Category:
Honey, who shrank the chatbot? Even AI has been counting calories.

Honey, who shrank the chatbot? Even AI has been counting calories.

When Shrinkflation Meets AI: The Great Chatbot Diet

In a small college town, long before the age of apps and algorithms, a quirky system of shared auto-rickshaws became a lifeline for students. These budget-friendly rides were like a limousine service for the masses, ferrying groups of students for mere pennies. But when fuel prices surged during the Iraq War, the auto drivers didn’t raise their rates. Instead, they found a creative solution: squeezing more passengers into each rickshaw.

First, it was five passengers. Then seven. Soon, the rickshaws became sardine cans on wheels, with students clinging to each other and sacrificing any notion of personal space. This was the original ride-sharing economy, long before smartphones and surge pricing algorithms entered the picture.

It wasn’t just the rides that adapted to economic pressures. The idlis—a popular South Indian snack—at nearby cafés began to shrink in size every trimester. While this might have been a blessing for waistlines, it wasn’t exactly welcomed by the scrawny, perpetually hungry students. Back then, there wasn’t a fancy term like “shrinkflation” to describe this phenomenon.

Shrinkflation: From Snacks to AI

Fast forward to today, and shrinkflation is everywhere. Open a bag of chips, and you might feel like you’ve purchased a packet of air with a side of disappointment. “Family-sized” cartons of juice now seem designed for a minimalist family of two. Shrinkflation has become as common as selfies at tourist spots.

But now, shrinkflation has reached an unexpected frontier: the world of Generative Artificial Intelligence (GenAI). Once celebrated for their verbose and insightful responses, AI chatbots are now showing signs of cutting back—on words, that is.

When AI Goes on a Diet

Not long ago, AI chatbots like OpenAI’s ChatGPT could churn out detailed, coherent answers with the enthusiasm of a history professor on caffeine. But those days of verbosity now feel like a distant memory. Today, these chatbots seem to have taken a vow of silence—or at least brevity—right when things get interesting.

For instance, ChatGPT used to build up to grand points with elaborate explanations. Now, it often stops mid-sentence, leaving users hanging like a season finale cliffhanger. Rival AI services like Perplexity have embraced minimalism, offering rapid-fire bullet points that feel more like SparkNotes than insightful answers. Google’s Gemini, meanwhile, opts for “concise summaries,” akin to a polite friend who doesn’t want to overwhelm you with too much information.

Where once you could ask about the history of philosophy and receive a dissertation, now you’re more likely to get: “Plato said some stuff. Aristotle disagreed. Skip to Nietzsche. The end.” At this rate, it wouldn’t be surprising if future AI responses were delivered in emojis or haikus.

The Cost of Words

So, what’s behind this shift? The answer lies in the economics of AI. Chatbots operate using tokens, which represent chunks of text. Each token requires computational power, and with millions of users asking everything from “What’s the meaning of life?” to “Why does my cat ignore me?” the costs add up quickly.

Anecdotally, a single GPT-4 query consumes enough electricity to power a small LED bulb for an hour. Multiply that by millions of users, and the energy costs become staggering—enough to make even Bitcoin miners blush. To manage these expenses, AI companies have started trimming the fat, imposing limits on the number of tokens in both input and output.

This “AI shrinkflation” might be a temporary measure to balance the books without charging users directly. However, it’s likely that pricing models will soon evolve. Imagine a future where simple answers are free, but detailed, dissertation-level responses come with a price tag. Just as airlines charge for extra baggage, AI companies might charge for every token exceeding a free tier.

Adapting to the New Normal

While this shift may be frustrating, it also serves as a reminder to appreciate what we have—or, in this case, what we had. In the spirit of Thanksgiving, it’s worth savoring the moments when AI chatbots do deliver dazzlingly precise answers or inspiring insights.

Chatbots are products of the post-free-social-media era, where “Everything-as-a-Service” (XaaS) is the norm. Their pricing models will continue to evolve, likely incorporating additional costs beyond current subscription fees. For now, we can be thankful that we’re not yet being charged for every word or token.

Looking Ahead

As AI continues to evolve, it’s clear that the days of unlimited, verbose responses may be behind us. But this doesn’t have to be a bad thing. Perhaps the shift toward brevity will encourage us to ask better, more focused questions. Or maybe it will inspire new innovations in how we interact with AI.

For now, though, it might be time to take a break, reflect, and enjoy a cup of coffee—or perhaps a Kafkaesque moment of contemplation about the future of AI.

Original source article rewritten by our AI can be read here.
Originally Written by: Innovation Investor

Share

Related

Popular

bytefeed

By clicking “Accept”, you agree to the use of cookies on your device in accordance with our Privacy and Cookie policies