Gemini is an increasingly good chatbot, but it’s still a bad assistant

You May Be Interested In:‘We got stuck in puddles’: skiers upset by lack of snow on Swedish slopes


Google leverages the theoretical power of generative AI to give Gemini access to data across multiple apps. When it works, this can be very handy. For example, you can ask Gemini to check your email for a specific message, extract data, and pass it to another app. I was excited about this functionality at first, but in practice, it makes me miss the way Assistant would just fail without wasting my time.

I was reminded of this issue recently when I asked Gemini to dig up a shipment tracking number from an email—something I do fairly often. It appeared to work just fine, with the robot citing the correct email and spitting out a long string of numbers. I didn’t realize anything was amiss until I tried to look up the tracking number. It didn’t work in Google’s search-based tracker, and going to the US Postal Service website yielded an error.

That’s when it dawned on me: The tracking number wasn’t a tracking number; it was a confabulation. It was a believable one, too. The number was about the right length, and like all USPS tracking numbers, it started with a nine. I could have looked up the tracking number myself in a fraction of the time it took to root out Gemini’s mistake, which is very, very frustrating. Gemini appeared confident that it had completed the task I had given it, but getting mad at the chatbot wouldn’t do any good—it can’t understand my anger any more than it can understand the nature of my original query.

At this point, I would kill for Assistant’s “Sorry, I don’t understand.”

This is just one of many similar incidents I’ve had with Gemini over the last year—I can’t count how many times Gemini has added calendar events to the wrong day or put incorrect data in a note. In fairness, Gemini usually gets these tasks right, but its mechanical imagination wanders often enough that its utility as an assistant is suspect. Assistant just couldn’t do a lot of things, but it didn’t waste my time acting like it could. Gemini is more insidious, claiming to have solved my problem when, in fact, it’s sending me down a rabbit hole to fix its mistakes. If a human assistant operated like this, I would have to conclude they were incompetent or openly malicious.

share Paylaş facebook pinterest whatsapp x print

Similar Content

Thoughts on the M4 iMac, and making peace with the death of the 27-inch model
Thoughts on the M4 iMac, and making peace with the death of the 27-inch model
Vaccine misinformation can easily poison AI – but there's a fix
Vaccine misinformation can easily poison AI – but there’s a fix
Apple takes over third-party Apple Passwords autofill extension for Firefox
Apple takes over third-party Apple Passwords autofill extension for Firefox
Bleeding Edge by Microsoft
Microsoft wants to use generative AI tool to help make video games
Samsung quits updating Galaxy Z Fold 2 that came out in 2020 for $2,000
Samsung quits updating Galaxy Z Fold 2 that came out in 2020 for $2,000
Gary Oldman in a scene from Slow Horses.
Apple TV+ spent $20B on original content. If only people actually watched.
The News Spectrum | © 2025 | News