Ask ChatGPT to estimate the carbs in your lunch. Now ask it again. And again. Five hundred times. You’d expect the same answer each time. It’s the same photo, the same model, the same question. But you won’t get the same answer. Not even close — and the differences are large enough to cause a
What in the picture indicates any form of filling?
What you can see is cheese, there is probably butter too, but those 2 have zero carbohydrates, so adding carbohydrates based on filling would be pure speculation.
There are no carbohydrates to see beyond the bread.
There is no evidence of any filling, as there is zero bulge in the bread.
The answer should be based on what can be seen, with a remark to that effect, and that there possibly could be more if it contains filling that isn’t visible.
The AI could ask about a possible filling, instead of just making shit up with zero evidence.
If a friend texted me the same picture and question, I would do exactly what you described. Try to give a calculated guess that wouldn’t change.
Unless I was lazy and Googled it.
Google’s carbohydrate tool says 8g, then the AI overview goes on to contradict that by saying “A standard cheese sandwich typically contains between 25 and 35g.”
Friendly reminder that LLMs don’t do math, they guess what number should come next, just like words.
It can probably link the image to the words “a photo of a sandwich on a plate”, and interpret the question as “how many calories are in a sandwich” but from there it is just guessing at the syntax of an answer, but not at finding any truth.
It knows sandwiches have calories and those tend to be 3-4 digit numbers, but also all numbers kinda look the same, so what’s to say it’s not 2, 5, or 12 digits?
Tool-powered agents can do math though. The issue is the fuzziness of it trying to guess carbs. It doesn’t know weight, ingredients, or anything other than a picture. These tools can be useful but not for this. Maybe one day but not yet.
Whoever claims an AI (LLM or agents) can do that and charging their users is lying and defrauding them.
The apps are advertising that they can do this tho. Many of them are aggressively sponsoring YouTubers who advertise you can basically just wave your phone over the food and it takes away all the “work” from traditional calorie counting apps
To be fair there’s no way of knowing what the filling is, so the AI may be guessing based on that too
Nope, Claude and Gemini both guessed fewer carbs than are in the bread.
What in the picture indicates any form of filling?
What you can see is cheese, there is probably butter too, but those 2 have zero carbohydrates, so adding carbohydrates based on filling would be pure speculation.
There are no carbohydrates to see beyond the bread.
There is no evidence of any filling, as there is zero bulge in the bread.
The answer should be based on what can be seen, with a remark to that effect, and that there possibly could be more if it contains filling that isn’t visible.
The AI could ask about a possible filling, instead of just making shit up with zero evidence.
To your point -
If a friend texted me the same picture and question, I would do exactly what you described. Try to give a calculated guess that wouldn’t change.
Unless I was lazy and Googled it.
Google’s carbohydrate tool says 8g, then the AI overview goes on to contradict that by saying “A standard cheese sandwich typically contains between 25 and 35g.”
Friendly reminder that LLMs don’t do math, they guess what number should come next, just like words.
It can probably link the image to the words “a photo of a sandwich on a plate”, and interpret the question as “how many calories are in a sandwich” but from there it is just guessing at the syntax of an answer, but not at finding any truth.
It knows sandwiches have calories and those tend to be 3-4 digit numbers, but also all numbers kinda look the same, so what’s to say it’s not 2, 5, or 12 digits?
Tool-powered agents can do math though. The issue is the fuzziness of it trying to guess carbs. It doesn’t know weight, ingredients, or anything other than a picture. These tools can be useful but not for this. Maybe one day but not yet.
Whoever claims an AI (LLM or agents) can do that and charging their users is lying and defrauding them.
The apps are advertising that they can do this tho. Many of them are aggressively sponsoring YouTubers who advertise you can basically just wave your phone over the food and it takes away all the “work” from traditional calorie counting apps
But the ai assumes itself infallible, at least it could ask…
That’s true, it should ask follow-up questions, or at least clarify its assumptions