I would pay something like $5 per day for this if it was a real person who looks at my photos, categorizes them accurately, and enters them into MyFitnessPal.
I won't trust an AI until it has a massive database and a million users. Until then, it's just not going to work well enough. You also need to convince users why they should use your app, instead of MyFitnessPal (which already has a massive database, including barcodes.)
I agree with your suggestions for human assessment because of problem posed by food calories varying significantly due to different preparations.
Take for example soda, if I gave you a picture of soda in a glass could you tell it was diet or regular? You might scoff at such an edge case but it quickly becomes more common when looking into food perpetration techniques. This is why caloric estimation is a really difficult and causes restaurants to not list their calories as the calories of a meal do not equal the sum of it's parts.
All these solutions are common as people want a signal to tell them to stop eating, but these are insufficient as people will simply ignore it due to hunger cravings (as happens on diets). Any nutritionist service in addition to detailing calories would need to incentive the patient to recognize the need to lose weight or setup a helpline.
It would be difficult to account properly for ingredients such as butter and oil which are hard to see in a finished dish and drastically affect the calories.
But people would probably still pay as long as it's a good faith attempt, since they can't necessarily do better themselves, so it would be just as accurate and save time.
And with geolocation, you could start to figure out where people are and then you'd know exactly the calories if they're at a chain restaurant.
I have considered something like this as a business model, although I was thinking of nutrition coaching instead of mere logging itself, but thats actually a really great idea.
Built using those newfangled convolutional neural network things. Trained caffe on a set of about a million food related images. Can't believe how accurate the classifier is...
UI needs work. Would love feedback on how to make food logging more intuitive.
Your app kinda sucks now, but the detection is great. You could also do food identification as a service...
I'm trying to use fatsecret/myfitnesspal for counting calories. Something like this could make it much easier!
To me, a better workflow would be: take pictures of everything you eat and classify later (at night, maybe on the PC). Of course, this software would simplify the classification, the user would only fix any errors and adjust sizes.
This workflow would be well suited to your app, which operates on a server? (my understanding).
Congrats, I understand the relevance of AI for simple apps now ;)
Edit: oops, should have read further. I see that you pulled appropriate terms from WordNet and used ImageNet to gather training images from Flickr. Cool!
Someone should combine this AI with something like Google glass (so you don't have to pull out your phone), a basic calorie/nutrition filter, and a shock bracelet[1].
That way when I look at a Krispy Kreme donut I get immediate feedback!
I've been thinking of this exact thing for a few years now, good to see others are interested too. I think there is HUGE potential for an app that can accurately (within a reasonable error range) count macro-nutrients through pictures of food, but I think that is WAY more difficult than you would imagine. How can you train a classifier to tell how much butter is in a dish? How can you account for food that is literally underneath other food?
Maybe a combination of spectrometer, weight, and photos would be a future solution. I certainly don't have an idea of how it would be implemented, but I think using data from all 3 of those sources could be a good step.
Another angle is post-consumption. Ive always wondered if precisely monitoring body temperature might tell you something about consumption, since there is this 'thermic effect of food'. Would be hard to get right though ...
In the lab, we can use doubly-labelled water to precisely measure calorie intake and burn, but its so damned expensive. If only it were commercially viable.
This has HUGE potential. Some of the comments here disregard the fact that AI can do a better job than humans. An expert chef looking at dishes will make an educated guess on the ingredients and quantities that went into a dish and will have a better idea of the caloric content. The AI could learn a similar skill, identifying whether a dish contains pastry, buttery sauces, fatty meat etc etc.
You have a business opportunity here if you train that on fresh vegetables. Many people have trouble with these issues:
1. Knowing what the hell a (example) rhubarb is or looks like.
2. Knowing if it looks fresh or about to go bad.
Being able to aim a camera slowly across an isle with it getting highlighted would solve first problem. Spinning vegetable around in front of the camera would could solve second. So, try that and make it a paid app.
You know, its funny. being a former fat guy, Im not entirely familiar with all of my vegetables. I took this app to a farmers market recently and it was actually helping me identify a few things ...
Had never seen a beet before un-cut. Ashamed of my ignorance ;)
Pretty nifty, but please consider allowing creation of an account with a simple email address, or even anonymously. I really don't want to have to connect my Facebook or other social-networking information to your app in order to use it.
Thanks for allowing one to try it out without logging in via another service; it'd be great to be able to actually have an account.
This is not compatible with my device which is an LeTV X500, running Android 5.0.2. Is there any reason that it might not work? I'm in the UK, if that might be the reason.
Im new to cordova (the toolkit I used to build the app), and I did notice that the number of supported devices is kind of low for my build. I may have included a plugin that narrows down the pool. I need to dig into this.
Hey Subcomos, great work! We've been working on something pretty similar, an app called Logameal (only available in the UK). We have run into (are running into) a lot of those UI issues too and would be glad to help. Let me know if you'd like to chat.
I remember the latest Android camera app being able to extrapolate distances in the app (thus simulating photos with large apertures/shallow depth of field).
I wonder if you could use this visual distance algorithm to somehow judge portion sizes/plate sizes?
Isn't it strange that in [1] the beginning of the "Normal" BMI has more risk of death than the beginning of the "Obese"? I'd even venture to say that the "Overweight" range has, in average, lower Mortality Risk than the "Normal".
PS, another plus for the Metric System. The Imperial System form includes Feet+Inches, while the metric one only needs Centimeters for the height.
If you click the 'no barcode' button to pull down a fatsecret nutrition record, there is a servings pulldown that is displayed. Its kinda hard to see. Working on the CSS
That is very cool subcosmos! I am also very interested in both cooking+nutrition and AI (see for example my web site cookingspace.com). My contact information is in my HN profile - shoot me an email if you want to talk about AI, food technology and business, etc.
This should be in use at those self-registry machines at food markets. I imagine that you place the food item on the scanner and it will automatically register it or ask for human input when something unrecognized is found.
Where did you get your training data set?
I'm trying to build a network to recognize faces from ID cards and match them to faces/selfies. Onenof the hard things has been to get a training data set.
[+] [-] AJRF|10 years ago|reply
I find myself sometimes not logging food on that app because I couldn't be bothered to search for it.
On the other hand if the product has a barcode I always scan it in because that is convenient.
[+] [-] subcosmos|10 years ago|reply
[+] [-] jcoffland|10 years ago|reply
[+] [-] nathan_f77|10 years ago|reply
I won't trust an AI until it has a massive database and a million users. Until then, it's just not going to work well enough. You also need to convince users why they should use your app, instead of MyFitnessPal (which already has a massive database, including barcodes.)
[+] [-] HamSession|10 years ago|reply
Take for example soda, if I gave you a picture of soda in a glass could you tell it was diet or regular? You might scoff at such an edge case but it quickly becomes more common when looking into food perpetration techniques. This is why caloric estimation is a really difficult and causes restaurants to not list their calories as the calories of a meal do not equal the sum of it's parts.
All these solutions are common as people want a signal to tell them to stop eating, but these are insufficient as people will simply ignore it due to hunger cravings (as happens on diets). Any nutritionist service in addition to detailing calories would need to incentive the patient to recognize the need to lose weight or setup a helpline.
[+] [-] logn|10 years ago|reply
But people would probably still pay as long as it's a good faith attempt, since they can't necessarily do better themselves, so it would be just as accurate and save time.
And with geolocation, you could start to figure out where people are and then you'd know exactly the calories if they're at a chain restaurant.
[+] [-] subcosmos|10 years ago|reply
[+] [-] unknown|10 years ago|reply
[deleted]
[+] [-] a_small_island|10 years ago|reply
[+] [-] subcosmos|10 years ago|reply
UI needs work. Would love feedback on how to make food logging more intuitive.
[+] [-] swah|10 years ago|reply
I'm trying to use fatsecret/myfitnesspal for counting calories. Something like this could make it much easier!
To me, a better workflow would be: take pictures of everything you eat and classify later (at night, maybe on the PC). Of course, this software would simplify the classification, the user would only fix any errors and adjust sizes.
This workflow would be well suited to your app, which operates on a server? (my understanding).
Congrats, I understand the relevance of AI for simple apps now ;)
[+] [-] AJRF|10 years ago|reply
Things like Main screen, App Icon and colour palette.
[+] [-] zopf|10 years ago|reply
We're working on a similar component for our commercial behavioral-economics-driven app suite, targeted toward patients with chronic diseases.
Do you use location as a way to filter the set of possible foods, as in Google's im2calories project/paper last year?
They do some pretty awesome depth calculation stuff too: https://www.google.com/?ion=1&espv=2#q=im2calories+type:pdf
Edit: oops, should have read further. I see that you pulled appropriate terms from WordNet and used ImageNet to gather training images from Flickr. Cool!
[+] [-] Keats|10 years ago|reply
[+] [-] teLeopardthy|10 years ago|reply
[+] [-] krashidov|10 years ago|reply
[+] [-] sandGorgon|10 years ago|reply
[+] [-] unknown|10 years ago|reply
[deleted]
[+] [-] imacomputer2|10 years ago|reply
[deleted]
[+] [-] imacomputer2|10 years ago|reply
[deleted]
[+] [-] imacomputer2|10 years ago|reply
[+] [-] koolba|10 years ago|reply
That way when I look at a Krispy Kreme donut I get immediate feedback!
[1]: http://pavlok.com/hello.php
[+] [-] internaut|10 years ago|reply
Let's call it: "American Dystopia"
[+] [-] misiti3780|10 years ago|reply
[+] [-] subcosmos|10 years ago|reply
[+] [-] solaxun|10 years ago|reply
Maybe a combination of spectrometer, weight, and photos would be a future solution. I certainly don't have an idea of how it would be implemented, but I think using data from all 3 of those sources could be a good step.
[+] [-] subcosmos|10 years ago|reply
In the lab, we can use doubly-labelled water to precisely measure calorie intake and burn, but its so damned expensive. If only it were commercially viable.
[+] [-] holografix|10 years ago|reply
[+] [-] nickpsecurity|10 years ago|reply
1. Knowing what the hell a (example) rhubarb is or looks like.
2. Knowing if it looks fresh or about to go bad.
Being able to aim a camera slowly across an isle with it getting highlighted would solve first problem. Spinning vegetable around in front of the camera would could solve second. So, try that and make it a paid app.
[+] [-] subcosmos|10 years ago|reply
Had never seen a beet before un-cut. Ashamed of my ignorance ;)
[+] [-] zeveb|10 years ago|reply
Thanks for allowing one to try it out without logging in via another service; it'd be great to be able to actually have an account.
[+] [-] subcosmos|10 years ago|reply
[+] [-] aembleton|10 years ago|reply
[+] [-] subcosmos|10 years ago|reply
Sorry about this!
[+] [-] rymate1234|10 years ago|reply
Looking forward to trying the app once I can install it though, looks neat
[+] [-] lj3|10 years ago|reply
:(
[+] [-] gchristen|10 years ago|reply
[+] [-] aantix|10 years ago|reply
I wonder if you could use this visual distance algorithm to somehow judge portion sizes/plate sizes?
[+] [-] micro_cam|10 years ago|reply
[+] [-] subcosmos|10 years ago|reply
Aim high ...
[+] [-] franciscop|10 years ago|reply
PS, another plus for the Metric System. The Imperial System form includes Feet+Inches, while the metric one only needs Centimeters for the height.
[1] https://www.infino.me/join/
[+] [-] virtualritz|10 years ago|reply
I had do the entire process twice because there is no (obvious) way to change the serving size to 2.
[+] [-] subcosmos|10 years ago|reply
[+] [-] zaroth|10 years ago|reply
[+] [-] mark_l_watson|10 years ago|reply
[+] [-] Zolomon|10 years ago|reply
Would decrease cheating perhaps?
[+] [-] sandGorgon|10 years ago|reply
[+] [-] zodPod|10 years ago|reply
[+] [-] devstack|10 years ago|reply
[+] [-] mrfusion|10 years ago|reply