16x Faster, Hands-free, Eyes-free Diet Tracking
Today anyone can pull up their bank or credit card card account to track spending over time, segment spending into categories, and identify potential areas to save money. There’s no hassle, no fees, and no extra exertion required to log and access your financial information – all you have to do is just use your credit/debit card as you normally would.
I believe a similar capability should be available for dietary consumption. In an ideal world, there would be an effortless, non-invasive method to track foods consumed – all you would have to do is just eat and the information (calories, macros, micros, time of day, meal type, etc.) would be automatically logged and accessible. If someone chose to change their diet or their weight, they would be able to access this data to view consumption over time, segment consumption by various categories, and identify potential areas to cut/substitute/add food. The user choice aspect is key – I believe that users should possess the mere ability to access said data if they so choose but not necessarily that said data should be front and center 24/7 (re: eating disorders, complicating our already very complicated relationships with food and health: critical nutritionism).
The described capability would require a non-invasive super device that hasn’t been developed yet . Myfitnesspal is currently one of the most popular apps used for tracking dietary intake. Many physique competitors and professional weight-class athletes rely on myfitnesspal to meticulously track dietary intake during specific times of the year. In addition, some elderly users, newly diagnosed with intimidating chronic diseases, hope to reclaim their health with the app’s help. Myfitnesspal is great, but there’s still room for improvement.
Today’s version requires the user to open up the app, navigate the menu, type in the food, select the amount, confirm, and repeat for every item consumed. It’s a tedious, hassle that requires, on average, 47.26 seconds of undivided attention for each item. There’s a way to move this dietary tracking process one step closer to the aforementioned effortless, hassle-free ideal: one that’s 16x faster than the current method, hands-free, and eyes-free!
Google’s Voice Assistant can be used to operate phones with just your voice. If I say “track 2 cups of cheerios”, my modded phone will immediately pull up myfitnesspal and log 2 cups of cheerios. It takes 2.76 seconds to say that phrase, and I could be multitasking while speaking that phrase. On the other hand, it takes 47.26 seconds of undivided attention to touch the app, navigate the menu, touch “Add Food”, type “cheerios”, select “2 cups”, and touch “save”. For users who want to beat a chronic disease, those who just want to “make weight”, and even those who just have a hard time seeing the screen, I believe this voice-controlled and automated tool breaks down time, attention, and ability barriers to engaging with myfitnesspal.
- Rooted Android Phone (iOS coming soon). Instructions for rooting your Android device are well documented here and here – it shouldn’t take more than 20 minutes, and it’s free.
- Install four apps on your phone: myfitnesspal, Tasker ($1.99), Autovoice ($1.99), and Autoinput ($1.99). Tasker, AutoVoice, and AutoInput are powerful long-term investments; together they allow you automate almost any set of phone actions and link the automated set of actions to a voice command.
- Download this Tasker profile and transfer to your phone. Open in Tasker, and turn on the profile.
- Integrate Autovoice with Google Assistant / OK Google.
- Grant accessibility permissions to Autovoice, Autoinput, and Tasker.
- Open up the Settings app and navigate to Apps. Click on AutoVoice to view its permissions and make sure the permission is turned on for Contacts.
Open the actual Autovoice app, click on Devices, then add your Google account.You should be able to say “OK Google”, wait for Assistant to pull up, verbally say “tell Autovoice to track 1.5 tablespoons of extra virgin olive oil” and it should work.
You can substitute the quantity, units, and food item for your given item, but the grammatical structure of the sentence must follow the “tell autovoice to track (quantity) (units) of (food)”. *quantity accepts decimal numbers *units accepts everything from table spoons to containers, but try to keep the units to something that’s likely to be found in myfitnesspal *food can be multi-worded and phrased, meaning you can say things like “extra virgin olive oil” and “low-fat chobani greek yogurt”
Mobile apps have been increasingly adopted as tools for reinforcing health behaviors. MyFitnessPal is currently one of the most popular apps used for tracking dietary intake. Many physique competitors and professional weight-class athletes depend on myfitnesspal to track dietary intake. In addition, many elderly users, newly diagnosed with intimidating chronic diseases hope to reclaim their health with some help from the app. Improved usability, via speech-recognition and automation capabilities, enhances myfitnesspal’s potential as a tool for influencing positive behavior change. This is especially true for many elderly users, newly diagnosed with chronic diseases, hoping to reclaim their health with the app. A hands-free, eyes-free, 16x faster way to engage with the app is particularly relevant and valuable to these users.
 That being said, there is some exciting work being done with various wearables that would have the potential to detect and characterize eating – this would get one step closer to that aforementioned effortless ideal: