My decision to study an odd seemingly unrelated combination of Nutrition and Mathematics during college was intentional. I’m also legally blind, so broader issues of accessibility have often been top of mind throughout my life. Health-Tech is a nice little nexus where I can bring each of those areas together. In particular, I think if done right, mathematics (and therein technology) has the potential to break down accessibility barriers to peoples’ well-being.
Health-Tech is an exciting and ever-changing industry, so I thought I’d share a little bit about my own personal setup, personal projects, and where I see the industry moving in the future. Personally, I have a lot of issues when it comes to looking at screens, reading text, navigating menus, and other sorts of sight-dependent tasks: so the setups I have listed here revolve around minimizing the amount of user input and physical abilities necessary to use these apps. Other more universal barriers such as time and attention also get touched on. Oh and since health-tech is a broad category, I’d like to clarify that I’m interested in the fitness-tech and diet-tech side of things.
Google Fit – this is my central integration app that automatically pulls data from the other tools. It autodetects different forms of physical activity (walking, running, biking) via the phone’s built-in accelerometer and gyrometer. I don’t own a wearable yet, but I hear it syncs quite well with those as well. I had heard about accuracy issues with Fitbit, but wiith Google Fit on Android Wear now able to autodetect and log strength training routines, I’m seriously considering a wearable purchase. Personally, I’m not a huge fan of the focus on “steps” in most wearables, but as a low-key powerlifter, the autodetection of strength training is a big draw for me. As the central integrator in my setup, Google Fit also pulls caloric intake from Myfitnesspal, weight history from Weight GURUs, and biking activity from Mapmyride,
Weight GURUs + Tasker – a smart scale paired with automation makes for effortless weight tracking. Weight GURUs , a bluetooth scale and accompanying app, stores all weigh-in data (weight, body fat, water weight, etc). However, to import said data, the app must be open on your phone and bluetooth must be activated when you step on the scale. That last nuance makes for some effort on the user’s part – you’d have to pick up your phone and open the app every single time you weigh yourself….unless there was a way to have the app and bluetooth automatically launch without user input. That’s where Tasker comes in. I know that I weigh myself every morning at roughly 5:15 AM, so I have a time-triggered Tasker profile that automatically launches bluetooth and the Weight GURUs app at 5:10 AM – all I need to do is step on my scale every morning when I wake up. And as already mentioned, the Weight GURUs historical weight data feeds straight into Google Fit. This is super helpful for me as I cut to make a weight class for powerlifting competitions.
Myfitnesspal + Tasker + Autovoice – speech recognition and automation capabilities make it so that I can just say “track 1.5 tablespoons of extra virgin olive oil” and the relevant dietary data is automatically logged in Myfitnesspal, which of course feeds directly into Google Fit. Luckily I didn’t have to build my own natural language processor or write my own machine learning code – Google Assistant already took care of that. The voice-control capability I connected to Myfitnesspal this past year is nice, but it still requires action and expended effort on the user’s part; the ideal would be some form of automatic detection and logging where all the person has to do is eat and that’s it (e.g. smart bowl). For now, this is the least effort, most accurate way to monitor dietary intake that I’ve seen.
Mapmyride + Tasker – commuting to and from work, I bike 7 miles roundtrip every weekday. It’s 3.5 miles each way, so it’s not a huge distance for a bike ride, but I’m probably expending at least 100 calories just by traveling to and from work every weekday. Mapmyride and the other apps in the Mapmyfitness suite (Mapmywalk, Mapmyhike, etc) all require the user to initiate the tracking and logging….unless there was a way to have the app automatically launch without user input. That’s where Tasker comes in again. I know that I leave home at roughly 5:45 AM, so I have a time-triggered Tasker profile that automatically launches Mapmyride and initiates tracking at 5:40 AM. I never need to pull out my phone or touch any buttons – all I need to do is bike. And as expected, the Mapmyride data feeds straight into Google Fit.
Sheets + Tasker + Autovoice – For powerlifting, I plan and track my workouts in . a Google Docs spreadsheet. I haven’t quite figured out how accurate the calorie data would be to link the sheet with Google Fit. But with speech recognition, I can make entries and adjustments to the sheet with just my voice. And with automation, Tasker can automatically pull the exercise routines into Google Fit – the calories burned may be off for strength training though since it looks time-based rather than repetition based. As for future projects, I’m planning to work with the Google Fit API and some average statistics on caloric expenditure in powerlifting routines.
More To Come…
Items I Will Try In The Future: Wearables, Smart Water Bottles
Thoughts On Some Areas: Logging Dietary Intake Via Pictures, App Space
Future of Health Tech: Noninvasive Autodetecting of Dietary Intake, Smart Eating Utensils