My First Post - Building Mobile Apps with Flutter
I’ve been building solo for a while, but just recently started getting into cross-platform mobile apps using Flutter.
I’ve been working on my latest project, HeartHealthAI, for the past two months and just published it on the App Store. It uses AI to analyze your meals from a photo and gives you a Heart Health Score based on 8 nutrition factors like sugar, sodium, fiber, processing, and more.
It also includes a GPT-4.1-powered chat assistant that gives personalized feedback based on your meals, preferences, and allergies.
Still early days, but I’m hoping to get it in front of more users. I’m here to engage with the Product Hunt community a bit before doing a proper launch.
Would love to connect with other mobile builders or folks working in health and AI.
Replies
Product Hunt
Hey Reid, welcome to Product Hunt 👋
HeartHealthAI sounds really interesting.
I'm guessing it maybe requires some manual input beyond just a photo. I imagine it would be kind of hard to understand what all ingredients were in a food just from a photo.
I like the idea of a scoring system. I feel like that would be really easy for users to understand.
How has your experience with Flutter been? I've toyed around with using it before, but I've never fully built anything with it. Did you do other mobile programming before doing cross-platform with Flutter?
@jakecrump Hey Jake! I appreciate the warm welcome. :)
I'm pretty excited about it! Let me explain how it works.
Here’s some Panda Express I had earlier, which was definitely not healthy:
First, the AI analyzes the image, either from a photo the user takes or from their library.
In this picture, you can see the description and ingredient list that’s returned. I’ve been using GPT-4.1 Mini, and I feel like it does a great job describing what’s in the image.
Next, the user can opt to make changes to the description by using the "edit" function along with the text box. For example, the user could type in "Cooked with minimal salt". This is something that could be hard to pick up from the image alone. GPT 4.1 mini is used again in order to find out what the user wants to modify based off of the context. Once the user is satisfied with the description, they can press the Heart Health Analysis button. The description is fed into GPT 4.1 one last time along with a ruberic and the users dietary preferences, allergies, and disliked foods. Below, you can see the page is that is returned, which includes calories, macronutrients, scores for each heart health subscore, a summary, and some reccomendations. These logs can be stored in the calendar and the user can use the chat feature to talk with an AI assistant that remembers the food logs.
I got a little carried away and forgot to answer your last question. Yes, I have tried react native with expo. I think they are both great, but I had a better experience with flutter. With react native, i feel like you really have to know what you are doing when using a development build. I just found flutter easier to integrate with a back end and it is nice to only have to debug with one device. I would definitely say its worth trying, I'm glad i did.