Have you ever stared upon your lowly little fridge and thought to yourself, how could you make anything at all with all the crap you keep in your fridge? Fret no more! Being a college student myself, I too am always lazy to cook food, and every time I do want to make something, I would find myself not knowing what to do with all the food ingredients I have in my fridge, and those things will end up long in the fridge before you realize they have become bad. That's wasteful. That's bad. It's time for a change.
It's a mobile app that lets you take a picture of your groceries and the food items in your fridge and the app will intelligently tell you what you can make (i.e. food recipes) based on what you have in your fridge. It also inventories the food items in your fridge and give them a freshness score so you know what's in there and what's going to go bad soon.
We used React Native for the mobile app, connected it with Google's Vision API for image recognition, and extracted recipe data from the Spoonacular API.
This is our first time using Google Cloud's Vision API. It took us some time trying to connect the Vision API to the app. Also, we weren't familiar with React Native, so there was a lot of first-times and frustrations.
We're proud that we've been able to produce a visually compelling app. We spent a lot of time working on the UI part, and I think it shows. Also, considering that React Native is cross platform, we're proud that we're able to create a product that works on both ends of the mobile ecosystem.
That due planning is important, and CSS is hard (CSS is a programming language, fight me)
We plan to further polish the app and think of ways to make the UI better. Also, deploying it to app stores would be a dream.
This project is a submission for SquirrelHacks (HackISU) 2019. Teammates are Mirza Nor Azman and Mujahid Anuar. At the time of the hackathon, all are students from University of Wisconsin-Madison.
Here's the link to the Devpost submission.