-
-
View of the dashboard where you can also see CO2 saved
-
You can see items scanned, and leaderboard among other friends
-
Recycled history, you can click on each to see detailed information
-
Scanning feature that uses MLKit & Gemini to scan item and categorize it
-
Profile tab that allows user to change profile picture
-
Report section to see detailed information on your weekly / monthly habits
-
Categorizes by which category you scan the most
-
Recent history with which detection source was mainly used
-
Settings page to change username and colorblind options
Inspiration
We saw the high cost of meta glasses which also help people who cannot see, but wanted to make one for phones which is something almost everyone has and is something most of us are familiar with using. Accessibility tools shouldn't require expensive hardware. We wanted to build an app that empowers anyone, including visually impaired users, to recycle correctly using just their smartphone camera, and voice commands.
What it does
It utilizes MLKit first then Gemini visions as a fallback to scan an item using the phones camera, then it will use TTS to announce the item (eg. food scraps or clothing), whether or not it is hazardous then a short description which will help those who have trouble with sight. Then the user can click or say "I Recylced this" to save the item and calculate the CO2 emissions saved. Or the user can click or say "Scan again" if the item was scanned wrong or they need to scan more. Other than the scanning feature, we implemented ways to compete with friends and track who has done more to help the Earth. Lastly there are also colorblind options within the settings page in case our users need that option.
How we built it
We utilized Claude & GithubCopilot as our AI tools in development. For simulation we used Expo to simulate the app onto our phones and for testing. For database we used supabase which will store everything from user information to the items they scanned and more.
Challenges we ran into
The main challenge was the scanning portion since we had to use ML and it would sometimes scan the wrong item or just have trouble recognizing what the item was.
Accomplishments that we're proud of
We are proud of how easy it is to access the app, going into this we wanted to focus on making sure it didn't just cater to the normal user but also kept in mind for those who cannot see where the dashboard is or where the scan button is.
What we learned
Time management and how long it takes when we need to implement so many different tools like ML & Gemini then connecting database so it holds everything properly. It was a mess at first but we were able to get everything moving smoothly.
What's next for virtuCycle
This would be a great app in the app store, I want to see people who are visually impaired to actually use the app and let us know what is great or what is still not the best for their situation.
Built With
- gemini
- mlkit
- react
- sql
- supabase
- typescript
Log in or sign up for Devpost to join the conversation.