TerraAPI at HackMIT
Last week, we participated in the biggest hackathon in the US, at the Massachusetts Institute of Technology, with HackMIT. The Hackathon brings some of the brightest hackers in the US to Boston to hack together.
During the hackathon, we presented the Spiderman’s challenge. Hackers would have to use the Unified API, or the Streaming API, to hack together a great idea within 24 hours. For that, we brought awards, such as Garmin Watches, the Terra Iron Duck, and the big gold Spiderman Statue.
Alex gave a workshop to the hackers, starting from the basics of APIs, interfacing with webhooks, authentication, accessing data from wearables and sensors, streaming real-time heart rate, and many more.
Most importantly, we challenged the strongest MIT hackers in a push-up competition - but we were disappointed to find that we were much stronger. We gave them three more weeks to train, though, and we’ll be back in Boston, at HarvardHack, to be challenged.
Back to the topic, though, the hackers built some incredible solutions within 24 hours, and here are some of them:
TerraRium:
Hackers: Sahil Jagtap, Sam Mathew
An app that transforms your daily activities into an exciting virtual adventure. It's like having a personal fitness game. Your in-game character reflects your real-life health and energy levels as you move and stay active.
How was it built: ‘To create TerraRium, we used the power of modern technology. We built it with tools like Next.js and Node.js, and we used Firebase for handling user data and authentication. We also connected to the Terra API to gather health, sleep, and activity data and a bit of Math Magic in our calculation algorithm.
Rhythm AI
Hackers: Vilhjalmur Vilhjalmsson, Vishruth Bharath, Erik Nymo, Sveinung Myhre
What is it: Autonomous generated nuanced comments for workouts through GPT and TERRA
How was it built: ‘For the front end we used React and material UI to create a user-friendly and appealing interface for our product. Firebase serves as the auth provider, and Firestore as the database. On the main page, all the comments displayed are fetched from the user's collection of comments in Firestore. The backend is written in Python with the Flask library. We created an endpoint to take a .gpx file, process the data, get a comment, and add it to the database. This ties together several APIs and produces our end product, workout-specific feedback, and comments.’
VibeAI
Hackers: Toya Takahashi, Mohit Hulse, Yue Chen Li, Uzay-G Girit
What is it: Your AI Dance Partner. Vibe.AI's automatic choreography feature makes it easier for anyone to dance! Simply play the music you love, and our AI will generate custom dance routines that perfectly match the music's rhythm and mood.
How was it built: ‘Vibe.AI is developed using Flutter for the mobile app and Python for AI modeling. We used the Terra API to integrate other wearables into our stack in the future seamlessly. We adapted Stanford University's music generation AI model EDGE, incorporating their research findings to enhance our AI capabilities. Additionally, we rely on Modal's cloud computing services, including their cloud-based infrastructure, for on-demand ML training.’
Mood Music
Hackers: Aileen Han
What is it: Mood Music uses the text from the scene in the movie of your life and your sleep patterns from the night before to generate some music that will play in the background of the scene!
How was it built: Using a flask framework, they used the MusicGen model by Facebook on Huggingface to generate the music. They then used TerraAPI to get the sleep pattern data.
ChatMD
Hackers: Justus Beck, Peter Berggren, TheOkster Oki
What is it: ChatMD takes in a list of symptoms from the user and comes up with possible diagnoses. For each diagnosis, it finds a reliable source for the diagnosis and then checks to see if this source says that the diagnosis is compatible with the symptoms. Finally, it outputs an explanation of where this diagnosis fits and doesn't fit the symptom descriptions and other symptoms that may confirm the diagnosis.
How they built it: The back end uses GPT-4 to analyze a user's symptoms and identify possible diagnoses. These are parsed into a Python list using GPT-3.5-turbo. For each diagnosis, it finds five potential resource links. The first valid, concise link for each disease is scraped to obtain raw HTML, which is then cleaned by removing non-text and unnecessary elements. Finally, GPT-3.5-turbo analyzes the refined website and the suspected disease and symptoms to provide the user with a comprehensive explanation.
Bite
The Hackers: Pranav Tadepalli, Mingkuan Yan, Huang Katherine, Alexander Kranias
What is it: Bite is an intelligent wearable accessory that uses computer vision to track your daily food and nutritional intake. It makes food tracking more convenient and informative.
How it works: Their wearable device, affixed to a velcro wrist strap, encompasses a Raspberry Pi 4b, miniature camera, IMU, and buzzer. It utilizes a 3-axis IMU to measure the user’s hand orientation and acceleration, translating this into spherical angle coordinates. Through signal processing and a state machine model, the device identifies when the wrist is motionless and recognizes a custom gesture to capture meal images. These images are analyzed via the LogMeal API for food segmentation, classification, and nutritional estimation. It provides audible feedback, processes nutritional data, and offers personalized dietary recommendations using GPT-3.5. Users can view meal details, total nutrient intake, and integrated calorie data from the Terra API on a comprehensive dashboard.