WORK   ABOUT   PRESS
January, 2020
#Communication
#Interaction

SeeFit



SeeFit an application that makes exercise entertaining by providing auditory and visual feebacks based on the user’s body movement. The project started from my observation during my volunteer, where people with disabilities attend a service center to exercise their bodies. They became more energized and engaged with the exercise as they danced with one another while listening to music. This gave me an inspiration to create a website that teaches users how to exercise, detects their body movements, gives feedback about their movement, and generates sounds corresponding to their movements.



Hyejun Youn, Brian Li, Hannah Han, David Wang
My Role - Project Leader | UIUX Designer | developer Woked on research, ideation, prototyping, front-end coding using HTML and CSS, javascript, video production using After Effects, 3d animation


Tools
Figma, Mixamo, Unity p5js, openCV After Effects javascript




Our virtual coach will give feedback based its detection of a user’s body movement. If a user moves correctly, the coach will give positive feedback, so that the user gradually learns the correct movements. SeeFit also makes exercise more entertaining by providing auditory and visual inputs based on the user’s body movement. Ultimately, we want to add a multiplayer section that allows interaction among different users, with the movements of each user generating sound, and the sounds combining to create music.







We built our platform using p5js and an open-source library called Openpose; specifically, we used MLI instead of COCO. We also edited some of the functionality to better fit our purposes. For example, we took out the hand and finger calculations and only focused on the major joints to improve computational speed, since we were mainly focusing on large-scale general body movements during a workout.