Contributed to the ETL pipelines, development, and end-to-end testing of productionized multimodal machine learning solutions for wearables at Meta.
Leveraging extensive experience in contextual AI, multimodal modeling, user experience research, and data science, my work focuses on combining modalities such as EMG, IMU, and computer vision to create innovative advancements in wearable technology.
Over seven years of experience in sensorimotor neuroscience and novel interaction paradigms.