eWEAR Seminar: AI insights from wearable sensors_Stanford University
Tonight, I had the opportunity to remotely attend a seminar held at Stanford University, where two insightful presentations shed light on advancements in wearable sensors.
Wearable Sensing and Generative Deep Learning for Gait Dynamics Assessment
Speaker: Tian Tan, Postdoctoral Researcher in Radiology, Stanford University
Key Applications of Gait Dynamics Assessment:
- Injury Prevention: Reducing the risk of physical injuries through better understanding of movement patterns.
- Disease Assessment: Early detection and monitoring of conditions like Parkinson’s or diabetic neuropathy.
- Mobility Assistance: Enhancing assistive devices for improved functionality.
Traditional Methods:
- Marker-Based Motion Tracking and Force Plates
These systems are accurate but come with significant drawbacks:- Confined to controlled environments.
- High operational costs.
The Shift to Wearable IMUs:
- Inertial Measurement Units (IMUs) offer portability and affordability, enabling data collection in real-world environments such as homes and outdoor settings.
Challenges in Gait Analysis Models:
- Data Bottlenecks:
- High-cost experiments often result in small, limited datasets.
- Model Architecture:
- Dependence on strict input-output pairs limits adaptability.
Key Studies Shared:
Self-Supervised Learning:
- Enables models to learn without labeled data, drastically reducing dependency on costly, annotated datasets.
- Enhances adaptability and scalability.
Generative Diffusion Models:
- Compared traditional end-to-end models with generative diffusion approaches.
- Used diffusion models to estimate external forces and predict gait modification responses—eliminating the need for expensive experiments.
MEMS & Sensors in the Metaverse
Speaker: Sneha Kadetotad, Engineering Manager, Motion Sensors, Meta Reality Labs
From Text to Immersion:
Meta Reality Labs is spearheading hardware and software innovations for the Metaverse, focusing on:
- Virtual Reality (VR)
- Wearables
Current Applications of MEMS & Sensors:
Hand & Body Tracking:
- Sensors: Optical depth sensors, machine vision cameras, human vision cameras.
Eye & Face Tracking:
- Sensors: Machine vision cameras.
Challenges in AR/VR Development:
- Capabilities & Performance: Delivering high-fidelity experiences with an advanced feature set.
- Wearability & Social Acceptance: Ensuring devices are comfortable, inconspicuous, and have extended battery life.
- Affordability: Balancing innovation with cost-efficiency.
Emerging Innovations in MEMS & Sensors:
- Technological Advancements:
- Development in MEMS, CMOS, specialty technologies, and new materials.
- Platformization:
- Advanced packaging and integration.
- Intelligent MEMS & Sensors:
- Pioneering smart functionalities for next-gen applications.
New Features & Experiences:
- Health & Wellness Monitoring: Leveraging sensors for continuous health tracking.
- Force Feedback: Improving the tactile realism of interactions in virtual environments.
- Active Displays & Optics: Advancing visual fidelity for AR/VR devices.
Meta is now shifting focus toward building complete platforms, aiming to redefine AR/VR experiences.
eWEAR Seminar: AI insights from wearable sensors_Stanford University