PPG Sensing for Human-Centered Interaction
An end-to-end system that captures PPG via remote video and contact sensors, processes signals in real time, and translates physiological change into interactive feedback.
Overview
This project explores how photoplethysmography (PPG) from remote video (rPPG) and contact sensors can become meaningful signals for interaction, identity, and health-focused systems.
End-to-End Pipeline
Motivation
Physiological signals are rich but often treated as backend analytics. This work asks how PPG can power real-time, human-centered interactions in the wild.
How can PPG enable responsive interaction?
We focus on multimodal sensing and robust processing so interaction logic can respond to state changes, not just static biometrics.
System Architecture
Three coordinated layers connect sensing to signal processing to interaction.
Architecture
Signal Processing Pipeline
From face-based ROI extraction to filtered signals and validated features.
Interaction Design
Physiological change becomes state-aware feedback and implicit interaction triggers.
Evaluation & Insights
rPPG is sensitive to motion and lighting, but multimodal context improves stability and user understanding.
Contribution
An end-to-end PPG sensing and interaction system with a modular pipeline and design insights for real-time, state-aware feedback.
End-to-End System
Modular Pipeline
Design Insights
Multimodal Validation
Adaptable Setups
Interaction Logic
"A full pipeline from sensing through interaction makes PPG usable in real time."
"Multimodal synchronization is essential for reliable signal interpretation."
"Users respond to relative physiological change more intuitively than raw values."
Future Directions
Scaling toward deployment, authentication, and longitudinal sensing.
Real-World Deployment
Evaluate stability across varied lighting, motion, and environments.