Multimodal AI Assistant for Astronaut Mental Health Monitoring
This is demo screen where the AI bot is going to talk and user can interact with the AI
Talk to Maitri using voice or upload audio files. Drag to rotate the 3D model.
Status: Ready to Listen
Real-time voice interaction enabled
You said:
Waiting for input...
Hello! I'm Maitri, your AI companion. I'm here to support your mental health during the mission.
⸠Voice Recognition Inactive
Click "Start Listening" to activate
đ Try these emotions:
YOLO-based facial expression analysis with real-time emotion detection
Confidence: 94%
Positive emotional state detected
Confidence: 87%
Low mood detected - support initiated
Confidence: 91%
Stress indicators detected - intervention suggested
High stress - relaxation protocol activated
â Normal
â Optimal
â Normal
â Good
MFCC & Spectrogram-based voice analysis for emotion detection
Voice analysis indicates stable emotional state with positive undertones.
Analysis: Speech patterns show reduced stress markers, steady pitch variation
Samples analyzed: 1,247 | Accuracy: 94.2%
Maitri is continuously monitoring all parameters. Offline-first support enabled.
Next Check-in: In 2 hours | Ground Contact: Available
System Status: All sensors operational | Last sync: --:--:--