Create and login into the same account on both VR and mobile to sync across devices.
Real-time biometric data is collected through the mobile app and transmitted to the server.
Check your biometric analysis results through dashboard in MUA XR. Based on your emotional state, MUA suggests the meditation that best suits for you.
MIND-C AI learns from accumulated biometric data and pre and post changes through repeated use, allowing it to infer emotional states with greater precision.
User Profile / HR / HRV / Blood Oxygen / SpO2 / Activity
Emoji-based mood meter
Convert into basis for recommending meditation
For highly stressed users
For users suffering from burnout
For users who need inner stability
• 90% said XR helped them reach deep meditation
• Improved heart rate and HRV
• Reduced time to enter deep meditation
• Longer and better sleep than with traditional meditation
Valence : the degree of positive or negative emotion
Positive emotion (pleasant) ↔ Negative emotion (unpleasant)
Arousal : the level of excitement or activation
High arousal (excited, tense) ↔ Low arousal (calm)
MUA works seamlessly with your existing smart watch.
Biometric signals measured by the watch are sent in real time to our servers, where they are transformed into indicators for emotional state estimation.
With repeated use, your own foundation model is built, enhancing meditation recommendations and enabling increasingly precise emotion estimation.
MUA helps organizations maintain well-being and boost productivity.
Using a smart mirror, biometric signals are measured in a contactless manner and sent to MUA’s servers for emotional state analysis. Once the analysis is complete, MUA recommends the meditation most needed by the individual at that moment.
By analyzing data patterns across industries and groups, we provide optimized, specialized content and allow managers to track meditation outcomes through a dedicated dashboard—enabling effective mental health management.
Powered by
Emotion recognition based on physiological signals using valence-arousal model(Basu, 2015)
Wearable-Based Affect Recognition—A Review(Schmidt, 2019)
WESAD, a multimodal dataset for wearable stress and affect detection (Schmidt, 2018)
Autonomic Nervous System Activity Distinguishes Among Emotions (Ekman, 1983)
Basic emotions are associated with distinct patterns of cardiorespiratory activity (Rainville, 2006)
Improving Real-Life Estimates of Emotion Based on Heart Rate: A Perspective on Taking Metabolic Heart Rate Into Account (Brouwer, 2018)
Physiological Pattern of Human Emotion State based on SPO2 sensor (Wibawa, 2016)
Emotion Recognition in Elderly Based on SpO2 and Pulse Rate Signals Using Support Vector Machine (Hakim, 2018)
Predicting Emotion with Biosignals: A Comparison of Classification and Regression Models for Estimating Valence and Arousal Level Using Wearable Sensors (Siirtola, 2023)
Heart rate as a measure of emotional arousal in evolutionary biology (Wascher, 2021)