shared expression

Emteq is developing an emotion-sensing platform that can be integrated to VR and AR headsets enabling more sophisticated social interaction in virtual experiences and to improve the quality of data captured within VR experiences

Our facial expressions are at the core of our social interactions, enabling us to silently, and instantly communicate our feelings, desires and intentions. Often called an empathy machine, VR represents a new paradigm in Human Computer Interaction; a naturalistic interface using physical motion. But the empathy machine needs emotional input, and Emteq has created emteqVR™ to provide a solution. 

emteqVR™ is a facial sensing Machine Learning platform that provides unparalleled insights for measuring emotions through facial gestures and biometric responses. This ground-breaking technology heralds 4th generation VR; a new paradigm for studying human-human and human-digital interactions for researchers, educators and content creators. 



The following selection of published research papers provide an overview of our emotion sensing technology and the background science behind it:

FACETEQ: A novel platform for measuring emotion in VR 

Mavridou, I., McGhee, J.T., Hamedi, M., Fatoorechi, M., Cleal, A., Ballaguer-Balester, E., Seiss, E., Cox, G. and Nduka, C. - March. 2017 IEEE (pp. 441-442). IEEE.

Using facial gestures to drive narrative in VR 

Mavridou, I., Hamedi, M., Fatoorechi, M., Archer, J., Cleal, A., Balaguer-Ballester, E., Seiss, E. and Nduka, C., 2017, October. In Proceedings of the 5th Symposium on Spatial User Interaction(pp. 152-152). ACM.

 A System Architecture for Emotion Detection in Virtual Reality  EuroVR 2018

Mavridou, I., Seiss, E., Kostoulas, T., Balaguer-Ballester, E. and Nduka, C., EuroVR conference, October 2018

If you are interested in reading more about the academic foundations our idea has been built upon, below is a selection of the key papers. 

EMG-based facial gesture recognition through versatile elliptic basis function neural network                       
Hamedi, M., Salleh, SH., Astaraki, M., Noor, M.                                                                   
BioMedical Engineering OnLine. 2013; 12: 73                       

Facial neuromuscular signal classification by means of least square support vector machine for MuCI
Hamedi, M., Salleh, SH., Astaraki, M., Noor, M.
Applied Soft Computing. 2015; 30: 1-822

Human facial neural activities and gesture recognition for machine-interfacing applications
Hamedi, M., Salleh, SH., Tan, TS., Ismail, K., Ali, J., Dee-Uam, C., Pavaganun, C., Yupapin, PP  
International Journal of Nanomedicine. 2011; 6: 3461–3472

Surface Electromyography-Based Facial Expression Recognition in Bi-Polar Configuration
Hamedi, M., Salleh, SH., Tan, TS., Kamarulafizam,  I. 
Journal of Computer Science. 2011; 7 (9) : 1407-1415

Robust Facial Expression Recognition for MuCI: A Comprehensive Neuromuscular Signal Analysis
Hamedi, M., Salleh, SH., Ting, CM., Astaraki, M., Noor, M.
IEEE Transactions on Affective Computing. 2016; PP (99 ): 1                                        

Time-Frequency Facial Gestures EMG Analysis using Bilinear Distribution
Hamedi, M., Salleh, SH., Kamarulafizam, I., Noor, M.                                                                               
IEEE International Conference on Signal and Image Processing Applications (ICSIPA). 2015; 169-173

Facial Gesture Recognition Using Two-Channel Bio-Sensors Configuration and Fuzzy Classifier: A Pilot Study
Hamedi, M., Rezazadeh, IM., Firoozabadi, M.
International Conference on Electrical, Control and Computer Engineering. 2011; 338-340

Analysis of Neurophysiological Reactions to Advertising Stimuli by Means of EEG and Galvanic Skin Response Measures
Ohme, R., Reykowska, D., Wiener, D., Choromanska, A.
Journal of Neuroscience, Psychology, and Economics.  2009; 2 (1): 21-31

Valence Lasts Longer than Arousal: Persistence of Induced Moods as Assessed by Psychophysiological Measures
Gomez, P., Zimmerman, P.G., Guttormsen, S., Danuser, B.
Journal of Psychophysiology. 2015; 23: 7-17

Enhanced Facial EMG Activity in Response to Dynamic Facial Expressions
Wataru, S., Fujimura, T., Suzuki, N.
International Journal of Psychophysiology. 2008; 70 (1): 70–74

Electromyographic Responses to Static and Dynamic Avatar Emotional Facial Expressions
Weyers, P., Helberger, A., Hefele, C., Pauli, P. 
Psychophysiology. 2006; 43 (5): 450–453

Computing Emotion Awareness through Facial Electromyography
Broek, L., Schut, H., Westerink, J.H.D., Herk Jan van, Tuinenbreijer, K  
Computer Science (Human-Computer Interaction). 2006; 9379: 51-62

Electromyographic Activity Over Facial Muscle Regions Can Differentiate the Valence and Intensity of Affective Reactions
Cacioppo, John T., Petty, Richard E., Losch, Mary E., Kim, Hai Sook
Journal of Personality and Social Psychology. Feb 1986; 50 (2) : 260-268