We are enabling emotional interaction in VR. Virtual reality offers an unparalleled way to understand how we interact with the world. Fields as diverse as mental health and wellbeing, advertising, gaming, retail and education will be disrupted by this new interactive space.
VR systems today allow physical interaction with the virtual environment through head movement and limb interaction - 1st and 2nd generation VR. Technologies are available that will introduce eye tracking, but the vast majority of non-verbal communication is through facial expression, which continuously reflects our emotions and intentions. 4th generation VR will be driven by true naturalistic interaction, and that's where Emteq comes in.
Why Virtual Reality?
What is it?
Our facial expressions are at the core of our social interactions, enabling us to silently, and instantly communicate our feelings, desires and intentions. Often called an empathy machine, VR represents a new paradigm in Human Computer Interaction; a naturalistic interface using physical motion. But the empathy machine needs emotional input, and faceteq™ provides the solution.
faceteq™ from Emteq is a facial sensing platform that provides unparalleled insights for measuring emotions through facial gestures and biometric responses. This ground-breaking technology offers a new paradigm for studying human-human and human-digital interactions for researchers, educators and content creators.
who is it for?
faceteq is for:
- Developers wanting to rapidly iterate by understanding everything about their users; where they looked, where they went, what they did, and how they responded.
- Creative agencies looking to ensure that they are maximising their return on investment
- Brands wanting to ensure they're getting an emotional connection with the audience
- Market Researchers needing to A/B test content
- Researchers interested in understanding the basis of our psychological and emotional responses to the world
- Innovators wanting to create novel methods of human-computer interaction to allow better connection with the world
- Health Care Professionals. Facial movements are an important aspect of conditions ranging from stroke to depression, Parkinson's and facial paralysis
how does it work?
We use a range of facial sensing techniques including electrical muscle activity, eye movement detection, heart rate and heart rate variability, stress response and head position to identify and recognise what your face is doing 1000 times per second. Our AI engine tracks and translates that information back into your physical expression and emotional state in near real-time.