What is it?
Our facial expressions are at the core of our social interactions, enabling us to silently, and instantly communicate our feelings, desires and intentions. Often called an empathy machine, VR represents a new paradigm in Human Computer Interaction; a naturalistic interface using physical motion. But the empathy machine needs emotional input, and emteq currently provides two solutions:
faceteq™ is a facial sensing platform that provides unparalleled insights for measuring emotions through facial gestures and biometric responses. This ground-breaking technology offers a new paradigm for studying human-human and human-digital interactions for researchers, educators and content creators.
OCOsense™ is a wearable platform for measuring facial muscle activity and emotional responses through glasses and VR & AR headsets. Using a range of multi-modal, patent-pending sensor technologies, proprietary algorithms and live data streaming we have enabled the collection and interpretation of human emotional response in real-time in the real world.
who is it for?
faceteq™ and OCOsense™ are for:
- Developers wanting to rapidly iterate by understanding everything about their users; where they looked, where they went, what they did, and how they responded.
- Creative agencies looking to ensure that they are maximising their return on investment
- Brands wanting to ensure they're getting an emotional connection with the audience
- Market Researchers needing to A/B test content
- Researchers interested in understanding the basis of our psychological and emotional responses to the world
- Innovators wanting to create novel methods of human-computer interaction to allow better connection with the world
- Health Care Professionals. Facial movements are an important aspect of conditions ranging from stroke to depression, Parkinson's and facial paralysis
how does it work?
We use a range of facial sensing techniques including electrical muscle activity, eye movement detection, heart rate and heart rate variability, stress response and head position to identify and recognise what your face is doing 1000 times per second. Our AI engine tracks and translates that information back into your physical expression and emotional state in near real-time.
We provide an API for developers and an iOS app to analyse data captured via our OCOsense™ glasses
where can I get one?
faceteq™is currently available as a Beta product for academic and market research customers contact us now for more information on pricing and availability
OCOsense™ will be available for pilot customers in Q4 2018