Posted by Dr Charles Nduka.

Activity monitoring from accelerometers: more than steps

“One small step for a giant leap for mankind”

Whilst that particular step by Neil Armstrong was one of the most important in history, counting steps in modern times has become routine thanks to low-cost accelerometers inside our smartphones and fitness devices. As this week marks the 50th anniversary of the moon landing, it’s fitting to hark back to the era when accelerometers weighed kilograms and cost tens of thousands of dollars each. Today accelerometers, which are used to measure movement in space (and which are often combined with a gyroscope to measure rotation and are collectively called inertial measurement units or IMUs) are only a little larger than a large grain of salt and costing pennies.

Whilst smartphone-based measurement of physical activity has a role, a phone cannot provide a true reflection of the person’s movements as there is no guarantee that the device is being held.

An employee at a Chinese firm that wanted to monitor employees activities came up with a clever hack

Wrist-based activity tracking is better but numerous studies have found that the accuracy is poor, with differing devices worn on the same wrist providing wildly varying step counts.

Few would doubt that accurate tracking of a person’s orientation and acceleration has many important applications. Current users of the inertial sensing technology include animation, sports training and rehabilitation. There is also a growing awareness of its potential benefits in guiding clinical diagnosis of physical and mental health conditions.

Current methods of motion tracking either require the use of privacy-invading cameras or devices strapped to limbs. Examples of image-based activity recognition systems include Google TensorFlow’s open-source Posenet.

Whilst cameras offer a low-cost way to identify activities, unless multiple cameras are used this method is susceptible to the person being tracked simply disappearing out of the line of sight. The potential invasion of privacy is also a potential issue, making computer vision technology unsuited for many situations.

These issues preclude its use in assessing naturalistic behaviours or tracking activities for a prolonged period over time.

As technology advances, many wearable devices have attempted to overcome this limitation by using an inertial sensor, also called an inertial measurement unit (IMU). An IMU sensor is a small and relatively inexpensive device that contains an accelerometer, gyroscope and magnetometer to report angular rates, velocity and position of an object relative to a global reference frame. IMU sensors have been placed in many personal electronic devices currently on the market including smartwatches and fitness trackers as well as in all current-generation smartphones. IMU sensors in wearable devices have proved popular with consumers and have spawned a range of fitness companies.

Despite a potential to provide both real-time tracking of human action and clear interest from users to receive this information, the output that is fed back to users is quite limited (often restricted to simply the number of steps taken) (2).

Part of the reason for this is because wrist-worn devices are simply not suitable to optimally make use of an IMU sensor. The lack of uniformity in how a phone is used or the naturalistic use of an arm may obscure the type of activity being performed. This makes the placement of IMU sensors in smartphones or on the wrist likely to introduce an unacceptable amount of noise. For this reason, it is challenging to use IMU data to capture the more subtle features of common activities (such as distinguishing sitting on a sofa from sitting on a chair).

Emteq has created smart glasses to provide users with more detailed feedback about their behaviours. Unlike the placement of IMU sensors on the wrist, glasses worn sensors remain centred with the central axis of the body which gives unparalleled information about posture. Head movement also tends to remain relatively stable for different activity types. This means that the output from head-worn IMU sensors provides less variability, making it easier to develop algorithms to recognise features common to each activity type.  

Past studies have shown that Parkinson’s disease and even limb injuries can be tracked using IMU data. We are looking at the potential of using this data to provide information about emotional states. 

If you are interested in helping to develop new technologies to improve lives, here’s your chance.

Emteq is sponsoring a machine learning challenge in collaboration with the Ubiquitous Computing Society (UbiComp; and the International Society for Wearable Computing (ISWC; The 2019 Emteq activity recognition challenge is open now. The goal of the competition is to recognise different activity types such as walking and watching a movie using the output from an IMU sensor.

The prize for first place will be £2000

Second place will receive OCOsense smart glasses valued at £1500 and the third-placed entrant will receive an Oculus GO VR headset. 

Further information on the competition is available here:

1.     Zhao J. A Review of Wearable IMU (Inertial-Measurement-Unit)-based Pose Estimation and Drift Reduction Technologies. In: Journal of Physics: Conference Series. IOP Publishing; 2018. p. 42003.

2.     Henriksen A, Haugen Mikalsen M, Woldaregay AZ, Muzny M, Hartvigsen G, Hopstock LA, et al. Using Fitness Trackers and Smartwatches to Measure Physical Activity in Research: Analysis of Consumer Wrist-Worn Wearables. J Med Internet Res [Internet]. 2018 Mar 22;20(3):e110–e110. Available from:

Is clinical VR the new CBT?

Is clinical VR the new CBT?

Virtual reality is offering a new way to treat mental health disorders that impact 25% of the UK. The success of Clinical VR may be due to its ability to simulate the fundamental properties of the human brain, surrounding you with a safe, controlled version of your world. Continuing advances in VR technology will mark the way for new innovative treatments, changing the lives of many people. In this article we explore the application of immersive virtual reality as a treatment medium for mental health disorders and the concept of Predictive Coding.

monitoring emotional response in VR

monitoring emotional response in VR

A recent study published by Digital Catapult highlights the need for a better way to monitor emotional response in VR market research. There is no doubt that VR can revolutionise market research for products and experiences with a high emotional connection such as vehicles, apparel or retail. Until now the solutions have lacked the accuracy required, Emteq is pioneering the measurement of affect, engagement and arousal in VR. We are currently trialling a prototype device that can be inserted into a VR headset, contact us for more information

The Reality of Virtual Market Research

The Reality of Virtual Market Research

Using questionnaires and interviews for market research is subject to bias and often fails to capture the emotions that were felt at the time. VR brings a fresh angle to market research by immersing people into the product they are exploring. As opposed to looking at images of a car prototype, VR can be used to literally put people in the driving seat – or at least feel as though they are.

Immersive VR research trends

Here at Emteq we have recently completed a survey of respected research organisations for their thoughts on the use of VR technology in their work. The response was fantastic, with 87 responses from researchers in 12 different countries.  The responses lead us to believe that the cost of current eye tracking and emotion measurement solutions is a barrier to widespread adoption, coupled with the fact that the current range of packaged solutions are complex and lack key features.  At Emteq, we are creating a simple solution to sense human emotions from the face, with VR being the perfect use-case for our technology.  If you would like to understand more about how Emteq can improve the quality of research in VR visit

Virtual Reality that puts a smile back on your face

Emteq is developing  technology that allows computers to read your facial expression and emotions.  This technology - named Facial Remote Activity Monitoring Eyewear (FRAME) is part of an £800,000 project funded by the National Institute for Health Research Invention for Innovation Programme. FRAME is being developed by a consortium led by Nottingham Trent University in collaboration with Queen Victoria Hospital in West Sussex, Brighton-based technology company Emteq, Coventry University, and the Facial Palsy UK charity.

Virtually drug free treatments of mental heath conditions you didn't know about

An exciting opportunity to minimise reliance on pharmacological garments for mental health lies in Virtual Reality.  Advances in Virtual Reality technology allow you to enter a world that is authentic enough to trigger your mind and body to behave as if it’s the real world. 

Emteq's technology to create facial expression sensing glasses

Emteq's technology to create facial expression sensing glasses

Smart specs that know when you are smiling are being developed by Emteq to help rehabilitate people with facial paralysis.

The face in Virtual Reality

The face in Virtual Reality
"The addition of this new and exciting science to the surgical armamentarium is an important step and is a virtual certainty"

The quote above is not in response to the recent ground-breaking live VR surgery performed by Shafi Ahmed. In fact I wrote these lines in the British Journal of Surgery 22 years ago. Re-reading the article now, I cringe at my gushing enthusiasm - "it may be five or even 10 years before computers are capable of producing convincing images...". I glossed over the fact that the Silicon Graphics computers required to run the system cost $60,000 or more and the upper end of performance was 15 frames per second.

To many, the current excitement about VR may seem equally overblown, but as I've previously grazed my knees falling off the Hype Cycle, I know that this time it's different...