“One small step for a man...one giant leap for mankind”
Whilst that particular step by Neil Armstrong was one of the most important in history, counting steps in modern times has become routine thanks to low-cost accelerometers inside our smartphones and fitness devices. As this week marks the 50th anniversary of the moon landing, it’s fitting to hark back to the era when accelerometers weighed kilograms and cost tens of thousands of dollars each. Today accelerometers, which are used to measure movement in space (and which are often combined with a gyroscope to measure rotation and are collectively called inertial measurement units or IMUs) are only a little larger than a large grain of salt and costing pennies.
Whilst smartphone-based measurement of physical activity has a role, a phone cannot provide a true reflection of the person’s movements as there is no guarantee that the device is being held.
Wrist-based activity tracking is better but numerous studies have found that the accuracy is poor, with differing devices worn on the same wrist providing wildly varying step counts.
Few would doubt that accurate tracking of a person’s orientation and acceleration has many important applications. Current users of the inertial sensing technology include animation, sports training and rehabilitation. There is also a growing awareness of its potential benefits in guiding clinical diagnosis of physical and mental health conditions.
Current methods of motion tracking either require the use of privacy-invading cameras or devices strapped to limbs. Examples of image-based activity recognition systems include Google TensorFlow’s open-source Posenet.
Whilst cameras offer a low-cost way to identify activities, unless multiple cameras are used this method is susceptible to the person being tracked simply disappearing out of the line of sight. The potential invasion of privacy is also a potential issue, making computer vision technology unsuited for many situations.
These issues preclude its use in assessing naturalistic behaviours or tracking activities for a prolonged period over time.
As technology advances, many wearable devices have attempted to overcome this limitation by using an inertial sensor, also called an inertial measurement unit (IMU). An IMU sensor is a small and relatively inexpensive device that contains an accelerometer, gyroscope and magnetometer to report angular rates, velocity and position of an object relative to a global reference frame. IMU sensors have been placed in many personal electronic devices currently on the market including smartwatches and fitness trackers as well as in all current-generation smartphones. IMU sensors in wearable devices have proved popular with consumers and have spawned a range of fitness companies.
Despite a potential to provide both real-time tracking of human action and clear interest from users to receive this information, the output that is fed back to users is quite limited (often restricted to simply the number of steps taken) (2).
Part of the reason for this is because wrist-worn devices are simply not suitable to optimally make use of an IMU sensor. The lack of uniformity in how a phone is used or the naturalistic use of an arm may obscure the type of activity being performed. This makes the placement of IMU sensors in smartphones or on the wrist likely to introduce an unacceptable amount of noise. For this reason, it is challenging to use IMU data to capture the more subtle features of common activities (such as distinguishing sitting on a sofa from sitting on a chair).
Emteq has created smart glasses to provide users with more detailed feedback about their behaviours. Unlike the placement of IMU sensors on the wrist, glasses worn sensors remain centred with the central axis of the body which gives unparalleled information about posture. Head movement also tends to remain relatively stable for different activity types. This means that the output from head-worn IMU sensors provides less variability, making it easier to develop algorithms to recognise features common to each activity type.
Past studies have shown that Parkinson’s disease and even limb injuries can be tracked using IMU data. We are looking at the potential of using this data to provide information about emotional states.
If you are interested in helping to develop new technologies to improve lives, here’s your chance.
Emteq is sponsoring a machine learning challenge in collaboration with the Ubiquitous Computing Society (UbiComp; www.ubicomp.org) and the International Society for Wearable Computing (ISWC; iswc.net/). The 2019 Emteq activity recognition challenge is open now. The goal of the competition is to recognise different activity types such as walking and watching a movie using the output from an IMU sensor.
The prize for first place will be £2000
Second place will receive OCOsense smart glasses valued at £1500 and the third-placed entrant will receive an Oculus GO VR headset.
Further information on the competition is available here:
1. Zhao J. A Review of Wearable IMU (Inertial-Measurement-Unit)-based Pose Estimation and Drift Reduction Technologies. In: Journal of Physics: Conference Series. IOP Publishing; 2018. p. 42003.
2. Henriksen A, Haugen Mikalsen M, Woldaregay AZ, Muzny M, Hartvigsen G, Hopstock LA, et al. Using Fitness Trackers and Smartwatches to Measure Physical Activity in Research: Analysis of Consumer Wrist-Worn Wearables. J Med Internet Res [Internet]. 2018 Mar 22;20(3):e110–e110. Available from: https://www.ncbi.nlm.nih.gov/pubmed/29567635