If your phone buzzed right now, would it distract you from reading this? Of course, it would, because switching between tasks takes. at the University of New South Wales, Australia, wants to help us handle those distractions.
Epps is building a wearable system that tracks human movements to understand what task you’re doing, how difficult it is, and when you switch to something else. His goal is to help us control our multitasking lives.
“Computing devices are with us all the time. Our smartphones are on our person. I think they need to be able to understand what we’re doing in order to interrupt us at the right time,” says Epps.
Epps’s team has made a device which straps to a baseball cap that can work out the intensity of a task and when a person switches to another task – just from their head movements. Twenty university students tried the baseball cap while doing arithmetic problems of varying difficulty. The sensor could tell with 70 to 80 per cent accuracy how difficult their task was. People moved their heads more during the simple arithmetic problems, and less during the harder ones. The hat was over 90 per cent accurate at determining when they moved on to a new problem, as their pattern of movement shifted.
Plenty of wearable devices measure whole-body movement to count steps, or track things like skin conductance to assess stress. But few have tried Epps’s approach, which is useful for tracking what we’re doing while sitting down. “I see a great potential in the use of head movement,” saidat the Swiss Federal Institute of Technology in Zurich.
If you’re brainstorming, Mazilu says, the device could turn your phone to silent or deliver only emergency notifications. It could also tell you when you need to take a break. It could also make risky jobs safer by giving workers a ping or other cue to pay extra attention when their task becomes particularly demanding.
Gathering patterns of data that describe humans doing different tasks has more potential than just helping us work more efficiently. Thad Starner at the Georgia Institute of Technology in Atlanta wants to use the data from wearables to train. By digesting the data generated by wearables on millions of humans doing simple tasks, robots and AI can get a better understanding of our lives, and that might let them learn to open doors and climb stairs as well as we do.
Epps’s team is building a new prototype made from cheap components that can be worn on glasses, which tracks eye movement and speech as well as head motion.
Journal reference: Engineering in Medicine and Biology Society 2015, 37th Annual International Conference of the IEEE,
This entry passed through the Full-Text RSS service – if this is your content and you’re reading it on someone else’s site, please read the FAQ at fivefilters.org/content-only/faq.php#publishers.