What are you doing now? Your next watch may know the answer.
Two new smartwatch prototypes developed at Carnegie Mellon University in Pittsburgh, Pennsylvania, can guess what their wearer is up to by tracking subtle signals in their skin and muscles. The technology could allow the owner to answer calls, track activities, and more – all without needing to be touched.
The first, named, can figure out what object its owner is touching. By fitting the smartwatch with a radio receiver, EM-Sense can use its wearer like a living antenna, picking up on the electromagnetic “noise” that travels through the human body when emitted by electrical objects.
So far, the system has been trained to recognise the unique electromagnetic signals of 23 common items, including desk lamps, refrigerators and computer trackpads.
Developed in the lab of, a professor of human-computer interactions, EM-Sense is envisioned as a tool for people to augment their everyday activities. If the watch senses that you’ve jumped on your motorcycle, it might open up a map to guide you to your next destination or display a stored reminder to pick up milk from a shop. Or if it knows you have stepped on some scales, it might automatically log your weight.
A separate prototype, named, tracks the wearer’s hand gestures in real time. This relies on an imaging technique called electrical impedance tomography to see inside the arm. The watch band is studded with copper electrodes that can bounce electrical signals between each other, building a picture of the muscles inside the wrist.
To test their invention, Harrison and graduate studentgot 10 people to wear Tomo armbands and wristbands, and collected data on their hand motions. A machine learning algorithm then learned to accurately recognise 13 gestures: a thumbs up, a fist, a pinch between the thumb and different fingers, and so on.
In one demonstration, the researchers hooked Tomo up to a Samsung Galaxy smartwatch. This allowed the wearer to flip through new messages with a flick of the hand to the right or left, or answer incoming phone calls by making a fist.
“Smartwatch capabilities in general are super-limited right now,” sayswho led one of the projects. “We’re just starting to tap into the full potential of having a computer on your wrist.”
Both projects were presented on Monday at the Symposium on User Interface Software and Technology in Charlotte, North Carolina.
Image credit: Chris Harrison
This entry passed through the Full-Text RSS service – if this is your content and you’re reading it on someone else’s site, please read the FAQ at fivefilters.org/content-only/faq.php#publishers.