Smart homes fitted out with appliances that can be remotely controlled are already here – the next challenge is to control your home by thought alone
Think, and it happens (Image: Disability Images / Alamy Stock Photo)
FEELING cold? Your home already knows, and turns up the heat. Sick of the TV show you are watching? Your home changes the channel. No need for a remote controller, just think about what you want and it will happen.
Smart homes fitted out withappliances are already here, and thoughts have been used to of a virtual reality home. Now at Gazi University in Turkey and her team are attempting to combine the two.
The aim is to improve the home environment for people with movement disabilities, says Akman Aydin, allowing more control and independence. They have developed a system in which people use their thoughts to select from a menu on a screen.
The group use an EEG cap to pick up a signal known as a P300 – a pattern of brain activity that appears when a person intends to do something. An accompanying display shows a list of representative images, for the television, phone and air conditioning, for example.
The idea is that when a person wants to use the phone, and sees an image of a phone, their brain will create a P300 signal. A smart home can respond by preparing the phone to dial a number, via a wireless internet connection.
When Akman Aydin and her colleagues tested their device with five volunteers, all were able to learn to control a phone, light, TV and heater. “They could choose a film and change the volume,” she says.
“All the volunteers learned to control the TV. They could choose a film and change the volume”
Five flashes of an image were enough to pick up the correct P300 signal at least 95 per cent of the time for all five people, and two people were successful 100 per cent of the time. Akman Aydin presented her results at thein Medicine and Biology Society meeting in Milan, Italy, last month.
EEG is not the only approach.at the Advanced Telecommunication Research Institute in Kyoto, Japan, is instead using a type of brain scan called functional near-infrared spectroscopy. A cap shines laser light onto the head, and this lights up blood vessels close to the surface of the brain. Light passing through the vessels is measured to give an indication of which areas are actively using oxygen.
The group asked three volunteers to try out their system in a real smart home environment. Controlling the TV meant activating the part of the brain involved in movement, so the volunteers were asked to move a limb. “The system got it right 80 per cent of the time,” says Ogawa, who presented his work at the same meeting.
But because the system relies on adapting to changes in blood in the brain, it takes a while to respond. “The problem is that you wave your arm but you can’t watch TV until 17 seconds later,” he says.
He wants to develop a quicker and more sensitive system that can pick up subtle changes in brain activity that are associated with thoughts of movement and so would not require the user to move.
Both approaches hold promise, saysat the University of Chicago. “They indicate a future in which brain-machine interfaces are connected to almost everything in the house,” he says.
This article appeared in print under the headline “Thought control hits home”
This entry passed through the Full-Text RSS service – if this is your content and you’re reading it on someone else’s site, please read the FAQ at fivefilters.org/content-only/faq.php#publishers.