Thought Control of Surroundings & Communication

Thought Control of Surroundings & Communication
A new study by G.Tec allows participants to translate thoughts into smart-home commands.

This Future Watch topic was inspired by Can the Mind Control the Home?, an article by Rachel Cericola published in Electronic House, 7/8/2011. The general idea of brain-machine interface research is to give disabled people more control over their surroundings, but market ready products still seem years away. Watch the videos below and let us know what you think in a Reply below.

Think about how helpful would it be if air conditioning came on automatically when you felt warm, without having to use a remote control? Or how valuable it might be if doors could open themselves when you approach with hands full? Participants in new research can also control lights and thermostats, and even publish Twitter posts, but with today’s technology they have to wear special head gear with EEG sensors.

Rachel’s article references Control your home with thought alone, by Duncan Graham-Rowe, NewScientist, 7/5/2011. Duncan Graham-Rowe describes a scene where “Two friends meet in a bar in the online environment Second Life to chat about their latest tweets and favorite TV shows. There’s nothing unusual in that – except that both of them have Lou Gehrig’s disease, otherwise known as amyotrophic lateral sclerosis (ALS), and it has left them so severely paralyzed that they can only move their eyes.”

In this video demo of a brain-computer interface, the subject concentrates on a flashing letter, number or command as Electroencephalography (EEG) measurements sense P300 brain wave reactions and properly determine the desired action. Using the technology to type characters is time consuming, but it can better be applied to control functions. Prompt the home intercom to ask, “Who’s there?” in response to the door bell or to unlock the door and say, “Come on in” if the person is recognized.

This second video demo shows Honda’s work on brain-machine interface technology, which will allow users to control the actions of service robots through thought alone. The next question is how long it will take to refine the technology with better response times and accuracy and without the need for expensive and complicated EEG sensors. I’d say that’s at least five years away, possibly ten.

Similar Posts

One Comment

  1. RELATED ARTICLES:

    Facebook Finally Released Details on Their Top Secret Brain-Computer Interface (Futurism, 4/19/2017)
    The Cyborg in Us All (NY Times)
    Mind control of helicopters now. What might be next?
    Young Innovators and The Future of Healthcare
    First Human Brain-To-Brain Interface Lets Scientist Control Colleague’s Body (VIDEO)
    COMMENT: Just as memories of the Brazilian lab rat were recorded and then transmitted to a rat at Duke, imagine the day when you can “plug in” to learn a new language or how to play an instrument or other applications of mind-machine and mind-mind interfaces. When will that occur? And will these capabilities be used for good or evil?

    Imagine the implications of a $1,000 computer that becomes as powerful as the human brain (Ray Kurzweil projects that by 2037) or as powerful as the human race (2049) and how eventually that’s a $0.01 embedded processor that’s connected to trillions of similar processors in an Internet of Things, or dozens or thousands of cell-sized processors living in, and powered by, our bodies.

    What will become of humans? After all, the biology of humans and other living organisms have evolved slowly – over many centuries – while tech innovation has evolved exponentially, following Moore’s Law. ‘Heady stuff (pun intended).

Leave a Reply

Your email address will not be published.