Editor's note: This post about IBM's 5 in 5 prediction of mind reading technology is by Kevin Brown of IBM Software Group's Emerging Technologies.
One of the many great things about working with the Emerging Technology Services team is that I am always focused on “what’s next.” For a long time speech recognition fitted into this category as the computing industry looked to make technology more pervasive to free our finger tips from typing and to help us become more productive.
We are benefitting from this today with voice recognition for our cars, smartphones and even automated phone services for banks and travel reservations.
Now that speech recognition is becoming mainstream, and many other forms of human computer interaction have come along, like touch, gesture recognition, etc., we are thinking about what’s next - or in the case of the IBM 5 in 5 - what's next by 2017. In my view there will be huge leaps made in bioinformatics - this is a large topic, so I am more specifically referring to the use of sensors to understand our thoughts.
No longer just wishful thinking
While much of the brain remains a mystery, progress has been made in understanding and reading electrical brain activity were we can use computers to see how the brain responds to facial expressions, excitement and concentration levels, and the thoughts of a person without them physically taking any actions.
So the idea is to use these electrical synapses to also do everyday activities such as placing a phone call, turning on the lights or even in the healthcare space for rehabilitation. In fact, that is what initially inspired me to look at this field more closely.
In March 2009, Shah, an IBM colleague, had a stroke which left him completely paralyzed, unable to use his muscles, and without the ability to speak. His brain however was working fine - a condition called Locked-In Syndrome, which means he can only communicate with his eyes - looking up for yes, and down for no.
Coincidentally, my wife happened to be his occupational therapist and I demonstrated to her a device that I had recently been investigating called the EPOC from Emotiv. The device has several sensors sitting on your head, that actually read electrical brain impulses. You can train the device so that by thinking a particular thought, an action can take place on your computer. So for example, using Emotiv's software, you can see a cube on your computer screen and think about moving it to the left, and it will. While I was initially interested in connecting it to email systems and smartphones for business users, it immediately became clear to us how this could help Shah.
Shah being a techie himself was open to testing it out. Amazingly, after only 8 seconds of training, he could move the cube at will on the computer screen. We then connected the device to software which could eventually allow control of the environment. The concentration needed whilst operating the headset is quite a lot, however, so more development of the technology and more training in using the headset may be needed to make it entirely effective. I'm sure this will continue developing within the next 5 years.
This isn’t the only example of progress in this area. Scientists at UC Berkeley have designed and developed a special MRI scan that can model our visual thoughts both while we are awake, but even more intriguingly, while we are dreaming.
All in the Applications
This is a case where the technology has now become cheap enough and mobile enough to become a consumer device but it will take the development of some compelling applications and innovative, imaginative uses over the next few years to really make people eager to use it.
By 2017, like all technology, the EPOC or other similar technologies will probably get smaller. So I can imagine it will have completely dry sensors, and I'd be wearing it all the time, perhaps embedded into a baseball cap, and with a finer range of thought patterns detected and connected directly to my mobile phone - allowing me to interact with the world just by thinking particular thoughts. In doing this I could wonder what the traffic will be like on the way home and this information would pop up in front of me.
If you also think about smarter cities, if everyone is wearing the device and open to sharing their thoughts, city heat maps could be created to see how people are feeling to create a picture of the mental health of a city. Or musicians could create elaborate pieces based on what they are thinking about.
The applications are endless, we just have to build them. Think this topic is the most-likely prediction, or maybe just the most innovative, among the Next 5 in 5? Vote for it by clicking "like" on IBM's smarter planet.