Home Articles Mind Controllers: Will computers soon know how to read our emotions?

Mind Controllers: Will computers soon know how to read our emotions?

2

Touching a device or telling it what to do works well enough, but each action always starts with a thought. So why not do away with the physical component altogether and make a device respond simply by thinking about it?

This capability is at the core of technology being developed by Emotiv, an Australian-born company headquartered in San Francisco.

Emotiv is the developer of the EPOC, a brain control interface device consisting of an electronic headset with sensors that can read brain activity and use this as means of controlling a device or feeding information into a computer. The technology can also be used to detect facial expressions and movements, such as a wink or a frown, and show these on a virtual representation (or avatar) of your face.

The EPOC has already shown its potential for controlling actions in games, and may one day help the physically impaired to operate devices such as electric wheelchairs.

But Emotiv’s president Tan Le says EPOC can do more than just replace what we already to do with our hands.

“It’s about total communication. It’s about having a device that is sensing your bio-signals, so it’s truly understanding how you feel about certain things and how you are experiencing things, and tailoring experiences to you.”

Take for example typing a query into a search engine, such as the term “sports car”. EPOC simultaneously senses the visual image you have in mind, and can tell that you are more likely to be searching for information on an MG rather than a Corvette.

“We are going to be able to create searches that are far more tailored,” Le predicts. “There are lots of things in our visual cortex that we can access even today. This is not even in a 10-year future. I can really easily see this in the three- to five-year horizon. You can add a lot of dimensionality to existing technologies to give a better clue into how people are experiencing or thinking about certain things.”

Another example would connect EPOC up to an entertainment system, such as an online video library, and help you select what you want to see by sensing what you are in the mood for.

“If I really want to see an action-based movie, then it’s going to dish up for me relevant content along those lines, because it knows that’s what I’m looking for without me having to verbalise it or go through a whole menu system.

“There’s going to be a combination of all of these mechanisms that will converge, that will make our interactions a lot more compelling and a lot faster. So in terms of how this technology is evolving, it’s all about being able to provide another dimension into understanding the person and providing a level of individualisation and personalisation that’s not possible today with directive-type input commands.”

Yet another scenario is emotion-based searching. Le says it will be possible to automatically tag and index photos based on how you feel about them.

“If you have the EPOC on when while you’re viewing those, we can tag and index all of those experiences so that when you recall that experience, you can bring up those images again. We’ve seen another research group leveraging the power of cloud computing to create a bunch of indices and tags for generic images. So then with lots and lots of people tagging and indexing images, they can create a neuro-based image search technology where you just think about an image or a whatever it is you’re trying to find, and it will locate it based on your visual recall and how other people have tagged and referenced that image.”

Le says the outcome will be devices that understand you as a person and how you operate and respond, and are able to respond in more compelling way.

Right now the EPOC is a headset, but Le says Emotiv is working hard to shrink it, with the goal of making it an unobtrusive lightweight device that can be worn continuously.

“Then it’s all about getting consumer acceptance of the technology itself, so people feel comfortable having the device on all the time and interacting with anything in their world using a combination of their hands, their gestures, and a personal bio-sensing device that augments that experience.”

The above article is an extract from A Faster Future, by Brad Howarth and Janelle Ledwidge, launched last month. Howarth is a freelance journalist, author and speaker with more than 16 years’ experience covering the technology and marketing industries. Ledwidge is a professional coach, communications practitioner and digital media industry specialist who for the last 14 years has worked with some of Australia’s leading digital media and ICT companies.

A Faster Future includes interviews and profiles of more than 100 of the world’s leading thinkers on technology and business about the future of online broadband services. Learn more or order your copy at www.afasterfuture.com.