The Mind and Consciousness As an Interface

Julian Bleecker and Nicolas Nova, both from the Near Future Laboratory, presented on the future of user interfaces. Julian sees that the semantics of the discussion around interfaces leads to a more direct coupling between thought and action: basically brain control.

He kicked off the presentation with a set of clips from science fiction films (e.g. Brainstorm). In one of them people were now able to directly control replicant versions of themselves. If you are trying to control a computer, then it is likely that you will need to concentrate on a single thing which goes around the natural way of how our brains work. We can’t let our minds wander anymore. What does that mean for our imagination? He then focused on how hands are very a much a way that we exert control over the world.

After Julian’s cultural backdrop, Nicolas showed some real (scientific) examples. He showed the “hello world” tweet of mind control which was sent by an EEG, a monkey operating a robotic arm with its brain and a weird device made by Neurowear:

[youtube=http://www.youtube.com/watch?v=w06zvM2x_lw]

There a basically two ways of creating this type of interface:

  • By implanting sensors directly and invasively into the brain. They use this a lot in research on how to help people with disabilities.
  • There are also non-invasive solutions using EEG or fMRI. We are getting better at interpreting the data that comes out of these measurement devices.

There is a whole set of applications for which this can be used. Examples include: gaming, spelling applications, 2D cursor control, relaxation tool, access to dreams/consciousness, brain training programs, brain to brain communication, a modern day lie detector, mind-controlled whatever (see the Mind controlled parachute) or zen-like interfaces (like the PLX wave).

The interaction design space (or repertoire) that this opens up has these possibilities:

  • Explicit versus implicit user interactions
  • Synchronous versus asynchronous
  • Detection of cognitive states/brain activity
  • Stand-alone brain-computer interface (BCI) or BCI plus other physiological data (e.g. a heartbeat or turning your head)

There are a few problems:

  1. It is easy to measure the base cognitive state of somebody, but it is very hard to reconstruct this semantically.
  2. It will be hard to train users. They will have to learn a new vocabulary and the feedback that you are getting from most of these systems is hard to interpret directly.
  3. Signal versus noise.
  4. Taking context into account is hard. Most existing projects are done in the lab now (the skateboard below is an exception!)

[vimeo http://vimeo.com/37232050]

There are important questions to ask about the future. We need to build an interaction design perspective, ask design issues and not only address technological problems. What’s the equivalent of the blue screen of death for brain controlled interfaces and what will happen with social norms in the long run?