Giving Your Computer The Finger

Spread the love

John Underkoffler explains the human-computer interface he first designed as part of the advisory work for the film Minority Report. The system, called "g-speak", is now real and working. Note the gloves Underkoffler is wearing. (Image from Wikipedia)For as long as we've had personal computers, we've been trying to decide how we interact with these things. The mouse and graphical display was a major improvement over text only input, though I know some of you out there will argue for the keyboard to this very day (you know who you are). Nevertheless, every time we come up with something we like in the way people and machine interact (what we call HMI or Human Machine Interface), we decide that while it's okay, it still isn't quite right. That's okay. As my wife likes to point out, if people like something, they've want to change it; it's when they don't use it that you know you've lost them.

So it is with the humble mouse and the graphical user interface (which came out of Xerox PARC back in the 70s. It seemed like an awfully good idea . . . and it still is. When Apple released its own version of the graphical desktop inspired by Xerox, personal computing changed forever. Point here, click there, and magical things happen. Right click and menus pop up into which we dig ever deeper to make other things happen. The advent of clicking and dragging brought a real-life cause and effect onto the desktop's two-dimensional space. Hold on to this virtual object and drag it to a new location, or deposit this virtual object into this virtual container, whether it be a trash can or a file folder. Introducing motion into an otherwise static environment enhanced human-machine interaction. 

In the movie adaptation of Phillip K. Dick's "Minority Report", Tom Cruise stands in front of a virtual screen, manipulating the computer system using hand gestures to manipulate visions of future crimes, pulling this image here, setting that one aside, zooming in, pushing that one back, and looking for information on the individuals concerned. For techie geeks like me, that user interface was the real star of the show and years later, what I remember most clearly about the film. However useful such a user interface might be, it was seriously cool.

The inspiration for that gesture-based interface was designed by John Underkoffler, an actual product called the "g-speak Spatial Operating Environment", developed by his company, "Oblong Industries". Underkoffler also did some work on other visualization and interface techniques including holography and animation while at MIT. For a really cool demonstration, and a fascinating talk by Underkoffler, visit ted.com and pop his name into the search field.

The idea of gesture-based systems is obviously an attractive one because we keep exploring it. If you're seen "Iron Man 2", Tony Stark interacts with his own supercomputer via gestures  without special gloves. In this natural environment, the idea behind the tech becomes downright sexy. But Stark doesn't just use gestures; he also talks to the system in an almost conversational way while issuing commands as thoughts pop into his head. The system reacts to his speech and actions in an almost organic way, as though the system is just an extension of himself, much like his iron man suit. Too fanciful for you? A German group of scientists at Fraunhofer FIT have developed what might be called the next generation of human gesture based systems. Unlike Oblong's system, this three-dimensional interface doesn't require any special gloves, just like Tony Stark's system.

Ever since computers started coming into the hands of everyday users, we have been trying to reinvent the way people interact with these things. From inputting code via jumpers and switches, to keyboards, to the graphical UI that made Apple a household word (the company, not the fruit), it seems we can't ever find an interface we like. At least not for long. All of us work happily (more or less) with a keyboard and mouse, but it is limiting, hence all these fascinating developments into human machine interface design (HMI). We want to touch, wave to, pinch, tap on, and talk to our machines. This is, I believe, part of the attraction to the latest computing marketplace increasingly dominated by ever-smarter smartphones, iPads, Android tablets, and the BlackBerry Playbook. What could be more direct than touching in order to make things happen? It's natural. Reach out and touch.

From the humble mouse to touch screens to science fiction ideas like artificial intelligences that respond naturally to our speech, to direct neural interfaces as seen in the nightmarish Matrix, we keep looking for other ways to interact with computers.

How about you, dear reader. Are you ready to just plug in? What's your favorite interface between human and machine?

Liked it? Take a second to support Cooking With Linux on Patreon!
Become a patron at Patreon!