The name of the game today in technology is ‘interaction’. This is the touchscreen generation where interfacing with technology is a reality, not to the extent that one would see in a science fiction movie, although all of those concepts are ideologies that are already into play. Researchers and developers are working on models that will allow us, the end user, to be able to live in a science fiction like lifestyle where interacting with the technology that surrounds us could be conducted with a simple wave of the hand.
We’ve seen simple devices like Wacom’s Pen Tablet design that’s been used by artists and designers alike to create works of art or shape our various environments. There have been tablet PCs that have allowed those on the go to interact directly with a computer without having to break their pace and then there’s the touchscreen mobile phone that allows us to tap into the infinite vastness of the internet or do a simple calculation of 2 + 2 without the use of a physical key. Touch technology has come a long way from the tablet PC and the iPhone and that’s saying quite a lot considering the latter is still reigning high on any touchscreen device list.
The day is soon approaching wherein we’ll be able to pull Tom Cruise's, Minority Report style moves with your home computers making technology even more intimately interactive. We’ve already seen Microsoft’s Surface Interactive tables that incorporate even multi-touch functionality right on a table top. Consider it to be a rather large touchscreen mobile without coming in the same size of course.
Motorola is working with researchers to develop Anywhere Touch technology that would allow users to create a touchscreen interface on any surface they desire and even have it multi-touch enabled. They call it ‘tactilizing” any surface. The software technology leverages acoustics to analyze sound waves departing from the point of a touch to precisely transform any product into a touch device. The claim is that it’s cost effective as well as intuitive enough for it to be made affordable. A similar device is the virtual keypad that uses a micro projector to display a virtual keyboard on any flat surface allowing users to tap the surface like it were a standard keyboard for results to appear on the screen. The device connects to a PC or even a mobile handset via Bluetooth.
Researchers down at MIT have showcased an even more advanced form of technology interactivity which is exactly what the movie Minority Report featured. In fact MIT’s Media Lab’s, Hiroshi Ishii, was the one actually responsible for the basis of the concept system shown in the same movie. The touchscreen generation is slowly moving into the world of gesture control. MIT researchers have devised a concept that uses optical sensors placed behind an ordinary LCD panel to track hand movements in front of the panel. These sensors then interact with the computer allowing users to manipulate data on the screen by simply moving their hands around. We’ll soon be able to slide, push, zoom into or out of data and pull it all, very literally out of thin air.
Microsoft’s Project Natal that we saw showcased earlier this year will be using similar technology and taking the interactivity of the Nintendo Wii to a whole new level in the gaming industry. Where the Wii requires the use of motions detected from the controllers in the hand to the sensors, Natal is designed to map the user’s entire body so it can simulate the same into the game. You’ll be able to move in any direction and make any gesture to control your on screen character.
The evolution of human-technology interfacing may still have a long way to go before we reach the point of it looking and feeling exactly what the science fiction writers have led us to imagine. But when you think about it, it doesn’t seem like that long a wait.