Everlytic is more connected to its business partners than ever before. This is after South Africa’s most trusted bulk communication platform company held its…
Microsoft’s Kinect points the way to intuitive interfaces
Microsoft’s Steve Ballmer announced a few weeks ago the Xbox 360 Kinect has sold over 8 million units in its first 60 days. Microsoft has a hit with Kinect, and the product deserves the kudos that it has generated. In case you have missed it, Kinect is a controller-free gaming sensor for the Xbox 360 that lets players control the game they are playing via movement. And this technological marvel is just the beginning of more natural ways for people to interact with computing.
The Kinect interface for the Xbox 360 is brilliant. Microsoft describes it as a “controller free gaming and entertainment experience.” Think Wii without the wand. The interface works with simply speech and movement.
We were fortunate to do much of the work on Kinect leading up to its release, and it was a really fun product to work on. We knew MSFT had a hit coming, as Kinect really is a magical experience. No matter how sophisticated you are with technology, during your first two minutes with Kinect you will resemble a monkey looking into a mirror for the first time.
You peer intently at the image on the screen, raise an arm and see the arm on the screen go up too. You raise your leg and the avatar does too. You do weird little dance moves and watch the avatar do it too. You get a silly grin and your face and say “cool!”
Brian Eno famously said that “there should be more Africa in computing” over a decade ago in Wired magazine. Africa has now come to computing, and this is just the beginning. Gesture interfaces will become a common computing interface, allowing us to do many activities easily that are very hard to do with a mouse and a keyboard. Dance and fitness games are early examples. Much more will follow, and some dazzling hacks are coming to light as people experiment with the technology. A recent CNET article suggests that Microsoft may actually be encouraging hacks.
Gesture interfaces are only the beginning of a host of more natural, intuitive user interfaces. Voice interfaces will get much better, and mobile phone users will find it much easier to interact with their devices while mobile. Think of the reduction in the accident rate if people could just talk into their phones rather than trying to peer at a small screen while driving to look up that long lost contact. We are seeing the first brain implants, enabling direct interfaces between the brain and hardware. This is spawning a field called BCI, or brain computing interfaces. There are also experiments underway on using thoughts to control computing interfaces.
Computing will become increasingly interesting as we move away from the mouse and the keyboard to more intuitive and even invasive interfaces. We will likely become much closer to our computers, even having computing interfaces integrated with our bodies and brains.