Developers are always finding intuitive and natural ways for people to interact with their devices. ABI Research has released a report stating that the latest technology they’re turning their heads at is introducing gesture recognition to smartphones and tablets.
No ad to show here.
Pantech already began selling its Vega LTE handset with gesture recognition technology in Korea during November last year. The report said that by 2017, 600-million smartphone will use gesture recognition technology.
This is merely the next step of user interfaces. What I can’t quite understand though, is that the interaction will mean a hands free interaction for a handset. A little ironic? It’s proposed that tablets will use the technology more so, and at least that makes sense.
It’s by no means new, but it hasn’t been used by the big boys yet, meaning it hasn’t been seen as a viable interface yet.
There are a couple of reasons for this. First, it’s costly. Costly to develop in terms of tracking options: whether it should be via a camera, infrared, or ultrasound. Second, battery power would be an issue. With every technology update, battery technology has needed to keep up, or the update would not hold up.
So from button pressing to touch screen, voice interactions at gesture, what could possibly be next? Mind controlled seems the next step. Wonder when that will be a reality?