2020 has been an interesting year for the team at Twitter, but one of the newest developments is the announcement of the return of…
Logitech last month became the first company to unveil a set-top device running Google’s new Android-based TV platform, followed by Sony with a host of slick new HDTVs and a Blu-ray disc player. All of these devices are powered by a chip originally designed for computers, but customised for consumer electronics to enable 1080p video, Dolby 7.1 surround sound, DTS and more. And while there have been connected TVs for some time, Google TV is the first to marry the Internet with broadcast TV in an entirely new way.
According to analysts, something on the order of 300 million digital televisions ship every year. That’s about double the number of PCs sold in 2009, so it’s a big market for all the companies involved, and growing.
Speaking to Computer World, analyst Rob Enderle predicted that in the right format and with the right usage model and customer experience, people could adopt Google TV. “Folks probably won’t be doing much browsing on their TV,” he said. “But consuming Internet media? Certainly. Up until now it has been too difficult for most to do that.”
In an interview with PC Magazine, Creative Strategies research analyst Tim Bajarin said: “If you just have the TV emulate the PC experience, then I think that approach will fail. On the other hand, if you turn the various Internet sites that might work on a big screen into channels, with viewing at the heart of the experience, and deliver an experience that consumers are used to on a big screen, then the chance of success is better.”
The concept of bringing the Internet to the TV set has been a hot topic for over a decade, Bajarin said, but only now are we “starting to see products that deliver the promise in a way that consumers may actually find interesting”.
Learning what works, and doesn’t, is one of the lessons of the Viiv PC platform, said Brian David Johnson, Intel Labs futurist and author of Screen Future: The Future of Entertainment, Computing, and the Devices We Love.
Viiv was Intel’s effort to grow the market for PC-based TV and consumer living room experiences, based on a collection of Intel chips and technologies and Windows Media Center.
From a user perspective “we failed with Viiv,” Johnson said. “We learned a lot of things, the most important being we tried to turn the television into a computer.”
Intel’s Gary Palangian, a Google TV program manager in the Digital Home Group, agrees that Viiv was about putting a PC in the living room, while smart TV is about putting your Internet on the TV.
He pointed out, for example, that Viiv was powered by PC chips, while the Intel Atom CE4100 was designed specifically for consumer electronics, offering home theatre-quality audio/video performance, signal processing, surround sound and 3-D graphics in a super small package that also enables fan-less designs due to its low power.
Another key factor in the attempt to get a better grasp of what users want research project led by Bell’s group called “The Social Lives of Television”.
Johnson and a team of Intel anthropologists and ethnographers visited hundreds of people in their homes in India, the UK, the US and China to learn how they engaged with their TVs so that Intel could better understand what consumers actually wanted.
“When we started working on the concept 4 years ago, we figured the No. 1 thing people would want in the future is movies-on-demand,” Johnson said. “But our focus groups revealed that what people really wanted on their TVs was Internet access. People saw the Internet as a way they could get whatever they wanted on demand. Watching what they wanted, when they wanted it, and where they wanted was a profound and liberating experience.”
Intel and many other companies are betting big this knowledge will pay off as television evolves and becomes more PC-like, but without all the trimmings of the traditional PC experience.
Intel President and CEO Paul Otellini recently predicted that “TV is about to change more in the next year than it has in the last 50.” For that to happen, according to analysts and industry gurus, it will need to be an entirely different kind of computing experience.