This will be the ultimate test with every event slated to be streamed to every type of device to tens of millions of simultaneous viewers worldwide. But just how well informed we’ll be about the end-user experience in terms of whether content, including ads, arrived smoothly at good resolution is open to question.
No doubt, major failures will be picked up by the press and we’ll all know to what degree the basic premise that so much content could be transferred successfully over all types of access networks has been born out. But we won’t know whether wide-screen connected TVs consistently received HD-quality pictures or something that pales in comparison to what TV viewers saw on traditional broadcast, cable and satellite feeds.
Such measures of performance down to reception on the smallest screens over potentially jammed cellular feeds in high-density coverage areas remain one facet of the emerging broadband TV infrastructure that has yet to be implemented on anything approaching a mass scale. Even purveyors of TV Everywhere services who have long relied on quality assurance tools to keep track of performance on their traditional pay TV conduits don’t have any idea of what’s happening at the receiving end of streamed IP content.
This missing ingredient poses a significant challenge to all who believe broadband TV bodes well to generate new revenues or even to supplant traditional TV. With so much to measure – consistency of acceptable picture resolution, timing of ads in the stream, performance of content security mechanisms, to name a few – ROI projections that don’t factor in the costs of quality assurance will have no credibility.
Over the past year many suppliers of traditional video quality assurance platforms for the cable and IPTV markets have begun introducing means to provide content distributors an end-to-end accounting of what’s happening in the esoteric realm of adaptive streaming over IP, where every few seconds each unicast link is fed a fragment of content matched to the bandwidth availability, format and adaptive player of the receiving device. While some aspects of quality assurance measurement in IP streaming mirror what’s done with traditional TV, especially as regards encoding performance, there’s another realm of functionality unique to adaptive streaming which requires all-new modes of measurement and analysis.
Many folks in cable are wary of reliance on adaptive streaming over the long haul, even though they’ve used the technology to stream content to connected devices in the early going. But as CableLabs CTO Ralph Brown notes in this month’s Source Code interview, none of the workarounds proposed so far are without significant downsides. Transcoding MPEG2-delivered content at the media gateway for streaming over IP to devices in the home incurs processing costs and doesn’t address the need to get the content to devices outside the home. Reliance on cable-optimized technology offered through the PacketCable Multimedia protocol opens a new can of worms that may be no less worrisome than the alternative.
In point of fact, adaptive streaming, with an emerging standard known as MPEG DASH in the offing and legions of technology suppliers working to make the use of the technology as painless as possible, is here to stay and probably offers the best long-term route to success for premium service providers delivering content to IP devices. But the quality assurance question will have to be addressed. It’s an added cost burden, no doubt, but by embracing adaptive streaming and the quality assurance requirements it imposes from the outset, service providers will be better positioned to minimize costs and maximize performance as the migration to IP accelerates.
The suggestion here is to devote serious effort weighing the solutions and mapping an approach to implementing quality assurance that will minimize costs early on while assuring all the bases are covered when broadband TV distribution goes main stream. Fortunately, there are many more options to choose from today than there were six months ago.