Quality Assurance Is Moving Multiscreen into Mainstream

Marty Roberts, SVP, sales & marketing, thePlatform

Marty Roberts, SVP, sales & marketing, thePlatform

November 21, 2012 – Distributors of high-value video in growing numbers are taking a crucial step toward moving multiscreen services into the mainstream by embracing a variety of approaches to measuring and maintaining quality of experience.
 
Given uncertainties about ROI and business models, over-the-top and pay TV providers have been reluctant to invest in costly solutions that would give them the same level of quality assurance that has become a mainstay in cable, telco and satellite TV operations. But now, with consumers making clear there’s high demand for a premium TV experience on connected devices of every description, the stakes are too high to forego facing the quality assurance challenges posed by the adaptive rate (AR) streaming mode of distribution over IP networks.

“Miranda [Technologies] is really happy we made the investments in products for this space,” says Mitchell Askenas, director of business development, at Miranda, which was recently acquired by cable hardware supplier Belden, Inc. “We were trying to get ahead of the curve, and it looks like it’s paying off.”

Miranda has quietly brought AR quality assurance (QA) into the mix of QoE and QoS issues addressed by its iControl QA system over the past year. The data generated from functionalities added to deal with the complexities of AR are analyzed in concert with data gathered from Miranda’s probes and other network resources to pinpoint and analyze video performance across the network, resulting in the same level of quality assurance for AR streamed content that can be achieved for traditionally delivered pay TV.

OTT suppliers have been especially focused on the QA issue as they seek to provide content useful for TV Everywhere services offered through their multichannel video programming distributor (MVPD) affiliates, Askenas notes. Moreover, programmers are starting to sell advertising inventory unique to the multiscreen streams, which makes QA essential, he adds.

“We’re also seeing a lot more interest from cable operators,” he says. “They want to know what each user experience looks like rather than simply relying on packet analysis.” The reference is to the difference between traditional IP deep packet inspection (DPI) measuring packet losses and delays and deep content inspection (DCI), which looks at what’s happening within the video frames and across frame sequences.

“This year has been a time for learning about the options for our customers,” Askenas says. “I think next year is when we’ll see significant sales.”
A New Bellwether

One important bellwether to the trend is the recently announced decision by white label video publisher thePlatform to provide analytics capabilities distributors can use to turn raw data coming in from different points of the network into a coherent picture of what’s going on. “Our customers are using adaptive streaming, because it delivers a better overall experience for users accessing content over broadband networks,” says Marty Roberts, senior vice president of sales and marketing at thePlatform. “But with the variations in degrees of quality from difference CDNs based on the different ways they handle streaming modes like HLS (Apple’s HTTP Live Streaming), Smooth (the Microsoft streaming mode) and Adobe’s HDS (HTTP Dynamic Streaming), they need to be able to analyze the overall experience their customers are getting from all these suppliers.”

That’s a tall order. “Tracking and managing QoS around AR is a little bit harder than it is for traditional modes of distribution,” Roberts says. “There are different formats and protocols and different encryption schemes, so there are technical challenges to monitoring and understanding what the quality is for each user experience.”

To address these challenges thePlatform has partnered with Conviva, whose Conviva Viewer Insights video analytics capabilities will be offered as an integrated component of thePlatform’s mpx video publishing system to provide an additional layer of dynamic reporting within the mpx console at no extra cost to customers. This will allow publishers to quickly access real-time statistics related to the consumer experience, engagement and the relationship between high-quality viewing and audiences, Roberts says.

Now that AR has been widely embraced as the distribution mode for TV Everywhere on the part of big pay TV operators and media companies, maintaining QA “boils down to being good business for them,” Robert notes. “We’ve seen data showing that if a video takes longer than two seconds to start streaming you will see a real drop off in viewership. Assuring good user experience keeps viewers engaged for longer viewing times, resulting in more ad avails to support monetization as well as higher viewer satisfaction.”

Intrinsic to the partnership is the device-end data-gathering capabilities of the new default plug-in installed with the video players thePlatform provides. “Data from the user experience on each device is piped back to the Conviva servers for analysis and displayed into our console to give our customers a good understanding of what’s going on,” Roberts says. “It’s delivered in a standard report, so there’s no need for customization.”

Conviva’s analysis of this data displayed in mpx will allow content distributors to determine video performance and its impact on viewer engagement across multiple types of video players and streaming protocols, notes Conviva CEO Darren Feher. “The joint solution will allow thePlatform’s customers to see exactly what every single viewer experiences, at the precise moment it happens, providing actionable intelligence to enhance people’s viewing experiences and ultimately improve online video for both viewers and publishers,” Feher says. Metrics include audience-based quantification of video quality on viewer engagement and in-depth diagnostics into video quality issues across CDNs, CDN regions, ISPs and viewer host machines.

The partnership also facilitates upselling thePlatform’s customers to the Conviva Precision Video solution, which utilizes the flow of analytics data to optimize quality of experience by maintaining what Conviva calls “preemptive, seamless stream adjustments” on content as it’s delivered to each user. “Precision allows the client to make real-time decisions that optimize the quality of service,” Roberts says.

“The client says, ‘I’m getting a bad stream from this node in this CDN so let’s switch to another node or to another CDN,’” he explains. “The client doesn’t care what server it’s talking to. If one chunk happens to be coming from one server and the next chunk is from another CDN, that’s fine.”

The reference is to how AR employs a “pull” mode in distribution technology that is altogether different from the “push” mode of traditional digital TV. Every few seconds an AR-enabled device, by referencing the bitrate options or “adaptation sets” listed for a given piece of content in a manifest file sent from an HTTP server, asks the server to send a fragment or chunk of streamed content at the most optimum bitrate, depending on how much bandwidth is available at that moment in time and how much processing power the device has available for decoding the bit streams.

The basic goal is to ensure video is streamed at the highest level of quality that can be sustained at any given time without dropping frames or triggering buffering interruptions in the flow. But AR introduces a wide range of processes that pose challenges to assessing audio and video quality that are new to the premium television environment. And those processes vary from one streaming mode to the next.

Not only are there far more parameters to measure in the AR transcoding, fragmentation and distribution process; there are multiple points in the network where those processes can go wrong, extending from source encoders through origin servers and CDN caching points to all the different types of unmanaged IP-connected devices possessed by end users. Moreover, additional complexities associated with content protection and monetization make the achievement of premium service-level quality assurance all the more daunting.

In some respects Conviva’s Precision Video solution avoids these complications by simply switching the call for chunks from one HTTP server to another through the streaming session so as to achieve the best possible quality flow from the CDN tier in the network. But this leaves unanswered issues such as any problems at origin servers or at the content sources that may be contributing to poor performance in the distribution network.

End-to-End QA Challenges

Systems designed to track sources of problems typically employ a combination of DPI and DCI techniques with probes positioned at different points, sometimes in conjunction with plug-ins at the device end such as thePlatform is providing. In some cases, traditional DPI isn’t used, but virtually all participants in AR QA agree there needs to be a means by which the delivery of packets is monitored so as to ensure there’s an even rate that avoids over-loading packets into the device buffer or not having enough buffered packets to ensure smooth video rendering on the device.

This potential for jitter goes to the heart of the fragmentation process, where, when things are going well, a sequence of packets in the video stream is distributed in response to a device request every few seconds. QoS monitoring must be sensitive to which type of AR mode is in play, insofar as sequence durations vary by mode.

Some of the QoS monitoring process is a function of how the fragmentation server is performing; some pertains to what’s happening in transit from the fragmenter to the user device, and some of it is a matter of the time it takes for a device request to get to the fragmenter. Thorough QoS monitoring requires an understanding of what’s happening to interrupt smooth performance when such interruptions occur.

QoE as measured by DCI techniques pertains to the full range of functionalities that determine what the user sees and hears, which means that some aspects of assuring acceptable QoE in the AR domain are the same as what’s required for QoE in traditional premium TV QA. DCI looks at the video stream on a frame-by-frame basis to identify any problems in the encoding process such as blurring, blocking, tiling and splicing errors, taking into account the location and size of impairments as well as their duration and frequency. Gauging audio performance, of course, is also a part of this process, including now the measurement of volume changes between programming and ads to ensure conformance with the Commercial Advertisement Loudness Mitigation (CALM) Act.

DCI, however, gets a little harder in the AR arena compared to legacy pay TV services that employ MPEG-2 compression. Whereas MPEG-2 applies encoding algorithms to fixed size macroblocks of pixels within each frame, H.264 encoding employs variable-sized macroblocks. This greatly complicates detection of tiling or macroblocking, which occurs when one or more image components within a frame are blurred or delivered as a single color block.

Further complicating matters is the fact that, with AR transcoding there are multiple streams for each piece of programming that must be monitored with respect to ancillary content feeds such as audio/video synching and synchronization of closed captioning. This even goes to the need to assure proper synching of alternative language audio or captioning when transnational content distribution is involved.

An important element of QoE that’s unique to AR is the need to go beyond the QoS monitoring of streaming performance to keep track of the degree to which the fluctuations in bitrates driven by bandwidth availability are within acceptable bounds. In other words, if the bandwidth availability is persistently limited to a point where the AR system is sending out sub-par quality video, as might happen if an HD TV set is receiving a stream at a persistent rate that results in sub-par resolution, the user is not getting a good experience, even though the QoS measures are reporting everything is fine.

The ability to verify performance of content protection mechanisms on AR premium content is also essential to the overall QA regime. Just as different types of devices operate natively with different types of AR fragmentation systems, they also come equipped to support different types of digital rights management (DRM) systems. This means that each fragment over each AR stream must be assigned an encryption key that will communicate with the embedded device DRM client.

Premium service quality assurance will have to provide verification that the DRM processes are working. These include not just the encryption mechanisms with appropriate synching of keys to DRMs but also enforcement of usage policies tied to rights metadata and to authorizations assigned by back-office systems to individual users.

The Miranda Solutions

“There are so many moving parts you must gather information from if you want to identify where the problems are in the network and how to fix them,” says Miranda’s Askenas. “QoS will tell you where something is going wrong, but if you have certain fixes in place like forward error correction or buffering mechanisms, it won’t tell you whether the customer is having a good or bad experience. And QoE doesn’t tell you whether you have a QoS problem.

“We rely on telemetry from network elements and the components Miranda provides to get to sources of problems,” he continues. “The first issue is to determine exactly what the source of the problem is, which requires correlating a lot of data. You might have an alarm saying something is wrong with an encoder, but you have to determine which of hundreds of streams an operator is delivering over the top are affected.”

The second issue is, “How do you verify whether the cause of the alarm is actually impacting customers? The priority is to concentrate on fixing poor customer experience.”

Then, he says, “Once you know there’s an impact on customer experience, you need to know exactly what that experience looks like. Now you know what the source of the problem is, what its significance is to end users and what precisely needs to be fixed to rectify the situation.”

And beyond all this the data must be aggregated into reports that are useful at the management level. “You need an overall sense of the quality of your service, what your uptime performance looks like, whether you are fulfilling on your commitments to program suppliers, advertisers and subscribers,” Askenas says.

Miranda’s iControl relies on whatever sources of telemetry in the network can be used to perform analysis on QoS, whether that data is coming from DPI probes, routers, cable modems or devices. “There’s a tremendous amount of data to draw from; the trick is to aggregate and analyze it to provide an accurate and thorough measure of QoS end to end,” Askenas says.

“We’re not providing the DPI probes; our focus in on providing the data you need for a deep QoE analysis,” he adds. “We have probes that sit on the video network to go deep into the video and audio analysis of the frame. We look at the true customer experience by identifying things at the macroblock level like pixilation, frame freezes and black spaces in the video, audio issues, performance of closed captioning, whether the metadata is included in the stream.”

One set of probes, the firm’s Kaleido multiviewers, sit beyond the cache points to deliver DCI readings on all the content flowing out of the local cache. The other probes, part of the firm’s Densité infrastructure equipment, look at encoder and origin server outputs, providing a view beyond QoS measures on encoding to look for things such monitors can’t detect.

The QoE analysis also covers critical aspects of ad performance. “We can use fingerprinting technology to coordinate with the ad schedule and determine if the right ad is playing out,” Askenas says. “Our iControl fingerprinting process is built into our hardware. The ad insertion management module is part of the video management system and collects data from various elements in the network. It sits on top of all the functionalities, including the QoE mechanisms as well as fingerprinting readouts, to correlate and provide a clear view of what’s going on with ads.”

Tying all these capabilities into QA on AR streams Miranda also adds AR-specific metrics having to do with things like fragmentation and buffering. “We’ll abstract an alarm that might be saying there’s too much fragmentation happening on an ESPN stream and analyze whether those fluctuations are really impairing the viewer experience,” Askenas notes. “We’ll look at whether encoders are over-feeding device buffers.”

Rather than relying on its own client software to obtain data from devices Miranda intends to tap telemetry feeds intrinsic to players running connected devices. “The players wrap the decoding with the infrastructure, so we can use iControl to collect and correlate those statistics,” he says. “We aggregate that data with everything else to see where the problems are and what needs to be done to fix them.”

Belgacom, DISH and other Initiatives

Another supplier reporting rising demand for AR QA solutions is Paris-based Witbe, which recently added Belgacom, the incumbent telecom operator in Belgium, to the list of service providers that are using the firm’s Multiscreen Quality Manager solution. Employing what it calls “QoE Robots,” Witbe’s platform employs connections to Belgacom set-top boxes in several Belgium cities to log onto Belgacom’s TV Everywhere portal. The robots use Belgacom’s TV Everywhere app for iOS and Android to “watch” live TV and order on-demand content across all devices, explains Witbe president Jean-Michel Planche.

“Delivering multiscreen video services can be tricky as one does not control the networks nor the devices used to watch video streams,” Planche notes. “Controlling the quality of experience is crucial to ensure success, protect brand reputation and secure revenues.”

Belgacom engineers, marketers and managers have access to analytic dashboards reporting KPIs (Key Progress Indicators) such as channel change time, video streams quality, portal responsiveness, delay to launch the app and log into the portal, success ratio when buying on-demand content, etc. KPIs are available per device type and geographic location enabling management to focus troubleshooting actions and measure the impact of infrastructure investments on the quality delivered.

At the start of the year Witbe’s contracted to supply its QA solution for DISH Network’s new broadband TV Everywhere service, marking one of the largest deals yet publicized in the AR space. Using Witbe’s QoE Robots, DISH can evaluate service availability, measure application performance, check content integrity and measure perceived quality of video streams delivered to computers, smartphones and tablets, Planche says.

The robots run continuous tests on the DISH broadband feed by replicating user actions on end users’ devices through Wi-Fi connections or 3G/4G cellular networks, Planche explains. The robots interface with multiple types of devices operating in AR or other modes to log into servers, browse program guides, watch live and on-demand TV, configure and access DVR recordings and more.

Witbe, with 12 years’ experience in Europe and two in North America, has other, unannounced North American customers as well, including Comcast and Cogeco of Canada. Comcast is using the platform to run tests of its multiscreen service with the iPad and other connected devices while Cogeco is running set-top box tests, Planche says.

Declaring the “classical market for probes has hit the wall,” Planche describes the Witbe QoE Robot platform as the source of comprehensive intelligence distributors require to run premium services in the user-centric IP services environment. “In the IP world you can have a good backbone and bad service or a bad backbone and good service,” he says. “QoS without collaboration with what the user is seeing is of little value.”

But that’s not to say QoS is not important to the value of what Witbe brings to the table. At the analytics level Witbe correlates the intelligence gathered on QoE by its probes with what QoS metrics from other sources are delivering to precisely identify the nature and sources of problems. “Our technology is to understand the quality of the content the operator is delivering at each strategic point of the network, from the point of ingestion, across the backbone and over the last mile,” Planche says.

While end-to-end QA is the ultimate goal, operators can start slowly with implementation of the Witbe platform to begin gaining control over the AR experience at they explore where they want to go with multiscreen services. “We have different small operators, such as telecom operators in small states like Monaco and Macao, where we can do clever things with just a few robots,” he notes.

Whatever level of penetration a provider wants to go to, the Witbe approach to AR QE does not constitute a big investment, he adds. “We’re delivering information they never dreamed they could get with such a small investment,” he says.

As a growing number of vendors offer solutions to bolster QA, the ecosystem in general is moving in the direction of ever better performance metrics. As thePlatform’s Marty Roberts notes, now that AR is “table stakes” there’s general recognition that an old saw holds for the multiscreen domain much as it has for any other aspect of network service operations: “You can’t improve what you can’t measure.”

Notably, he adds, CDN suppliers are now generating QoS metrics that can be fed into analytics frameworks like the one thePlatform is leveraging from Conviva. “Akamai is the best example of a CDN supplier with robust analytics tools for measuring QoS experience,” he says. “But all of them have some level of QoS metrics.”