Content Ecosystem Archive

0

Quality Assurance Looms as Big Factor in OTT Biz Case

Kurt Michel, senior director, marketing, IneoQuest Technologies

Kurt Michel, senior director, marketing, IneoQuest Technologies

IneoQuest Prepares to Introduce Solutions that Could Lead to a QA Currency


By Fred Dawson

December 2, 2015 – It looks like broadcasters and programming bundlers seeking to generate new revenues from online streaming will soon be able to attain the essential level of quality assurance that has long been the exclusive purview of network owners.

That’s a bold statement considering the hurdles that must be cleared to achieve end-to-end visibility into all the points where something can go wrong as the ABR (adaptive bitrate) “chunks” comprising each content session make their way over divergent paths to the end user. But IneoQuest Technologies, a leading supplier of service assurance solutions in the MVPD sector, says it can be done.

In the months ahead, IneoQuest intends to roll out a portfolio of products designed to help OTT providers minimize performance glitches that are endemic to reliance on online business models, says Kurt Michel, senior director of marketing at IneoQuest. “There’s a lot of attention being paid to developing the advertising currency for monetizing OTT, but there’s not a lot of attention being paid to how you achieve the quality assurance that was a given with broadcast TV,” Michel says. “If you want to build a real business on unmanaged infrastructure, that’s essential.”

IneoQuest is not alone in tackling the OTT quality assurance problem. For example, as previously reported, Tektronix has addressed the new quality control (QC) requirements faced by suppliers of file-based on-demand content to the proliferating ecosystem of OTT outlets. The Tektronix post-production solution set includes a highly automated QC platform along with a multi-protocol playback tool enabling highly granular monitoring of files’ conformance to OTT distributors’ ABR and other specifications.

And other entities, such as CDN operator Akamai, where Michel was employed before moving to IneoQuest, and CDN technology supplier Edgeware, are providing insight into what’s happening at edge points where they have a presence. But, as Michel notes, there needs to be a solution that looks at and analyzes problems as they occur on a more ubiquitous basis, with the analytics clout to immediately pinpoint issues by triangulating data gleaned from origins, edges and end devices everywhere.

To get there IneoQuest is applying the expertise it developed for enabling telcos to achieve TV service-caliber quality assurance with packet-based IPTV technology. There the challenge was to aggregate data from the managed network and IPTV set-tops using advanced analytics tools to provide an end-to-end view of performance accessible to all stakeholders, including engineers, field technicians, CSRs and operations managers.

“Now, in the OTT space, we’re looking at different places in the network, as we did in the managed network, measuring at the headend or points of content origin and storage, the quality going in and coming out of the CDNs and measuring with software integrated into the app at the client device,” Michel says. “We can measure all those and correlate them in real time.”

But it’s hard to do in the unmanaged network environment. “In the packet network there are lots of different packets sent out over separate paths and recompiled on a packet-by-packet basis to create a continuous stream of video on the end user’s device,” he notes. “So trying to figure out what’s wrong is a complicated process.”

Collecting all the data needed to perform the analytics that can locate where a problem is occurring and how it is impacting the final end user experience is a multifaceted process.
The solution entails pulling data from a combination of appliance-based devices, virtualized cloud monitoring points and quality monitoring-enabled media players, Michel says.

Virtualization is crucial. “We’re doing it through NFV (network function virtualization) – virtual probes and virtual monitoring and correlation end to end,” Michel says. “We’re taking all the things managed network providers have always done with physical appliances and turning them into virtualized packages that can be deployed on the public cloud.”

To drive data to those virtual modules IneoQuest envisions that once content distributors have engaged with the IneoQuest cloud platform they will have the leverage to get their CDN affiliates and other suppliers to cooperate. “They will be able to say, ‘I want you to apply an API that allows me to access real quality metrics so I can see in real time what the content is doing,’” he says.

Another approach will entail use of passive and active monitoring. By passive, Michel means directing test streams to a single location in order to sample performance. Active monitoring entails use of robotic playback players in strategic locations that can monitor performance on a live stream.

However it’s done, the OTT distributor will be able to discern how much buffering is happening on a given stream, how bandwidth availability at different points is affecting the level of throughput chosen by the ABR process, whether ads are rendering where they’re supposed to in the program and all the other parameters that go into determining whether performance meets viewers’ and advertisers’ expectations. Using client software to monitor viewer behavior such as tuning out or channel switching will allow operations managers to correlate what’s happening upstream with viewer behavior to determine whether poor performance is producing negative audience results.

“We can do that and give power to the content owners,” Michel says. “Then I think we can get focused on improving quality.”

This, of course, raises the question of what can be done by the content owner to drive proactive action at points the distributor doesn’t control. Here the issue goes beyond what any one provider can do with individual suppliers to the larger question of ecosystem-wide participation in establishing benchmarks for performance and means of adjusting to meet those benchmarks.

“There needs to be third-party quality validation based on an established currency just as we have with Nielsen validation of ad viewing,” Michel says. “Quality is another currency that the OTT business has to be built on.”

As IneoQuest brings its solutions to the OTT market, the players will have what they need to work with in establishing that currency. It would assign value to viewer behavior as measured against the benchmarks and then use that measure to categorize the quality and, ultimately, the value of the content stream.

There’s a long way to go before such standards come into view. But opening the door to getting there is a major step forward.

1

Disney/ABC Illuminates Vision For Transition to IP Operations

Mike Strein, director, television & workflow strategy, ABC TV

Mike Strein, director, television & workflow strategy, ABC TV

Executives Discuss How Virtual Master Control Facilitates New Business Strategies

Disney/ABC Television Group made headlines at the NAB Show in April with its decision to convert its entire post-production operation to IP-based “virtual master control” using technology supplied by Imagine Communications. The project has become an industry bellwether as to the feasibility of moving to software-defined virtualization.

By all accounts, the so-called “Columbus Project” requires a lot of work and tight cooperation between the vendor and its client to accomplish such a massive transition. The commitment reflects how essential this is to Disney/ABC’s ability to deliver on new business opportunities, including using direct-to-consumer OTT distribution to better engage consumers with their brands and make better use of their assets to develop niche “channels.”

At IBC in September key participants in the project publicly shared their views of what the transition to all-IP, IT-centric operations is all about. In the transcribed discussion that follows, Mike Strein, director of television and workflow strategy for ABC TV, Stephen Smith, CTO of cloud strategy at Imagine, and Tim Mendoza, vice president of product development for the playout family of Imagine products, explain the scope of the project and its ramifications for the broadcaster’s operations and business development.

Tim Mendoza – The future of virtualized master control is a tremendous topic. We started talking about it two years ago, and now we’re well on our way down the path of virtualizing our solutions, starting with VersioCloud. Mike, as one of our anchor clients at ABC in New York, what is virtualized master control?

Mike Strein – Master control is different at a TV network than it is at a station. At a station it’s a 24-hour operation continually running lists and inserting commercials and that sort of thing. At a TV network like us where we deliver to stations, it’s a time zone management sort of thing. We’ll do it by day part. We’ll run automation lists, which are all Imagine, formerly Harris, products, and insert the commercials on a time zone and day part basis.

Mendoza – How many cities is ABC in?

Strein – We have over 220 affiliate stations. We have a number of time zone feeds that we’re continually updating, fixing things as news gets updated and what not.

Mendoza – Project Columbus is what we’re working on. Your current infrastructure is traditional.

Strein – Columbus is the redesign of our TV network – a complete rebuild, a lot of it using Imagine’s viritualized processing.

Mendoza – I would say your master control is one of the hardest case studies of any master control on the planet. It includes every type of workflow.

Strein – Needlessly complex.

Mendoza – Steve, you’ve been here since day one of our virtualized strategy. What do you think virtualized master control holds for us?

Steve Smith – It’s really the aggregation of all the parts and pieces in a transmission chain. We’ve had that notion of integrated channel playout for quite some time now. But [virtualized master control] not only encompasses the graphics, branding, playback and automation aspects of the chain; there’s a lot more that happens upstream and downstream of playout: audio shuffling, live caption insertion, download normalization, [ATSC] encoding, decoding, Dolby E management. All the discrete parts of your workflow today that are solved traditionally with modular gear need to get virtualized and aggregated into this platform as well.

So when we’re talking about virtualized master control, we’re really talking about that complete origination chain being able to interact with inbound live feeds, interact with file-based content, insert commercials, do the live captioning as those feeds come through. So everything you use to work within a very dynamic and live or linear environment, all done in software.

Strein – And I might want to say we’re really tackling the non-live component first. Live is really the most complex part of that. So the automated playout of content, ingest of commercials and replay of that sort of content is much simpler to do in a virtual environment where you may have that virtual environment not co-located with your broadcast facility.

Mendoza – You brought up a really good point. It doesn’t have to be co-located with the broadcast facility. In the current, traditional days you have miles and miles of coax cable in your NOC. Now you can be hundreds of miles away. In fact, ABC is running 650 miles away.

Strein – We have two facilities. One is 700 miles away. The other is a backup facility in Las Vegas, and that’s 2,000 miles away. It’s kind of scary when you think about it, but it’s why you have multiple instances.

Mendoza – Do you own the facilities or is it an outsourced operation?

Strein – We own the one in North Carolina, which is 700 miles away. The Las Vegas is a leased facility. That is owned by The Walt Disney Co., not ABC.

Smith – The relationship there is almost like you are contracting with a third party for cloud.

Strein – Yes.

Mendoza – Steve, do you see a lot of clients actually contracting with third parties, or are they doing it themselves in-house?

Smith – I think it’s a progression where, to begin with, to get your hands dirty with the technology and get a sense of what it means to move into a virtualized IP environment, customers kind of want to experiment with it internally just to understand what it is. They look for low-hanging fruit, things that are low risk but have value within the organization to try and cut their teeth on.

A simple one is DR (disaster recovery). You hope to never actually have to use it. So to start playing with the technology in that space is sort of any easy stepping-off point. But really the value in trying to move to software-defined workflow and cloud enablement is to get out of the physical plant that you own and to get into an infrastructure that is actually elastic, that you can scale up and scale down and take advantage of the resources that have been deployed not just for your broadcast day in day out but for other business systems as well.

Strein – I think one of the key aspects to virtualizing is it can be co-located, but it doesn’t have to be. If you want to move it out, if you feel brave enough to do that, you can do that. But you can also build entirely within your own plant.

Mendoza – Currently you have eight feeds. Does virtualized master control give you the ability to increase those feeds pretty much immediately and take them down pretty much immediately?

Strein – Potentially, yes. You have the ability to spin up more instances. Our satellite distribution has not yet moved out of house, so if it goes to an off-site facility, it has to come back in to distribute it again. But that doesn’t mean that doesn’t happen sometime in the future as well.

Mendoza – So master control is not only with regard to playout but it also includes a whole slew of other workflows in the chain. One of them is graphics. Can you explain how Disney/ABC is working on the Brandnet project and what exactly that is?

Strein – Brandnet is our affiliate-based graphic-insertion model. We’ve had a couple versions of this. Brandnet 1 we put in place in 2005. Sort of ahead of the game a little bit. At that time it was a Harris IconStation platform, which sits at stations. We file distribute the content to all those boxes, and we in-band trigger the localized content, whether it’s time and temperature, news crawls, any sort of localized content for the stations. The Brandnet 2.0 is a replacement product for that based on the same Versio product we’re putting in Columbus and other areas within the company.

Smith – The Brandnet platform lets you keep that local feel in region while being able to control it from a central location.

Strein – Absolutely.

Mendoza – Why don’t you go through how other graphics can be [part of] a total easy workflow graphics solution [that] flows into a virtualized environment.

Strein – We’re going to be doing almost all our graphics in a nearly live environment. Graphic devices have traditionally been very expensive flashy devices that would sit downstream of playout. The limitations to what you could do were based on the amount of processing power available in that device.

If you look at how the data is actually injected into those graphic templates, most of the time it’s already available in the schedule and you’re pulling information from an RSS feed. But it’s not really that timely; it’s not frame accurate – not as if it’s being generated at the point where a frame is originated. So you can actually create these graphics in time or just in time, render than a few seconds ahead of time.

And if we take a backend approach we can actually farm the graphics out and use an after-effect around the farm to render much more complex graphics with data injected very close to air. So you’re removing them out of processing power you actually need to do this stuff, because it doesn’t have to be done in real time. I can do it slower than real time. I can do it faster than real time. And then key that element in downstream.

Strein – And as that processing power [requirement] increases, you don’t necessarily have to push the content to the boxes at the stations. You can just have pointers that have the content maybe in the cloud or hosted environment. It gives you a lot more power. A capability of updating things like election results and whatnot are sort of what we’re looking at for this next generation.

Smith- Agility is the word.

Mendoza – Also, it goes beyond graphics and playout. It goes upstream a little bit for schedules as well. In a virtualized environment you can have your virtualized scheduling solution, and the actual playlist of the schedule is the exact same playlist as is being played out. So there’s no more XEP translation between the two playlists, or going back one generation it would be a flat file between the scheduling system and the playlist.

We’re talking real-time API calls between the playout playlist and the scheduling playlist. And the schedule is completely virtualized in the cloud or it can be a private data center. It can be accessed in Android, tablets, Macs, whatever it is. With that and the playout in the virtualized environment you have a complete solution that can be either virtualized in [a Microsoft] Azure, [an HP] Helion or Cisco cloud, or you can have it on bare metal and hosted in your environment.

And then there’s the output with regard to this. Now it’s much more than just [a single] stream output. We’re dealing everyday with OTT providers and other types of distribution means not only in the U.S. for ABC but also globally. So how does this environment help us push that out into an OTT type of world?

Smith – The first point is that as we change the wire away from a piece of coax into a piece of Ethernet we can consolidate the business functions that run on that device. We’re shrinking down the footprint of the infrastructure. And that’s where some of the cost savings start coming in.

Once we get rid of that piece of coax, we’re no longer constrained to the aspect ratio of the frame size or the frame rate that SDI imposes on us. So from a single platform I can deliver a transport stream that can go to traditional free-to-air. I can target a cable plant. I can go to an earth station for satellite uplink. I can create adaptive bitrate streams for an OTT service. It’s all the same software platform, the same product, just configured to deliver different streaming outputs.

The general message here is it’s not just about changing the wire. The step into the IP domain is the first step that has to be taken before you can get to software-defined workflow and cloud enablement. So a lot of people are going to have to get focused on if you make that transition into the IP domain, you need to make sure you remain also focused on why it is that you’re actually making that transition – to become more agile and be able to adapt to changes in business needs. I don’t think anybody thinks their business is going to be the same for the next ten years as it was for the last ten years. Our industry is changing, and your infrastructure needs to be able to adapt to those change as they come.

Being in this software-defined world we can repurpose resources. You can sacrifice your transcode capability to launch new linear services; you can reduce linear service capability to do transcode or any other business function that may be in there. But it’s really around getting to the collective pool of resources that can be applied to many different business functions, instead of having this one business function tied to a single appliance.

Strein – Once your content is there you can repurpose it for whatever you want to do with it.

Mendoza – Everybody has a traditional type of plant. Is this something you have to go in head first into the pool, or can you phase it in over a timeframe?

Strein – I think it’s an over build, and you have to phase it in. At some point you throw a switch. For us it will be a challenge when to do that. And, as I said earlier, the live component is the difficult part. But not all of our day part is live. So we’ll focus on the things that are achievable at first and slowly bolt the live part in.

Smith – Another easy stepping off point to get into the IP and software-defined world is as infrastructure ages and needs to be replaced, you replace it with something that not only has traditional capabilities but actually will move forward into the IP and software domain as well. So you see that case of picking some point in the operation where you’re going to start making the transition to IP, and then it will just bubble out from there.

To try to either boil the ocean and replace everything at once is going to be cost prohibitive, and starting at either end is going to introduce additional costs as well as you have to keep on and off boarding between the IP  world and the SDI world. So you just pick a point and let it grow from there.

Mendoza – IP standards are very important. Help me understand some vendors out on the floor are proprietary. We’ve decided to go pure IP standards-based solutions. What’s the difference?

Smith – You will find organizations that have decided that video on IP should still be treated as signal switching, that we’re still treating the IP plant as if it were a traditional heavy iron based on Raptor and it’s just the wire that’s changing.

If you take that approach, you can’t ever actually get outside of your own datacenter. You can’t use public Internet or anybody else’s infrastructure to transition between sites. As soon as you take that step into a proprietary network infrastructure, you are tied into that one space.

The approach we’ve taken is to partner with Cisco, partner with Juniper, partner with companies that are in the IP space where it is their core business to manufacture this IP technology and then layer software across the top of that. That lets us transition to anyone of these manufacturers’ products, and therefore you’re not tied to any specific domain. If you’re a Cisco house great; if you’re an HP house great. Stay with whatever it is you have internally. The key point is, do everything in software and leave the hardware layer out.

Mendoza – From my standpoint it would be ludicrous to go to my CEO and ask for millions of dollars for new hardware platforms and compete against the hundreds of billions of dollars that are being spent on R&D by the likes of HP, IBM, Cisco, Juniper Networks, Brocade. We have to be very locked up with the IP IT industry. And from a Disney standpoint I’d think that would be a major driver as well.

Strein – Certainly.We want to leverage the enormity of the growth space in that market. The densities you can achieve with these types of products are amazing.

Smith – I picked the top six companies that play in the IP space, added up how much they spend in R&D every single year. That amount of money is bigger than the total amount of money spent in the broadcast industry every year on technology. Their R&D budgets are bigger than our spend as a global market.

Mendoza – Microsoft spends $4 billion a year just on the Azure infrastructure. With regard to workflow and change management, this is a big deal, not only from an IT standpoint but also from an operator standpoint. How is Disney/ABC trying to change and grow the professionals within your ranks to come to grips with what’s coming?

Strein – There’s a lot of training going on, certainly. The manual processes, manually translating a traffic log into an automation list, a lot of that process should hopefully go away.

Work doesn’t go away. It becomes a different environment. People are constantly tweaking and changing things. Our traffic and sales people will sell commercials up ‘til just before the show airs. So there’s constant updating. Hopefully it becomes simpler and we give our clients more flexibility. That’s the ultimate goal. Make a simpler process and make it easier to change.

Mendoza – What about broadcast engineers turning more into an IT type of world?

Strein – There’s no question. A number of years ago you looked for broadcast engineers with IT strengths. Now it’s almost the opposite. You look for IT engineers with broadcast strengths. I don’t know if it’s at that tipping point yet, but it’s getting there.

Mendoza – I think it is at the tipping point. We see around the world that customers who are holding onto the traditional world are just holding until retirement.

Strein – You don’t need to know everything. That’s what I tell people. You don’t need to know how to configure a switch. You don’t need to know how to configure a memory array. But you need to be able to communicate intelligently to the people who are doing it.

Smith – What about on the operational side, the people that monitor the channels and interact with them on a daily basis. How does this change for them?

Strein – I think what they have to do stays the same. What we’re doing is still the same business. They’re just going about it a different way. So it’s different tools, adapting to those different tools, reacting with the capabilities that are enabled with them.

0

Use of Watermarking against Piracy Kicks into Higher Gear Worldwide

Alex Terpstra, chairman, Civolution

Alex Terpstra, chairman, Civolution

SoC-Level Support, Cloud-Based Tracking Services Create Foundation for Effective Use of Vendor Solutions in OTT Environment

By Fred Dawson

October 19, 2015 – A more robust global response to online video piracy is finally coming into view, built on forensic watermarking technology, cloud-based support for rapid identification of thieves and growing cooperation among regional entities, content owners and service providers.

While implementation of watermarking has been widely associated with licensing movies for 4K
Ultra HD distribution, broadcasters’ increasing reliance on the Internet to deliver on-demand and live programming has added another dimension to demand for the technology. “It’s about securing premium high-value content, and there’s plenty of premium high-value content available today that isn’t necessarily UHD,” says Steve Oetegenn, president of content security provider Verimatrix.

When it comes to thwarting the direct-from-screen recording of content for illicit distribution, the fastest growing type of piracy, watermarking is the only recourse, notes Rory O’Connor, vice president of managed services at Irdeto. “The only way to counter this kind of piracy is through a robust scheme that uses watermarks to identify illegal sources and follows up with enforcement action,” O’Connor says.

While there’s been some drop off in the number of sites supporting downloading of illicit content, typically through Bit Torrent, sites devoted to streaming purloined content and collecting ad revenues from automated online ad networks are on the rise. A recent study conducted by Digital Citizens Alliance and consulting firm MediaLink LLC reported the number of such sites worldwide had jumped by 40 percent between 2013 and 2014.

Live sports streaming has become an especially urgent area of focus for the use of watermarking. O’Connor points to the decision of Barclays Premier League, the U.K. soccer association, to engage Irdeto’s Piracy Control service as a bellwether development in this arena. The league is tapping Irdeto’s comprehensive detection, enforcement, watermarking and investigation capabilities to close down pirate sites and to inhibit illegal distribution of set-tops capable of receiving stolen content

Verimatrix, which offers a comprehensive spectrum of forensic anti-piracy capabilities within its Video Content Authority System (VCAS) Ultra architecture, last month announced it was enhancing that support with a live profile for its VideoMark watermarking solution that is specifically designed to protect linear content against real-time re-broadcasting threats. “It is well known that re-broadcasting piracy is a growing threat to operator revenues, particularly with sports,” says Verimatrix CTO Petr Peterka. “The VideoMark live profile was developed as a better alternative to effectively identify marks and block leaks of live content in real time.”

The VideoMark live profile enables a flashing mark that is unobtrusive, yet quick to read and analyze, enabling direct interpretation from a mobile device and rapid implementation of steps to block illegal streams as they appear, Peterka notes. A distinguishing characteristic of the VideoMark live profile, he adds, is that it is embedded at the pixel level and can be secured using the Verimatrix proprietary algorithm operating within SoCs (systems-on-chips).

Indeed, the stepped-up efforts to combat use of screen-captured content to feed illicit sites dovetails with preparations to shore up SoC-level protection for 4K UHD content delivered to set-tops and directly to IP-connected 4K TV displays. These preparations include establishing chip-level support for watermarking in SoCs now being produced for next-generation set-tops and 4K TV displays.

“We’re available on all major chipset vendors natively with our VideoMark watermarking technology,” Oetegenn says. This support is part of new security mechanisms performed by walled-off processes within SoCs that include instantiations of hardware roots of trust as required by the Enhanced Content Protection specifications issued by MovieLabs, the tech organization founded by the six leading Hollywood studios.

“The operator has the peace of mind that if they buy a mainstream set-top box using a mainstream system-on-a-chip, that system will not only be pre-enabled for Verimatrix encryption and subscriber management, but also pre-enabled for forensic watermarking,” Oetegenn says. “So there’s no reason not to use [watermarking] anymore. It’s available today.”

“There’s definitely progress being made,” he adds. “We’re working with quite a few operators that are actually using the VideoMark product.”

Civolution, too, points to widespread SoC support for its NexGuard watermarking technology as a sign that watermarking is moving into the mainstream. Named SoC vendors now supporting NexGuard include STMicroelectronics, Sigma Design and Hisilicon Technologies.

The integration with Hisilicon, the latest chip manufacturer to publicly announce inclusion of support for NexGuard in its SoCs, ensures the watermark is present in video viewed on Hisilicon-supplied set-tops whether the content is captured from the screen or through a set-top video output, notes Jean-Philippe Plantevin, senior vice president of products and solutions at NexGuard.

“This will help operators in their content acquisition discussions with Hollywood studios and sports content owners and rights holders,” Plantevin says. “It also gives operators the flexibility to easily and securely activate forensic watermarking at the STB manufacturing facility or through secure over-the-air downloads in coordination with any conditional access or DRM technology.”

“Our business is accelerating with growing market traction in several important areas,” says Alex Terpstra, chairman of Civolution, which, after selling other lines of business, is focused on watermarking-related products and services provided through NexGuard. Network service providers as well as broadcasters are now taking action to deal with the threat posed by pirates’ ability to deliver high-quality video captured from big TV displays.

“The streaming and pay TV ecosystems have different incentives, but it comes down to similar needs for watermarking to address this kind of piracy,” Terpstra says. “Pay TV operators paying large sums of money for licensing rights lose revenues when the content they deliver is stolen this way.”

“We’re responding to a lot of RFPs from operators calling for watermarking support,” he adds. “We’ve signed a number of pay TV operators this year, including one of the largest in Europe and another large operator in the U.S., which we can’t name.”

A major factor in making use of watermarking more effective in tracking pirates is the emergence of cloud-based support for identifying stolen content and reading the embedded watermarks, which can be done in real time to enable action against live streams while events are in progress. The availability of such services from major content security firms helps to overcome one of the drawbacks of watermarking, which is the fact that every provider uses a proprietary forensic marking scheme which only they or an entity licensed by them can detect and read. Since there’s no inclination among stakeholders to create a single global reading and tracking system, the best alternative is to make it possible to quickly detect what’s happening within any vendor’s watermarking ecosystem through shared access to that ecosystem’s watermarked content and other data.

“If you can’t detect the watermark online and detect where it’s coming from, you can’t act fast enough to prevent some of the most damaging activities when it comes to limiting your ability to monetize your assets,” says Peter Oggel, vice president of products at Irdeto. “We’ve made watermarking service an integral part of our security life-cycle service.”

The service includes a menu of options people can choose to take action against pirates and users, such as warnings that content is being viewed illegally or take-down notices against repeat offenders.  Oggel adds. The strength of Irdeto’s service rests in part on its global piracy monitoring operation, which is used by enforcement authorities, regional organizations and content distributors worldwide to gauge the scale of piracy and to identify illegal sites.

At Verimatrix cloud-based support for tracking and reacting to pirated content that has been watermarked with VideoMark is one of the services offered through the company’s new Verspective Intelligence Center (VIC), a globally interconnected revenue security support engine designed to proactively address revenue threats utilizing information gathered from across the video distribution ecosystem. “Verspective will allow operators globally to opt in, connect into the VIC, and with that we’ll be able to provide a whole multitude of value-added services to our operators,” Oetegenn says.

To combat piracy “we enable real-time watermark detection with the monitoring of piracy and breach attempts,” he explains. “If we’re seeing footprints and patterns within our operator community that are telling us the bad guys are trying to break in, those activities can be very quickly combatted and eliminated before they spread.”

Such capabilities extend to thwarting the types of Periscope attacks that resulted in screen capture and illicit distribution of the Mayweather-Pacquiao fight on May 2 and other programming since Twitter bought the app company earlier this year. In such instances, stopping the streams could have a lasting impact on people’s willingness to rely on illegal sources for the content, Oetegenn notes.

In the case of the Mayweather-Pacquiao fight “it would have been possible to monitor it in real time as that content was being put on the Internet and define exactly which terminal device was being used to play out that content and shut it down,” he says. “Let’s imagine they shut an operation down just before the KO or just before the homerun in the baseball game or just before the deciding goal in the soccer match. Ultimately people are going to start wandering away from this type of service and understanding it’s not legal in the first place. It’s not reliable enough to be viewed as true high-value premium content out on the Internet.”

Oetegenn notes his company’s recent acquisition of the video analytics business from Concurrent Computing will greatly enhance its ability to gather and analyze system performance and potential threats. “That analytics platform will be fully integrated with the VIC to the extent that now we’ll be able to have data from every single end device,” he says.

NexGuard has also been stressing the importance of utilizing its cloud-service to expedite effective use of watermarking. By teaming with NexGuard, companies specializing in monitoring for piracy can “read our watermarks,” Terpstra says “If one of our partners finds an illegal file, they can identify the watermarks and use that information to track down the pirates,” he adds.

For example, one such partner, MarkMonitor, a subsidiary of NetResult, does Internet monitoring for sports leagues. Another, Vobile, does anti-piracy work with the studios. “There’s a de facto ecosystem emerging around NexGuard,” Terpstra says.

The NexGuard ecosystem also includes providers of content security who do not have their own watermarking systems, such as NAGRA, Conax and Via-Orca, he adds. NAGRA and Conax, which are partnering on content security solutions, are the latest additions to the NexGuard fold, allowing them to integrate and certify implementation of NexGuard in set-top boxes and to enable pay TV operators’ headend systems to control the application of watermarking.

The integration also brings into play use of NAGRA sister company Kudelski Security to add forensic monitoring, investigation and response services to the enhanced protection portfolio, notes NAGRA chief architect Philippe Stransky. “Our customers already trust us to provide device certification, lifetime device support, security countermeasures and anti-piracy services,” Stransky says. “With the addition of NexGuard, we demonstrate NAGRA’s capability to securely integrate and manage forensic watermarking for all types of content.”

0

New Solutions Add to Power Of Analytics to Drive TV Biz

Steven Canepa, GM, global media and entertainment industry, IBM

Steven Canepa, GM, global media and entertainment industry, IBM

IBM, ContentWise, Edgeware Introduce Unprecedented Range of Capabilities

By Fred Dawson

October 12, 2015 – New responses to surging demand for advanced data analytics in the premium video OTT marketplace are coming from all directions, ranging from a major commitment to media and entertainment by a revamped cloud-focused IBM to innovative solutions from much smaller entities.

“Business analytics is the new battleground,” says Steven Canepa, general manager for the global media and entertainment industry at IBM. “The disruptors in this industry, the new emergent players in this industry, have data at the center of their business model. They’re going to compete in the marketplace by leveraging that data.”

In this environment the challenge has become competing for people’s time against an ever-expanding range of online options, which can only be done effectively through greater attention to each individual’s needs and interests, Canepa adds. “The notion that there are static, segmented sections of the audiences that you can appeal to with media programming is an outdated idea,” he says. “The question becomes, how do you appeal to those consumers on the terms that are interesting to them?”

With billions spent over the past few years on R&D and acquisitions, IBM has put itself in position to respond to the need for big data analytics across multiple industries with strong emphasis on public cloud-based services that seamlessly integrate with in-house facilities to apply a wide range of algorithmic solutions to specific needs. As Canepa’s title suggests, the video entertainment sector has become a major focus for IBM, which has teamed with Imagine Communications as a key market ally in its pursuit of business in this arena.

“The ability to get the distribution platform in place and the ability to marry with it the customer insights are the two fundamental things that any company has to do to get money, to get value out of these new OTT services,” Canepa says. “This is the future of the industry.”

Clearly, IBM is a force to be reckoned with in this scenario. But media companies and network distributors also have opportunities to look at much smaller entrants offering new approaches to using data to compete more effectively for eyeballs and ad dollars.

One of these is ContentWise, a Milan-based company with a significant customer base outside the U.S., which earlier this year announced it was allocating resources to focus on North America. ContentWise sees a significant opportunity for its approach to pulling together, analyzing and turning data from multiple sources into a compelling force for personalized engagement with end users, says CTO Pancrazio Auteri.

“Operators and content providers face enormous challenges today in handling even the simplest of metadata management tasks, and today’s technologies do not have the functionality to support them,” Auteri says. “Delivering true personalization and a superior user experience is only possible if the underlying data is rich, deep and understood by the personalization system.”

The unmet challenges, as Auteri sees them, include the need to “automate metadata processing, validate enriched data before it goes live, consolidate workflows and export clean and richer metadata to existing content management systems.”

According to Massimiliano Bolondi, senior vice president of sales at ContentWise, these are the capabilities ContentWise brings to the table through its Knowledge Factory data processing platform and other product components. These include predictive browsing that takes users to locations with content duration, type, structure and genre suited to their interests; creation of “pseudo-genres” personalized to each user, and a means of instantly reconfiguring and programming UI elements across all client platforms without having to rely on client upgrades.

The key is to be able to leverage as many data sources as possible with fully automated processes that offer a practical, reliable alternative to manually intensive approaches, Bolondi says. At the same time, he adds, the ContentWise platform is designed to ensure managers will have “the last word on things like what types of data get pulled into digital libraries, the way audience reminders and promotions are handled or which subsets of customers are targeted for testing new ideas.”

Knowledge Factory is a set of metadata handling tools, a workflow manager and advanced semantic algorithms that maximize the value of data sources, he explains. “We built our Knowledge Factory to pull data into a knowledge tree that can be accessed through our workflow and processed by our algorithms to support many applications,” he says.

The metadata tools control the ingestion, mash-up, reconciliation and de-duplication of data from multiple commercial and free data providers, including TMS, Rovi, IMDB, Wikipedia and social media websites like Facebook and Twitter, he adds. The platform supports four analytics sets for generating KPI (Key Performance Indicators), including user activity metrics, financial metrics, engagement metrics and recommendation effectiveness metrics. Managers can drill down to each service, catalog item, audience segment, business rule and search keyword to fill out the details underlying these metrics.

Since shifting its focus to the video entertainment sector in 2007, ContentWise has built a strong base of customers in Europe and other regions outside the U.S., Bolondi notes. For example, Sky Italia has deployed the ContentWise discovery solution to deliver personalized recommendations for its Sky Online service, which was deployed last year with systems integration managed by Ericsson.

“As viewers demand more relevance and convenience in the content they consume, our goal is to make sure they can enjoy moving effortlessly across discovery patterns,” says Sky Italia CTO Pier Paolo Tamma. “Selecting ContentWise’s solution makes this vision a reality.”

Meanwhile, coming at the analytics challenge from another direction is Edgeware, traditionally a supplier of advanced CDN solutions for the MPVD market. Following the previously reported expansion of its Video Consolidation Platform (VCP) to include core as well as edge processing capabilities, Edgeware has added significant data aggregation and analytics solutions that not only serve the needs of network operators but also provide media companies who don’t own their own network facilities an opportunity to gain access to per-session performance analysis that would otherwise be out of reach.

Edgeware’s Convoy 360o Analytics, operating in tandem with the company’s high-density purpose-built Orbit server platform, delivers analysis of real-time and historical business information essential to fulfilling direct-to-consumer OTT business goals, says Matt Parker, director of business development at Edgeware. “This is moving from being just a management interface and portal into a very comprehensive 360 analytics suite that will provide extremely granular information on the content that is being watched, at what frequency, from which location, on what device type, whether it’s VOD, whether it’s live,” Parker explains.

Utilizing the Orbit/Convoy combination, broadcasters can take control over content distribution in ways that are impossible with reliance on third-party distributors, he adds. “By taking ownership of those processing functions in the headend, they’re able to exercise much more control over how the content is ingested, the frequency at which it’s played out across multiple concurrent clients and the management of bulk storage and the way those assets are stored in a logical hierarchy,” he explains. “And then, drilling down, new innovations such as dynamic ad insertion and forensic watermarking are the next generation processes the broadcaster has to take in, because this is how they will monetize next-generation services.”

Equally important, by exploiting these benefits of the VCP portfolio, broadcasters now have access to the 360o analytics platform. “It gives the broadcaster, the original content owner the opportunity to create their own interface using a widget-based approach,” he says. “They can create their own UI layout to get exactly the information they want when they want it to make these actionable decisions through the better use of data.”
Within each dashboard, users can filter by region, distribution, format, devices, content provider and ISP and present the data as a variety of charts, tables or geographical maps. The system also includes support for data export or integration to third party data processing systems via an open API, Parker notes.
In an emerging environment where multiple flows of live TV content along with high volumes of stored content must be delivered with consistent performance on par with managed network distribution, broadcasters can’t afford to be at the mercy of faulty points in the external distribution chain. To deal with such situations they must be able to prove the fault doesn’t lie at the point of origin.

“In terms of having a poorly encoded piece of content, missing chunks or bits, etc. they can categorically state we know this content was good when it left our origination point because we  measured it, stress tested it,” Parker says. “The rest is up to you.”

Adding to the possibilities is the fact that network operators who use the Edgeware technology to support their own CDNs can provide wholesale services to broadcasters that will generate the same range of performance analytics at edge points, he notes. Or broadcasters themselves can negotiate for placement of the edge platform at peering points of access to local markets, as is the case with Hong Kong OTT video provider TVB.COM.

Since 2008 TVB’s myTV service has offered 24-hour live streaming, free TV channels and catch-up with accessibility to TVB’s TV programs on smartphones, tablets and computers. With the increased popularity, myTV traffic volumes grew beyond the capacity that third-party CDN platforms could cost-effectively support, says TVB COO Kenneth Wong.

“Our customers expect nothing less than the highest quality, regardless of the device on which they access the content,” Wong says. “With Edgeware’s VCP, we can cost effectively deliver the quality our customers demand with the ability to rapidly scale and add new services as the market evolves.”

As broadcasters and network operators take advantage of the unique capabilities of solutions offered by the likes of ContentWise and Edgeware, they will also need access to larger fields of data and analytics capabilities like those touted by IBM. Fortunately, in this emerging environment with open APIs enabling integration of different data and analytics platforms into content providers’ workflows, buyers have the ability to put together best-of-breed options that will give them the full benefits of basing operations on comprehensive data analytics.

IBM has been putting a lot of marketing dollars into promoting Watson, its cognitive computing platform, for every type of business and application under the sun. In the media industry that means being able to find metadata, get smarter, search fragmented repositories and get value out of the platform’s ability to interpret human language in readouts of text in real time, Canepa notes.

“We can now essentially read the Web in real time, analyze that data and bring it into the workflow within the media company,” he says. “In Watson we have the logic to be able to federate search across a set of repositories, put intelligence in it so that we can find the right metadata. We can cleanse it; we can do forensics, and we can integrate that automatically into the workflow.”

Such capabilities are critical to enabling the ad campaigns of the future, where media buyers want to get to consumers wherever they’re spending time. “We have to begin to bring intelligence to things like campaign reporting, rate card optimization, ratings prediction,” Canepa says.

“One of the engines we have in that portfolio can take a bunch of disparate data, weight those different attributes and predict what’s going to happen,” he continues. “If you’re a subscriber management system and you want to understand where churn is coming from or be able to predict churn before it happens, that becomes really important. If you’re in a targeted advertising role and you want to predict your efficacy on your marketing spend against a certain sub segment of the audience, the ability to predict a very complex set of attributes and constantly re-evaluate that becomes a new critical path.”

The functionalities extend to internal management of the ad sales operations. Canepa cites questions that need answers such as: “Who are my top performing ad sales reps; who are my worst performing ad sales reps? Who’s selling campaigns cross channel? Who do I have make-good exposures with? Where can I do substitutions, deliver a better audience for advertisers at a premium and eliminate financial risk I have on my side of the equation? Those are the kinds of real-time decision-making tools that we can put in the hands of executives when you have a flexible platform architecture that you can extract that data from.”

As Canepa notes, from now on “it’s every media company competing with every other alternative that every consumer has. We all have a unique pattern of usage. We all use a different set of services.”

Old ways of looking at audience divisions by age “don’t tell us anything anymore,” he adds. “Early adopter curves are largely over, and they’re going to continue to collapse. Who is a 23-year-old female today? Each of them is unique with unique patterns of consumption.”

That’s not to say, however, that it’s not possible to use big data to define and target new audience segments. “There are commonalities, and if you can attribute sufficiently what those interests and behaviors are to group those commonalities together, then you can target them as a segment,” Canepa says.

1

HDR Starts to Roll amid Growing Clarity on Bitrate & Quality Issues

Matthew Goldman, SVP, TV compression technology, Ericsson, and EVP, SMPTE

Matthew Goldman, SVP, TV compression technology, Ericsson, and EVP, SMPTE

Minimizing Bandwidth Impact vs. Enabling Backward Compatibility with Better SDR Quality Is a Key Consideration

By Fred Dawson

September 18, 2015 – The pace toward market adoption of high dynamic range-enhanced video has accelerated with the launch of HDR content by Amazon, Walmart’s Vudu, 21st Century Fox and indications of commitment to rollouts in the near future on the part of Comcast, Netflix and others.

While there’s continuing debate over HDR formats, there’s now a lot more clarity on some points than there was a few months ago, thanks in part to the Blu-ray Disc Association’s (BDA’s) release of the new Blu-ray Ultra HD specifications and a bulletin on recommended HDR specifications from the Consumer Electronics Association (CEA). But there’s still a long way to go as content creators, vendors and distributors wrestle with key issues such as bandwidth impact, when the quality of TV displays for mainstream consumption will be sufficient to merit delivering HDR-enhanced content and finding the balance between too much and too little in the way of eye-popping dynamism (see p. 8).

Driving progress is a widespread desire to deliver a dramatically better consumer viewing experience without waiting for content formatted to the full pixel density of 4K to emerge. As noted by Matthew Goldman, senior vice president of TV compression technology at Ericsson and executive vice president of the Society of Motion Picture and Television Engineers (SMPTE), ever greater numbers of 4K UHD TV sets are entering the market, bringing with them the ability to upscale HD 1080p through rendering tricks that mimic the effect of the higher pixel density delivered by 4K at 3840 x 2160 pixels.

As a result, Goldman notes, there’s little difference in what the viewer sees at a reasonable viewing distance between pure 4K and upscaled 1080 p HD content. While 1080 interlaced content presents some challenges to the upscaling process, they’re not insurmountable, he adds.

“The rule of thumb is you get the full impact of HD at three picture heights’ distance – about two and a half meters, which is a typical viewing distance in people’s homes,” he continues. “To get the full benefit of 4K, which is to say, to see a real difference, you have to be half that distance from the screen, which is not what people are accustomed to.”

All of this has been born out in focus group tests and by providers of 4K-originated content, who, as previously reported, have found consumers to be underwhelmed by the experience. In contrast, Goldman notes, with HDR, whether it’s offered in conjunction with 4K or 2K pixel density, “You can see the impact of HDR from across the room.”

Given the bandwidth constraints, delivering 4K at four times the density of HD is a potentially costly proposition with a marginal return on the allocation of network capacity compared to HDR. “It’s all about more bang for the bit,” Goldman says.

Just what the price to be paid in added bandwidth for HDR apart from 4K turns out to be depends on many factors, but it will be much lower, even though the sampling rate for encoding HDR (and 4K) is at least 10 bits versus the 8-bit encoding the industry uses today. ”Our experiments have shown the bitrate increase could be as little as 0 or maybe as much as 20 percent, depending on a variety of factors,” Goldman says.

Ericsson has found that with today’s dynamic encoding capabilities, where the level of compression and hence bandwidth utilization varies widely across picture sequences, HDR in dark spaces actually allows more compression without affecting picture quality than is the case with HD. This tends to balance out bright areas where capturing the nuances requires reducing the amount of compression and raising the bitrate compared to the bitrate for comparable scenes in HD, notes Carl Furgusson, head of strategy for business line compression at Ericsson.

HDR, by definition, means there is more non-linearity in the acceptable compression rate frame to frame than is the case with the standard dynamic range (SDR) used in traditional television. As embodied in the ITU Rec-709 standard, SDR has a luminance range of 100 nits (candelas per square meter) with a color gamut of 16.78 million colors while HDR supports many hundreds or even thousands of nits and billions of colors, depending on which HDR format is used, the type of display and which of two advanced color gamuts is in play, the current cinematic DCI P3 standard or the ITU’s Rec-2020.

“We’re finding with HDR the fact that you can reach deeper levels of black means artifacts that a high level of compression of SDR content might produce aren’t perceivable, so you can go to that level of compression with HDR,” Furgusson says. “At the top end of brightness you see more artifacts with HDR than HD at a given level of compression, so you have to spend more bitrate to avoid that.”

More testing will be required to develop guidelines around bitrate expectations with HDR, he adds. Asked whether the impact range of 10-15 percent in additional bandwidth that CableLabs has measured for HDR was in the ballpark, he replies, “You can’t go with a straight rule of thumb on this. But HDR will not have the impact on bitrates people anticipated.”

HDR formats, like Dolby Vision, that use 12-bit rather than 10-bit encoding will have a greater bandwidth impact, possibly in the 20-25 percent range, says Patrick Griffis, who serves as executive director of technology strategy in the office of the CTO at Dolby Laboratories and education vice president at SMPTE. But if backward compatibility isn’t required, the dual streams can be compacted into one, resulting in a lower bandwidth penalty. “We’ve worked with major chip vendors to ensure both options are included,” Griffis says.

At IBC in Amsterdam this month, Envivio, set to be acquired by Ericsson pending shareholder approval, offered a dramatic demonstration of how its encoding technology can be used to lower the bitrate of a single-stream version of Dolby Vision HDR with 4K pixel density. The demo, running on a new 1,000-nit Vizio display slated for commercial release by year’s end, showed fast-motion trailer clips from an HDR-enhanced version of the motion picture Oblivion at a bitrate of just 12 Mbps.

The bitrate impact of the single-stream version of the 12-bit Dolby Vision system on the total bitrate was between 1 and 2 Mbps, or somewhere between 10 and 20 percent, according to an Envivio official. The compromises in the encoding process that had to be made to reach the 12 Mbps bitrate were cleverly obscured with use of aggressive compression levels outside the areas of viewing focus in any given scene, resulting in a high-quality viewing experience that won the requisite approval for the demo from Oblivion director Joseph Kosinski.

The idea of making HD a part of the HDR paradigm, now labeled as HDR+, is really a return to the original concept of UHD, which was that it was more about dynamic range than pixel density, Griffis notes. “UHD is really about creating an immersive viewing experience, but the industry got sidetracked for a while by the CE manufacturers’ introduction of 4K TV sets, which focused discussion of UHD on pixel density,” he says. “Now things are getting back to the original intent.”

That intent has been bolstered by the development of SMPTE 2094, a draft standard moving to adoption that introduces more dynamism into the color transformation process than provided by SMPTE 2086, otherwise known as the “Master Display Color Volume Metadata Supporting High Luminance and Wide Color Gamut.” SMPTE 2086 serves as the metadata component referenced by the SMPTE 2084 Electro-Optical Transfer Function (EOTF), an alternative to the traditional Gamma or Opto-Electric Transfer Function (OETF) that is now part of the BDA’s UHD standard and the HDR specifications recommended by the Consumer Electronics Association.

By providing a means of assigning a dynamic brightness dimension to the rendering of colors by TV displays, SMPTE 2094, an adaptation of the metadata instruction set used in Dolby Vision, brings the full HDR experience into play with 10-bit as well as 12-bit sampling systems.    With SMPTE 2084 and ITU Rec-2020 color gamut now specified in both the BDA’s and the CEA’s HDR profiles, a fairly clear HDR roadmap is in place, leaving it to individual industry players to determine whether they want to utilize 12-bit formats like Dolby Vision and the one developed by Technicolor or the 10-bit formats offered by Samsung and others.

Dolby at its IBC booth offered a dramatic demonstration of what can be done with Dolby Vision using SMPTE 2094 to map HDR-originated content in the transfer function on both HDR and SDR displays. The demo, using a 2,000-nit display not yet commercially available, offered a dramatic view of what HDR will look like as such displays enter the market.

At the same time, with a real-time 2094 mapping capability, Dolby also showed how the HDR-originated content delivered through the Rec-709 component of the dual-stream Dolby Vision system could be rendered to greatly enhance the SDR display of that content in comparison to an SDR display of the content that didn’t use the SMPTE 2094 mapping capability. This is another factor that will have to be weighed as network-based distributors consider backward compatibility and its impact on bandwidth.

“The industry is split on the backward compatibility question,” says Ericsson’s Goldman. “Some say it’s essential; others don’t think it’s necessary.” Even for those that do favor backward compatibility, there are other approaches under consideration besides reliance on the dual-stream option that involve separate processing of HDR content for SDR distribution.

But with higher-luminance displays supporting the 68.7 billion colors enabled with REC 2020 on the near horizon, distributors will have to begin deciding which way they want to go with HDR formats and sampling rates sooner than later. Given the added benefit of enhanced SDR quality stemming from use of SMPTE 2094 with the dual-stream 12-bit format, there’s a new benefit that will have to be considered in determining whether the 12-bit dual-stream route is worth the extra cost in bandwidth.

0

HDR Requires Care in Addressing Nuances of Human Visual Response

Sean McCarthy, engineering fellow, ARRI

Sean McCarthy, engineering fellow, ARRI

The Eye’s Reactions to Greater Contrast and Color Depth Pose New Challenges 

September 21, 2015 – Now that MPVDs are turning their focus to HDR, considerations relating to the impact of the technology on viewers that weren’t part of the 4K UHD discussion must be brought into the planning process.

While there’s every reason to expect that the visual impact of the expanded color and contrast of HDR will induce distributors to embrace the technology as a major improvement beyond the 4x increase in pixel density of 4K Ultra HD, the push to bring the HDR viewing experience to the public introduces issues in video production that will require far more attention to the nuances of human visual perception. In other words, if HDR is to live up to its potential as a major leap forward in video quality, providers will have to be extremely careful not to alienate viewers with unintended negative impacts analogous to what happened with 3D TV.

This is the message implicit in a paper delivered at INTX in Chicago this past spring by Sean McCarthy, an expert in visual science who serves as engineering fellow for ARRIS. “Unlike 4K UHD, which has been a fairly straight-forward step in the evolution of display resolution built on the SDR (standard dynamic range) foundation, HDR introduces a new paradigm where the dimensions of the new viewing experience must be defined in keeping with basic principles of the human visual response system,” McCarthy says. “This will impact everything that’s done in the creation and dissemination of video content from initial capture through production, postproduction and processing for distribution.”

While such concerns may not be top of mind at this point among service providers, they are a top priority at motion picture studios where directors are now shooting in HDR. “For next-generation imaging people need to be more aware of the way the eye responds to changes in brightness and other dynamics HDR introduces than has been the case with SDR,” says Patrick Griffis, executive director of technology strategy in office of the CTO at Dolby Laboratories and vice president of education at the Society of Motion Picture and Television Engineers. “How much difference in brightness you can employ from one scene to the next is a complicated question. It’s a learning process that movie producers are going through as they shoot in HDR.”

As Griffis notes, SMPTE has provided algorithmic tools to simplify dynamic control over contrast and color in the new transfer function designed to support rendering of the HDR parameters on TV screens. The conveyance of parameters maintaining control over contrast in the black-to-white dimension as well as choice and brightness of colors is performed by metadata used in conjunction with the SMPTE 2084 Electro-Optical Transfer Function (EOTF), otherwise known as “Perceptual Quantization (PQ).”

The ability to dynamically control color brightness is the latest addition to the toolbox. These capabilities are embodied in the draft standard SMPTE 2094, known as “Content-Dependent Metadata for Color Volume Transformation of High Luminance and Wide Color Gamut Images.”

Learning how to use these tools in the context of maintaining viewing comfort is a process that will also have to be undertaken by those who are responsible for quality control in content distribution, including instances where SDR content is being converted to HDR for subscribers who own HDR-capable TV sets. This is a tricky domain where engineers employing 10-bit HDR systems will find they are more constrained than users of 12-bit HDR systems, Griffis says.

The challenge is to apply PQ as aggressively as possible without triggering artifacts resulting from crossing the threshold of tolerance for quantization error. “The amount of precision you can apply with PQ depends on how much error you can tolerate,” he says. “What we found is that to maximize precision with a higher error tolerance you need 12-bit sampling.”

Based on McCarthy’s analysis, there are many factors relating to how changes in contrast and color affect the viewing experience that must be taken into account with HDR. “Along with responsiveness to degrees of contrast,” he notes, “developers must consider human perceptual factors such as light and dark adaptation; brightness sensitivity; reaction to ambient light; how color is perceived under different conditions, [and] responses to frame-rate flicker that might be introduced with expansions in dynamic range.”

And that’s not all. “Content producers and distributors will have to determine how human visual response in the HDR environment will impact quality parameters for advertising, channel changes between HDR and SDR programming and the presentation of user interfaces, captioning and other textual and graphic elements,” McCarthy says. “The impact of various HDR modes on bitrate and bandwidth requirements will also be an important consideration, especially for MVPDs.”

One example of the difference in impact between SDR rendering in HD and HDR rendering of the same content can be seen in how HDR impacts the size of the pupil, which regulates how much light enters the eye. An SDR signal going from low luminosity to a maximum luminance of 100 nits causes the pupil to reduce the light admitting aperture from 8 millimeters to about 4.5 mm, McCarthy notes, whereas the pupil diameter reduces to approximately 2 mm when encountering an HDR-enabled luminance of 1,000 nits, which translates to an approximately three-fold reduction in pupil area compared to SDR.

These differences bring into play considerations about how fast the eye can adapt to increases or reductions in luminosity, since, with HDR, the changes in pupil area from one extreme to the other are much larger than they are with SDR. “Settings for HDR must take into account the impact of luminance on the retinal photoreceptors that determine illuminance in both bright and dark home environments and even outdoor situations in cases where the viewing experience is extended to handheld devices,” McCarthy says. “The level of sensitivity and the speed of adaptation can vary considerably depending on those conditions.”

This is especially significant in instances where the eye has adjusted to a given level of luminosity over an extended period of time in the video sequence – a phenomenon known as “bleaching adaptation.” “The bleaching impact will be an important consideration in determining what average and peak levels of HDR brightness should be in the context of temporal shifts in luminance,” McCarthy notes.

Color, too, is an area requiring special attention with use of HDR. “As luminance increases so does the ability of the human visual system to discriminate between colors at ever smaller gradations,” McCarthy says. “Consequently, more bits would be needed to code color without introducing noticeable errors, particularly at high luminance.”

In other words, a TV show originally recorded in SDR using 8-bit encoding will have to be re-encoded using 10-bit sampling to achieve a satisfactory conversion to HDR. “10-bit encoding may be expected as a minimum bit depth for HDR for any color space,” McCarthy says.

1

Tech Improvements Fuel TV Nets’ OTT Prospects

Sudheer Sirivara, head of engineering, Microsoft Azure Media Services.

Sudheer Sirivara, head of engineering, Microsoft Azure Media Services.

Direct-to-Consumer Option Gains Credibility

By Fred Dawson

September 8, 2015 – For the legion of TV networks exploring direct-to-consumer OTT strategies that risk upending the traditional pay TV distribution model it’s beginning to look like long-standing impediments to delivering a TV-caliber viewing and advertising experience will not stand in the way.

Of course, there are many other dicey issues to be weighed in the direct-to-consumer (DTC) gambit, where live and on-demand content is delivered without requiring viewers to be pay TV subscribers. But, by virtue of the rapid increase in broadband throughput and vendor rollouts of platforms that bode well to enable a managed service-like approach to end-to-end content delivery, the subpar performance common to TV program viewing and advertising online may soon be a thing of the past.

The possibilities are evident, for example, in the TV Everywhere offerings mounted by Scripps Networks, which are only available to pay TV subscribers at this point but which demonstrate the capabilities now at hand for mounting compelling ad revenue-generating alternatives to traditional TV services. Five Scripps networks including, HGTV, Food Network, Cooking Channel, Travel Channel and DIY, are streaming live with interstitially placed dynamic advertising tied to location and demographic characteristics utilizing the TV Everywhere cloud-based platform supplied by Anvato.

“Anvato has greatly simplified the process of deploying our TV brands to screens anywhere and everywhere, and helped us optimize advertisements with dynamic insertion of relevant ads for each authenticated user,” says Alix Baudin, senior vice president and general manager for digital product and operations at Scripps Networks Interactive. “Anvato’s platform and comprehensive toolset provides a compelling, rich TV Everywhere experience for our viewers.”

Of late, Scripps and many other broadcasters have moved away from reliance on client-side apps to facilitate ad placements in favor of server-side solution that overcome hit-or-miss performance by directly inserting ads into adaptive bitrate (ABR) streams. “Our server-side ad insertion eliminates spinning wheels, ad-blockers and delivers truly seamless viewing experiences on all platforms while maximizing monetization,” notes Anvato CEO Alper Turgut.

Indeed, DAI (dynamic ad insertion) on live programming streams has become vital to shaping the upside potential in DTC strategies now under serious consideration across the TV programming ecosystem. As a result, bringing together the operational complexities, performance assurance and data analytics essential to making DAI work as a viable alternative to traditional spot placement has become a top priority, prompting many vendors accustomed to working with network operators in the DAI space to switch their focus to TV programmers who are hoping to leverage unmanaged network environments to drive new revenues.

“I don’t know of any TV network that’s not serious about going direct to consumer,” says Jay Samit, CEO of SeaChange International, which from its founding has been focused on providing VOD, middleware, workflow management, DAI and other solutions to network operators. “Providing compelling solutions for this [TV programming] market has become a priority for us.”

Beyond advertising, traditional MVPD (multichannel video programming distributor) suppliers’ efforts to capitalize on TV programmers’ need for end-to-end support in the direct-to-consumer arena extend across all relevant service support categories, attesting to the size of the opportunity they see for selling into this segment. This will be strikingly evident at IBC in Amsterdam, where service provider-oriented vendors like Edgeware, a long-time supplier of advanced CDN and cloud DVR technology to telephone and cable companies, are pitching their messages to broadcasters.

In Edgeware’s case, along with calling broadcasters’ attention to the ability of its software-defined, hardware-accelerated approach to massively scaling service reach from the core, the vendor is promoting a CDN strategy where broadcasters could team with network owners to create a much more compelling OTT user experience than is possible with the use of public CDNs. As reported elsewhere, Broadpeak is another traditional service provider vendor who sees a way to foster managed-network quality performance for broadcasters through CDN affiliations with operators.

Edgeware’s CDN strategy also provides broadcasters opportunities to leverage an expanded portfolio of network-based real-time analytics to enable a holistic view into the quality of each user’s experience, which could contribute greatly to satisfying advertisers’ demand for verification of performance. From a public CDN perspective, analytics has long been a component of services from providers like Akamai, but it remains to be seen whether broadcasters will find a way to reach the level of per-session precision that can be attained by traditional network operators.

Meanwhile, turnkey hosted-platform support for DTC operations has grown more robust. For example, Amazon Web Services’ acquisition of Elemental will result in tighter integration of advanced transcoding and packaging capabilities into the Amazon cloud.

“The video market is currently at a major inflection point, and one of a magnitude that only happens every few decades,” says Elemental CEO Sam Blackman in an email message. “Together we intend to provide companies all across the globe the means to efficiently scale their video infrastructures as the industry moves to Internet-based delivery.”

In another sign of the shifting strategic picture, Microsoft’s Azure Media Services has implemented new measures aimed at facilitating live programming distribution and DAI. “We’ve seen a big tidal shift with the unbundling of channels and an emphasis on streaming live as well as on-demand programming,” says Sudheer Sirivara, head of engineering at Microsoft Azure Media Services.

“Content owners want to create their own branded bundles with a viewing experience that meets consumer expectations on all screens,” Sirivara adds. “They don’t want to completely cannibalize the legacy pay TV business, but they want to have their oars in the water with the ability to play around with different combinations of advertising and subscription revenue models.”

In response, Azure Media Services has leveraged its experience with specially built live streaming projects like NBC’s webcasts of the 2014 Winter Olympics and other major events into a pre-packaged set of solutions that any content producer can tap into. “It’s there as a complete end-to-end linear TV workflow with dynamic packaging for mom-and-pop businesses as well as big broadcasters,” Sirivara says.

Server-side DAI is now integral to the Azure service package, he adds. “We can execute DAI through the client on the end-user device, as we’ve always done,” he notes, “but with wide deployment of HTML5-capable browsers and the explosion of devices, people are questioning whether they want to continue deploying clients for each device.”

Morever, there’s a big challenge to getting ads to run smoothly with the client-side approach. “By using the HTML5 browser as runtime and doing ad insertion on the backend, we are providing a simpler solution that many customers prefer,” Sirivara says.

Presently, he adds, the server-side DAI is executed in the cloud. But with scaling of DTC broadcasters may eventually want to insert ads that have been cached as assets at the CDN edge points. “Positioning video fragments in the CDN makes sense,” he says. “But when it comes to manifest manipulation (for pulling in ads on the ABR stream), it’s preferable to handle that level of logic at the backend.”

With about $15 billion sunk into datacenter resources worldwide, Microsoft has developed an ecosystem of partners offering specialized functionalities that can be accessed through Azure Media Services. “Our customers need a robust cloud infrastructure to reach all devices on a global scale, but they need to have the flexibility to access best-of-breed solutions to satisfy their specific needs,” Sirivara says.

One such partner is online video publisher iStreamPlanet, in which Turner Broadcasting System has taken a majority stake as it shifts its core technology infrastructure to the cloud. TBS hasn’t acknowledged any DTC plans, but officials have made clear their new positioning gives them flexibility to move in that direction.

“We’ve worked with iStreamPlanet in the past during the PGA Championship, and they have also delivered world-class events such as the Super Bowl and Olympics to millions of viewers,” says Turner chairman and CEO John Martin. “This partnership will expand our capabilities to offer live events within and outside of the traditional ecosystem and, by bringing iStreamPlanet’s innovative technology in-house, allow us to cultivate future business opportunities on digital platforms.”

But as the technological framework for enabling TV-caliber service over the Internet comes into focus there remains considerable uncertainty over whether broadcasters can pull off a DTC play without causing farther harm to the increasingly fragile traditional pay TV business. Ironically, during the recent stock market turmoil, investors hammered media companies over losses attributable to cord cutting while service providers were garnering strong analyst support as providers of broadband connectivity.

For example, Wells Fargo Securities analyst Marci Ryvicker issued an advisory downgrading CBS, Walt Disney Co. and 21st Century Fox while extending buy recommendations for Dish, Comcast, Time Warner Cable and Charter. Noting this has been one of the worst earning seasons ever in the media sector, Ryvicker said, “We can’t help but think some level of value is transferring from content to distribution.”

Citing ongoing cord cutting, MoffettNathanson analyst Craig Moffett said in a note to investors, “Almost every investor with whom we have spoken has described an almost palpable sense that sector sentiment has changed, some would say perhaps permanently….The process has already begun of sifting through the wreckage and considering positioning ‘the morning after.’”

DTC strategies would seem to put broadcasters in line with the thinking that views broadband as the preferred distribution medium, but not everyone is convinced DTC is a winner. BTIG Research analyst Richard Greenfield in a recent blog post takes issue with DTC as a way out of the revenue squeeze without significant consolidation in the programming sector. Consolidation, he believes, would expand the range of content options from any one DTC distributor.

Challenging Disney’s statements that ESPN or other Disney networks could move to DTC successfully if the company chooses, Greenfield says, “Whereas niche worked in the cable network world, in the DTC world, programmers need to have breadth of content.”

But, as previously reported on a number of occasions, one of the big benefits of DTC for companies like Disney is there are no channel limits akin to the restricted capacity on MVPD networks, which gives media companies the option to create whatever niche “channels” they want. For example, in the case of ESPN, global reach for minor sports categories that might draw a sizeable audience of devoted fans create advertising opportunities that otherwise would not exist.

Moreover, established categories where Disney might want to compete with leaders like Scripps’ Food Network create new revenue opportunities and a new competitive arena within niches that aren’t open for competition in the legacy pay TV domain. Disney, the first major content provider to announce plans to move its entire post-production operation to the all-digital consolidated workflow platform developed by Imagine Communications, will be well positioned once it completes the conversion to pull whatever assets it has, from deep archive to the latest productions, into any combination of programs for any interest category old or new.

0

Broadpeak Is Targeting NA Ops In Multicast-to-Home Initiative

Nivedita Nouvel, VP, marketing, Broadpeak

Nivedita Nouvel, VP, marketing, Broadpeak

French CDN Supplier Prepares to Pursue Opportunities from New Offices in NYC

By Fred Dawson

August 10, 2015 – CDN technology supplier Broadpeak is establishing a presence in North America that could open new synergies between content suppliers pursuing direct-to-consumer OTT strategies and network operators looking for monetization opportunities in the OTT space.

In a nutshell, Broadpeak hopes to persuade network operators that by supporting a direct-to-home multicast path for OTT content via its nanoCDN technology they could offer broadcasters and other OTT providers an attractive, lower-cost alternative to streaming increasing volumes of live content in unicast mode, which subjects every stream and hence user experience to the vicissitudes of congestion and subpar performance. Along with generating revenue from content suppliers who would avail themselves of the multicast opportunity, network operators would also benefit from having a multicast solution in place to support their own multiscreen pay TV services, notes Nivedita Nouvel, vice president of marketing at Broadpeak.

“We know network service providers in North America have been deploying their own CDNs and caching servers,” Nouvel says. “We can be part of that story, but where we’re bringing new value and a new approach is in the reconciliation between content providers, network operators and CDN service providers.”

The strategy envisions use of proprietary software running on origin servers to convert ABR (adaptive bitrate) unicast streams into ABR multicast streams for delivery to broadband gateways and IP-enabled set-tops, where the nanoCDN software agent converts the multicast content back to unicast for streaming to IP devices in the home. Distributors can also deploy Broadpeak’s CDN technology in their networks to utilize the company’s multicast technology for delivering live content to devices outside the home.

By making IP-enabled set-tops boxes and media gateways active components of the broadband content delivery infrastructure, the Broadpeak solution delivers other benefits as well, Nouvel notes. For example, when used in conjunction with Broadpeak’s transparent caching capability, the nanoCDN becomes a location for caching the most popular on-demand video content closer to end users, thereby improving quality of experience.

But it’s the support for multicasting live TV for multiscreen access that Broadpeak hopes will anchor its entry into the North American market. “We ended 2014 with 55 percent growth primarily serving the European, APAC and Latin American markets,” says Broadpeak CEO

Jacques Le Mancq. “Now we’re opening our offices in New York City and getting in touch with cable and telephone company prospects.”

Broadpeak’s multicasting technology, which overcomes the “joining” limitations of traditional IP multicast, has been one of its great strengths in building business worldwide, Mancq adds. “ABR streaming of live content is becoming a big problem,” he notes. “When you have ten million users watching the same event it’s going to cost millions of dollars to support that distribution over unicast streams. We’re engaging now with broadcasters, operators and CPE manufacturers who want to apply our solution to dramatically reduce these costs and ensure a consistently high quality of experience with live content.”

As consumers’ use of connected devices to access premium content drives pay TV distributors into IP-based versions of their broadcast channel services and pulls ever more TV networks into direct-to-consumer initiatives, live broadcast content is taking a much more prominent role in what once was largely a non-linear on-demand OTT video services market. According to the Q4 2014 Global Video Index from online video publisher Ooyala, the average amount of time users spend on different types of devices watching live versus on-demand video has shot way up.

Tracking millions of streams worldwide, Ooyala reported average viewing time for live content hit 34 minutes per play on desktops compared to 2.6 minutes per play with on-demand viewing. Other ratios of live versus on-demand viewing time recorded by Ooyala included connected TVs at 53 to 4.6 minutes, tablets at 9.8 to 3.7 minutes and smartphones at 4.6 to 2.5 minutes.

Of course, the volume of on-demand videos viewed, encompassing YouTube and all the other sources of short clips, is much greater, but Ooyala also records a big surge in the volume of long-form content, including live and on-demand. For example, Ooyala reported that 70 percent of video viewing time on tablets over the period from Q4 2013 to Q4 2014 was spent with content lasting more than ten minutes, and 41 percent of viewing time went to such long-form content on mobile phones. Clearly, a large share of that long-form viewing is going to live content.

This is confirmed by metrics released by FreeWheel, the OTT video ad management firm recently acquired by Comcast. In its most recent Video Monetization Report surveying the amount of interstitial advertising viewed with live and on-demand programming, Freewheel reported 73 percent of ad views in Q1 2015 occurred with consumption of live programming. Percentages on tablets, desktops and smartphones were 30 percent, 13 percent and 7 percent, respectively. While viewing of sports events dominates the live online video consumption, FreeWheel reported the amount of regular TV programming that’s viewed as it’s broadcast is on the rise, reaching 7 percent of live viewing in Q1, which marked a 28 percent jump from the previous quarter.

These numbers likely represent just the tip of the iceberg, given the level of commitment to direct-to-consumer initiatives among programming networks and the growing trend toward delivery of “skinny” bundles of live program channels on the part of pay TV distributors. As previously reported, these developments are fueling other multicast initiatives tied to CPE on the part of Akamai and ARRIS.

But Broadpeak appears to be farther along in providing a commercially deployable solution. On the distributor side, an early commitment to the nanoCDN technology comes from France-based Eutelsat, the satellite operator offering support for DTH service providers worldwide. The nanoCDN technology will be incorporated into Eutelsat’s smart LNB (low-noise block downconverter), a dish antenna component designed to bolt interactive value-added services, including Internet broadband, onto broadcast platforms.

Slated for rollout in 2016, the enhanced service capability will use multicast to enable direct-to-home IP video broadcasts. “Eutelsat’s smart LNB is groundbreaking, because it significantly extends the reach of broadcasters inside the connected home,” Mancq says.

Broadpeak has made considerable progress gaining support for nanoCDN among OEMs. For example, the company recently forged a deal to embed the technology into Android TV set-tops supplied by Chinese OEM Geniatech. Equipped with a quad-core CPU, Geniatech’s ATV1900 Series set-top is designed to combine legacy and Web-based content and features to deliver “an amazing user experience,” says Geniatech CEO Fang Jijun.

Support for multicasting IP video is essential to that goal, Fang adds. “[G]iven the unpredictable nature of live OTT video consumption, we saw the value in implementing Broadpeak’s nanoCDN agent in our devices,” he says. “The result is a guaranteed high quality of service for our customers at all times and the possibility for operators to deliver live OTT content to virtually any number of subscribers with our STB using the same amount of bandwidth.”

Another partner on the Android set-top front is multiscreen middleware supplier Comigo, which has teamed with Broadpeak to create an integrated solution supporting delivery of personalized, feature-rich live IP video multicast streams to those devices. “This collaboration with Broadpeak transforms the way that IPTV operators deliver live content while providing them with an effective monetization strategy for multiscreen,” says Comigo CEO Motty Lentzitzky.

Broadpeak is also making headway in fostering partnerships with CPE and other ecosystem suppliers through its Broadpeak Open Alliance, Nouvel says, noting alliance partners now include Technicolor, Wi-Fi gateway supplier AirTies and middleware supplier SoftAtHome. Broadpeak also established a relationship with Pace prior to its acquisition by Arris, which will allow Arris to integrate the nanoCDN technology into Pace-developed terminals if and when operators request that capability, she adds.

By building alliances with set-top manufacturers like Technicolor, which is set to acquire Cisco’s CPE business, Pace and Geniatech, Broadpeak has seeded the market with CPE platforms which network operators can turn to for ready-to-deploy multicast-enabled terminals. But they must be persuaded there’s an advantage to using such devices.

“We need to have the support of network operators,” Nouvel says. “Once operators have deployed the technology in their networks they can easily go to content providers and say, ‘I have this technology in my network that will allow you to greatly reduce your distribution costs while optimizing the quality for end users.’”

But Broadpeak will hold off on engaging North American operators in those efforts until its New York office is up and running with a full staff later this year. “We want to have the office opened first before we’re really going out to see people,” Nouvel says. “We need to have people ready to manage proofs of concept on the tech side so that we’re not just pitching something.”

Broadpeak may find a market open to the idea now that network operators have become more receptive to new technology from non-traditional suppliers. The fact that a home-based point of terminating multicasts of live OTT programming could facilitate operators’ moves into the broadband TV space with their own “skinny” bundles of broadcast pay TV content could turn out to be at least as big an incentive as setting up a revenue-generating means of supporting programmers’ direct-to-home initiatives.

Page 4 of 25« First...23456...1020...Last »