Applications Archive


New Approach to Developing UIs Fuels OTT Efforts to Win Viewers

Trisha Cooke, VP marketing, You.i TV

Trisha Cooke, VP marketing, You.i TV

You.i TV Scores with Single Codebase Engine for Quickly Mounting Compelling Cross-Platform Experiences

By Fred Dawson

September 16, 2016 – Canadian software innovator You.i TV is making waves in the UI space with a highly flexible development platform that Turner Broadcasting, Sony Crackle, Canadian SVOD operator showmi and a growing number of other players are using to expedite their efforts to raise the bar on user experience.

Turner, for example, has begun to standardize application development on the You.i platform, following successful utilization with apps for TNT, TBS and Turner Classic Movie’s forthcoming direct-to-consumer service FilmStruck. Adding to the momentum,Turner parent Time Warner, Inc.’s investment arm just announced it is leading a $12-million Series B funding round for You.i.

“Delivering video directly to consumers is becoming vital to the media industry, and offering a compelling user interface and app experience is an important piece of this value chain,” says Scott Levine, managing director of Time Warner Investments, who will be joining the You.i TV board. “We were immediately impressed with the You.i TV products, seeing how they create high-quality, unique branded experiences across multiple device platforms, while powering higher engagement rates with users.”

Unlike suppliers of UI solutions that offer MVPDs and other entities fully baked templates complete with recommendation engines and other navigational bells and whistles, You.i gives its customers a toolset that facilitates rapid translation of designers’ visions into practical implementations without the coding hassles that prolong development and prevent full realization of the intended user experience. “We’re enabling the marriage between thinking about what needs to happen in targeting consumers with compelling interfaces and what that interface turns out to be,” says Tricia Cooke, vice president of marketing at You.i.

In so doing, You.i is fueling intensifying competition to win consumers through highly differentiated gateways that convey what’s special about the provider’s offerings in a market awash with me-too viewing options. “You.i Engine brings motion designs to life with pixel-perfect clarity and performance that I’ve never seen before,” says Ann Tebo, director product management at shomi, the OTT venture backed by Canadian MSOs Rogers Communications and Shaw Communications.

showmi has been using the platform to build an immersive multiscreen experience across iOS, Android, Xbox and PlayStation devices. The range of device platforms developers can reach through the You.i Engine also includes Apple’s tvOS, Amazon Fire, Roku, smart TVs, RDK set-top boxes and more, Cooke notes.

As explained by Cooke, the You.i Engine is an app development platform built on the principles of video game engines, which use design-centric, cross-platform code in conjunction with GPUs (graphical processing units) to expedite development of on-screen presentations. Users of the You.i Engine are able to directly export designs created in Adobe Photoshop and animated in Adobe After Effects into a single codebase that conforms the fully realized UI to every device platform in accord with the requirements of each, Cooke says.

“Production people get an already-coded app to work with,” she notes. “This represents a break with the norm where you see designers handing off their work to the tech guys at a brand, who come back and tell the designer, ‘You need to adjust and compromise so we can work with this’ – to the point that you end up with a fraction of what the design called for.”

Freedom from constraints that have prevented realization of the full potential of innovative designs has been a boon to the cross-platform video aspirations of the Canadian Football League, says Christina Litz, the CFL’s senior vice president of content and marketing. “When we were building CFL Mobile, You.i Engine was the only option that allowed us to realize our vision for the new application without having to compromise on any detail for the fans,” Litz says.

The project took about three months, Cooke notes. “They’re a small league that has been able to do what much bigger entities are doing, which is to use technology to define themselves in the online market.”

You.i customers are finding that once they’ve integrated with the You.i codebase to enable cross-platform rendering of a new UI for one of their brands, they can re-use the codebase for UI development on other brands with minimal recoding. A case in point, Cooke says, is the Canadian content aggregator Corus Entertainment, which leveraged the You.i Engine to create building blocks for their TV Everywhere app, including front-end design, interactions, business logic and back-end integrations, to deliver highly diverse experiences on four core brands across four device platforms. “They were able to launch three brands within six weeks of the first launch,” she notes.

Following its successes with Canadian entertainment outlets, You.i’s engagement with Turner marks an expanding involvement with U.S. entities, including network service providers. “Most of our work now is in U.S.,” Cooke says, noting conversations there have gone from “academic exercises” to commercial dialogs leading to RFPs.

“Service providers are interested in pursuing the OVP (online video publishing) model,” she adds. “That’s where our customers are going.”

You.i has developed two additional avenues for engaging customers beyond its original approach of directly assisting them with the integrations and other aspects of bringing the You.i engine into their workflows. Now it offers the You.i Engine as a product that can be implemented in-house by customers’ DIY teams, and it works through channel partners like EPAM and Valtech that have integrated the software into their solutions. Valtech, for example, is the channel partner You.i is working with in the CFL engagement.

“Our channel partners are helping us to expand our reach globally,” Cooke says. “There’s no point of the market we can’t serve.”


Connected Cars Are Catching On With Big Implications for MVPDs

Alan Messer, CTO, global connected customer experience, General Motors

Alan Messer, CTO, global connected customer experience, General Motors

GM, AT&T Lead the Way, but There’s Plenty of Room for Cable Operators

By Fred Dawson

August 5, 2016 – Like everything else in the IP world the business models and opportunities surrounding the connected-car phenomenon are changing at warp speed, signaling that network service providers who may have kicked the Internet-on-wheels tires and passed two or three years ago should think again.

One sign of the fast-changing times is the position occupied by Alan Messer as CTO of global connected customer experience at General Motors. To Messer’s knowledge he is the first to hold such a title in the automotive industry but probably won’t be the only one for long.

“We’re going to see more change in the next five years than we’ve seen in the past 50 in the auto industry,” Messer says, citing use of car connectivity as one of the big four transformative trends underway, which also include the emergence of electric-powered vehicles, autonomous cars and shared use of cars analogous to bike sharing systems now operating in hundreds of cities worldwide.

While tech giants Microsoft, Google and Apple have been evolving various connected-car strategies for several years, AT&T, by far the leading player among network service providers, has established a dominant position through its Drive service. In fact, AT&T’s success raises the question of whether there are meaningful opportunities left for other NSPs, especially cable operators who lack the LTE network support that AT&T has leveraged to gain partnerships with 19 car brands worldwide.

“Absolutely, there’s an opportunity for the local MSOs and MVPDs in general, big time,” says analyst Allan McLennan, who heads up the PADEM Group. “As we advance with network demands, the ability to create new customers and models for connection with over-the-top services of media and entertainment is a natural extension for MVPDs.”

While, according to Parks Associates, as of the start of 2016 only 16 percent of the light vehicles on the road in the U.S. had built-in mobile connectivity to the Internet, a recent study conducted by AT&T and Ericsson found that three out of four consumers consider connected-car services to be an important feature in their next car purchase. Most car manufacturers have at least some models with mobile connections to in-car networks that support a variety of applications.

GM OnStar, the oldest such offering from a car manufacturer, leverages built-in connectivity to support a variety of monthly subscription packages on top of a free basic plan that includes connectivity with data usage surcharges. OnStar premium packages, priced at $20 and up, include offerings such as Protection covering crash response, roadside assistance and online advisors; Security with stolen vehicle tracking assistance, ignition block or slowdown of the stolen vehicle and theft alarm notification, and Guidance with “turn-by-turn navigation” travel assistance like hotel booking and “hands-free calling minutes.”

Chevrolet, the first vehicle brand to offer LTE connectivity on all models, reports a high volume of usage since it introduced the connected service in 2014. “Wireless connectivity has proven to be a beneficial technology for many Chevrolet customers, from contractors who use their Silverado as a mobile office to families using their Suburban on a summer road trip,” says Sandor Piszar, Chevrolet truck marketing director. “As our customers increase their usage of the technology, we are able to make it more affordable for them.”

With 2.1 million connected vehicles purchased so far, Chevrolet customers have consumed more than 3 million gigabytes of data, and Chevy in-vehicle data usage continues to trend upward, Piszar says. For example, more than 60 percent of Suburban owners and passengers use their OnStar 4G LTE Wi-Fi hotspot, with Tahoe and Traverse hotspot usage not far behind. Chevy has cut its data plan rate for all models in half this summer to $10 for 1 gigabyte per month and has added a 4-Gbyte offer priced at $20.
AT&T’s Drive is a value-added service component to the underlying LTE connectivity it provides to Chevy and other brands. Auto makers can choose the services and capabilities that are important to them to complement in-house offerings like those in the OnStar portfolio or Ford’s FordLink service, most of which so far have been directly related to car operations.
As of Q1 2016 about eight million cars were embedded with connectivity to AT&T’s network, including more than 50 percent of all new connected passenger vehicles in the U.S., according to Chris Penrose, senior vice president for Internet of Things (IoT) solutions at AT&T. “It’s incredible to think back to ten years ago when we first started talking with automakers about connecting their cars,” Penrose says. “The interest we are seeing from carmakers and consumers around the world says this revolution is here to stay.”

Beyond providing connectivity and applications of interest to the car companies, it’s clear AT&T’s larger goal is to drive its position in the IoT marketplace. With its launch of a global SIM platform for cars, AT&T has created an environment designed to draw individuals and IoT equipment makers to its network to foster expansion of an IoT ecosystem tied to its brand.

Consumers with Drive service can remotely interact with their AT&T Digital Life smart home service, synching automation of actions such as setting house temperatures, locking and unlocking doors, turning lights on and off, running connected appliance, etc. with the use and location of the car. To help drive the car-related aspects to applications in that ecosystem the company operates the AT&T Drive Studio in Atlanta, which serves as a working lab and showroom that automakers and third parties can use to build and exhibit innovations.

AT&T isn’t limiting its expansion of IoT into the vehicle environment to owners of connected cars. The company also offers a Wi-Fi in-car plug-in device supplied by ZTE called Mobley, which its mobile customers who don’t have connected cars can use to distribute data accessed from their phones to screens and other devices in the car. AT&T unlimited data plan customers who don’t have connected cars as well as those that do can add the ZTE Mobley vehicle to the plan for $40 per month.

Verizon, which has very limited penetration as a supplier to connected-car manufacturers, has also been offering mobile customers an in-vehicle Wi-Fi hotspot service, which it calls Hum. Both AT&T and Verizon are making these hotspot modules available to owners of cars built in 1996 or later, which come with government-mandated On-Board Diagnostics (OBD) ports. OBD provides open access to a vehicle’s diagnostic and performance information, allowing third parties to layer value-added services onto the built-in vehicle platform using smart phones as the front-end interfaces.

But, so far, Verizon’s connected-vehicle services play has been primarily focused on the enterprise fleet management and telematics market. The company recently acquired Telogis, a leader in the vehicular telematics market, to bolster its position in this space.

The upshot of all these developments is that a space has been opened for cable operators to begin leveraging their role in entertainment and information services in ways that suit other needs of connected-car owners that haven’t been the focus of the mobile providers. IoT apps, too, could be part of their play, depending on how deeply they get into that side of the business. Here the role, until such time as they become mobile operators themselves, would be as OTT providers leveraging the underlying connectivity provided by mobile carriers.

“The connected car to me is just another connected exhibition arena for media and entertainment,” says PADEM Group’s McLennan. Indeed, in-car video entertainment is now becoming a practical option with screens positioned for passenger viewing in many new models, creating an environment where cable operators will want to engage with subscribers rather than leaving them to rely solely on OTT video providers.

Cable companies’ local market presence is especially advantageous for creating business models that tie in with car dealers’ needs to generate value-added revenue in the increasingly tight-margined car sales business, McLennan notes. “The dealer revenue model is primarily built off of services – service department, financing, parts, etc., more so than the actual sale of a new car,” he says. “This lends itself to an entirely new perspective and potentially lucrative business model for both the MVPD and the dealer.”

GM’s Messer agrees, noting “the living room on wheels potential is huge.” In fact, he adds, it could be explosive if and when autonomous vehicles take hold.

Beyond simply providing car access to existing content, there’s also an opportunity to build content directly related to the driving experience, such as location-based travelogues that can be curated from the cloud to fit the specific itinerary of a vacationing family. Location-based content with advertising support is something GM is looking at. “We want to enable those services,” Messer says.

Car-specific subscription channels could also be part of the MVPD strategy with a revenue-share for car dealers, McLennan suggests. “A media offering potentially with exclusive packages (front seat/back seat) has strong potential, especially with an already established [dealer] sales channel,” he says.


Vendors Give Broadcasters Tools Enabling OTT Quality Assurance

Kurt Michel, senior director of marketing, IneoQuest

Kurt Michel, senior director of marketing, IneoQuest

IneoQuest, Tektronix and Edgeware Mitigate Liabilities for Companies that Don’t Own Networks

By Fred Dawson

May 5, 2016 – OTT providers of high-value video content at last have access to quality-assurance solutions suited to assessing whether they’re achieving the end-to-end performance that’s essential to forging ahead with next-generation direct-to-consumer agendas.

Until now, it’s been hard to execute business models predicated on monetizing online delivery of HD- and UHD-caliber content with confidence that goals respecting user experience and fulfillment of advertising commitments are being met. Without that confidence, it’s hard to make a case to consumers and advertisers that online alternatives to traditional pay TV don’t represent a compromise on quality and value.

One vendor that has gone to great lengths to cover all the bases in enabling distributors who don’t own networks to identify and proactively address issues that could damage their value propositions is IneoQuest. Judging from demonstrations of its new FoQus platform solutions at the recent NAB Show in Las Vegas, it appears the company has delivered on commitments to solving the conundrum of how to deliver OTT services with a managed-network level of quality assurance.

NAB also brought to light advancements in this direction on the part of other vendors that have been focused on delivering actionable performance metrics in the direct-to-consumer (DTC) space. Especially noteworthy were new solutions introduced by Tektronix and the feedback from Edgeware regarding the receptivity its previously introduced performance measurement platform is getting from content providers.

Edgeware TV Analytics

Edgeware’s monitoring and analytics product suite, introduced in September as part of the company’s positioning of its solutions for the broadcaster side of the TV ecosystem, has proved to be a point of primary interest to potential customers in this segment, says Johan Bolin, vice president of products at Edgeware. “When we get into meetings with these media companies, the first thing they want to talk about is analytics,” Bolin says. “If they like what we they see, it opens discussion about our other solutions.”

The starting point for Edgeware’s TV Analytics platform is the ability to aggregate raw adaptive bit rate (ABR) “chunks” into virtual sessions for meaningful analysis that can be performed on streams at points of origin, CDN edge locations and end devices to gauge bitrates and other indicators of quality on a per-session basis. “If you see the bitrate is way down from the optimum you know you have something happening in the network that’s degrading performance,” Bolin says. “We look at the time stamp requests on client requests to the server to see how buffering is being managed. If the gaps in requests are unnaturally long, that’s another signal that there’s a problem.”

Without owning the access network, it’s hard for broadcasters to pinpoint the causes of the problems, but at least the Edgeware TV Analytics platform can let them know there’s an issue. Moreover, the platform makes it possible to tie viewer behavior with network behavior in real-time and over long durations to provide extremely granular information on what content is being watched, at what frequency, from which location, on what device type and whether it’s live or VOD, Bolin notes. By correlating data from multiple sources, the platform can look into the relationship between network performance and customer satisfaction, viewing behavior across geographies or screens, viewer mobility, screen swapping and other dynamics, he says.

Edgeware customers, tapping a pool of pre-designed widgets for specific analytic applications or working with Edgeware to design new widgets, can create their own interfaces to collect and analyze information for making actionable decisions, Bolin adds. They can filter the information by data, distribution, format, devices, ISP, etc. and present the data as a variety of charts, tables or geographical maps.

The range of possibilities depends on the volume of data available, which depends on what can be pulled from different types of end devices and the degree to which third-party CDN suppliers are willing to share data. Edgeware is developing open APIs that will make it relatively easy to integrate such feeds into the platform, Bolin says.

Advertising, too, is an important target for Edgeware TV Analytics. “We can see what ads played out, whether a session was broken, whether what we’re seeing matches the volume of ad renderings scheduled by the ad decision server,” he says. “We can map all ads geographically to see the frequency of ad playouts in different areas.”

New IneoQuest Solutions

Taking broadcasters even farther into the process of identifying and rectifying impediments to the quality of consumer experience, the new IneoQuest FoQus platform basically eliminates the disadvantages of being a virtual MVPD compared to a network-based MVPD, says Kurt Michel, senior director of marketing at IneoQuest. By providing visibility and advanced analytics intelligence across the entire video distribution value chain, FoQus allows any OTT provider to ensure the delivery of a reliable, consistent viewing experience to consumers, Michel notes.

“We’ve re-invented everything we’ve done in the traditional network service provider space to bring those capabilities into the open Internet environment,” Michel says. “We’ve created a portfolio of products that gives you that same level of end-to-end access to information even though you don’t own the network.”

That’s a tall order, but judging from the demonstration of the FoQus platform in action on a live video feed at NAB, IneoQuest has met the challenge. In the demo, the system analyzed the availability of the asset, its quality in the pre- and post-encoder phases, what the quality was coming out of the CDN and what the status was at the viewing point, all of which pointed to the convention center’s Wi-Fi system as the cause for a low bitrate. The platform also examined how many people were viewing and determined what the quality was for viewing on phones and other devices.

All the information the administrator sought was instantly presented on the UI. Rather than delivering just the raw quality metrics based on PSNR (Peak Signal-to-Noise Ratio) or MSE (Mean Squared Error), the system employed IneoQuest’s iQ-MOS real-time scoring technique, an on-the-fly execution of the Mean Opinion Score method of grading video. MOS grading relies on algorithms that reflect the actual responses of the human visual system as prescribed in guidelines set by the ITU’s BT.500 video assessment recommendations.

These assessments were performed by the FoQus platform’s modular iQ Engines, which collect, correlate and process data to provide the comprehensive big-picture views that video providers require to manage their business. Different iQ Engines are offered based on the area of analysis they cover – audience analytics or operational analytics – so that customers can meet their current needs and obtain new modules as needed later, Michel notes.

IneoQuest is also offering a subscription-based, cloud-hosted FoQus|Event service to address streaming events over the Internet, he adds. This service, which leverages Amazon EC2 cloud infrastructure to dynamically position FoQus platform elements as needed, can be used both to test the video distribution system prior to the actual event and to monitor the performance, quality and availability when the live event streaming occurs.

The data processing performed by the iQ Engines utilizes data drawn from the network by the FoQus acquisition elements, which are offered as modules dedicated to specific segments of the network. These include Inspector, which measure the quality of content preparation before it enters the network, and are well-suited to headends, origin services infrastructure and video testing labs; Surveyor, which measure network performance and content availability across the Internet, CDNs and fixed and mobile access networks, and Spectator, which measure the playback quality and the viewer’s response to it with metrics that include the selected content, session time, network type and provider, and key quality metrics such as startup time, bitrate and re-buffering.

Along with allowing non-network owners to scrutinize quality performance at all points in the distribution chain, FoQus allows distributors to determine what the optimum bitrate settings are for any given locality for any type of device, which is to say the minimum bitrates required to hit a given MOS target. For example, the distributor can assess what the bitrate needs to be to hit a MOS score – typically between 3 and 4 – that signals good quality has been achieved on a big screen display. Then bitrates can be set for smaller screens where similar MOS scores can be achieved at lower throughput owing to the lower resolution of those screens.

“We probe for information and create alarms,” says Peter Dawson, chairman and co-founder of IneoQuest, who conducted the NAB demonstration. “We give people the tools they need to harness how many people are impacted by network problems.”

With such information in hand broadcasters have leverage over CDN suppliers, peering exchange systems and others they contract with to rectify the problems. Or they can make adjustments in the bitrates generated from transcoders at points of origin under their control to get to the MOS levels they’re looking for.

“Once you have that score for the content you can report the quality value to the system, which will then tell you where the bitrate you’ve chosen doesn’t deliver that value and why,” Dawson explains. “You can monitor this channel over time and, when the MOS goes below your chosen threshold, the system reports that up to the platform to analyze the problem.”

To get at what’s happening beyond the point of origin, customers can position the platform at the edge of the network, running as software on commodity servers where the customer has their own physical presence or can negotiate space for the software on servers operated by a CDN provider. Or the customer can place IneoQuest appliances in strategic locations to emulate what’s happening at the regional CDN level. Similarly, the FoQus platform can gather real-time information from end user devices whenever possible or rely on strategically placed end-device emulators.

Tektronix Solutions for Broadcasters

Rounding out the latest innovations aimed at ensuring high-quality DTC services for broadcasters and other OTT distributors is a new platform from Tektronix designed to provide testing and diagnostics support for broadcasters’ transition from SDI-based to IP-based production and post-production infrastructure. Tektronix, a leading supplier of end-to-end video quality assurance technology for network service providers, has been focused on quality control (QC) needs of broadcasters for managing video assets through to playout.

For example, as previously reported, Tektronix has addressed the QC requirements faced by suppliers of file-based on-demand content to the proliferating ecosystem of OTT outlets. The Tektronix post-production solution set includes a highly automated QC platform along with a multi-protocol playback tool enabling highly granular monitoring of files’ conformance to OTT distributors’ ABR and other specifications.

In its latest innovation targeting broadcasters, Tektronix is offering what it says is the first hybrid SDI/IP media analysis platform, dubbed “Prism.” The platform diagnoses and correlates both SDI and IP signal types and helps quickly identify the root cause of the error, whether it is in the IP layer or in the content layer, says Charlie Dunn, GM for the Tektronix video product line.

“The transition from SDI to IP is happening in phases, where there’s a need for a test and monitoring system that can provide visibility into new kinds of problems to allow engineers and operators to keep things on a consistent path as they manage hybrid facilities,” Dunn says. “We’re enabling broadcasters to ensure the new systems they’re putting in place will deliver all video content, including 4K UHD with HDR (high dynamic range) enhancements, from post production into playout at the quality levels they expect.”

Dunn notes that Tektronix has expanded its quality assessment capabilities in the broadcast domain to take into account the challenges posed by HDR. “We’re addressing issues like color grading that come into play with HDR,” Dunn says. “For example, there’s a question of shading in live production where you have to make sure that when you shade for HDR, the content isn’t washed out when it’s played on an SDR (standard dynamic range) display. We’re doing research with customers to determine how to put this type of analysis into our solutions.”

The success of Tektronix in addressing quality-assurance needs of broadcasters was underscored with news that NBC Sports Group’s Olympics division will use the vendor’s equipment to handle audio and video testing and live distribution quality monitoring for its production of the games this summer in Rio de Janeiro. The Tektronix equipment, operating across production, post production, transmission and distribution workflows, includes the firm’s WFM8300 Waveform Monitor, which supports numerous UHD formats and ITU-R BT.2020 wide color gamut, Dunn notes.

Terry Adams, vice president of engineering at NBC Olympics, notes that, for the first time his division, which has used Tektronix equipment in its last eight Olympics productions, will be using the vendor’s Sentry probes to monitor distribution performance. “We will be utilizing 12 of the Sentry units located across the country to monitor the hundreds of live production and distribution streams generated in Rio,” Adams says. “Tektronix has incorporated many new features based on requirements we identified during our coverage of the London and Rio Games.”

Clearly, a key element essential to the transformation of the TV business in the OTT era is now in place. The risk of flying blind with no recourse to identifying, let alone rectifying problems as they occur in real time has been eliminated from the strategic planning equation.


Getting Real about Virtual Reality

In a galaxy far, far away.

In a galaxy far, far away.

TV Universe Begins to Stir as Technology Gains Momentum

By Fred Dawson

February 1, 2016 – As the hype surrounding virtual reality moves into mainstream entertainment circles network service providers face still another situation where they have to weigh how far to go with allocation of human and infrastructure resources toward a vaguely defined service opportunity.

With the failure of 3D to get off the ground still fresh on network operators’ minds and the rollout pace of services supporting 4K UHD and High-Dynamic-Range (HDR) formatted content still up in the air, network operators may see little reason to dive into serious consideration of VR at this early stage. But they can’t afford to be too late to the party if nascent but accelerating attempts at making VR part of the OTT app mix catch hold.

Decades of VR development and misplaced expectations have given way to an unprecedented burst of enthusiasm buttressed by multiple research projections and a new generation of headsets, production tools and application concepts that are drawing the engagement of major players. These range from the massive $2-billion bet made on VR by Facebook with its purchase of VR technology developer Oculus in 2014 to Google’s aggressive approach to building a mass market through its Cardboard initiative and new YouTube programming to toe-in-the-water activities on the part of entertainment giants such as Netflix, Disney, Discovery Communications, the BBC, DirecTV, Comcast and many more.

A recent Goldman Sachs Group report predicted an $80-billion global market for VR and AR (augmented reality) by 2025 with $45 billion going to hardware sales and the remainder to software applications. On the software side, the $7.4-billion share Goldman Sachs sees going to video entertainment ($3.2 billion) and network-delivered viewing of live events ($4.1 billion) is second only to the $11.6 billion projected for the VR games market by that year, with healthcare, engineering and defense leading in other categories singled out for VR and AR applications.

While Goldman Sachs says AR software revenues will account for 25 percent of the $35-billion software pie, figures projected for the video and event as well as gaming software markets are focused on VR apps. In fact, while Goldman Sachs says VR and AR have the “potential to become the next big computing platform,” it predicts the far greater impact will come from VR, “given VR’s technological progress and momentum” and the fact that AR has “more technological hurdles to overcome, including challenges in display technology and the real-time calibration and processing of the real-world physical environment.”

Goldman Sachs is not alone in predicting a big future for VR. The same week in early December that its report came out, Macquarie Bank analyst Ben Schachter issued an advisory note that echoed Goldman’s core point: “We continue to believe that VR/AR is poised to be the next computing platform,” Schachter says. “And like the transition from desktop to mobile, it will be disruptive.”

While Schachter and others view 2016 as the year that VR surges to an unprecedented level of consumer adoption, in the grand scheme of things the ramp-up in the near term will be relatively slow. “Less will happen in two years than you’d think,” he says, “but more than you can possibly imagine will happen in the next 10….[O]nce these devices begin to get into consumers’ hands and developers launch content that moves beyond the ‘wow’ moment and into uniquely, useful experiences, it will be clear that entertainment, communication and many enterprise functions will change dramatically over the coming decade.”

For 2016, the Consumer Technology Association (formerly Consumer Electronics Association) estimates unit sales of VR headsets such as Oculus Rift, HTC Vive, Sony PlayStation VR and Samsung Gear VR will reach 1.2 million and generate $540 million in revenue, marking a 440 percent increase over 2015. Globally, Deloitte Global, in another study released in December, projects 2.5 million VR headsets sold in 2016 will generate $700 million in revenue with another $300 million generated by sales of ten million game copies selling at anywhere from $5 to $40 per unit.

“We do not expect VR to be used to any great extent in television or movies in 2016,” Deloitte says, noting the absence of content or even much in the way of commercially viable production gear. “By the start of 2016, we anticipate a small range of suitable cameras may have been launched onto the market, but the cost of purchasing or renting professional grade devices may initially be prohibitive for many projects.”

Equally if not more significant is the fact that there’s a steep learning curve ahead for VR filming, where the need to capture the 360o perspective means cameras will have to be invisible from all angles and under automated control to keep crew members out of the picture. Sports poses an even bigger problem, given that cameras in the field could obstruct player movement.

Handling the massive file sizes will be a big issue in post-production. “One production level camera features 42 cameras capable of 4K resolution,” Deloitte says. “This captures a gigapixel image (about 500 times the size of a standard smartphone image), and shoots at 30 frames a second. One subsequent challenge of capturing images at this level of resolution will be determining how to store, transmit and edit the files.”

The biggest factor driving the expectations for 2016 has been the improvement in headset or what in industry jargon is known as HMD (head-mounted device) technology. As noted by Deloitte, the market will consist of two types of VR headsets: fully featured systems and mobile-optimized systems. “Full feature devices will likely be designed for use with either the latest generation games consoles or PCs with advanced graphics cards capable of driving high refresh rates,” Deloitte says. “‘Mobile VR’ incorporates a high-end smartphone’s screen into a special case, enabling the headset to fit more-or-less snugly on the user’s head.” Samsung’s Gear VR, priced at $99 and powered by Oculus technology, is a leading example of this VR category.

The high-end device lineup is led by the Oculus Rift, pre-order priced at $599 with a late March rollout date; the HTC Vive, slated for April rollout with price yet to be announced, and the Sony PlayStationVR, due out later in the year. All three have been widely reviewed in prototype mode, most recently at CES 2016, with enthusiastic feedback from gaming specialists.

For example, Gizmag reviewer Will Shanklin found all three compelling on a visit to CES. “All three give you the basic VR experience of transporting you somewhere else,” Shanklin writes. “If any one of them were the only virtual reality that existed, we’d still be excited about this emerging frontier.”

While there are some consequential technical differences, the overarching consideration for buyers is availability of good content, which gives the edge to Oculus in Shanklin’s opinion. Oculus, in addition to having the backing of Facebook and the broadest range of content already in play, has been drawing developers to the platform through its Oculus Studios with plans to introduce 20 more VR games this year. Moreover, Oculus is teamed with Microsoft’s Xbox One, making the game console an alternative to high-performance PCs as the processing platform for playing games, which is sure to spur additional game development.

Other reviewers have given the content nod to HTC Vive in light of its deal with Valve, a leading video game maker which is leveraging the headset to support an integrated hardware-software gaming system it’s calling SteamVR. And, of course, in the wings is PlayStation VR, whose backers promise to have a boatload of content available with launch.

These headsets far outperform previous VR devices with extremely high-resolution displays – one 1080 x 1200 OLED (Organic LED) display for each eye with a 90 Hz frame rate in the cases of Rift and Vive and a 120 Hz rate with PlayStation VR. This means, with the exception of Oculus Xbox users, consumers using Rift and Vive must also have access to a high-end PC to run the headsets. Oculus is offering a complete package with appropriate PC and headset priced at $1,499.

Reviewers offer varying assessments of how the systems compare from a technological performance standpoint, but Vive seems to be winning highest praise for enhancements that include motion tracking capabilities utilizing two wireless infrared cameras placed at the corners of a room to interact with the headset’s 37 sensors. As a result, scene changes tied to head and bodily movements across a room create the sense of physical exploration in the VR experience. The system has a camera built into the headset to provide a user-activated view of the real surroundings that allows the user to avoid stumbling into things.

Writing about the Vive for online publisher Pocket-Lint, reviewer Stuart Miles notes Vive’s full movement capabilities offer “a more comprehensive range of possibilities than many other units.”  He also is impressed at “how smooth the experience is. Graphically there’s no sign of lag, no delay as you move your head, hands or body…There’s no flicker and the headset is pretty comfortable too, with the soundtrack being completely enveloping.”

But it’s a measure of how far VR veterans have gone in adapting to what others might find off putting that a reviewer can rave about an immersive VR experience that includes “an umbilical cord of cables coming out of the back” of the headset, not to mention a headset that looks like “a giant scuba diving mask” with a headband “more akin to a gas mask fitting…than skiing goggles” that serves to reduce “some of the front weighting of the unit.”

As the Deloitte report cautions, “Any company that is considering VR in any regard should have a careful look at the likely addressable market. Recent breakthrough technologies that required consumers to wear something on their face have not proven to be mass market successes. While VR headsets may sell better than smart glasses or 3D TV glasses, also consider that using the technology may require a set of behavioral changes that the majority of people do not want to make.”

CableLabs, which in its recent re-organization has listed VR as a major area of ongoing interest, took pains this past year to gauge consumer response to the VR experience by bringing a cross section of non-users into its facilities for a test run. Surprisingly, the response was “overwhelmingly positive,” reports Steve Glennon, principal architect for CableLabs’ Advanced Technology Group.

Writing in a recent blog, Glennon says, “[W]e were surprised by how few expressed any discomfort and how positively regular people described the experience.” Fifty-seven percent of the visitors said they “had to have it,” while 88 percent could see themselves using a head-mounted display within three years. “Only 11% considered the headset to be either uncomfortable or very uncomfortable,” Glennon notes, adding that “96% of those who were cost sensitive would consider a purchase at a $200 price point,” which is well within the range of mobile VR headset like Samsung’s Gear VR.

But content availability is a big issue. “[W]e asked what would stop people from buying a virtual reality headset,” Glennon says. “High on the list of items was availability of content. Setting aside VR gaming, people didn’t want to spend money on a device that only had a few (or a few hundred) pieces of content.”

Judging from recent developments, a dearth of non-gaming content may not be a problem for long. A new survey of Hollywood content creators jointly sponsored by CTA and NATPE (National Association of Television Program Executives) finds most believe VR represents a game-changing method of storytelling. The consensus of 16 executives interviewed in depth for the study was that, beyond gaming, the strongest genre meriting VR development is horror. Sports and concerts were also cited as promising areas of development.

But respondents also made clear they recognize serious hurdles must be overcome before a significant tide of non-gaming VR content emerges. These include the need to generate a viable model for content creation, including a determination of the endurance cap for sustained viewing, with a clear pathway to monetization.

“The future of VR is dependent on quality content and, with this study we wanted to provide a more comprehensive look at Hollywood’s attitudes on the many opportunities and challenges this technology faces,” says NATPE president and CEO Rod Perth. “This study presents a snapshot of the types of genres that could be adapted to this dynamic technology but also offers a realistic picture of its limitations.”

CableLabs’ Glennon cites three VR content factories that are acting to fill the void, including JauntVR, which recently secured $66 million in investment funding led by Disney, Creative Artist Agency’s Evolution Media Capital and China Media Capital; Immersive Media, whose VR productions include an American Express-sponsored Taylor Swift concert video, and NextVR, which, Glennon notes, “seemingly wants to become the ‘Netflix of VR.’”

NextVR has caught the enthusiasm of Comcast Ventures, which joined with other investors in a recent $30.5-million financing round. “As the preeminent company that can transmit live high definition virtual reality over the Internet, NextVR’s lens-to-lens system captures and delivers immersive experiences for marquis live events,” say Comcast Venture managing director Michael Yang and principal Gavin Teo in a recent blog post.

Recent big TV network forays into VR include a VR-enabled version of a CNN-sponsored Democratic primary debate, Discover Communications’ Discovery VR, a series of short-form VR experiences in nature, and the BBC’s creation of a VR version of its Strictly Come Dancing TV show. Netflix has broken into VR with an app for viewing through the Samsung Gear VR that puts the viewer in front of a virtual giant screen located in different settings such as a ski lodge to watch any Netflix movies or TV episodes.

Among pay TV distributors, DirecTV, focusing on mobile VR development at its digital innovation lab, has taken the lead with a VR app that takes viewers inside the boxing ring to experience a recorded fight. In a recent interview with Multichannel News, DirecTV senior vice president of digital entertainment Tony Goncalves was quoted as saying, “We’re not sure how VR will evolve and shift from flat, non-immersive TV experiences to the level of almost putting consumers inside a movie, but for pay TV and video content providers, it’s very important to explore new ways of seeing content, and VR falls into that category.”

Perhaps the biggest center of early network-delivered VR activity can be found at Google, which has over 14,000 “spherical videos” running on YouTube, much of it user-generated content. “[With] VR (virtual reality) we’re starting to see some of the first signs of really incredible storytelling,” says Ben Relles, head of comedy and unscripted at YouTube.

Google is also heavily engaged with developers targeting Google’s $20 Cardboard headset with VR apps delivered over mobile links. For example, The New York Times has distributed Cardboard HMDs to its 1.1 million subscribers in advance of releasing the first in a series of VR short films, The Displaced, which is a documentary following the lives of three refugee children fromSouth Sudan, eastern Ukraine and Syria.

While all this activity adds up to a drop in the bucket next to mainstream content development for 4K UHD, there’s no reason at this point for pay TV distributors not to take seriously the possibility that before too long they may be packaging VR services for delivery over their broadband pipes. This will represent another strong incentive for building out gigabit access networks, given that, by CableLabs’ estimate, a single VR stream suited for use on high-end systems like the Oculus will consume between 150 and 200 Mbps of bandwidth.

“This is not just hot and sexy, a passing fad,” says CableLabs’ Glennon. “It has massive potential to transform lots of what we do, and we can all expect incredible developments in this space.”


ADB Scores Wins in Switch To Solutions for IP Services

Gerald Wood, CMO, ADB

Gerald Wood, CMO, ADB

Cloud Middleware, Support for Connected Apps Fill Gaps in B2C and B2B Segments

By Fred Dawson

October 9, 2015 – In another reflection of how profound changes in premium video strategies are impacting suppliers, ADB, known for its advanced set-top boxes, is making waves with a new portfolio of wide-ranging software solutions for the TV and Internet-of-Things markets.

“Over the last couple of years we’ve seen declining margins on hardware amid a lot of tough competition as boxes have become more commoditized,” says ADB CMO Gerald Wood. “We either had to stay with that strategy or move to adding more value to our operations. So we shifted our focus to the software side.”

The result is a new set of application-specific products and services packaged as “Connected Solutions” buttressed by a cloud-based device-centric middle platform that acts as a transition layer between Connected Solutions, industry platforms and protocols. This “ConnectedOS” serves to simplify integration, accelerate speed to market and reduce the costs of delivering connected services, says ADB CEO Peter Balchin.

“This is a new chapter for ADB,” Balchin says. “We believe that in an age of Internet connectivity there is a need for fast, reliable and cost-effective solutions that ensure consumers and businesses are always connected.”

The strategy is already beginning to bear fruit, resulting, for example, in a collaboration with Polish satellite pay TV provider Cyfrowy Polsat aimed at integrating ADB’s Connected Solutions and Personal TV software with the MVPD’s in-house manufactured set-tops and other devices. “Our partnership with ADB will help us to give our customers an enhanced TV experience that will meet their TV anytime, anywhere needs,” says Cyfrowy Polsat CTO Dariusz Dzialkowski. “ADB is a renowned global player with great local knowledge.”

Indeed, Wood notes, Polish universities have been an important recruitment source for ADB in building its software expertise. Now the guiding light for ADB isn’t so much about capitalizing on the next hardware advancement in technology,

“We’re taking a different approach by following market developments to determine what we need to do,” he says. There are needs aplenty, he adds, including support for “scalable convergence of TV and mobile services, managed devices in the home, personalized EPGs, multiscreen, second screen, cloud DVR, catch-up and OTT affiliations.”

That strategy has already paid off handsomely in the B2B realm where modules from ADB’s Commercial Video Solutions (CVS) portfolio have been deployed by major U.S. MSOs to crack the hotel market. Used with a client application running on smart TVs and set-tops to render hotel-specific UIs, the network-agnostic platform supports linear and VOD TV and OTT content as well as local advertisements and in-house promotions of hotel services, Wood says.

“We’ve been tremendously successful with CVS in the U.S. where large operators, including Time Warner Cable, Cox and Bright House, are using the platform to offer solutions to hotel groups,” he notes. “Through operators in North America and elsewhere, our technology is now operating in some 200,000 hotel rooms.”

Another market ADB is targeting with its OSConnect middleware is the Internet-of-Things (IoT). The company’s efforts there as well as its ability to marry the legacy and OTT video environments were greatly enhanced with its 2010 acquisition of Pirelli Broadband, a leading supplier of broadband gateways, fixed/mobile convergence devices and broadband systems management solutions to telecom operators in Europe and Latin America. “We’re now able to better understand the broadband Internet featuring side of the business and to bridge between broadcast and broadband,” Wood says.

The IoT product suites target both the B2C and B2B markets, says Jamie Mackinlay, vice president of business development at ADB. On the B2C front, ADB is providing operators the means to build a managed multi-app service while its B2B solutions enable Internet-connected sensors to interact with an IT cloud infrastructure to support device-specific applications.

“At a basic level we’re proving an intelligent message bus with business logic designed to interoperate with third parties to pull together the applications and create a compelling user experience,” Mackinlay says.  In the B2B scenario he points to ADB’s engagement with Whirlpool, which is bringing IoT capabilities to washing machines, dishwashers, refrigerators and other appliances.

For example, with washing machines the IoT apps include timing of wash runs to coincide with periods of low electricity costs, identification of defects before they inconvenience users and support for marketing tie-ins that allow detergent suppliers to promote their brands by offering advice on efficient use of their products. Or, as another example, in a household with connected refrigerators and connected fitness machines, the refrigerator can advise residents what to eat based on their treadmill activity. “We’ve learned that Whirlpool believes that with these advances they’re now on par with Bosch and far ahead of other competitors,” Mackinlay says.

On the B2C side, ADB’s Personal IoT solution leverages the OSConnect middleware in conjunction with interoperability with Apple, Android and other ecosystems to enable end-to-end
remote provisioning, assurance and control of connected objects. The platform supports creation of a collaborative service model between network operators and domain providers that can be extended on top of traditional offerings as part of bundled service packages.

ADB is finding many other use cases for its software knowhow, including enabling MVPDs to extend OTT as a managed service and providing pure OTT providers a more robust path in direct-to-consumer offerings. Advertising, too, has become a focus, Wood says, noting that advertising platform provider Invidi is utilizing ConnectedOS for programmatic advertising implementations in Europe. “ConnectedOS is also being used in M&A environments to bring different companies operating systems together,” he adds.


Open-Source Solution Facilitates Use of Browsers in Pay TV Apps

Thijs Bijleveld, senior vice president, sales & marketing, Metrological

Thijs Bijleveld, senior vice president, sales & marketing, Metrological

Metrological Makes HTML5 Enhancements Available to Users of the RDK Framework

By Fred Dawson

October 2, 2015 – Support for enabling robust rendering of cloud-based applications on set-top boxes, long the domain of proprietary middleware solutions, is now available as an open-source option, potentially setting in motion a more rapid transition to advanced services for service providers utilizing the RDK and other set-top frameworks.

The new open-source HTML5 browser enhancements, developed by Metrological and now incorporated into the Reference Design Kit software stack, have already had an impact on advanced service initiatives underway at Comcast and Liberty Global, which, along with Time Warner Cable, are the core MSO partners in Reference Design Kit Management, LLC. The two cable giants see the Metrological solution as a way to enable cloud-based applications to run on set-top boxes with the speed and consistency of native apps while avoiding paying the costs normally associated with such capabilities developed by various cloud middleware suppliers.

“With these enhancements for the RDK, we hope to see HTML5 experiences with the visual fidelity and graphics performance normally reserved for native apps,” says Sree Kotay, executive vice president and chief software architect at Comcast Cable. “Under the hood, we’re constantly looking for ways to enhance our X1 Entertainment operating system, and we believe Metrological’s contribution will have significant impact.”

Comcast plans to trial the STB browser software enhancements for use on its RDK-based X1 Platform later this year. Liberty Global, which currently uses an earlier version of Metrological’s browser, plans to upgrade to these new enhancements for its RDK-based Horizon TV platform.

“At Liberty Global we set out to optimize the browser to deliver a high performance consumer experience with support for rich UIs and HTML5 apps on Horizon TV,” says Balan Nair, executive vice president and CTO at Liberty Global. “This browser capability will enable us to more easily customize user experiences and offer high performance TV services, such as our integrated app experience based on the Metrological Application Platform, to our customers in ways that weren’t previously possible.”

Metrological’s open-source approach to overcoming the drawbacks that limit the usefulness of browsers to pull apps from the cloud into the panoply of feature options available on pay TV UIs parallels many of the techniques used in proprietary solutions. But, says Thijs Bijleveld, senior vice president sales and marketing at Metrological, his company is not looking to compete with middleware suppliers.

Instead, the goal is to create a more favorable environment for operators’ use of Metrological’s cloud-based Applications Platform, which, as previously reported, supports a device- and software-agnostic managed service that includes app store deployment, lifecycle management, service assurance and legal content management. “We’re not a browser company and don’t have a business model for that,” Bijleveld says. Rather, by making its solution available on an open-source basis the company hopes to inspire wide-scale adoption where “the better the browser performs the better our apps perform.”

In making RDK a part of its app outreach strategy, Metrological has been utilizing the RDK Emulator, a testing framework, to allow app developers and operators to remotely develop and test apps on top of the RDK. At the same time, the Metrological SDK (software development kit) is designed to allow developers to create apps for other environments as well, including DVB, OpenCable and IP.

Operators are able to tap into all the apps Metrological hosts, currently numbering about 250 with another 50 slated to be added before year’s end, to enhance their main screen TV content with feeds from OTT sources, Bijleveld says. “Increasingly, our platform is being used by operators to distribute niche content such as ethnic programming and sports,” he adds. “With our scheduling and personalization capabilities, if operators have good recommendation engines and subscriber profiles, we can determine which selection of content will be served to a given end user.”

By offering an open SDK, Metrological makes it possible for a global community of developers to create apps for its platform, greatly extending their reach across multiple operators, set-top frameworks and geographic regions, he explains. At the same time, operators can develop apps that will be made available just for their own use.

In growing numbers, operators are recognizing the limits of relying on apps residing natively on the set-top to expand their service portfolios. “At some point, you run up against the limits of CPU power and memory to support apps,” Bijleveld says. “Operators are finding consumers want personalized OTT offerings, which means the operator has to have access to a great number of apps and the mechanisms in place to search and discover the right apps for each person. This can’t be done natively.”

But as such considerations drove operators to explore the use of HTML5-enabled browsers to manage these apps at the set-top, it became clear something had to be done to improve performance, he adds. “During our deliveries we discovered more and more that the browser and even HTML5 needed farther improvement and optimization to get the best out of our app platform,” he notes.

“At some point Comcast and LGI acknowledged they would use browsers as the basis for launching new services,” he continues. “That’s when we decided to take on the challenge of improving browser performance to match and even exceed user experience compared to native apps running in the STB. Now we’re seeing that the new browser version used by LGI, for example, is performing better than apps that run natively on the same device.”

Metrological has developed the HTML5 browser enhancements utilizing an open-source environment for the browser core known as “WebKit for Wayland.” The enhancements not only enable better rendering of apps and next-generation UIs along with better windows management to control multiple applications and improve control over resources; they deliver these improvements with a smaller software footprint and significantly less memory usage, Bijleveld says.

“HTML5 creates a standard, but it doesn’t solve the issue of having to deal with the fact that every device is different,” he explains. “If you want a uniform experience you need an in-house browser supported by a framework that can run cloud-based services consistently across all devices.” And, he adds, operators need to be able to control which devices have access to any given app.

One of the advantages to using an open-source approach to addressing these challenges is that “you can leverage the rapid increase in innovations coming from contributors to open-source browsing,” he notes. “For suppliers of proprietary solutions, it’s a real challenge to keep up with this scale of innovation.”

So far, Metrological has employed over 130 parameters for caching, graphics rendering and other functions, either developed in-house using the open-source core or applied from already developed open-source contributions, to create a robust HTML5 browser environment, Bijleveld says. For example, much as some proprietary solutions have done, the Metrological solution takes advantage of the capabilities of OpenGL (Graphic Library), a multi-program API designed to interact with GPUs (graphics processing units) in chipsets to activate hardware-accelerated rendering of 2D and 3D graphics.

Operators can use the Metrological enhancements with any off-the-shelf HTML5-enabled browser to create a robust user experience, or they can combine their own enhancements with those provided by Metrological to farther differentiate the user experience. “We’re doing everything we can to enable more open, flexible and robust use of browsers, which is pretty much in line with what the RDK approach,” Bijleveld says.

“These innovative new browser enhancements are a prime example of how RDK member companies are using modern technologies to help accelerate the deployment of new TV services,” confirms Steve Heeb, president and general manager of RDK Management. “Metrological is bringing this contribution into the RDK, and thanks to their efforts, it will be available for the entire RDK community.”


As IPv6 Momentum Builds SPs Look at Exploiting Full Potential

Michael Kloberdans, lead architect, home networking, CableLabs

Michael Kloberdans, lead architect, home networking, CableLabs

Internet of Things, CPE Virtualization Are Key Motivators

By Fred Dawson

August 17, 2015 – IPv6 penetration in public networks and personal devices has reached the point where the true dimensions of its potential are coming into view, going well beyond its role as a solution to IPv4 address exhaustion in the traditional service domain.

The new Internet addressing protocol brings new dimensions to service providers’ efforts to build new services, improve user experience and facilitate the migration to CPE virtualization and other aspects of network function virtualization (NFV). The ability to directly address every user device is especially important to advancing the Internet of Things service paradigm.

Yet, as these possibilities take shape on the not-too-distant horizon, most of the commercial interests for whom the Internet has become a vital lifeline are oblivious. Indeed, amid widespread skepticism that IPv4 will be losing steam anytime soon, the casual observer might assume that after intense drum beating three years ago touting the need to convert to lPv6 not much has happened in the relatively quiet period since.

But over that time the build-out of dual-stack support in broadband networks for carrying IPv6 traffic with IPv4 traffic has accelerated worldwide with U.S. MSOs and telcos in the lead among service providers registering significant volumes of IPv6 traffic. “I hear people often say IPv6 is a failure,” says Jack Waters, CTO at Level 3 Communications. “I take a different view. We haven’t had our [IPv4 exhaust] flashpoint yet, but it’s coming. It could be a train wreck over the next five to eight years, which is going to force the IPv6 issue.”

Even as the American Registry for Internet Numbers (ARIN), which administers address allocations for North America and the Caribbean, reports it is running out of IPv4 addresses to allocate this month, the general perception outside the service provider community has been that with so many unused IPv4 addresses stockpiled by major service providers and the pervasive use of Network Address Translation (NAT) as a means of assigning multiple private addresses to devices connected to a publicly addressed device, there’s been a Cry Wolf quality to alarms emanating out of ARIN and the other regional Internet registries.

“Service providers are ready,” says Jeff Doyle, modern networks evangelist for the SDN (software defined networking) solutions provider Big Switch Networks. “Content providers need to start being ready because of what service providers are doing. More and more native IPv6 traffic is going to be coming into their websites.”

This is borne out by the latest traffic metrics. For example, as the end of 2014 Google reported that five percent of all traffic coming into its servers worldwide used IPv6. This was the double the amount registered a year earlier.

While there’s no single source for measuring IPv6 traffic as a percentage of all Internet traffic, Akamai has become a widely cited barometer for IPv6 traffic growth through its compilation of stats showing the percentage of IPv6 traffic pinging its CDN servers over dual-stack networks. In countries like the U.S. where dual-stack networks are widely deployed, this traffic represents a significant share of overall traffic. By this measure of IPv6 traffic, Belgium has an outsized lead at 34.8 percent, with Switzerland second at 18.9 percent, the U.S. third at 18.6 percent, Peru fourth at 17.3 percent and Germany fifth at 16.9 percent.

As measured by the percentage of IPv6 addresses in any one operator’s dual-stack traffic, the global leader is Verizon Wireless with 70.27 percent; T-Mobile ranks second at 57.55 percent; AT&T third at 52.10 percent, and Comcast fourth at 39.24 percent. In terms of the raw volume of IPv6-addressed requests traveling over dual-stack networks, Comcast ranks as the lead network operator worldwide, Akamai reports.

The dominance of wireless carriers in these stats is in part a reflection of the fact that all smartphones now ship with IPv6 capabilities, which are used by default in networks that support IPv6. IPv6 is also the default address system used with most tablets and all the latest generation computer operating systems, including Windows, Apple, Linux and Android.

The volume of IPv6 traffic will continue surging, notes Erik Nygren, a chief systems architect at Akamai. “Recently we’ve seen some networks rapidly roll out IPv6 to significant portions of their customer base over the course of weeks or months,” Nygren says in a recent blog. “For example, recent roll-outs by the Saudi Telecom Company reached 10 percent over just two weeks. Significant IPv6 deployments in Brazil by NET Servicos de Comunicacao, Global Village Telecom, and Telefonica Brasil have also started in the past few months.”

Now there are signs the IPv6 traffic volume will soon be more than a matter of devices defaulting to the new address protocol in dual-stack environments, he adds. “In the past year, we’ve also started to see announcements of companies moving beyond just dual-stack, with IPv6-only solutions being used to solve real-world problems by companies such as Facebook, Comcast, and T-Mobile US,” he says.

For service providers, a key step in this direction entails providing new customers with IPv6-only residential and commercial gateways to eliminate the need to keep handing out IPv4 addresses and to facilitate use of IPv6 addresses for additional, newer devices that sit behind those gateways in the home rather than continuing to rely on NAT-based private addressing. In Comcast’s case, this is the strategy the MSO is following with its Xfinity X1 and Xfinity Voice platforms.

Initially, such strategies will rely on the dual-stack infrastructure where both IPv6 and IPv4 are natively routed through the network. But this keeps alive the support for IPv4 in the routing infrastructure and the use of Large-Scale NAT (LSN) solutions to handle a lot of processing that wouldn’t be necessary if the network were operating in native IPv6 mode.

For example, the LSN known as NAT444 was devised to enable use of network-based public addresses to assign private addresses to large clusters of end users, thereby extending the usefulness of public IPv4 addresses as the supply runs out. Another role for LSN is to provide address translation where one IP version is tunneled over the other in the access network, as in the case of Comcast’s Dual-Stack Lite, which allows IPv4 traffic from non-IPv6 devices in the home to be tunneled over IPv6 wherever IPv6 gateways are in operation.

As Big Switch’s Doyle notes in a recent blog, “LSNs come with a long list of difficulties.” For example, Doyle says, “Many applications that have been adapted to work through a single (customer-based) NAT44 will break when crossing a double-NAT architecture like NAT444.”

In addition, with multiple subscribers sharing a public IPv4 address, there is likely to be an impact on any systems, external as well as internal, that assume an IPv4 address uniquely identifies an Internet subscriber. Externally, these would include e-mail spam monitoring systems and certain law enforcement systems. Internally, some provisioning systems operate in this fashion to support automated subscriber activation, including self-service.

The list of LSN issues goes on. “The stateful address mapping and heavy logging requirements in an LSN may present a performance bottleneck,” Doyle says. Moreover, “LSNs can be a single point of failure, and are an attractive target for CPU depletion or address pool depletion attacks.” Not mentioned by Doyle but perhaps most troublesome of all, the double NAT architecture fragments the network, which complicates troubleshooting and repairs.

With or without LSNs, allowing both protocols to operate natively on the network creates a great burden on the operator’s routing infrastructure, as noted by Comcast fellow Brian Field in a white paper presented at this year’s INTX cable conference in Chicago. The best solution, Field suggests, is to develop what he calls a “lean” IPv6 infrastructure where the lion’s share of traffic is carried natively in IPv6.

“Within Comcast, dual-stack represents an important step to the deployment, maturation and growth of IPv6,” Field says. “Nonetheless, we are looking at ways for our next generation infrastructure to natively support only IPv6.”

This would greatly reduce the size of the Forward Information Base (FIB), i.e., the number of routes represented by IPv4 and v6 prefixes, “because the network will only need to carry IPv6 routes,” he says. “This translates into much less high-speed memory required on routing line cards, which reduces the cost to build them and the energy they consume.” Field also notes that eliminating IPv4-specific features in the router configurations not only would reduce configuration and operational overhead but also would yield “more efficient code, fewer bugs and fewer unintended operational interactions with our equipment vendors.”

But the IPv4 traffic must be supported, so the challenge becomes designing a means by which IPv4 traffic could be incrementally moved into an overlay onto the IPv6 network over time, gradually reducing the IPv4 FIB burden in the network. As it turns out, Field says, only 4.5 percent or about 25,000 out of the 575,000 routes contained in the IPv4 FIB used with Comcast services carry 99 percent of the traffic.

This leaves open the possibility of starting out small in setting up a lean IPv6 network where just a small volume of the overall traffic volume would be carried in the IPv4 overlay on the IPv6-only network. This overlay would be supported in “IPv4 as a service (IPv4aas)” mode utilizing the SDN (software defined networking) capabilities of OpenStack technology.

Comcast is looking at enabling this architecture utilizing an enhanced implementation of a relatively recent Internet Engineering Task Force (IETF) protocol known as Location Identify Separation Protocol (LISP). As the name implies, this new routing architecture provides a means of separating the location and identity components of IPv4 (and v6) packet headers into two different numbering spaces. The device identity, known as the Endpoint Identifier (EID), and its location, known as the Route Locator (RLOC), are administered through communications between the premises gateway, the routing destination and a DDT (Delegated Database Tree) server, which utilizes a query-based mechanism analogous to a DNS (Domain Name System) server.

This is all in the early experimental stages with a lot of moving parts to be adjusted and other ideas in play as well. But, ultimately, the significance of these activities is they point to an accelerated attempt to open a path to use of the native IPv6 network as a foundation to more advanced IP services. “The goal of this network architecture is to drive towards an IPv6-only feature set; a small FIB is not a specific optimization,” Field notes.

As network operators of every description, especially at the Tier 1 level, move in this direction, they’re leaning on suppliers to get on board by ensuring that every software and hardware solution is IPv6 compatible. All Tier 1 U.S. cable operators, for example, have now mandated IPv6 compatibility across the application and services ecosystem, which requires significant heavy lifting by vendors with testing, debugging, scaling and stabilizing of every implementation.

One goal driving this activity is network function virtualization (NFV) utilizing software-defined networking (SDN) technology. “IPv6 is a key building block for making SDN applicable across the network,” Doyle says. “Dual-stack with SDN is hard to manage.”

Standardized means of simplifying configuration of home routers and other devices in a mixed IPv4/v6 environment are contributing to preparations for introduction of IPv6-based services. The IETF has a draft proposal for what it calls “Homenet,” a routing protocol for extending IPv6 in the home. Farther along in development is CableLab’s HIPnet (Home IP Networking) protocol, which has been the subject of interoperability events over the past year and is now supported by various off-the-shelf premises routing devices.

As described by Michael Kloberdans, lead architect for home networking at CableLabs, HIPnet provides a plug-and-play means of connecting routers to networks and to each other in a mixed v4 and v6 environment. By supporting prefix sub-delegation that puts IPv4 devices under the control of IPv6, HIPnet allows the operator to move from the chaotic unmanageable environment of today’s home device ecosystem to an environment favorable to service innovation, Koberdans says. “Mapping can only happen if a device can be identified,” he notes.

Equally significant, HIPnet is a linchpin to providing operators a foundation for virtualization of CPE, allowing routing, firewall, parental controls and other functions hosted in the cloud to be combined with direct per-device service provisioning, orchestration of home-based subnets and other back-office functions to transform convenience, service flexibility and degrees of personalization.

“Only virtualization can enable the speed of new services and time to market that’s needed,” Koberdans stresses. “The potential synergies between these two technologies (IPv6 and virtualization) are huge.”


RDK Moves to Broadband Gateways Amid Surge in Adoption & Innovation

Jeff Huppertz, VP, marketing & business development, Espial

Jeff Huppertz, VP, marketing & business development, Espial

Espial, Metrological Contribute Market-Moving Technology

By Fred Dawson

June 15, 2015 – With growing momentum the Reference Design Kit initiative spawned three years ago by Comcast is shaping MVPD strategies across the globe and, in the process, building a demand pool that is impacting device designs beyond the initially targeted set-top domain.

While uptake in North America has been slow to expand beyond Comcast and Time Warner Cable, Comcast’s initial partner in the RDK Management, LLC joint venture, there are now other NA players in the pool. Sources report Roger Communications, an RDK licensee, is building its next-generation IP pay TV service utilizing RDK-compliant middleware, and CenturyLink has become an RDK licensee, marking the first move in that direction by a U.S. telco.

Meanwhile, the pace of adoption elsewhere has been surprisingly strong. “RDK is being adopted more aggressively outside the U.S.,” comments Jeff Huppertz, vice president of marketing and business development at Canadian middleware supplier Espial. “MSOs in Europe are more aggressive, and there’s tremendous interest among European telcos as well.”

The platform, acting as a universal SoC adaptor across customer-premises equipment from various suppliers, provides a common method to manage complex video functions such as tuning, conditional access, third party DRM and stream management. The business model, offering licensees access to RDK as a shared source, has been a winning strategy in a field where earlier attempts at enabling a common platform for next-gen services faltered.

“The RDK provides a modern software platform on which pay TV providers worldwide can provide new video services to customers,” says Steve Heeb, president and general manager of RDK Management, which now includes Liberty Global as a venture partner. “The community is starting to take advantage of the benefits of having source code access for their STB software, and the RDK continues to gain traction around the globe.”

Indeed, there are now more than 220 total licensees of the RDK software stack across CE manufacturers, SoC vendors, software developers, system integrators and MVPDs, according to the latest figures released by RDK Management. The MVPD count has gone up 60 percent over the past year to 25, with many more taking advantage of the technology by working with RDK-licensed set-top vendors, middleware system suppliers and systems integrators to launch new services.

The trend line marks a strong refutation of RDK skeptics who until recently doubted the platform would take hold in a meaningful way beyond the JV partners. This is great news for companies like Espial that early on put a lot of resources behind developing for RDK.

Espial Successes

“Three years ago we took a strategic bet to focus on RDK as one of the first licensees,” Huppertz says. “It has paid off for us in a big way.” He cites recent wins with two European and a North American MVPD representing a combined subscriber base of over five million for Espial’s RDK-compliant fourth-generation platform. The solution set includes the recently announced G4 STB Client, the set-top specific extension of the G4 User Experience framework, which Espial announced in 2013 as “a complete out-of-the-box HtmL5 user experience for TV, tablets and smartphones.”

As shown in recent demos, the G4 client software enables the cloud-based system to parse out immense amounts of metadata and to render UI windows and apps at lightning speeds with the option to present them as overlays or alternative screen views with whatever programming is being watched. For example, a Blue Jays baseball app created as a demo for Rogers allows viewers to watch game highlights and buy tickets in conjunction with bringing up a stadium graphic showing seat availabilities.

The HTML5-based G4 operates much faster than RDK applications written in commonly used C coding language and its variants, Huppertz notes. But G4 isn’t entirely dependent on HTML5 execution in the cloud, insofar as it uses another extension available in the RDK stack known as GStreamer to leverage the processing power of the set-top.

“It’s a balancing game between the set-top and the cloud,” he says. ”It’s fast with just the cloud, but with rendering on the set-top it’s almost instantaneous.”

The difference is especially evident in high-volume viewing situations like the recent Floyd Mayweather, Jr.- Manny Pacquiao fight, he notes. “You can have latency and scalability issues where you just can’t render for everybody at the same time if you’re doing it from the cloud,” he says.

“Based on the processing capabilities, the media player determines how much metadata and other content to package on the set-top,” adds Kirk Edwardson, director of marketing at Espial.

Along with enabling the best possible rendering performance, the platform can leverage the set-top hardware to limit the impact of app glitches. “If something happens with an app, we can close it off without affecting the whole user experience,” Edwardson says.

The RDK App Development Framework

Another supplier benefitting from a bet on RDK is Metrological, a Dutch company specializing in publishing platforms for TV and multiscreen app stores which earlier this year announced it was working with RDK Management to provide a framework enabling app developers to remotely develop and test apps on top of the RDK. “This gives us a unique spot with RDK,” says Thijs Bijleveld, senior vice president for sales and marketing at Metrological. “Many people supply device and software frameworks aligned with RDK, but we’re the only ones in the app space.”

“Supporting app developers to create apps on top of the RDK is a very important part of the RDK mission,” Heeb says. “Thanks to Metrological, the RDK can attract an even larger base of app developers that can provide apps to operators.”

Metrological, which was founded in 2005 to provide remote performance monitoring for M2M, airport and subway environments, moved into video when it saw the video industry had the same needs – namely, the ability to run their software on a unified platform without having to develop their apps separately for every type of device, Bijleveld says.  “Our app framework employs an abstraction layer that helps deal with all the complex things, like determining if a device relies on touchscreen or keyboard prompts, that would be hard to do on a native device development basis,” he explains.

Once Metrological moved into the video space, it started working with Liberty Global to provide a single framework to deliver apps across a footprint spanning 17 countries. With that project accomplished, the company has expanded beyond Europe with offices in the U.S. and Brazil.

In the new initiative Metrological uses the RDK and RDK Emulator to allow developers to prototype, develop and test RDK applications remotely on a laptop without using a physical RDK device or set-top box. “For some time we’ve offered a transparent ecosystem that allows developers to create apps that can be launched across multiple networks and countries,” Bijleveld says. “Now that’s possible for developers working in the RDK environment as well.”

The paradigm is a boon to developers and MVPDs alike, he says. “I was talking to a big content owner looking for RDK operator reach in Europe,” he notes. “They needed to develop their app for 20 different networks on five screens. Using our service they can develop once and reach 20 million subscribers on RDK systems.”

Using the Metrological SDK (software development kit) developers can create apps for multiple environments, including DVB, OpenCable and IP as well as RDK, Bijleveld continues. “If a customer wants to build for RDK, great,” he adds. “If they don’t, we provide the same development architecture and solution for the other environments.”

Operators can engage with Metrological to tap into all the apps it hosts to create their own branded app stores to enhance their main screen TV content with feeds from OTT sources, Bijleveld says. “We’ve currently aggregated over 250 apps available for operators to publish,” he adds, noting the aggregation includes self-care, social, sports and game apps as well as OTT sources.

The cloud-based Metrological TV app framework, delivered as a device- and software-agnostic managed service, includes app store deployment, lifecycle management, service assurance and legal content management. “We’ve made it possible to contextually merge different sources onto a single screen,” Bijleveld says. “If you’re offering YouTube content you can automatically offer options that are relevant to what the subscriber is watching.”

Supporting Legacy  Platforms

Along with taking steps to foster more app developer participation with RDK, RDK Management has expanded applicability of the platform on two other fronts – the all-IP broadband device environment and the legacy DVB (Digital Video Broadcasting) domain. According to RDK Management, in-band DVB Service Information (DVB-SI) elements are now available to the RDK community through an advancement made possible with significant contributions from Arris Group.

In-band DVB-SI provides a high degree of data reliability for TV service providers that operate one-way networks or that are in the process of transitioning to two way networks. Specifically, the administrators say, Arris’s new contributions extend the RDK Media Framework (RMF) to support a common method for extracting DVB streams, and also provide an open source reference DVB implementation that can be adopted and extended by DVB-based operators and suppliers. This new RDK capability adds to previously supported DVB components such as teletext and subtitles released last year.

Espial, too, has taken steps to bring its RDK-optimized middleware into the legacy set-top domain by working with operators to tailor use of what Jeff Huppertz describes as a “toned-down” version of the G4 STB Client with OCAP (OpenCable Application Platform) set-tops. “We can download our client software into OCAP set-tops, which gives operators an opportunity to deliver high-graphics user interfaces consistently across legacy as well as new RDK boxes.”

Without HTML5 capabilities, the rendering from Flash memories on older boxes is slower and requires a good deal of customization work to fit specific OCAP set-top models, he adds. “Maybe you can’t cache as much as we do with G4, which means you can only have so many VOD posters and days of live programming on display at any one time,” he says. “Virtually every major operator is talking to us about this.”

Moving RDK into Broadband

Where broadband devices are concerned, RDK Management last year launched the RDK-B initiative aimed at providing cable modems and broadband gateways a software baseline of commonality and standardization that could be enhanced by an RDK-like structure. At CES in January Broadcom made known it has incorporated support for RDK-B into its DOCSIS 3.0 and 3.1 silicon.

According to Comcast vice president of hardware design Fraser Stirling, who appeared at a press conference during the recent INTX show in Chicago, the MSO is now launching RDK-B trials with plans to begin a second wave later this year utilizing the MSO’s new Gigabit Home Gateways. The initial trials are being conducted in collaboration with Arris using that vendor’s Touchstone TG1682 DOCSIS 3.0 voice gateways.

Samsung, which has been using Espial’s G4 User Experience Framework with its RDK set-tops, is another early RDK-B adopter, as demonstrated at INTX with a display of its new DOCSIS 3.1 gateway. Along with running RDK-B, the device supports MoCA 2.0 and dual-band 4×4 Wi-Fi connectivity. “As more and more devices connect wirelessly to the home network and MSOs move to IP delivery of content, the data gateway in the home becomes the critical CPE component for delivering these services,” says Randy Westrick, director of STB product marketing at Samsung Electronics America.

Page 1 of 1012345...10...Last »