Service Providers Archive


Cable Transformations Illuminate Upside to Rural Broadband Story

Diane Quennoz, SVP, marketing & customer experience

Diane Quennoz, SVP, marketing & customer experience

Vyve Shows What Can Be Done with Prudent Investments in Small Systems

By Fred Dawson

August 10, 2017 – As government officials argue over how to deal with the sad state of broadband coverage in rural America they would do well to consider what the experiences of some aggressive Tier 2 cable operators say about what’s doable with resources at hand.

Of course, not every community and certainly not every farm has access to a cable network, but, too often, those that do are served by antiquated plant run by small operators whose stewardship leaves the wrong impression about the value of those facilities. Getting them up to speed will cost money, but the costs are not so great as to foreclose the likelihood of a reasonable return on investment.

Confidence in that supposition is reflected in the hundreds of millions of dollars flowing into small market cable acquisitions and plant upgrades over the past few years, notwithstanding an industry-wide pay TV margin squeeze that has hit small operators much harder than their larger brethren. There’s been plenty of time to determine whether the earliest expansion strategies pegged to high-speed broadband were a good idea. That the investments keep coming suggests the case has been made.

For example, the pacesetters in implementation of 1 Gig broadband service across large multi-state footprint have been Tier 2 MSOs like Cable One, now reaching 70 percent of 1.7 million homes passed with its GigaONE service, and Midco, which reached the 50 percent mark at midyear with its Xstream Gig service on its way to 100 percent coverage by year’s end. That would put the 1 Gig service in reach of 600,000 households in 335 communities across North and South Dakota, Minnesota, Wisconsin and Kansas.

The sense of opportunity in smaller markets is fueling ever more announcements of data rates in the 100 Mbps-1Gbps range and ongoing buyouts of smaller MSOs by larger companies. And with the broadband expansion, most of these companies have made connectivity to businesses a key part of their strategies.

One of the more recent consolidation deals came at the start of the year with Cable One’s $735-million acquisition of NewWave Communications, which, as previously reported, was recapitalized four years ago to become the leading broadband provider for businesses as well as consumers in its markets. With 1 Gig available across much of its 440,000 household footprint, NewWave has built a commercial service business from scratch that now accounts for 13.6 percent of company revenues.

Another even smaller Tier 2 operator demonstrating the success of a broadband-focused strategy is Vyve Broadband, which five years ago began transforming old cable systems into the kinds of high-capacity multi-service operations small towns and villages everywhere are looking for. Today, with HFC networks passing about 315,000 households in nine states, Vyve offers 200 Mbps access in three quarters of its franchises and has reached 1 Gbps in two of them: Shawnee and Ketchum, OK.

With cable systems in Texas, Arkansas, Kansas, Louisiana, Tennessee, Georgia, Colorado and Wyoming as well as Oklahoma, Vyve has put a good deal of capital into interconnecting its markets via fiber rings to facilitate headend consolidation and other efficiencies essential to profitably operating over such an expanse. Last year, the company completed buildout of a 400-mile fiber ring around central and eastern Oklahoma.

The 48-fiber ring feeds seven main city hubs, which provide direct service to over 40 municipalities across the state. It connects with a larger network tied to Tulsa, Oklahoma City, Dallas and Atlanta, and also feeds signals into links connecting to local systems in rural areas of these and the other states.

Vyve is offering a triple-play service with digital TV and voice in all its markets. It also has a unit dedicated to delivering business services, which include optical Ethernet, PRI (Primary Rate Interface) and hosted voice.

The emphasis on broadband hasn’t diverted the company from developing a 140+-channel HD service that would be competitive in any market. Indeed, the company is considering new service elements that would take it a step beyond what’s typically on offer from operators in much larger markets.

“Vyve is always on the cutting front,” says Diane Quennoz, senior vice president of marketing and customer experience at the Rye Brook, NY-based company. For example, she notes, Vyve is just now rolling out the first iteration of hybrid video service offering Netflix with the MSO’s linear and VOD lineup for unified navigation through the TiVo UI running on Evolution Digital’s eBOX, which combines QAM-delivered traditional linear TV with IP delivered VOD and OTT on the HDMI 1 input.

“We’re really excited about bringing OTT subscriptions into the linear lineup,” Quennoz says. “People want live and local TV but many want to get their movies through OTT. Now they can have it all through one input.”

Along with universal navigation, support for recommendations and other features, the TiVo UI offers viewers the option to choose between a grid-style traditional interface and the more graphically rich personalized format taking hold throughout the industry. “With TiVo we’ve always had a very user-friendly UI,” she adds. “As we’ve evolved into whole-home DVR and ultimately the eBOX we’ve been able to maintain the same look and feel while benefitting from the things TiVo has done to make navigation easier.”

Vyve pioneered use of Evolution’s DTA with the TiVo UI three years ago as a way to deliver a feature-rich service to legacy analog as well as HD TV sets. The eBOX IP hybrid STB builds on the DTA capabilities to enable operators to cap QAM-based delivery in a smooth migration to all-IP video. The terminal has caught on among smaller operators, many of whom are deploying it through an affiliation Evolution established with the National Cable Television Cooperative.

Next up may be cloud DVR. Vyve is looking at options but hasn’t made any decisions, Quennoz says.

Where the OTT tie-in leads is anybody’s guess. Quennoz acknowledges skinny bundling of live TV channels by OTT providers has had an impact. “We’ve seen our share of decline in video market for sure,” she acknowledges.

For now Vyve sees Netflix as “the go-to service” for ensuring the company is keeping pace with what customers want. “But let’s see what takes off and what makes sense for consumers,” she says, noting the company may look at options that could include new combinations of OTT and smaller bundles as the market evolves.

What matters is keeping customers engaged through high-speed broadband with offerings of value-added services that they can’t get in pure a la carte OTT mode. Along with the unified navigational advantage of providing live and local programming with OTT on-demand content, Vyve sees opportunities tied to a whole-home robust Wi-Fi service and smart-home applications.

“We have to own the Wi-Fi and the Internet in the home,” Quennoz says. “We’re working through various solutions that go beyond using our DOCSIS 3.0 modems with extenders.”

The search for a whole-home Wi-Fi solution operators can offer as a better experience over traditional modes is accelerating across the industry as the number of wireless devices used to access video and other content from any point in the home or business multiplies. This is another area where Midco has played a leading role, having been the first MSO in North America to deploy the mesh Wi-Fi solution offered by AirTies to support a whole-home wireless service, which it offers at $7.95 per month.

“We’re looking at a couple of providers in the market, conducting tests to see what’s working and what’s affordable,” Quennoz says. “The question is how you calculate the value for customers and sell it. Is it part of our broadband offering or a different product set? But we definitely want to be part of that home Wi-Fi opportunity.”

Where smart-home services are concerned, “we’re looking to see what makes sense for our customers,” she says. “We’re testing stuff every day and trying to determine whether this is something we should own or provide access to. A challenge with a lot of these solutions is the hardware is quite expensive.”

It’s still unclear how far consumers in rural markets will go in embracing the smart-home concept and the many applications associated with the Internet of Things, she adds. Vyve is offering home security in one of its markets, but “I wouldn’t say we’re on the forefront of any solutions,” she says. “But we want to determine where our customers are heading and get there before they need it.”

That’s a pretty good way of summing up how Vyve and the other cable players who see opportunity in the smaller markets are approaching the business. So far, they’re demonstrating it’s a winning strategy.


HDR Tech Bottleneck Slows but Can’t Stop 4K Transition

Steven Corda, VP, business development, SES

Steven Corda, VP, business development, SES

Complications Abound, but SES Is Demonstrating They Aren’t Insurmountable

By Fred Dawson

July 3, 2017 – The wait for pervasive availability of 4K UHD TV services may seem interminable as technical complications and a dearth of content continue to impede progress, but there’s every reason to believe the dam will finally break in 2018.

Right now the view from the technical trenches is mixed at best, given the added challenges imposed by HDR (High Dynamic Range) technology, which is widely viewed as essential to creating a viewing experience that significantly differentiates UHD from HD. No one knows how that differentiation will impact revenue streams, but MVPDs, traditional and virtual alike, as well as content producers appear willing to invest heavily to find out.

“4K for us will always go with HDR,” says Joshua Seiden, executive director of Comcast Innovation Labs. The decision to go that route has significantly altered the MSO’s plans, which initially envisioned introduction of a 4K UHD set-top-box (STB) supporting UHD services for possible rollout in 2016 followed by an HDR-capable STB later that year to enable delivery of HDR-enhanced content.

Joshua Seiden, executive director, Comcast Innovation Labs

Joshua Seiden, executive director, Comcast Innovation Labs

While Comcast hasn’t announced the timing for UHD service introduction, beyond the already supported on-demand 4K sampler service offered to owners of certain Samsung  and LG TV sets, the company has set its sights on making HDR based on the HDR10 standard available in time for the 2018 Winter Olympics schedule for February 9-25 in Pyeongchang, South Korea. “That’s what we’re targeting,” Seiden says.

The Layer3 TV Agenda

Meanwhile, the pace of MVPDs’ commercial introductions of 4K UHD services, with and without HDR, is quickening across North America. Denver-based startup Layer3 TV, for example, is providing STBs supporting HDR-enhanced 4K to all the subscribers it signs up in currently served markets, which include Los Angeles, Chicago, Washington D.C., Dallas/Ft. Worth and Denver, with New York City and environs slated for launch in the near future.

Layer3’s allHD service, delivered over subscribers’ broadband connections, offers 250 HD channels typically priced at about $85 per month. In Washington the company also offers a fiber-to-the-home option by reselling 100 Mbps full duplex Internet service running on Verizon’s network at a standalone price of $69 or at $125 when bundled with the allHD service.

So far, Layer3 has only offered a limited amount of 4K content in VOD mode. Going a step farther into live event coverage, on June 24, along with a handful of other MVPDs, the company tapped the recently launched North American 4K UHD satellite feed from SES to offer iN DEMAND’s pay-per-view production of the Bellator NYC: Sonnen vs Silva Mixed Martial Arts event at a slight premium over the HD feed.

There’s much more in store once more content becomes available, says David Rapson, senior director of content partnerships at Layer3. “We see an opportunity to take advantage of being first with 4K/HDR in our markets,” Rapson says. “We’d like to have four or five live UHD channels running 24/7 along with VOD content as soon as possible.”

The opportunity is closer at hand than most people realize, he adds. When it comes to 4K content development “there’s a lot being discussed that’s not out publicly,” he says. “Movies will be a good opportunity along with sports and nature programs. And there’s a lot of international content coming, too.”

SES Orchestrates a Head Start for MVPDs

Steven Corda, vice president of business development at SES, agrees. “The pace of channels becoming available for our 4K service is exceeding our expectations,” Corda says. “Some new ones are imminent.”

One factor in the quickening pace is the fact that SES has built a distribution system designed to facilitate implementation by terrestrial MVPDs, he adds. “Every piece of the value chain is resolved,” he says, noting this includes a growing catalog of STBs for telco IPTV and cable operators. “If we hadn’t created an end-to-end solution, things wouldn’t be going this fast.”

SES uses HEVC (High Efficiency Video Coding) to compress the channels for delivery over terrestrial networks at 18 Mbps, performs encryption and other format processing and supplies the local headend reception equipment as part of the package. Corda says the bitrate is likely to fall as HEVC matures.

But what the bitrate may be for MVPDs once live sports and other programming comes into play with HDR remains to be seen. Comcast’s Seiden says that, right now, delivering HDR-enhanced 4K sports content at quality levels meeting Comcast’s requirements requires throughput in the range of 30-35 Mbps.

Clearly, though, the SES UHD service, currently delivering ten 4K UHD channels from three satellites covering the U.S., represents a good starting point for MVPDs who want to get their feet wet. As previously reported, SES is offering a similar service in Europe, which it launched ahead of the U.S. service, and now it’s operating UHD in Latin America as well. In all, the company has 22 UHD channels in operation globally, representing about 43 percent of the available UHD channel count, Corda says.

In the U.S. the service is undergoing testing by about 25 MVPDs with a combined audience approaching 10 million, he adds. Verizon, for example, is collaborating with SES in conjunction with evaluation of the platform as a way to integrate scalable and dedicated satellite bandwidth into their Ultra HD launch plans. “This marks an important milestone in the development of our Ultra HD solution,” Corda notes.

Other MVPDs publicly named as trial partners include Frontier Communications and several cable operators, including Aureon in Iowa, GVTC Communications in Texas, Highlands Cable Group in North Carolina, KPU Telecommunications in Alaska, Service Electric in Pennsylvania and New Jersey, and Shrewsbury Community Cable in Massachusetts. In addition two MVPDs, Highlands Cable Group, a small cable operator in Highlands, NC, and Marquette-Adams, an independent telco offering IPTV service in Oxford, WI, are using the SES channels to support commercial 4K UHD services.

SES does not negotiate the licensing rights on the nine 4K UHD channels it offers from third parties. In the case of the two commercial MVPD launches, the licensing was mediated by Vivicast Media, a content licensor serving MVPDs worldwide. The role played by Vivicast reflects the extent to which SES has gone to help Tier 3 MVPDs get off the ground with 4K UHD services, Corda says. He also points to the assistance SES provided to Marquette-Adams in testing the Amino 4K STB it chose for the service as another example of the hands-on approach.

These launches reflect the importance of a turnkey 4K UHD service to the fortunes of smaller operators who don’t want to be caught, as they were with HD, at a disadvantage against DBS competitors, Corda notes. But, he adds, SES sees an opportunity for the service extending into the higher MVPD tiers as well. “There are over 900 MVPDs in North America, and we have relationships with all of them,” he says.

The ten 4K UHD channels currently on offer from SES are provided on an a la carte basis. One of the channels is comprised of content aggregated by SES, such as the iN DEMAND PPV event. The others, most of which are not part of traditional pay TV lineups, include Fashion One 4K, Travelxp 4K, 4KUNIVERSE, NASA TV UHD, INSIGHT TV, UHD1, C4K360, Funbox 4K and Nature Relaxation 4K.

Some are better known in the OTT space, such as Insight and Fashion On, two English language channels out of Munich, and TravelXP, an international travel channel originating in English out of India. There are startups as well.

“A few of our channels are from content people who saw they could build channels with their own brands through affiliation with us,” Corda says. For example, 4KUNIVERSE debuted in January on one of the SES satellites offering a mix of documentaries, sports, movies and TV shows aimed at Millennials and Generation-X viewers.

The only SES Ultra HD channel delivered so far with HDR enhancement is TravelXP. Billing itself as the world’s first 4K travel channel, Travelx is using SES to deliver hundreds of hours of travel programs from all over the world. Its HD service, with a lineup consisting entirely of originally produced travel and lifestyle programming, reaches over 50 million homes globally, the company says.

“Ours is the only commercial HDR channel available in this market,” Corda says. He expects SES will be able to add more before too long. “A number of programmers are looking at HDR,” he says.

The Technical Challenges Posed by HDR

SES has settled for its own purposes one of the more vexing issues MVPDs face with HDR, namely, choosing which transfer function to support. “We’re extremely pleased with what we’re seeing with HLG (Hybrid Log Gamma),” Corda says. “Unlike PQ (Perceptual Quantizer) it doesn’t require use of metadata, and it’s backward compatible with standard dynamic range (SDR) UHD. We looked at HDR10, but it washes out with non-HDR10 TV sets.”

In TV displays the transfer function is the algorithmic instruction set which directs how the display interprets and renders the luminance values of the original production. PQ does this by incorporating metadata into the channel stream that can be interpreted by HDR10-compatible TV sets to render brightness as captured by cameras in the original production in accord with the luminance range supported by any given display.

Dolby, the developer of PQ, offers a two-stream version supporting backward compatibility where the basic content signal is delivered in SDR and the metadata enabling HDR rendering is delivered in a separate stream, but this has not been incorporated into the SMPTE and HDR10 standards. HLG relies on tweaks in how the traditional transfer function used in broadcast TV works, avoiding use of metadata so that SDR TVs can display the picture while enabling HLG10-compatible TV sets to render with the luminance enhancements enabled by HDR.

As previously reported, last year the ITU added HLG to its Rec. 2100 specifications for HDR, which include PQ along with the other components of the HDR domain, such as a minimum luminance range of 1,000 nits (cd/m2 or candela per square meter), the wide color gamut set by ITU’s Rec. 2020, support for 10-bit or 12-bit coding, a wide range of frame rate values, resolution specs for HD, 4K and 8K and much else. HLG is also now accommodated in specifications set for ATSC 3.0, HDMI 2.0b, HEVC and Google’s VP9 codec.

Scott Davis, chief architect, Charter Communications

Scott Davis, chief architect, Charter Communications

Notwithstanding the ITU’s accommodation of both the PQ and HLG options with provisions for transcoding from PQ to HLG or vice versa, the industry is increasingly torn over which approach to take. Scott Davis, chief architect for Charter Communications, notes that while HLG solves the backward compatibility problem there are, as we reported last year, concerns “about chromaticity errors that some people have seen in versions of HLG.” Such issues can be dealt with, he says, but “the difficulty becomes, what does the TV support?”

Until this year’s NAB, Davis continues, “I saw a great deal of support for PQ and not so much for HLG.” But at NAB 2017 “I saw an awful lot of HLG. I think we’re back to the place of, which would you like to do? It’s going to be a bit of a negotiation between us as content distributors, the content creators and TV manufacturers over what the process needs to be. I think anybody who thinks this is completely solved is a little premature.”

As Seiden notes, Comcast has committed to HDR10. “HDR10 is the most widely deployed,” he says, in reference to HDR-capable UHD TV sets. “That’s our focus.” But, he adds without elaboration, “From the STB side other standards will be supported.”

This could mean the MSO’s new HDR/4K STBs may be able to transcode from PQ to HLG in cases where the subscriber’s TV set is not HDR10 compatible. Whether this is feasible from a cost standpoint is unclear, but it’s clearly doable based on the process prescribed in the ITU’s Rec.2100.

Right now, though, it’s very hard for an MVPD to lock onto an approach that can be relied on to satisfy consumers; meet the requirements of content providers, and measure up to its own standards of performance, Davis says. Moreover, the transfer function issue is just one of many that don’t lend themselves to easy resolution.

“Let’s start with the easy one,” he says. “What luminance value should we choose? I recall last year going across the floor looking at 400-nit TVs and thinking that’s pretty cool. Since then I’ve had opportunities on a couple different occasions to see produced content at much higher values – 1500 nits.”

Indeed, where 1,000-nit displays were the high-end models a year ago, now manufacturers are said to be preparing to introduce displays with 2,000-nit capabilities. “How do we balance this out appropriately?” Davis asks.

“How do we measure those luminances?” he continues. “How do we derive what the real value is of that TV? Additionally, what happens if somebody shoots a video at 1,000 nits and the TV is 2,000 nits? Do we allow the TV to make a change? Do we do the change in some external box? Or do we clamp at 1,000 nits? These are things we haven’t figured out yet.”

Another point of uncertainty is the coding bit rate used with HDR, where 10 bits is now the norm with 12 bits in the wings. Until now, the standard in digital TV has been 8-bit color coding. Does that mean MVPDs have to deliver two streams for every channel, one for HDR versions and one for SDR? Or is it best to move everything to 10-bit processing?

One way or the other, Davis notes, if MVPDs are going to rely on HEVC, also known as MPEG H.265,  to compress UHD signals to reasonable bitrates, they’ll have to adopt the H.265 Main 10 profile to support 10-bit processing. “Quite a few of our existing decoders don’t understand that,” Davis says, referring to Main 10. “So how do we make sure we don’t give somebody something they can’t watch?”

The need to utilize 10 Main to support 10-bit coding for HDR has been the key delaying factor for Comcast, Seiden says. “It complicates matters,” he says, noting it has taken awhile for chip makers to incorporate 10 Main. “We’re taking mezzanine content directly from content providers. The whole workflow has to be developed to support 10-bit HEVC.”

Moreover, he adds, getting to the low latency required with live sports and other linear content “takes a heck of a lot of processing.” Comcast is working with vendor partners to address that problem.

The fact that standards keep changing doesn’t help in the transition to UHD. For example, SMPTE, which had incorporated what is known as static metadata in ST 2084 in conjunction with HDR10 PQ specifications, is now finalizing a new standard, ST 2094, to incorporate dynamic metadata as a means of accurately adjusting brightness levels on a scene-by-scene, frame-by-frame basis. With a strong push from Samsung, PQ with dynamic metadata is now coming into the market as the key component to what’s known as HDR10+, which Amazon says it will use with some of its UHD content later this year.

“There’s nothing wrong with new standards,” Davis says. “They just add to the breadth of options.” The problem is deciding which options to pick. “You get nervous at that point in time,” he says.

Unstoppable Momentum

But all these issues, probably sooner than later, will be resolved. As they are, MVPDs will feel intensifying pressure to be among the first with viable HDR/UHD services.

Already OTT providers are racing ahead with HDR-infused UHD content. Netflix and Amazon have led the way so far with the addition of HDR-enhanced programming to 4K portfolios they’ve been building since 2014. Others following suit include Hulu, Vudu, Sony Ultra and UltraFlix4K.

But probably the biggest incentive to accelerating cable operators’ move into 4K UHD is the threat posed by DirecTV, especially now that it has the resources of AT&T to leverage in the anticipated expansion of its nascent 4K UHD service. DirecTV hasn’t implemented HDR yet, but it’s leading the pay TV market with three channels devoted to 4K UHD and enough satellite capacity to support dozens more as content becomes available.

The DBS operator recently shifted from making UHD channels available at a high premium to other services to including UHD as part of its 145-channel, $50-per-month “Select” plan. A growing component of the 4K programming is live sports, which began with the Masters Golf tournament in 2016 and was repeated with two-channel coverage in 2017. The MVPD’s 4K sports coverage also includes occasional broadcasts of live MLB, NBA and Notre Dame football games.


Altice Adopts Security Strategy Suited to Big Expansion Agenda

Dexter Goei, CEO, Altice USA

Dexter Goei, CEO, Altice USA

MSO Says Multiscreen Solution Supports Cost-Efficient Path to UHD & IP Migration

By Fred Dawson

February 8, 2017 – Tier 1 MSO Altice USA has taken a key step toward positioning itself to be a formidable competitor across what could become a much larger fixed and possibly mobile implementing a consolidated approach to securing and managing next-generation services, including UHD.

In a departure from the norm in U.S. cable, the company has tapped Switzerland-based NAGRA to supply content protection and back-office platforms that will serve as the foundation for an aggressive expansion strategy, details of which are gradually coming into public view. The decision to tap a pay TV security supplier that has had limited penetration in the U.S comports with the company’s intentions “to bring innovative products and services to Altice USA’s Optimum and Suddenlink customers by leveraging our global operational expertise, scale, resources and key strategic partners like NAGRA,” says Altice USA co-president and COO Hakim Boubazine.

By breaking with reliance on the traditional suppliers to those cable systems, the company is blazing a trail that could have implications for other operators looking for solutions essential to getting next-gen TV off the ground. Indeed, in some respects the move is in stride with a major shift in how North American cable operators go about procuring solutions these days.

But while openness to broader selections of solutions beyond those offered by traditional set-top and other equipment suppliers has drawn a growing number of players from abroad, security has been a tough nut to crack. With an embedded base of set-tops that rely on the conditional access systems (CAS) from the dominant CAS suppliers, ARRIS and Technicolor, successors, respectively, to the old General Instrument/Scientific-Atlanta duopoly, it’s been hard for North American operators to embrace other suppliers’ solutions.

The Altice strategy suggests this barrier may soon fall as operators make the transition to a new generation of hybrid set-tops that can support UHD 4K while advancing the migration to all-IP video. From a security standpoint, the ability to manage content protection from a single platform that can cover all the bases in an increasingly fragmented device environment has become fundamental to ensuring the robust security that’s essential to delivering licensed content to every point of subscriber connectivity.

NAGRA’s Connect security platform is designed to meet these goals, says NAGRA COO Pierre Roy, not only by offering advanced CAS/DRM and multi-DRM support, but also by facilitating operators’ ability to meet the more advanced security requirements tied to UHD content, including forensic watermarking technologies and anti-piracy and cybersecurity services. Of course, such capabilities don’t amount to much unless they can be configured to local conditions, which Roy says NAGRA has demonstrated it can do here and in other markets, giving it a leg up when it comes to achieving economies of scale

“Being selected by Altice USA shows how we can be a global partner to large multi-network operators while adapting to their local infrastructures and requirements,” he says. “This creates economies of scale that reduce operator cost and increase operational efficiency through a single, flexible, global technology partner.”

Boubazine concurs. “We have been impressed by the flexibility NAGRA has shown in adapting to U.S.-specific requirements in a short amount of time,” he says, “This partnership will enable us to design integrated services to meet our customers’ expectations.”

Cost efficiencies, such as those enabled through service integration, are a major goal as well. Altice, now the fourth largest MSO following its acquisitions of Cablevision and Suddenlink, has partially justified its risky bet on U.S. cable by citing cost-cutting opportunities which, in the case of Cablevision, are expected to produce a $900-million savings within three to five years.

Altice isn’t saying much about its next-gen service strategy. But it’s clear the company is eager to move to a uniform approach to delivering services across all networks and user devices, as farther evidenced in its choice of the NAGRA MediaLive platform.

MediaLive is designed to enable a flexible all-screen backend management approach to monetizing and delivering services, explains Christopher Schouten, senior director of product marketing at NAGRA. “It can be used in set-top-only situations, multiscreen-only configurations and as a universal IP content delivery backend for both set-tops and personal devices,” he says

On the security side, Schouten adds, Connect allows Altice to automatically provide security appropriate to whatever device a subscriber is using to access the MSO’s service in compliance with licensing terms. “Each situation has different security and business rules, so it’s important to have one master system that applies to all of them,” he says.

The NAGRA solution can efficiently coexist within legacy U.S. cable systems while avoiding duplication of bandwidth and enabling an open choice of set-top box suppliers, Schouten notes.
“It’s impossible to dump 100 percent of the headend and set-top legacy infrastructure in one go,” he says. “You can imagine this would be for new customers and upgrades to more advanced services. The two solutions (legacy and Connect) will be able to be run in parallel.”

The converged security solution can be extended to mobile, he adds. “Because Connect supports broadcast, unicast, multicast and third-party services like Netflix, it can be used in any environment,” he says. “The multi-DRM management component helps unify the application of business rules across third-party DRMs like Fairplay, PlayReady and Widevine as well as NAGRA DRMs.”

Altice leaders have broadly hinted at expansion plans that could involve other cable acquisitions as well as a move into mobile, much as happened in France with the acquisition of SFR in 2014 and the subsequent combination of fixed and mobile operations under the SFR Group umbrella. In an interview last June with The New York Times, Altice USA CEO Dexter Goei made clear mobile was under consideration. “It is worthwhile knowing that every single one of our businesses in other markets are quad-play, both fixed and mobile broadband,” he said.

In one respect, the company has already expanded beyond the boundaries of its acquired cable systems through an investment in startup Layer3 TV, which it inherited with the Suddenlink takeover, as recently reported by Variety. Layer3, which has made known it intends to make its commercial service debut in Chicago with other, unnamed cities on tap in 2017, recently ran a trial of its 4K-ready service platform in Midland and Kingwood, Texas, apparently in cooperation with Suddenlink.

In its current operating territories Altice has committed to a massive fixed network upgrade plan with the intention to extend fiber in its HFC networks all the way to the premises across all of its Optimum (Cablevision) and most of its Suddenlink footprints over the next five years. This will enable “Generation Gigaspeed” services of up to 10 gigabits-per-second, the company says, noting that it “expects to reinvest efficiency savings to support the buildout without a material change in its overall capital budget”.

Underscoring the company’s belief that converged operational capabilities are key to generating efficiencies everywhere, the Holland-based parent Altice Group, now serving close to 50 million customers on four continents, has adopted what it calls the “Altice Way” as a set of principles for all its operations. These include a commitment to “developing, launching and integrating new products, services and business models, including the creation of next-generation communications access and content convergence platforms with market-leading home hubs.”

The company also said plans include forthcoming launches of Altice Studios to “create original movies and series” and the Altice Channel Factory to “create more new channels.”


On-Boarding OTT Services Just Got a Lot Easier for Pay TV Ops

Jeroen Ghijsen, CEO, Metrological

Jeroen Ghijsen, CEO, Metrological

Liberty Global’s Approach to Netflix Integration May Soon Be Replicated Elsewhere

By Fred Dawson

January 26, 2017 – Pay TV providers looking to include OTT subscription services like Netflix in their programming lineups will be relieved to learn there’s an expeditious alternative to the tortuous procedures they’ve had to employ to integrate such services in the past.

Liberty Global, which last year announced it was going to feature Netflix in its programming guides, has revealed it has performed the integration in a much more straight-forward and timely fashion than operators are accustomed to by utilizing technology developed by Metrological. As described by Metrological CEO Jeroen Ghijsen and VP of technology and innovation Wouter van Boesschoten, the new cloud-based process provides a means by which operators everywhere can more easily create OTT service-enhanced user experiences using existing middleware and set-top boxes (STBs).

“By leveraging our experience with browser-based application frameworks, we have standardized key components, simplifying the integration of premium OTT content,” Ghijsen says. “This results in a reduction of the required STB resources, deployment cost and time to market.”

Liberty Global has launched the Netflix app on its Horizon UI in the UK, Ireland, Switzerland and the Netherlands, to be followed in other countries throughout 2017. “Metrological’s Application Platform, which is an integral part of Horizon TV, helped us to streamline this particular Netflix deployment and expedite the time to market,” says Doron Hacmon, chief product officer at Liberty Global. “The flexibility of the platform allows us to continue to innovate by integrating new relevant services in a timely fashion.”

As Doron’s comment implies, the Metrological solution promises to make it easier for operators to continually add third-party OTT services as strategies are refined and new deals are arranged, improving their ability to turn the growing multi-subscription phenomenon to their advantage. About 22 percent of cable subscribers also subscribe to at least one OTT service, according to research conducted by Millward Bown Digital. A new study from Parks Associates finds that 31 percent of U.S. broadband households have multiple OTT service subscriptions, which is nearly one-half of the 63 percent of U.S. broadband households subscribing to at least one OTT service.

Brett Sappington, senior director of research at Parks, says the service-stacking phenomenon has become an important step in the growth of the U.S. OTT video services marketplace. As Sappington notes, a big reason for the surging importance of OTT services to pay TV and multi-OTT service subscribers is the volume of original content they provide that can’t be found anywhere else. “The regular release of high-quality original content, such as The Grand Tour (Amazon) and Gilmore Girls: A Year in the Life (Netflix), ensures the large OTT players will remain a core, consistent subscription among service-stacking households,” he says.

For pay TV providers who want to serve this demand by creating a one-stop-shopping environment for their own and others’ subscription services, having a way to bring those OTT services into the pay subscriber’s navigation window as a routine operational task will become ever more important. “If you can support this through one unified solution that integrates services in compliance with all their requirements in a standardized manner, you can save a lot of time,” Ghijsen says.

Metrological’s hybrid deployment architecture leverages an application framework that acts as a device- and software-agnostic abstraction layer streamlining the engineering and coding requirements for STBs, van Boesschoten explains. This approach also yields a smaller STB resource footprint, enabling operators to deploy premium OTT content on legacy devices, he notes, adding that much of the Liberty deployment involves use of five-year-old STBs.

As previously reported,  Metrological has become a leading supplier of solutions designed to facilitate pay TV operators’ multiscreen services strategies. Its Application Platform integrates TV and OTT experiences, providing full lifecycle support for operators’ management of branded TV app stores and OTT content via a cloud-based back-end that also provides real-time business intelligence data and marketing analytics. Operators can execute on these capabilities utilizing Metrological’s App Library, which contains over 300 apps, or they can build their own apps with an open software development kit.

The move away from reliance on apps hosted on the STB, where limited CPU resources restrain operators’ ability to respond to new opportunities, requires use of browser technology that draws on cloud resources fast enough to meet low latency requirements. There was considerable skepticism at Netflix that a cloud-based platform could execute on trick play and other functions intrinsic to its service, van Boesschoten notes.

But, with wide-scale adoption of its cloud technology, including incorporation into the Reference Design Kit (RDK) software stack backed by Comcast, Liberty Global and Time Warner Cable, Metrological has proved its pay TV-optimized HTML5 browser is up to supporting this latest addition to its cloud capabilities. Utilizing an open-source environment known as “WebKit for Wayland,” the browser enables better rendering of apps and next-generation UIs in a multi-device environment along with better control over all applications and resources, van Boesschoten says.

“Our browser gets past the native utilization hurdle,” he says. “It’s very fast with the ability to read data at 60 frames per second.”

In the hybrid deployment architecture used for integrating OTT services there’s a careful balance between functions residing in the cloud and on the STB. “At the hardware level we perform integration for graphics rendering with the CPU,” van Boesschoten says. In an RDK set-top environment, integration with the STB SoC takes just three days, he adds.

Speed to market is greatly aided by Metrological’s integration with GStreamer, a multimedia framework included in the RDK software stack to support secure streaming of content over the home network with a full set of components for managing complex renderings across all networked devices. GStreamer employs a plugin model supporting implementation of a wide range of codecs, filters and other resources that can be mixed and matched through developer-defined pipelines to enable feature-rich multimedia applications.

“We have tremendous experience with GStreamer,”  van Boesschoten says. “As long as an application supports GStreamer we can make that app work with whatever STB and middleware environment you bring to the table.”  Metrological can achieve the STB-level integration with apps that aren’t compatible with GStreamer, but it takes a little longer, he adds.

At the cloud layer in the hybrid architectural approach Metrological supports all the state functions (pause, rewind, resume, reset), positioning of the app in the UI (whether as a standalone app or as a channel selection or both) and any modifications tied to rendering on different types of devices beyond the STB. Device certification, subscriber authentication and security provisioning live in the cloud as well. A systematic, automated approach to pushing to the STB whatever DRM or other security mechanisms are required by a particular OTT service is critical to quickly mounting such apps, van Boesschoten notes.

In the case of Liberty’s integration with Netflix all these capabilities were put into play with the existing Cisco middleware platform with a minimum of heavy lifting,. “We’ve defined the platform to be useful regardless of whether the STB is running Cisco, ARRIS or somebody else’s middleware,” he says.

There are likely to be many more customers for the new Metrological platform, given how widespread the OTT service integration strategy has become. As previously reported, a recent global survey of operators’ service innovation priorities by the Pay TV Innovation Forum found that on-boarding OTT content was one of the top three priorities among service providers everywhere.

“This is opening an important new business opportunity for us,” Ghijsen says. “Meeting Netflix’s requirements for engagement has been a difficult undertaking for operators. Now we’ve validated it can be done much faster with far less effort.”


Pay TV Operators Worldwide Detail Responses to Disruption

Koby Zontag, VP Media Sales & Business Development, PCCW

Koby Zontag, VP Media Sales & Business Development, PCCW

Common Thread in Top Innovation Priorities Reflects Consistency of Competitive Threats

November 22, 2016 – Entering 2017 the challenges faced by pay TV providers the world over are remarkably consistent region to region, as evidenced by the results of an extensive global survey of operators undertaken by the Pay TV Innovation Forum initiative spearheaded by NAGRA. At the same time, innovation strategies vary depending on regional market conditions and where any given service provider sits in the intensifying competitive scrum.

In the interview that follows, Simon Trudelle, senior marketing director for NAGRA, provides an overview of the survey process and its findings. We then present excerpts from Pay TV Innovation Forum interviews with six executives from different regions of the world who describe the market conditions, challenges and innovation strategies that characterize their operational environments. Companies represented include AT&T/DirecTV, Liberty Global, Hong Kong’s PCCW, Brazil’s Oi, Link Net-First Media in Indonesia and Telekom Malaysia.

ScreenPlays – It’s great to have this opportunity to catch up with you, Simon, especially in light of some of the research that’s come out of the Pay TV Innovation Forum that NAGRA has been spearheading. Why don’t we begin with your telling us what this is, how long it’s been operating and what its agenda is?

Simon Trudelle, senior product marketing director, NAGRA – It’s a program we launched in Q2 2016. A final report and conclusions were released at IBC 2016.

The program aims to look at the state of innovation in the pay TV industry and really answer the question of what will be driving growth in the years to come at a global level. The approach we’ve taken is to work with a London-based consultancy, MTM. They have been experts in the TV space for over a decade.

They researched the market around the world looking at the top 231 operators across the leading countries and analyzing the state of innovation with each of these operators. And then we opened up the conversation with industry executives. Over 200 people were asked to contribute and to provide their view of what are the priorities in terms of innovation for years to come.

We ran six workshops in different parts of the world – Europe, in London and Rome; Asia-Pacific, in Singapore, and in the U.S. Los Angeles. And we also went to Mexico and Brazil. We surveyed executives in each of these regions to capture their input and also ran some surveys and analyzed data to get a complete view of the situation today and where it’s headed.

SP – I don’t know of anybody that has done this. Usually you get research studies that aren’t really talking to distributors. They’re talking to everybody else to get trends and what have you.

Getting them to cooperate was no small feat I imagine. Once you did what did you find out?

Trudelle – Ultimately we realized that there are some obvious leaders worldwide. They’re not specific to one region. We listed major players that are ahead of the curve in many ways and have been able to innovate already and launch new types of services, improving the pay TV experience or even going into what we call adjacencies, new areas of growth for pay TV. We provided a benchmark and ranking of the players. That data is available in the reports.

What comes out is that, in terms of the next steps, we’re going to see more competition driving more innovation. Eighty-three percent of the service providers we surveyed said that competition is going up, and 78 percent said that innovation was the answer.

It means we are reaching a point in the industry where we know things are changing and the opportunities are there to actually grow the pay TV industry. But the recipes will be different, because the technologies and the networks to deploy pay TV services are evolving with IP and cloud technology and data becoming more and more important.

And the other dimension in terms of how to do it better for the future, in the conclusions we not only see a focus on the new technologies but also on partnerships with key vendors to accelerate this innovation process and be more agile in leveraging the best-of-breed players to get there and build the future of pay TV.

SP – Where is that collaboration in the vendor community centered? How does that get done?

Trudelle – We’ve analyzed several models. There are some consortiums that have begun to be put in place. Also, there are some contributions from open-source communities. There are also some service providers among the largest ones that have started making equity investments in some of their partner vendors.

We think there are several models. It will be a mix of them that will make service providers successful. It is certainly a new way of approaching the market. The end game is that service providers have to be in a position where they put the consumer at the center of the experience, and they’re agile enough to move their systems to the next generation of technology.

SP – What did you see as the biggest area of consensus on innovation strategies? Is it revolving around UHD and HDR? Is it starting other services? Is it mounting an over-the-top?

Trudelle – We looked in particular at nine major categories, and out of that list there were three that stood out more. One is more on the business side, the pricing and packaging of the offering.

The feedback we’re getting from the industry is that we will move progressively away from the one-fits-all type of bundle to more segmented, targeted products that respond to the needs of consumer segments. And that has become possible because of OTT delivery, new technologies and new experiences that can be delivered. In the survey that came out as one of the top priorities.

Then it’s also about improving the offering in terms of content. So on-boarding OTT content, particularly the Netflix’s and YouTubes of the world…

SP – A few years ago that survey would have come up near zero on that question.

Trudelle – Absolutely. We started this survey over a year ago where we already had some signs that it was becoming a reality. And now we’re seeing that happening and more service providers saying we would like to on-board more content and create the one place where you have access to all the best content. So it’s really giving pay TV its leadership role again as being the one place where the best content is available.

The third priority is increasing the reach to all screens – big screens, TV sets, very important, but also bringing the same content to other devices with the on-demand capabilities easily available from all devices. That’s more to address the needs of a younger generation that is consuming content on all these devices.

SP – Obviously, these priorities are all intertwined. They basically feed off each other as the priorities of the industry. That survey really gives us a good idea of what’s on these people’s minds. Were the findings different for North America?

Trudelle – There were trends that are stronger in the North American market. We’ve learned from service providers there is a great appetite for delivering OTT content and building an app model addressing on-demand consumption anywhere anytime and also more flexible pricing and bundling.

And with the pressure from content owners that are going direct to consumers this is also bringing service providers to look at the market with a different vision of where it’s headed. We haven’t seen that much of these trends emerging in other parts of the world yet. When we look at the four reports, we see that North America is already addressing challenges that the other regions are only dreaming about.

SP – In this area of collaboration, did anything come up around security and the fact that these new [content licensing] rules that are coming into play will require far more cooperation on enforcements in tracking piracy, which is really a pan-industry kind of agenda?

Trudelle – It does come out in some conversations that there is, especially in markets like those in Latin America, a lot of illegal content that is available and hurting the pay TV industry. We at NAGRA work with regional operators to improve the anti-piracy efforts as part of the Alianza alliance in the region.

But this is potentially holding back growth and playing a negative factor on innovation, because consumers find the content they want but through the wrong channels. That means that service providers at some point and content owners as well have to get themselves the tools and the technologies to stay in control of the distribution of content and also make sure the experience at the end is better than what you get from a pirated site. So it is both defensive and proactive.

SP – Our audience can go to your website and get your findings from the forum?

Trudelle – Absolutely. These findings are available for download for free – registration at And we’ve also published a number of public interviews with executives that were created as part of the program. They provide from a service provider perspective real examples of what’s happening in a given market. Some insights of how they see innovation in their companies and innovation in the industry and what they see as the key success factors.

SP – We’ll definitely be watching the site for that input. Thanks much for taking us through this, Simon.

Excerpts from Pay TV Innovation Forum Interviews with Service Provider Executives

United States
Charles Cataldo, Manager Technical Services, DirecTV/AT&T

Pay TV Innovation Forum – How would you describe the state of the US pay-TV industry today?

Cataldo – Ten years ago, a typical pay TV subscriber was a family household. Today, the picture is very different – there might be five members of that household, each of them looking for different content. The ‘one subscription fits all’ model does not work anymore. Pay TV service providers now have to focus on building an ecosystem of products and services that appeals to each member of the household.

In addition, the younger generation has grown up watching YouTube. Their perceptions of and expectations for content are very different from those of a traditional pay TV decision maker. For a long time, I have believed that would present a great opportunity for video services that sit North of YouTube and South of traditional pay TV. That is exactly the type of standalone OTT subscription services that Major League Baseball (MLB.TV) and HBO (HBO Go) have developed.

PTVIF – What are the innovation priorities for pay TV companies in the USA? 

Cataldo – Pay TV service providers that have physical networks and are experienced in developing great content need to be able to innovate in terms of search engines and content placement on the user interface. On the other hand, OTT service providers, such as Netflix, that have flexible technology platforms and are sensitive to their customer preferences need to be able to establish relationships with major programmers in order to build great content propositions.

At the end of the day, the factors that will determine the success of a pay TV product will be content quality, followed by user experience and ease of navigation, followed by quality of delivery.

PTVIF – Looking ahead, what will be the most exciting areas of opportunity for pay TV service providers?

Cataldo – In terms of content, there is significant unrealized value in standalone OTT content, particularly sports, and mobile content, including mobile-first content and mobile gaming. In terms of business models, there are exciting opportunities to move beyond subscriptions. For example, pay TV service providers can utilize freemium models, where users can choose to pay the full price for the service without advertising, or get the service for free or at a reduced price with advertising. In addition, pay TV service providers can be creative in terms of how they promote their services, instead of buying advertising they could spend those ad dollars on offering pilot episodes to the public for free.

Shuja Khan, VP Revenue Growth Transformation, Liberty Global

PTVIF – How would you describe the state of the pay TV industry today?

Khan – When you look at the long-term evolution of the pay TV industry, the last five years have been much more disruptive than the previous ten. During the first decade of the century, European pay TV providers were focused on improving their content offerings by, for example, increasing channel lineups, differentiating themselves from free-to-air channels, and investing in their distribution platforms and set-top boxes. Today the focus is on delivering even better experiences for our customers – they are now used to almost continuous app updates compared to the three-five year refresh cycles we used to have. Then it’s also about bringing new content offerings to our platforms with flexible propositions and addressing the exciting new growth opportunities that are opening up with on demand, personalization and impact of social media.

It feels like we’ve gone from a jog to a sprint triathlon!

PTVIF – Pay TV companies are often perceived as not being especially innovative. Why do you think that is the case?

Kahn – In my opinion, what makes pay TV companies successful is their ability to transition breakthrough innovation into mass market adoption. The innovation may have originated in other markets, often niche markets, but what they do is make the technology reliable and easier to use and then package it in a way that is compelling. That for me is still innovation.

PTVIF – Looking forward, what do you see as the key innovation challenges facing the pay TV industry?

Kahn – First of all, the pay TV delivery mechanism is very complex. It has so many components to it and bringing innovation to the whole system is not straightforward, in terms of technology and cost. I think on balance it’s better to get it out then make sure it’s perfect…and then course correct.

Secondly, organizational design is really important for innovation, and lots of pay TV companies are not designed to be innovative – they’re designed to be efficient. A lot of them are still working in silos, with little collaboration. This is one of the key reasons for some of the transformational changes that I’m involved with at Liberty Global.

Third, there are return-on-investment considerations. Pay TV is a great cash-generating business and has healthy margins, so innovative products and services can face a very high return-on-investment hurdle.

Finally, lots of pay TV operators are worried about disrupting their existing businesses, so innovation is much more likely to come from new entrants or industry outsiders. The best way to address this – and the ROI challenge – is to strategically invest, incubate, rapidly experiment and then integrate.

PTVIF – What steps can pay TV service providers take to develop and grow their businesses?

Kahn – Quick ones. Pay TV service providers can’t ignore the disruptive forces facing the industry. They need to identify potential disruptions and take steps to take advantage of them.

In general, pay TV companies are doing a good job addressing the basics, investing to better set-top boxes, great OTT products and very advanced functionalities. Competition is stimulating innovation across the industry.

Secondly, the future is uncertain so we need to place bets. A good way to do that is through corporate venturing. As an investor, you can integrate the new innovation into your business – and could buy the business outright at some point, if it makes sense.

There are also lots of exciting new growth opportunities opening up for pay TV providers outside of their core business. Advertising and data is one area. There is a wealth of data that pay TV service providers can extract, analyze and monetize, leveraging return-path data from set-top boxes and OTT products. It’s a really unique asset that we have and can enable some really exciting new business models.

Thirdly, we need to follow consumer behavior and demand. This is what makes multiscreen or OTT interesting and exciting. Although TV Everywhere services are almost ubiquitous, there are still lots of opportunities to extend content onto new screens – to deliver the next generation of aggregation services and to make the mobile viewing experience easier and more user-friendly.

There is an abundance of opportunity; it’s just a case of prioritizing what’s most likely to provide the best growth.

Hong Kong
Koby Zontag, VP Media Sales and Business Development, PCCW
PTVIF – Do you think innovation is becoming more or less important to the pay TV industry?

Zontag – Innovation is definitely becoming more important to the industry, and companies are investing more in it. It is particularly important for market leaders who need to invest heavily to respond to disruptive technologies and innovate continuously to maintain their market positions. Pay TV service providers will always face potential disruptions. Today, it is OTT services, tomorrow there will be something else, so they have to be ready. It is also important to note that with major Internet businesses, such as Google and Amazon, entering the video market, the lines between different types of TV and video service providers are getting blurred.

PTVIF – Looking ahead, what will be the most exciting areas of opportunity for pay TV service providers?

Zontag – Service providers will focus a lot of their attention on offering great content, so we should see more original and exclusive content in the market and stronger partnerships between content owners and pay TV service providers.

Multiscreen TV Everywhere services will also be very important for pay TV service providers going forward. TV Everywhere is slowly becoming a must-have service for customers and it will soon become part of the most basic pay TV service offering. Commercially, I see a big opportunity to bring more premium content, particularly sports, to consumers, allowing them to, say, watch football finals on the go. The key challenge will be monetizing these TV Everywhere services, but there are various ways to overcome it, such as tiered pricing based on the number of supported devices or higher reliance on advertising revenue.

In terms of adjacent businesses, smart home solutions will be a very important way for pay TV service providers to extend their presence in consumer homes by providing connectivity for all consumer devices.

On the B2B side, targeted TV advertising will be a major opportunity for pay TV service providers as advertisers will be willing to pay more money for effective ways to reach their target audiences. Pay TV service providers, broadcasters and advertisers will have to work together to find a mutually beneficial business model. Today, it might be more lucrative for some broadcasters to sell TV advertising on their own. However, with TV advertising rates getting squeezed by online advertising, it will be just a matter of time before targeted TV advertising becomes a reality.

Ariel Dascal, Head of Digital Innovation, Oi

PTVIF – How would you describe the state of the Brazilian pay-TV market today?

Dascal – There is a clear generational divide in terms of how people consume TV and video content. Under 35s have very distinct viewing habits: they are very technologically savvy, they prefer streaming videos – either on subscription OTT services, YouTube or pirate sites – and consume a lot of content on mobile devices. Selling pay TV packages to them is difficult. They do not see much value in packaging, they want freedom to watch content whenever and wherever they desire. And then we have the older generation who consume TV in the traditional linear way and who are used to buying traditional pay TV services and triple-play bundles.

Although pay TV service providers need to respond to this new market reality, the pay TV industry still has huge growth potential in Brazil. There is a large untapped market, with less than half of the households subscribing to pay TV. Even among the high income households, where penetration is just over 80 percent, there is still a significant base of potential users that pay TV companies could go after.

However, there are three key barriers to further expansion. First is the price of pay TV. Most households that do not subscribe to pay TV services simply cannot afford to at the current price levels. Second, subscribing to pay TV used be a status symbol, but with the economic crisis many subscribers are dropping their pay TV subscriptions and keeping only their broadband subscriptions. Third, some consumers are leapfrogging pay TV and going from free-to-air TV to non-linear OTT services.

PTVIF – What are the innovation priorities for pay TV companies in Brazil? 

Dascal – The number one priority is the digitization of the pay TV experience in terms of delivering a better end-to-end experience to our customers and reducing our costs of operation and customer acquisition. We need to bring our services into the 21st century. As consumers are comparing pay TV services to Netflix, pay TV service providers need to deliver an interactive digital user experience across all consumer devices.

The second priority is acquiring great content, particularly for various on-demand and streaming propositions. Pay TV service providers face a major challenge in relation to the content industry, which is slow to respond to changing market realities and still follows the traditional approach of managing release windows and selling packages of channels. The content industry is highly susceptible to disruption driven by large Internet businesses, such as Apple and Google, which will allow consumers to get whatever content they want whenever and wherever they want it.

The industry also needs to look for opportunities beyond pay TV and OTT services in areas such as e-commerce, advertising, innovative pricing, new types of content, second-screen applications, mobile-first solutions and home automation and security solutions.

Iris Wee, CMO, Link Net-First Media
PTVIF – What do you think makes the Indonesian pay TV market different?

Wee – The Indonesian pay TV industry has faced a unique set of challenges and opportunities. Historically, pay TV penetration has been low due to high level of piracy and a very vibrant and competitive free-to-air TV market that offers high quality local content, providing little incentive to people to switch to pay TV. The pay TV market has been dominated by satellite operators that have primarily pursued aggressive pricing strategies, with little differentiation or innovation.

PTVIF – How would you describe the key developments in the Indonesian pay TV market?

Wee – I think the market is changing. First of all, the traditional DTH satellite providers have realized the limitations of their business model and are now increasingly trying to bundle their services with fixed broadband or 4G mobile data services, usually through partnerships with telcos. In addition, they are trying to move beyond pure price competition and are looking for ways to differentiate their services. However, without being able to support two-way communication, DTH satellite operators are at a big disadvantage. Hybrid set-top boxes might seem like a reasonable next step for them, but this would require significant capital expenditure and a long-term view of the business, which are not supported by the ‘low ARPU and high-churn nature of the DTH satellite pay TV business.

Secondly, there has been a number of new fiber providers entering the market recently, with pay TV and video playing a significant role in their market penetration strategies. Some of them offer pay TV services as part of their bundle, while others have partnered with OTT players to offer on-demand entertainment bundles.

Finally, the market has seen a number of OTT service launches. It is yet to be seen whether these services are going to be a substantial threat to the traditional pay TV model, but they have definitely been very innovative. OTT service providers recognized that a one-size-fits-all model would not work in the Asian market and adapted their propositions in terms of pricing and content. They have implemented a myriad of content localization techniques, such as subtitling and dubbing, and are actively looking to acquire and produce local content.

PTVIF – Looking ahead, what will be the most exciting areas of opportunity for pay-TV service providers?

Wee – Telcos will drive innovation in pay TV over the coming years, with broadband being key to pay TV market penetration strategies. They will not limit themselves to offering pay TV as a set-top box-based home entertainment service. Their offerings will be agnostic of consumer premises equipment and will include OTT products targeting the on-the-go digital consumer. It is only a matter of time before we will see the proliferation of digital media players, such as Chromecast and Apple TV, and these guys will be ready for that.

For mobile telcos, OTT services will be key to monetizing their mobile data services. We are already seeing a number of telco and OTT partnerships in the market, and these will be ever more important. However, the penetration of these services will heavily depend on pricing and packaging strategies.

Also, if you compare mobile networks in Europe and those in emerging Asian countries, you quickly realize that our networks cannot support a great on-the-go video experience. Some OTT and TV Everywhere services already have download-to-go functionality, and anyone trying to build a successful OTT service will need to support it.

Meanwhile, DTH satellite operators are changing their strategies and moving away from competing solely on price. They are seeing rationalization and investment in new set-top boxes, with differentiating functionalities, and putting more focus on premium customers.

Emily Wee, VP Business and Media Operations, New Media, Telekom Malaysia

PTVIF – Where does innovation rank among the Malaysian pay-TV industry’s top priorities?

Wee – Innovation is definitely one of the top priorities. Pay TV operators have to innovate to keep up with market trends and to protect and enhance their revenue streams. For us, as a challenger in the Malaysian pay TV industry that entered the pay TV business only five years ago, innovation is particularly important. We always need to look for an edge to convince customers to choose us rather than our competitors.

Innovation has become much more important over the last couple of years. The rate of change has accelerated and we are seeing many new players in the market, while consumers have a lot more choice and freedom. A growing number of different businesses are jumping onto the OTT bandwagon, with subscription fees of some OTT services as low as a tenth of the price of traditional pay TV packages. In addition, with technology companies, such as Google and Amazon, and TV manufacturers coming into the game, the urgency for the pay TV industry to innovate and keep ahead is growing.

PTVIF – Looking ahead, what will be the most exciting areas of commercial opportunity for pay-TV service providers?

Wee – There is a substantial opportunity to bring all entertainment together on a single platform. Partnerships with OTT content providers or game developers are where a lot of convergence is happening. The key task and challenge is to ensure that the whole experience fits nicely together.

Great user experience is the missing piece of the puzzle. How can pay TV service providers make it seamless? How can they build a search and recommendations engine that encompasses not only linear content, but also all the on-demand libraries, applications and OTT content? Smart TV manufacturers were the first to attempt that. They have tried to partner with as many content providers as possible in order to bring the adoption rate of smart TVs up. However, the experience has not lived up to the expectations. It still feels a bit clunky, with users having to navigate between different standalone apps.

In the OTT space, TV Everywhere is a ‘must do’ for all operators. I think there are also interesting opportunities for pay TV companies to offer standalone OTT services that are differentiated from their core propositions and targeted at new customers outside their footprints. Sky has made it work quite well with Now TV in the UK. However the jury is still out as to whether this would be applicable to the Malaysian pay TV market, given the differences between the two markets.

Outside the core pay TV and OTT propositions, Internet of Things and smart home solutions would be the first priority. This is particularly true for telcos, which are increasingly focused on owning the connected home. However, it is very early days for Internet of Things and smart home solutions in Malaysia. These solutions will develop much faster in other countries in the region that have higher incomes and higher broadband penetration.


Tech Advances Broaden Options For Boosting Home Wi-Fi QoE

Greg Fisher, CTO, Hitron Technologies Americas

Greg Fisher, CTO, Hitron Technologies Americas

CableLabs’ Tests Show Superior Performance for both Mesh- and Extender-Based Approaches

By Fred Dawson

November 17, 2016 – Network service providers seeking to address customer dissatisfaction over the growing disparity between broadband access speeds and Wi-Fi throughput in the home have a difficult choice to make between mesh- and repeater-based solutions that have proven they can close the gap.

Without naming vendors, CableLabs recently ran tests that showed effectiveness of both approaches to dealing with the typical issues that are driving Wi-Fi-related subscriber complaints, namely simultaneous streaming of video on multiple devices in a big household. As described in a blog by John Bahr, lead architect for wireless technologies at CableLabs, the test results convey “excellent news for consumers whose access to the Internet is wireless and who want that access everywhere in their homes.”

In the fast-moving wireless technology arena, it won’t be long before network operators who rely on Wi-Fi to support connected-device access to their broadband service in the home will be looking at options that go far beyond what’s doable today, including three spectrum-channel access points (APs) and devices that can operate at the 60 GHz tier to support gigabit speeds and LTE and 5G small cells that will be able to operate over unlicensed 5 GHz spectrum. But, for now, with the volume of complaints rising by some counts to close to 50 percent of call-center traffic, the urgency is too great to wait for emerging technologies to mature.

As previously reported, AirTies, a leading proponent of mesh technology, has gained considerable traction with broadband service providers worldwide, including two named customers, Frontier and Midco, and others unnamed in North America since it began focusing sales efforts in this part of the world over a year ago. As Bahr makes clear in his blog, mesh technology, which uses multiple intelligent APs to optimize use of Wi-Fi spectrum throughout the home, has made great strides among cable companies looking for new solutions.

“Mesh Access points (MAPs) are quickly gaining traction in home networks mainly due to ease of installation (even over Repeaters/Extenders) and the promise of high throughput with whole home coverage,” Bahr says. “In the past year, there has been a dizzying array of product announcements and introductions for home Wi-Fi coverage, with many of them using mesh networking.”

As Bahr notes, mesh wireless technology, while new in residential home networking, has been in use with enterprise wireless LANs for the past decade. Simple linear extension of coverage in the home via Wi-Fi AP extenders has been a mainstay in the home networking market even longer.

But, just as mesh technology has benefited from proprietary advances in software and shrinking of form factors, repeater technology, too, has morphed beyond the traditional extender model. A case in point is Hitron Technologies, a leading supplier of DOCSIS modems and AP-equipped gateways, which is utilizing various techniques to enable highly intelligent optimization of Wi-Fi performance in the home with use of extenders.

“Our technology provides control from the gateway that automatically sets up and configures extenders so that there’s always one hop between any device and an AP,” says Greg Fisher, CTO of Hitron Technologies Americas. “We support MoCA to Wi-Fi, Ethernet-to-Wi-Fi or Wi-Fi to Wi-Fi.”

CableLabs’ tests of mesh and repeater technologies were conducted in a test home utilizing three APs to provide coverage across over 5,000 square seat of space. “We performed throughput, jitter, latency and coverage testing at more than twenty locations in and around the house,” Bahr says.

The test ran two streaming videos at HD bitrates of about 20 Mbps to video clients and a simultaneous feed to a test client. “Both mesh and AP + repeater solutions were able to handle this video throughput, as well as deliver over 50 Mbps throughput throughout the house and even to some areas 20’ outside the house,” Bahr writes.

CableLabs’ goal, he adds, is to help operators move away from dependence on proprietary solutions by defining a standardized “AP Coordination Protocol” that would enable APs to share the information essential to making client steering decisions and performing network maintenance tasks. The organization is working with vendors to come up with such a protocol with no indication yet as to how close they are to consensus.

Meanwhile, the need to limit customer dissatisfaction resulting from Wi-Fi issues that, traditionally, haven’t been regarded as an operator’s responsibility has become an urgent matter with an average of 63 percent of home Wi-Fi users worldwide experiencing issues, according to the 2015 ARRIS Consumer Experience Index report. Moreover, 72 percent of consumers surveyed by ARRIS said that Wi-Fi availability in every room of the home is very or vitally important and 54 percent indicated they are not experiencing the range of coverage they need.

As represented by the AirTies and Hitron strategies, mesh and advanced repeater options constitute very different models for moving forward. Presently, the momentum at least as far as publicity is concerned, appears to be on the side of the mesh solution providers.

Mesh is definitely gaining momentum, says Khin Sandi Lynn, industry analyst at ABI Research. “Wi-Fi mesh network systems are one solution and a newer concept for homes, though enterprises commonly utilize them,” Lynn says. “This technology is beneficial in larger households that suffer from pockets of inadequate coverage, as their broadband routers are strong enough to provide premium coverage to the entire home and all of its connected devices.”

But, she quickly adds, they are expensive, with prices for a three-AP mesh system ranging between $300 and $500, which “could price some residential broadband users out of the market.” That could be mitigated if service providers foot the bill and can get people to pay a subscription fee for an enhanced Wi-Fi service, she says, noting AirTies customer Midco, the Tier 2 U.S, MSO, is now charging $7.95 monthly for such a service.

While Lynn agrees with Bahr that mesh systems are easier for consumers to install, since they are self-forming and self-optimizing, a mixed wired backbone with multiple APs can be more stable, she says. As noted in our previously cited report, AirTies is offering what it calls Hybrid Mesh, which puts the wired home network under control of the mesh system software.

As explained by Bulent Celebi, executive chairman and co-founder of AirTies, Hybrid Mesh selects a combination of wired and Wi-Fi hops to route packets, which, he says, dramatically increases total network capacity. As he notes, this is significantly different from traditional wired/Wi-Fi extender configurations, which treat the wireline as a fixed backbone with no dynamic interplay to ensure optimal connectivity.

A hybrid combination of APs and MoCA or Ethernet wiring is what Hitron’s customers are most inclined toward, says Greg Fisher – but not with mesh on the wireless side. “Most of our customers agree with a non-mesh strategy,” he says. “While we can do Wi-Fi to Wi-Fi, that’s not the preferred mode.”

Hitron, which has emerged as a leading player in DOCSIS modems and gateway routers over the past few years after playing a role as OEM supplier to other brands, is focused on leveraging cloud technology and industry standards to create a home networking environment that can maximize Wi-Fi performance with use of its gateways and extenders.“Hitron’s strategy is to deliver a Wi-Fi experience where the MSO’s demarcation point is the customer’s finger tips,” Fisher says.

One element to this strategy is a new marketing and sales partnership between Hitron and Adaptive Spectrum and Signal Alignment, Inc. (ASSIA), a leading provider of broadband diagnostic and optimization solutions to telecommunications companies globally. Leveraging ASSIA’s machine-learning CloudCheck architecture with an agent solution in Hitron’s gateways, the new system performs real-time analysis taking historical information into account to automatically optimize wireless network environments without operator or user intervention, says Jarrett Miller, vice president of global alliances for ASSIA.

“CloudCheck dynamically optimizes Wi-Fi and provides operators with true visibility into and control over subscriber Wi-Fi environments,” Miller says. “This helps to eliminate, or shorten, inbound calls through the self-healing of subscriber Wi-Fi environments.” In addition, he notes, the system is able to send accurate contextually based recommendations to subscribers to aid them in ensuring extenders are properly placed or in identifying device issues, such as when a dual-mode device capable of using both the 2.4 and 5 GHz spectrum tiers fails to jump to 5 GHz when the traffic over the 2.4 GHz channel is slowing things down.

The Hitron strategy also exploits the capabilities of devices equipped to support the IEEE protocol 802.11k, Fisher says. “As the device moves away from the gateway it automatically channels off and moves over to an extender,” he explains, noting that older devices without 802.11k must be “brute forced” into the transfer, which can take a few seconds. Also, rather than simply always connecting to the AP that provides the strongest signal, 802.11k-enabled devices connect to a more distant AP when the close one is over saturated, enabling higher throughput when there’s little or no contention on the weaker signal.

Along with automating optimization of connectivity, Fisher says the Hitron/CloudCheck combo enables a more robust per-device diagnostics regime than operators get with either SNMP (Simple Network Management Protocol), which doesn’t reach beyond the gateway to monitor extenders, or TR-069, the Broadband Forum protocol that’s designed to look at all the CPE. “TR-069 polls every six hours or so when it’s implemented on a public cloud service like Google Web Services,” Fisher says. “We do it every few seconds.”

Operators are able to look at per-device performance to assess the QoS on connectivity to any CPE element, including MoCA as well as Ethernet clients, he adds.  “Analytics across the entire premises footprint allows them to see which devices need help, what steps need to be taken to optimize performance,” he says.

Hitron, which provides commercial-grade as well as residential gateways, counts Charter, Mediacom, Altice, Shaw, Rogers, GCI and Videotron as customers, some of which use its products in both commercial and residential scenarios. Charter is using the gateways just for commercial customers, Fisher says.

With this embedded customer base Hitron clearly is positioned to drive momentum behind the non-mesh approach to advanced Wi-Fi service. It will be interesting to see how these technologies play out as operators move to address the Wi-Fi dissatisfaction issue in the near term.

In the long term, other options to providing robust wireless connectivity loom. We’ll explore these in a forthcoming article.


NFV Advances May End Resistance To Complete Network Virtualization

David Ward, CTO & chief architect, service provider division, Cisco Systems

David Ward, CTO & chief architect, service provider division, Cisco Systems

New Datacenter Technologies Combine with Orchestration Tools to Improve Prospects

By Fred Dawson

November 7, 2016 – New tools essential to making full-on conversion to network virtualization practical will soon allow network service providers to move beyond the point-solutions that have characterized use of virtualization technology so far.

While ever more operators are adopting strategies that enable some network functions to run as software on commodity servers, especially in the commercial services domain, they’ve largely shied away from pursuing full virtualization across their network and back-office infrastructures. One big reason for that is they don’t trust a fully virtualized architecture running in multiple locations in support of multiple functions to reliably allocate compute and storage resources as demand ebbs and flows without causing glitches in mission-critical applications.

Justin Paul, head of OSS marketing, Amdocs

Justin Paul, head of OSS marketing, Amdocs

“Our customers are telling us they need to understand and manage the impact of virtualization on a holistic basis,” says Justin Paul, head of OSS marketing at Amdocs. “Our research shows service providers want to move faster with a more complete approach to virtualization rather than having one foot in each camp.”

Forthcoming Amdocs Solutions

So far, Amdocs has seen a strong market response to its support for orchestration of vertically aligned processes that go into virtualizing a specific network function or set of functions related to specific service applications. Noting 2016 “has been a big year for Amdocs in the NFV (network function virtualization) space,” Paul cites as one example his firm’s successful collaboration with Vodafone in a proof-of-concept demonstration of software-defined VPN service at the Mobile World Congress in February.

“It took us just seven weeks to build the SDN/NFV framework with five different partners and six different network functions,” Paul says. The demo employed Amdocs’ cloud orchestration solution to coordinate all the service and virtualized CPE functionalities with a group of partners that included Juniper Networks, Aria Networks, Red Hat, Adva Optical Networking and Fortinet.

But as Vodafone executives have made clear since then, going beyond a demo to full implementation of a virtualized globally available enterprise service across 60 regional operating companies is a challenge that requires a much broader approach to making all the parts do what they’re supposed to. In an environment where hardware resources are dynamically allocated moment-to-moment to software-defined functions there’s an immense range of functionality that must be coordinated to maximum effect without violating performance priorities.

“We have seen in a lot of cases that vendors are thinking in terms of boxes when they are allocating virtualized resources,” said David Amzallag, Vodafone’s head of SDN and NFV, speaking at the Big Communications Event in May. As quoted by LightReading, he added, “This creates a misalignment in the amount of resources needed.”
Similarly, at a TM Forum event in July, Vodafone’s chief systems architect Lester Thomas, also quoted by LightReading, said virtual network functions (VNFs), the software-defined modules that are implemented in various types of virtualization frameworks, aren’t built from the ground up to exploit the cloud flexibility operators require. “They tend to make traditional network elements software based,” Thomas said. “Just virtualizing it doesn’t make it elastic and self-healing; so the next stage is to take systems and make them easy to deploy and scale.”

Vodafone hasn’t said whether it will tap the same vendors that put together the demo for its forthcoming VPN+ service, slated for commercialization next year. The demo utilized the Amdocs Network Cloud Service Orchestrator to maximize real-time operational efficiency by adapting to traffic loads, ensuring system recovery through dynamic policies and self-healing and leveraging analytics to mitigate the impact of security threats.

But Paul suggests Amdocs will soon have more to offer in this vein with a range of capabilities designed to support a holistic approach to NFV that reaches to massive scales. “We’ll be adding a new component in our NFV architecture – Amdocs Active Inventory,” he says.

“Up to now we’ve been in the early stages of network virtualization where people aren’t moving things all over,” he adds. “Functions sit on dedicated servers. They’re not integrating the whole network. This is where Active Inventory comes in.”

Paul describes the new component as “a kind of a jack of all trades that goes to all the systems in the network to pull and orchestrate data essential to network-wide NFV efficiency.” Using predictive analytics, the platform will be able to size up everything to better prepare for system-wide resource utilization as needs shift, he says.

For example, the platform will provide instantaneous, actionable answers to questions such as: “Where is there free capacity in datacenters? What’s the configuration of all the servers? Can they support functions like encryption or whatever? Do we move to another datacenter? Do we create a new VM (virtual machine)? Do we need to instantiate servers in the public cloud?”

New NFV Support from Guavus

Chris Menier, vice president, product & strategy, Guavus

Chris Menier, vice president, products & marketing, Guavus

The need for an overarching intelligence layer that can coordinate NFV scaling across multiple locations is also much on the minds of strategists at Guavus, a provider of operations analytics solutions that support proactive responses to any set of network performance anomalies that are likely to impact the quality of the end user experience. As Guavus moves into enabling personalization and optimization of use cases tied to specific service applications it is also building software solutions for ensuring quality performance in the virtualized environment, notes Chris Menier, the company’s vice president of products and marketing.

“With NFV you have a balloon effect where, when you push in one direction it impacts what’s happening elsewhere,” Menier observes. “This is especially true since applications are sharing physical resources. When you’re spinning up raw capacity for a given application you need to know whether you’re taking resources away from a higher priority application.”

Taking a pre-determined fair-share approach to resource utilization isn’t sufficient, he adds. “If you don’t know the impact of what you’re doing downstream, you can easily cause problems that impact user experience,” he says. “You’re blind with existing tools, which is a barrier to moving ahead with virtualization.”

Guavus will be incorporating analytics that can anticipate and prevent such problems into its product portfolio. “We have a clear way of looking at it based on what’s happening at the individual customer and device level,” Menier says. “We can predict how changes in the cloud will impact that experience and, if necessary, communicate what needs to be done to re-feature use of resources.”

Guavus has integrated its software platform into NFV orchestration as defined by the MANO (Management and Network Orchestration) framework developed by the European Telecommunications Standards Institute’s Industry Specifications Group for NFV. MANO, one of several orchestration frameworks emerging in the NFV space, is meant to foster a consistent industry approach to instantiating, scaling and configuring virtualized functions.

“The process of taking data in and spitting out answers to optimize orchestration will be automated,” Menier says. “When that happens operators will have a lot less heavy lifting to worry about.”

Major Progress at Cisco

At Cisco Systems moving virtualization into the service provider space has long been part of its larger play supporting virtualization in the enterprise market from the early days of SDN (software-defined networking). “We have multiple platforms we’ve already virtualized,” notes Daniel Etman, director of product management at Cisco, which plans to add virtualization of its cBR-8 CCAP (Converged Cable Access Platform) to the mix next year.

For Cisco, Etman says, virtualizing specific functions isn’t the big challenge. “From a product perspective virtualization is not that complicated,” he says. “It becomes complicated from the customer’s perspective.”

In Cisco’s view network virtualization is by definition a set of processes by which dynamic allocation of datacenter resources is managed across multiple network functions to support software representing both the control planes and the data planes of those functions, not just the control planes as defined by SDN. “Network virtualization is not SDN,” Etman says.

To achieve the real cost-benefit from virtualization requires building an initial approach to NFV that takes into account the need to support ever more services, he adds. “If you just virtualize a single service platform, I’d question the value,” he says.

A year ago Cisco launched its Evolved Services Platform (ESP) as a comprehensive virtualization and orchestration software framework that creates, automates and provisions services in real time across compute, storage and network functions. The company says that by enabling the delivery of desired business outcomes for applications running in multiple domains, ESP allows service providers to deliver prepackaged services from a flexible pool of resources that can be reused and personalized for each customer, automatically and on demand.

While the platform’s orchestration layer is meant to handle a wide array of virtualized network functions, Cisco has made it possible for operators to build virtualization incrementally around VNFs tied to specific services such as what it calls Cloud VPN or to specific hardware components such as the forthcoming virtualized cBR-8. The business logic of the ESP service and resource orchestration modules are used to start up the VMs (virtual machines), activate the VNFs and dynamically create the specific service function chain with linkages that support the service profile and steer the customer traffic through them, Etman says.

But he acknowledges that as the operational impact of NFV expands across ever more services, the orchestration requirements multiply, not only to ensure maximum efficiency in use of datacenter resources on an as-needed basis but also to ensure coordination with legacy aspects of the operational environment.  “Analytics is the key,” he says. “We have that orchestration level and the automation. We have the analytics running on top of that to support proactive network management with a holistic view of what’s happening and the ability to explain how to solve problems.”

To help take things to the level of orchestration required to support ongoing NFV migration Cisco is “working with a business unit that does orchestration for many of our business units that do virtualization platforms, which gives us a unified approach to addressing these issues,” Etman says. “What we’re driving to is to ensure that regardless of what is virtualized and what isn’t, everything is transparent from a service development and operations perspective.”

Facilitating this effort, Cisco recently introduced a new set of cloud professional services to help businesses optimize their cloud environments. Among other things these services help automate and de-risk the complexity involved in onboarding and migrating applications and workloads to the cloud with orchestration support for management across all locations, says Scott Clark, vice president of advanced services at Cisco.

Utilizing what Cisco calls its Domain Ten framework, the new initiative “provides customers with an end-to-end view of the key elements, or domains, in their data center and IT infrastructure,” Clark explains. “It helps customers plan their IT transformation by providing a strategic roadmap and steps to take to achieve their goals.”

Vector Packet Processing

More fundamentally, Cisco’s service division is going all-out in pursuit of a holistic approach to datacenter operations, which it hopes will make massively scaled NFV a practical option in the “5 9s” world of network operations. “This is where I’m putting all my attention,” says David Ward, CTO and chief architect of the division. “Getting that workflow layer right is how we can move cloud solutions to our customers.”

The goal is nothing less than an entire network running as software on computers, which requires “repeatable line-rate performance, deterministic behavior, no (aka 0, null) packet loss, realizing required scale and no compromise,” Ward says. “If this can be delivered by virtual networking technologies, then we’re in business as an industry.”

A key step in this direction is what Cisco calls Vector Packet Processing (VPP), which is a high performance virtual switch-router that runs in any virtual network infrastructure to support efficient use of hardware resources. As Ward explains, commodity server cores can handle the core function processing requirements common to traditional virtualized IT workloads and to VNFs as well, but they have trouble handling the tasks tied to processing huge volumes of packet headers at the I/O stage of workloads common to services involving millions of users.

“The typical compute workload mostly consumes CPU and memory with relatively little I/O,” Ward says in a recent blog. “Network workloads are the inverse. They are by definition reading and writing a lot of network data.”
Fortunately, he adds, the demands imposed on CPUs for processing network functions are relatively modest and well defined. So the main challenge is to figure out optimizations that will increase packet-per-second performance.

Even with the fastest cores, there’s a need to overcome the bottleneck that comes with receiving, processing and transmitting high volumes of packets. With a 67.2 nanosecond time limit on processing each packet, there aren’t enough clock cycles per packet available on a single core to handle the load within that timeframe, nor does traditional sequential processing enable timely use of multiple CPU cores for caching instructions and processing the packets.

Recourse to use of main RAM memory, which consumes about 70 nanoseconds, is out of the question. And spreading out the workload by adding ever more cores to the tasks isn’t a useful way to scale. This means a solution must be found that makes better use of all the resources in the datacenter, including enabling use of hierarchically arranged CPU caches as a supplement to other memory resources.

VPP is one of the baseline contributions to a new Linux Foundation open-source collaborative project called, which complements the many initiatives focused on control, management and orchestration by formulating packet forwarding processes that ensure data plane performance in real time with no packet loss. “VPP tries to answer what can be done, what…set of technologies and techniques can be used to progress virtual networking towards the actual fit-for-purpose functional, deterministic and performant service platform it needs to be to realize the promises of fully automated network service creation and delivery,” Ward says.

VPP, which works in all virtualization OS environments, including bare metal, VMs and Linux containers, uses parallel processing to execute multi-threading across multiple CPU cores. It addresses main memory access bottlenecks by making optimal use of CPU caches through adjustments based on a variety of techniques that react not only to what’s happening in real time across all the caches but also to what’s about to happen based on instructions “pre-fetched” from memory at the control layer to anticipate imminent packet flow requirements.

The results have been well validated in the field, Ward says, noting VPP is a proven technology shipping in many Cisco networking products. “Combined with a compute-efficient network I/O…VPP enables very efficient and high-performing design and offers a number of attractive attributes,” he says.

HCI and Unikernels
Along with contributing VPP to the Linux-based open-source environment, Cisco is focusing on integrating the latest virtualization infrastructure technologies into its own implementations of NFV with VPP. One of these is hyperconverged infrastructure (HCI), which overcomes the encumbrances to NFV performance in datacenter with partitioned compute and storage resources by integrating compute, storage and networking assets within a multi-core commodity appliance.

HCI platforms provide a modularized, low-cost approach to scaling the cloud with access to all the types of storage used with traditional approaches, including RAM, flash-based solid state memory and HDD (hard disk drive) storage. As they scale the HCI cloud module by module, IT teams are able to seamlessly orchestrate hardware resources across the integrated cluster to support flexible configurations of ever more workflows and applications.

“We think there’s a big place for HCI,” Ward says. “It’s going to be a very important technology.”

Cisco has already built a large customer base for the a software platform, HyperFlex, that’s designed to automate introduction and use of HCI clusters in traditional datacenter environments, enabling unified management across hyperconverged and legacy resources. The company recently reported 600 customers from multiple industries worldwide have implemented HyperFlex to manage thousands of nodes and many petabytes of hyperconverged storage. Latest capabilities include extension of the reach of the platform to holistically manage multiple HCI-equipped private and public cloud environments.

Cisco is also embracing new unikernel technology as another step toward greater efficiency and reliability, Ward notes. “Multi-container and VM solutions make it really hard to move things around,” he says.

While containers provide a more flexible deployment model and consume less hardware resources than conventional VMs, there are still some drawbacks when it comes to implementing the interchangeable virtualized software components known as microservices, which can be scaled up and down as needed in various combinations to support specific instantiations of network functions. These microservices, which share installation images, libraries, binaries and other elements of the host OS kernel, are subject to vulnerabilities in container architectures related to the design of container kernels and to the loss of the security protection afforded by hypervisors in the traditional VM architecture.

Unikernels represent a new approach to supporting microservices that consumes far less compute resources than would be the case were microservice VMs constructed using traditional virtualization technology. Running on hypervisors, which are the OS layers that orchestrate use of hardware resources to run VMs, unikernels compile code from shared OS code libraries into OS runtimes for each microservice in isolation from the underlying OS and from other VMs, thereby enabling high volumes of microservices to run and scale across the shared hardware resources, whether they be HCI appliances or traditional servers and storage components.

All these developments point to the need to utilize the full NFV approach to virtualizing networks rather than the SDN approach that relies on use of distributed data planes running on purpose-built hardware, Etman stresses. Many network operators, especially in cable, are unconvinced but need to make up their minds before they lock themselves into a virtualization migration path, he adds.


Virtualized CCAP Options Pose Hard Choices in HFC Migration

Weidong Chen, CTO, Casa Systems

Weidong Chen, CTO, Casa Systems

Platforms on Offer from Casa, Harmonic, Huawei and Nokia Raise SDN vs. NFV Question

By Fred Dawson

October 29, 2016 – After a protracted multi-year debate over best approaches to CCAP virtualization Cable MSOs finally have some real product options to consider as they look for ways to maximize the power of DOCSIS 3.1 to meet bandwidth and service requirements in the years ahead.

The aim of these vendor initiatives is to capitalize on the virtualization efficiencies made possible in CCAPs (Converged Cable Access Platforms) optimized to work with the CableLabs-defined Distributed Access Architecture (DAA) model, which relies on digital optics terminated at the HFC node by Remote PHY electronics that manage modulation, multiplexing, forward error correction and other physical layer processes in the conversion to RF for distribution over coax. The question posed by the new vendor options is, once the CCAP is relieved of performing these PHY layer processes, what components of the DAA–enabled CCAP should be virtualized.

In one approach, as exemplified by Casa Systems’ recent demonstration of its forthcoming vCCAP solutions at Cable-Tec Expo in Philadelphia, both the  control and data plane components of the CCAP are virtualized to run on COTS (commodity off-the-shelf) servers at headends and datacenters. Harmonic, too, has introduced a fully virtualized CCAP dubbed “CableOS.”

The other approach, known as a Remote MAC-PHY configuration, is supported by new products from Nokia and Huawei. Their solutions virtualize the control plane functions running in core locations to orchestrate provisioning, quality control and tie-ins with other back-office elements but move the CCAP data plane or MAC (Media Access Control) into the node.

From a CableLabs’ specifications standpoint there’s a complex set of options associated with Remote MAC-PHY having to do with how much of the MAC resides in the Remote MAC-PHY Device (RMD) and how much remains in the hub or headend. The Nokia and Huawei solutions entail the most comprehensive encapsulation of the MAC in the RMD, which is referred to in the specifications as Remote CCAP.

Now that real products are entering the market, it will be interesting to see how operators respond. It’s not an easy choice, given the far-reaching implications of the pros and cons associated with each approach and how they play into DAA.

Stefaan Vanhastel, head of fixed networks marketing, Nokia

Stefaan Vanhastel, head of fixed networks marketing, Nokia

For Nokia the appeal of the Remote MAC-PHY approach was strong enough to merit acquisition of Gainspeed, which provided the CCAP functions to complement the Nokia-supplied digital optics and Remote PHY components. “By co-locating the CCAP MAC with the Remote PHY electronics in our new SC-2D node operators can reduce space consumed in the headend compared to the other Remote PHY approach by a factor of seven,” says Stefaan Vanhastel, head of fixed networks marketing at Nokia.

Modeling how its solution would benefit an unnamed Tier 1 MSO in a move from a non-virtualized CCAP environment to DAA, Nokia found the platform delivered an eight-fold reduction in power, a seven-fold reduction in rack space and a substantial improvement in signal quality. And like any Remote PHY solution, the MAC-PHY approach eliminates the fiber link distance and wavelength density limitations imposed by analog optics.

Erick Keith, principal analyst for broadband networks and multiplay services at Current Analysis, offers an even more upbeat assessment. The Nokia solution, he says, “takes MSOs to the next level with at least three major ‘Force 10’ efficiency multipliers – specifically, 10x improvements in fiber efficiency, power consumption and rack space footprints over centralized CCAP implementations.”

But Casa believes the industry consensus is settling around another view. Loading up nodes with electronics adds headaches that operators don’t need, especially when the ongoing density gains of commodity processors in combination with other efficiencies tied to NFV (network function virtualization) architecture promise to greatly alleviate the space and power burden in headends, says Matt Eucker, manager for U.S. sales engineering at Casa.

“With Remote MAC-PHY you’re essentially hanging a CMTS on a pole, which leaves you with potentially hundreds of CMTSs to manage from the headend,” Euker says. Adds Casa CTO Weidong Chen: “Anybody can build a remote MAC. We just don’t feel that’s the right architecture.”

Whichever way operators choose to go, there’s little doubt the industry is on the cusp of a major transition to distributed access architecture (DAA), with or without CCAP virtualization, A year ago, IHS-Infonetics, reporting on in-depth interviews with cable companies that collectively control 87 percent of the industry’s capex, found that 42 percent planned to deploy DAA in at least some facilities by 2017.

The adoption of DAA is spurred in part by strategies aimed at maximizing the benefits of DOCSIS 3.1, which survey respondents on average said would reach a third of their subscribers next year. DOCSIS 3.1, with use of much higher orders of QAM modulation, improved forward error correction techniques and the multiplexing efficiencies of OFDM (orthogonal frequency division multiplexing), is designed to utilize up to 1.2 GHz of RF spectrum on the coaxial links to support up to 10 gigabit-per-second speeds downstream and 1 Gbps upstream.

DAA makes it easier to push fiber deeper, not only to be able to reach the 1.2 GHz level of usable coax spectrum with elimination of long amplifier cascades, but also to continue reducing service group sizes once the 1.2 GHz capacity is reached in order to raise the amount of throughput available to each subscribing household. DAA offers the same fiber migration benefits in conjunction with use of DOCSIS 3.0 as well.

Remote PHY is indifferent to what type of approach is taken to implementing distributed CCAP technology. It works with CCAPs running on proprietary hardware as well as virtualized CCAPs running on commodity off-the-shelf (COTS) servers. But with DAA expediting migration to ever more nodes, it appears that CCAP virtualization will eventually be required to more efficiently accommodate the multi-service needs of a growing number of service groups.

When it comes to assessing which approach to virtualization is better, one big question is what, if any efficiency gap exists between the Remote PHY and Remote MAC-PHY models. Claims from vendors taking either approach are equally impressive.

For example, Harmonic says CableOS used in conjunction with DAA saves up to 90 percent on space and power costs. Where a hardware-based CCAP typically requires nine racks of equipment to support 80 service groups, CableOS running on four racks can support more than 250. With these kinds of gains, the efficiency tradeoff question comes down to whether operators believe further gains are worth pursuing with remote placement of the MACs.

It’s also important to recognize that both approaches can claim the benefits of using digital optics with Ethernet transport. Along with the ability to support denser wavelength packing over each strand of fiber at much greater transmission distances than is possible with analog optics, digital produces a higher carrier-to-noise (CNR) output after conversion to RF at the node, which in many cases enables use of higher orders of QAM on the coax.

Huawei, for example, notes that use of digital fiber connected to one of its Remote PHY nodes allows service providers to achieve the 41 dB CNR, which is the minimum RF output at the node required to support 4K QAM with OFDM on DOCSIS 3.1 connections. Looking at current node positioning, this level of modulation supports an average 8x gain in the number of people in the service group that can be served at 1 Gbps, Huawei says.

“Our Distributed Cable Converged Access Platform is a fully baked product that now supports DOCSIS 3.1,” says Sean Long, director of optical network product management for North America. (Huawei’s first D-CCAP release only supported DOCSIS 3.0.) “We’ve announced deployments in Denmark and Peru,” Long says. “And we have some engagements with smaller operators in the U.S.”

Nokia, too, reports strong interest in its Gainspeed Virtual CCAP (V-CCAP) portfolio with a number of lab and field trials underway across North America and Europe in advance of a commercial release scheduled by year’s end. By early 2017 Nokia will augment the options with release of a Gainspeed node supporting 10 Gbps downstream throughput on DOCSIS 3.1 links. And, as previously reported, Nokia’s Bell Labs is working on full-duplex 10 Gbps DOCSIS 3.1 (FDX), a still developing protocol that utilizes full 1.2 GHz of coax spectrum without any amplifiers on the coax link.

Nokia’s confidence in the Remote MAC-PHY approach to DAA rests on the notion that operators will want the flexibility to manage service offerings and spectrum allocations independently within each service group and to have more options when it comes to future migration of nodes and access technology. “The ability to remotely configure and manage each access node will be increasingly important to operational agility in the years ahead,” Vanhastel says.

This includes the ability to make the transition from legacy video to IP video on a node-by-node basis. With introduction of the V-CCAP solution Nokia is making use of the Gainspeed Video Engine, which is designed to protect operators’ deployed Edge QAM (EQAM) investments by terminating RF video services in the hub, thereby enabling Ethernet over digital optics transport to the Gainspeed nodes where the legacy video service is delivered over the existing QAM infrastructure. This allows operators to begin offering an all-IP video service over the DOCSIS 3.1 spectrum without affecting subscribers to the legacy service while simplifying eventual transition of the entire node service area to IP-based TV service.

Another important aspect to migration agility is the support Nokia’s unified cable access solution can provide for PON, point-to-point Ethernet and Wi-Fi access points. Currently, Vanhastel notes, Nokia’s EPON node can be positioned next to or in place of the Gainspeed nodes to share the existing digital fiber resources in support of fiber extensions to customer premises and public Wi-Fi APs. “In the future we’re looking for ways to combine our PON and V-CCAP nodes into a single node,” he adds.

Another future development path involves use of the HFC plant to support 5G wireless access, which, operating from small cell locations, will enable 1 Gbps or higher throughput over very high frequency millimeter wave spectrum to small clusters of end users, The environmentally hardened nodes packed with miniaturized electronic modules might eventually be used as points of fiber extension to multiple small cells, allowing the backhaul support for 5G services to be provided over the HFC digital fiber infrastructure, Vanhastel explains.

Nokia is already looking into ways to configure backhaul architecture to maximize efficiencies, he adds. On the one hand, cloud-based management of the small cell infrastructure is vital to scalability, but transport of all the raw RF feeds from clusters of small cells would require “ridiculous amounts of bandwidth,” he says.

The solution may lie with a “mid-haul” approach that does some processing to consolidate the raw feeds at some point in the network while leaving it to the cloud platform to do the heaviest lifting. “We’re looking at eight or ten mid-haul scenarios, Vanhastel says.

In pursuing the Remote MAC-PHY path to CCAP virtualization, Nokia is utilizing the approach to virtualization known as SDN (software-defined networking), where a centralized control plane manages the data planes embedded in field-deployed purpose-built appliances. The Nokia Gainspeed Access Controller is the brains of the V-CCAP solution enabling operators to configure and manage a large and widely deployed network of MAC-equipped access nodes.

This is in contrast to NFV architecture where both the control and data planes are virtualized, doing away with dedicated purpose-built appliances altogether. “Virtualizing everything is not that easy,” Vanastel says. “We’ve taken a pragmatic approach using virtualization in whatever ways make the best sense for solving actual problems. If you want to maximize network migration agility, you still need physical interfaces in the access network.”

Casa’s Chen sees things differently. “We believe the next step in CCAP virtualization is NFV,” he says. Coding used with Casa’s vCCAP software is agnostic to whether the NFV architecture is based on OpenStack, containers or other emerging technologies, he notes, adding, “NFV allows you to scale much more easily.”

As described by Matt Eucker, the Casa solution maximizes scalability by enabling separation of the virtualized control plane and MAC functions so that one core datacenter-positioned control layer can manage virtualized MAC software positioned in multiple headend or hub locations to control service flows across a large number of nodes. A key advantage of this approach to distributed MAC management is it “closes the timing loop” that might cause unacceptable latency between the control plane and MACs if the MACs were positioned farther away in every node.

Adding to Casa’s case for a pure NFV solution is the fact that the vCCAP solution plays into the larger cloud framework of its Axyom software architecture, Eucker notes. Axyom is an open-source virtual edge mobile computing platform that provides a common secure foundation for independent scaling of control and data plane functions. With modular components supporting Wi-Fi wireless access gateway, LTE, 5G and other functions, Axyom establishes an NFV environment for vCCAP that enables dynamic use of COTS resources across multiple virtualized applications.

Clearly, MSOs face some tough choices as they move to utilize virtualization technology in conjunction with implementing DAA across their DOCSIS 3.1 footprints. Equally clearly, it looks like they’ll have the ability to ride virtualized CCAP technology a long way into the future, whichever path they choose.

Page 1 of 3512345...102030...Last »