Service Providers Archive

0

Pay TV Operators Worldwide Detail Responses to Disruption

Koby Zontag, VP Media Sales & Business Development, PCCW

Koby Zontag, VP Media Sales & Business Development, PCCW


Common Thread in Top Innovation Priorities Reflects Consistency of Competitive Threats

November 22, 2016 – Entering 2017 the challenges faced by pay TV providers the world over are remarkably consistent region to region, as evidenced by the results of an extensive global survey of operators undertaken by the Pay TV Innovation Forum initiative spearheaded by NAGRA. At the same time, innovation strategies vary depending on regional market conditions and where any given service provider sits in the intensifying competitive scrum.

In the interview that follows, Simon Trudelle, senior marketing director for NAGRA, provides an overview of the survey process and its findings. We then present excerpts from Pay TV Innovation Forum interviews with six executives from different regions of the world who describe the market conditions, challenges and innovation strategies that characterize their operational environments. Companies represented include AT&T/DirecTV, Liberty Global, Hong Kong’s PCCW, Brazil’s Oi, Link Net-First Media in Indonesia and Telekom Malaysia.

ScreenPlays – It’s great to have this opportunity to catch up with you, Simon, especially in light of some of the research that’s come out of the Pay TV Innovation Forum that NAGRA has been spearheading. Why don’t we begin with your telling us what this is, how long it’s been operating and what its agenda is?

Simon Trudelle, senior product marketing director, NAGRA – It’s a program we launched in Q2 2016. A final report and conclusions were released at IBC 2016.

The program aims to look at the state of innovation in the pay TV industry and really answer the question of what will be driving growth in the years to come at a global level. The approach we’ve taken is to work with a London-based consultancy, MTM. They have been experts in the TV space for over a decade.

They researched the market around the world looking at the top 231 operators across the leading countries and analyzing the state of innovation with each of these operators. And then we opened up the conversation with industry executives. Over 200 people were asked to contribute and to provide their view of what are the priorities in terms of innovation for years to come.

We ran six workshops in different parts of the world – Europe, in London and Rome; Asia-Pacific, in Singapore, and in the U.S. Los Angeles. And we also went to Mexico and Brazil. We surveyed executives in each of these regions to capture their input and also ran some surveys and analyzed data to get a complete view of the situation today and where it’s headed.

SP – I don’t know of anybody that has done this. Usually you get research studies that aren’t really talking to distributors. They’re talking to everybody else to get trends and what have you.

Getting them to cooperate was no small feat I imagine. Once you did what did you find out?

Trudelle – Ultimately we realized that there are some obvious leaders worldwide. They’re not specific to one region. We listed major players that are ahead of the curve in many ways and have been able to innovate already and launch new types of services, improving the pay TV experience or even going into what we call adjacencies, new areas of growth for pay TV. We provided a benchmark and ranking of the players. That data is available in the reports.

What comes out is that, in terms of the next steps, we’re going to see more competition driving more innovation. Eighty-three percent of the service providers we surveyed said that competition is going up, and 78 percent said that innovation was the answer.

It means we are reaching a point in the industry where we know things are changing and the opportunities are there to actually grow the pay TV industry. But the recipes will be different, because the technologies and the networks to deploy pay TV services are evolving with IP and cloud technology and data becoming more and more important.

And the other dimension in terms of how to do it better for the future, in the conclusions we not only see a focus on the new technologies but also on partnerships with key vendors to accelerate this innovation process and be more agile in leveraging the best-of-breed players to get there and build the future of pay TV.

SP – Where is that collaboration in the vendor community centered? How does that get done?

Trudelle – We’ve analyzed several models. There are some consortiums that have begun to be put in place. Also, there are some contributions from open-source communities. There are also some service providers among the largest ones that have started making equity investments in some of their partner vendors.

We think there are several models. It will be a mix of them that will make service providers successful. It is certainly a new way of approaching the market. The end game is that service providers have to be in a position where they put the consumer at the center of the experience, and they’re agile enough to move their systems to the next generation of technology.

SP – What did you see as the biggest area of consensus on innovation strategies? Is it revolving around UHD and HDR? Is it starting other services? Is it mounting an over-the-top?

Trudelle – We looked in particular at nine major categories, and out of that list there were three that stood out more. One is more on the business side, the pricing and packaging of the offering.

The feedback we’re getting from the industry is that we will move progressively away from the one-fits-all type of bundle to more segmented, targeted products that respond to the needs of consumer segments. And that has become possible because of OTT delivery, new technologies and new experiences that can be delivered. In the survey that came out as one of the top priorities.

Then it’s also about improving the offering in terms of content. So on-boarding OTT content, particularly the Netflix’s and YouTubes of the world…

SP – A few years ago that survey would have come up near zero on that question.

Trudelle – Absolutely. We started this survey over a year ago where we already had some signs that it was becoming a reality. And now we’re seeing that happening and more service providers saying we would like to on-board more content and create the one place where you have access to all the best content. So it’s really giving pay TV its leadership role again as being the one place where the best content is available.

The third priority is increasing the reach to all screens – big screens, TV sets, very important, but also bringing the same content to other devices with the on-demand capabilities easily available from all devices. That’s more to address the needs of a younger generation that is consuming content on all these devices.

SP – Obviously, these priorities are all intertwined. They basically feed off each other as the priorities of the industry. That survey really gives us a good idea of what’s on these people’s minds. Were the findings different for North America?

Trudelle – There were trends that are stronger in the North American market. We’ve learned from service providers there is a great appetite for delivering OTT content and building an app model addressing on-demand consumption anywhere anytime and also more flexible pricing and bundling.

And with the pressure from content owners that are going direct to consumers this is also bringing service providers to look at the market with a different vision of where it’s headed. We haven’t seen that much of these trends emerging in other parts of the world yet. When we look at the four reports, we see that North America is already addressing challenges that the other regions are only dreaming about.

SP – In this area of collaboration, did anything come up around security and the fact that these new [content licensing] rules that are coming into play will require far more cooperation on enforcements in tracking piracy, which is really a pan-industry kind of agenda?

Trudelle – It does come out in some conversations that there is, especially in markets like those in Latin America, a lot of illegal content that is available and hurting the pay TV industry. We at NAGRA work with regional operators to improve the anti-piracy efforts as part of the Alianza alliance in the region.

But this is potentially holding back growth and playing a negative factor on innovation, because consumers find the content they want but through the wrong channels. That means that service providers at some point and content owners as well have to get themselves the tools and the technologies to stay in control of the distribution of content and also make sure the experience at the end is better than what you get from a pirated site. So it is both defensive and proactive.

SP – Our audience can go to your website and get your findings from the forum?

Trudelle – Absolutely. These findings are available for download for free – registration at https://dtv.nagra.com/paytvif. And we’ve also published a number of public interviews with executives that were created as part of the program. They provide from a service provider perspective real examples of what’s happening in a given market. Some insights of how they see innovation in their companies and innovation in the industry and what they see as the key success factors.

SP – We’ll definitely be watching the site for that input. Thanks much for taking us through this, Simon.


Excerpts from Pay TV Innovation Forum Interviews with Service Provider Executives

United States
Charles Cataldo, Manager Technical Services, DirecTV/AT&T

Pay TV Innovation Forum – How would you describe the state of the US pay-TV industry today?

Cataldo – Ten years ago, a typical pay TV subscriber was a family household. Today, the picture is very different – there might be five members of that household, each of them looking for different content. The ‘one subscription fits all’ model does not work anymore. Pay TV service providers now have to focus on building an ecosystem of products and services that appeals to each member of the household.

In addition, the younger generation has grown up watching YouTube. Their perceptions of and expectations for content are very different from those of a traditional pay TV decision maker. For a long time, I have believed that would present a great opportunity for video services that sit North of YouTube and South of traditional pay TV. That is exactly the type of standalone OTT subscription services that Major League Baseball (MLB.TV) and HBO (HBO Go) have developed.

PTVIF – What are the innovation priorities for pay TV companies in the USA? 

Cataldo – Pay TV service providers that have physical networks and are experienced in developing great content need to be able to innovate in terms of search engines and content placement on the user interface. On the other hand, OTT service providers, such as Netflix, that have flexible technology platforms and are sensitive to their customer preferences need to be able to establish relationships with major programmers in order to build great content propositions.

At the end of the day, the factors that will determine the success of a pay TV product will be content quality, followed by user experience and ease of navigation, followed by quality of delivery.

PTVIF – Looking ahead, what will be the most exciting areas of opportunity for pay TV service providers?

Cataldo – In terms of content, there is significant unrealized value in standalone OTT content, particularly sports, and mobile content, including mobile-first content and mobile gaming. In terms of business models, there are exciting opportunities to move beyond subscriptions. For example, pay TV service providers can utilize freemium models, where users can choose to pay the full price for the service without advertising, or get the service for free or at a reduced price with advertising. In addition, pay TV service providers can be creative in terms of how they promote their services, instead of buying advertising they could spend those ad dollars on offering pilot episodes to the public for free.

Europe
Shuja Khan, VP Revenue Growth Transformation, Liberty Global

PTVIF – How would you describe the state of the pay TV industry today?

Khan – When you look at the long-term evolution of the pay TV industry, the last five years have been much more disruptive than the previous ten. During the first decade of the century, European pay TV providers were focused on improving their content offerings by, for example, increasing channel lineups, differentiating themselves from free-to-air channels, and investing in their distribution platforms and set-top boxes. Today the focus is on delivering even better experiences for our customers – they are now used to almost continuous app updates compared to the three-five year refresh cycles we used to have. Then it’s also about bringing new content offerings to our platforms with flexible propositions and addressing the exciting new growth opportunities that are opening up with on demand, personalization and impact of social media.

It feels like we’ve gone from a jog to a sprint triathlon!

PTVIF – Pay TV companies are often perceived as not being especially innovative. Why do you think that is the case?

Kahn – In my opinion, what makes pay TV companies successful is their ability to transition breakthrough innovation into mass market adoption. The innovation may have originated in other markets, often niche markets, but what they do is make the technology reliable and easier to use and then package it in a way that is compelling. That for me is still innovation.

PTVIF – Looking forward, what do you see as the key innovation challenges facing the pay TV industry?

Kahn – First of all, the pay TV delivery mechanism is very complex. It has so many components to it and bringing innovation to the whole system is not straightforward, in terms of technology and cost. I think on balance it’s better to get it out then make sure it’s perfect…and then course correct.

Secondly, organizational design is really important for innovation, and lots of pay TV companies are not designed to be innovative – they’re designed to be efficient. A lot of them are still working in silos, with little collaboration. This is one of the key reasons for some of the transformational changes that I’m involved with at Liberty Global.

Third, there are return-on-investment considerations. Pay TV is a great cash-generating business and has healthy margins, so innovative products and services can face a very high return-on-investment hurdle.

Finally, lots of pay TV operators are worried about disrupting their existing businesses, so innovation is much more likely to come from new entrants or industry outsiders. The best way to address this – and the ROI challenge – is to strategically invest, incubate, rapidly experiment and then integrate.

PTVIF – What steps can pay TV service providers take to develop and grow their businesses?

Kahn – Quick ones. Pay TV service providers can’t ignore the disruptive forces facing the industry. They need to identify potential disruptions and take steps to take advantage of them.

In general, pay TV companies are doing a good job addressing the basics, investing to better set-top boxes, great OTT products and very advanced functionalities. Competition is stimulating innovation across the industry.

Secondly, the future is uncertain so we need to place bets. A good way to do that is through corporate venturing. As an investor, you can integrate the new innovation into your business – and could buy the business outright at some point, if it makes sense.

There are also lots of exciting new growth opportunities opening up for pay TV providers outside of their core business. Advertising and data is one area. There is a wealth of data that pay TV service providers can extract, analyze and monetize, leveraging return-path data from set-top boxes and OTT products. It’s a really unique asset that we have and can enable some really exciting new business models.

Thirdly, we need to follow consumer behavior and demand. This is what makes multiscreen or OTT interesting and exciting. Although TV Everywhere services are almost ubiquitous, there are still lots of opportunities to extend content onto new screens – to deliver the next generation of aggregation services and to make the mobile viewing experience easier and more user-friendly.

There is an abundance of opportunity; it’s just a case of prioritizing what’s most likely to provide the best growth.

Hong Kong
Koby Zontag, VP Media Sales and Business Development, PCCW
        
PTVIF – Do you think innovation is becoming more or less important to the pay TV industry?

Zontag – Innovation is definitely becoming more important to the industry, and companies are investing more in it. It is particularly important for market leaders who need to invest heavily to respond to disruptive technologies and innovate continuously to maintain their market positions. Pay TV service providers will always face potential disruptions. Today, it is OTT services, tomorrow there will be something else, so they have to be ready. It is also important to note that with major Internet businesses, such as Google and Amazon, entering the video market, the lines between different types of TV and video service providers are getting blurred.

PTVIF – Looking ahead, what will be the most exciting areas of opportunity for pay TV service providers?

Zontag – Service providers will focus a lot of their attention on offering great content, so we should see more original and exclusive content in the market and stronger partnerships between content owners and pay TV service providers.

Multiscreen TV Everywhere services will also be very important for pay TV service providers going forward. TV Everywhere is slowly becoming a must-have service for customers and it will soon become part of the most basic pay TV service offering. Commercially, I see a big opportunity to bring more premium content, particularly sports, to consumers, allowing them to, say, watch football finals on the go. The key challenge will be monetizing these TV Everywhere services, but there are various ways to overcome it, such as tiered pricing based on the number of supported devices or higher reliance on advertising revenue.

In terms of adjacent businesses, smart home solutions will be a very important way for pay TV service providers to extend their presence in consumer homes by providing connectivity for all consumer devices.

On the B2B side, targeted TV advertising will be a major opportunity for pay TV service providers as advertisers will be willing to pay more money for effective ways to reach their target audiences. Pay TV service providers, broadcasters and advertisers will have to work together to find a mutually beneficial business model. Today, it might be more lucrative for some broadcasters to sell TV advertising on their own. However, with TV advertising rates getting squeezed by online advertising, it will be just a matter of time before targeted TV advertising becomes a reality.

Brazil
Ariel Dascal, Head of Digital Innovation, Oi

PTVIF – How would you describe the state of the Brazilian pay-TV market today?

Dascal – There is a clear generational divide in terms of how people consume TV and video content. Under 35s have very distinct viewing habits: they are very technologically savvy, they prefer streaming videos – either on subscription OTT services, YouTube or pirate sites – and consume a lot of content on mobile devices. Selling pay TV packages to them is difficult. They do not see much value in packaging, they want freedom to watch content whenever and wherever they desire. And then we have the older generation who consume TV in the traditional linear way and who are used to buying traditional pay TV services and triple-play bundles.

Although pay TV service providers need to respond to this new market reality, the pay TV industry still has huge growth potential in Brazil. There is a large untapped market, with less than half of the households subscribing to pay TV. Even among the high income households, where penetration is just over 80 percent, there is still a significant base of potential users that pay TV companies could go after.

However, there are three key barriers to further expansion. First is the price of pay TV. Most households that do not subscribe to pay TV services simply cannot afford to at the current price levels. Second, subscribing to pay TV used be a status symbol, but with the economic crisis many subscribers are dropping their pay TV subscriptions and keeping only their broadband subscriptions. Third, some consumers are leapfrogging pay TV and going from free-to-air TV to non-linear OTT services.

PTVIF – What are the innovation priorities for pay TV companies in Brazil? 

Dascal – The number one priority is the digitization of the pay TV experience in terms of delivering a better end-to-end experience to our customers and reducing our costs of operation and customer acquisition. We need to bring our services into the 21st century. As consumers are comparing pay TV services to Netflix, pay TV service providers need to deliver an interactive digital user experience across all consumer devices.

The second priority is acquiring great content, particularly for various on-demand and streaming propositions. Pay TV service providers face a major challenge in relation to the content industry, which is slow to respond to changing market realities and still follows the traditional approach of managing release windows and selling packages of channels. The content industry is highly susceptible to disruption driven by large Internet businesses, such as Apple and Google, which will allow consumers to get whatever content they want whenever and wherever they want it.

The industry also needs to look for opportunities beyond pay TV and OTT services in areas such as e-commerce, advertising, innovative pricing, new types of content, second-screen applications, mobile-first solutions and home automation and security solutions.

Indonesia
Iris Wee, CMO, Link Net-First Media
           
PTVIF – What do you think makes the Indonesian pay TV market different?

Wee – The Indonesian pay TV industry has faced a unique set of challenges and opportunities. Historically, pay TV penetration has been low due to high level of piracy and a very vibrant and competitive free-to-air TV market that offers high quality local content, providing little incentive to people to switch to pay TV. The pay TV market has been dominated by satellite operators that have primarily pursued aggressive pricing strategies, with little differentiation or innovation.

PTVIF – How would you describe the key developments in the Indonesian pay TV market?

Wee – I think the market is changing. First of all, the traditional DTH satellite providers have realized the limitations of their business model and are now increasingly trying to bundle their services with fixed broadband or 4G mobile data services, usually through partnerships with telcos. In addition, they are trying to move beyond pure price competition and are looking for ways to differentiate their services. However, without being able to support two-way communication, DTH satellite operators are at a big disadvantage. Hybrid set-top boxes might seem like a reasonable next step for them, but this would require significant capital expenditure and a long-term view of the business, which are not supported by the ‘low ARPU and high-churn nature of the DTH satellite pay TV business.

Secondly, there has been a number of new fiber providers entering the market recently, with pay TV and video playing a significant role in their market penetration strategies. Some of them offer pay TV services as part of their bundle, while others have partnered with OTT players to offer on-demand entertainment bundles.

Finally, the market has seen a number of OTT service launches. It is yet to be seen whether these services are going to be a substantial threat to the traditional pay TV model, but they have definitely been very innovative. OTT service providers recognized that a one-size-fits-all model would not work in the Asian market and adapted their propositions in terms of pricing and content. They have implemented a myriad of content localization techniques, such as subtitling and dubbing, and are actively looking to acquire and produce local content.

PTVIF – Looking ahead, what will be the most exciting areas of opportunity for pay-TV service providers?

Wee – Telcos will drive innovation in pay TV over the coming years, with broadband being key to pay TV market penetration strategies. They will not limit themselves to offering pay TV as a set-top box-based home entertainment service. Their offerings will be agnostic of consumer premises equipment and will include OTT products targeting the on-the-go digital consumer. It is only a matter of time before we will see the proliferation of digital media players, such as Chromecast and Apple TV, and these guys will be ready for that.

For mobile telcos, OTT services will be key to monetizing their mobile data services. We are already seeing a number of telco and OTT partnerships in the market, and these will be ever more important. However, the penetration of these services will heavily depend on pricing and packaging strategies.

Also, if you compare mobile networks in Europe and those in emerging Asian countries, you quickly realize that our networks cannot support a great on-the-go video experience. Some OTT and TV Everywhere services already have download-to-go functionality, and anyone trying to build a successful OTT service will need to support it.

Meanwhile, DTH satellite operators are changing their strategies and moving away from competing solely on price. They are seeing rationalization and investment in new set-top boxes, with differentiating functionalities, and putting more focus on premium customers.

Malaysia
Emily Wee, VP Business and Media Operations, New Media, Telekom Malaysia

PTVIF – Where does innovation rank among the Malaysian pay-TV industry’s top priorities?

Wee – Innovation is definitely one of the top priorities. Pay TV operators have to innovate to keep up with market trends and to protect and enhance their revenue streams. For us, as a challenger in the Malaysian pay TV industry that entered the pay TV business only five years ago, innovation is particularly important. We always need to look for an edge to convince customers to choose us rather than our competitors.

Innovation has become much more important over the last couple of years. The rate of change has accelerated and we are seeing many new players in the market, while consumers have a lot more choice and freedom. A growing number of different businesses are jumping onto the OTT bandwagon, with subscription fees of some OTT services as low as a tenth of the price of traditional pay TV packages. In addition, with technology companies, such as Google and Amazon, and TV manufacturers coming into the game, the urgency for the pay TV industry to innovate and keep ahead is growing.

PTVIF – Looking ahead, what will be the most exciting areas of commercial opportunity for pay-TV service providers?

Wee – There is a substantial opportunity to bring all entertainment together on a single platform. Partnerships with OTT content providers or game developers are where a lot of convergence is happening. The key task and challenge is to ensure that the whole experience fits nicely together.

Great user experience is the missing piece of the puzzle. How can pay TV service providers make it seamless? How can they build a search and recommendations engine that encompasses not only linear content, but also all the on-demand libraries, applications and OTT content? Smart TV manufacturers were the first to attempt that. They have tried to partner with as many content providers as possible in order to bring the adoption rate of smart TVs up. However, the experience has not lived up to the expectations. It still feels a bit clunky, with users having to navigate between different standalone apps.

In the OTT space, TV Everywhere is a ‘must do’ for all operators. I think there are also interesting opportunities for pay TV companies to offer standalone OTT services that are differentiated from their core propositions and targeted at new customers outside their footprints. Sky has made it work quite well with Now TV in the UK. However the jury is still out as to whether this would be applicable to the Malaysian pay TV market, given the differences between the two markets.

Outside the core pay TV and OTT propositions, Internet of Things and smart home solutions would be the first priority. This is particularly true for telcos, which are increasingly focused on owning the connected home. However, it is very early days for Internet of Things and smart home solutions in Malaysia. These solutions will develop much faster in other countries in the region that have higher incomes and higher broadband penetration.

0

Tech Advances Broaden Options For Boosting Home Wi-Fi QoE

Greg Fisher, CTO, Hitron Technologies Americas

Greg Fisher, CTO, Hitron Technologies Americas


CableLabs’ Tests Show Superior Performance for both Mesh- and Extender-Based Approaches

By Fred Dawson

November 17, 2016 – Network service providers seeking to address customer dissatisfaction over the growing disparity between broadband access speeds and Wi-Fi throughput in the home have a difficult choice to make between mesh- and repeater-based solutions that have proven they can close the gap.

Without naming vendors, CableLabs recently ran tests that showed effectiveness of both approaches to dealing with the typical issues that are driving Wi-Fi-related subscriber complaints, namely simultaneous streaming of video on multiple devices in a big household. As described in a blog by John Bahr, lead architect for wireless technologies at CableLabs, the test results convey “excellent news for consumers whose access to the Internet is wireless and who want that access everywhere in their homes.”

In the fast-moving wireless technology arena, it won’t be long before network operators who rely on Wi-Fi to support connected-device access to their broadband service in the home will be looking at options that go far beyond what’s doable today, including three spectrum-channel access points (APs) and devices that can operate at the 60 GHz tier to support gigabit speeds and LTE and 5G small cells that will be able to operate over unlicensed 5 GHz spectrum. But, for now, with the volume of complaints rising by some counts to close to 50 percent of call-center traffic, the urgency is too great to wait for emerging technologies to mature.

As previously reported, AirTies, a leading proponent of mesh technology, has gained considerable traction with broadband service providers worldwide, including two named customers, Frontier and Midco, and others unnamed in North America since it began focusing sales efforts in this part of the world over a year ago. As Bahr makes clear in his blog, mesh technology, which uses multiple intelligent APs to optimize use of Wi-Fi spectrum throughout the home, has made great strides among cable companies looking for new solutions.

“Mesh Access points (MAPs) are quickly gaining traction in home networks mainly due to ease of installation (even over Repeaters/Extenders) and the promise of high throughput with whole home coverage,” Bahr says. “In the past year, there has been a dizzying array of product announcements and introductions for home Wi-Fi coverage, with many of them using mesh networking.”

As Bahr notes, mesh wireless technology, while new in residential home networking, has been in use with enterprise wireless LANs for the past decade. Simple linear extension of coverage in the home via Wi-Fi AP extenders has been a mainstay in the home networking market even longer.

But, just as mesh technology has benefited from proprietary advances in software and shrinking of form factors, repeater technology, too, has morphed beyond the traditional extender model. A case in point is Hitron Technologies, a leading supplier of DOCSIS modems and AP-equipped gateways, which is utilizing various techniques to enable highly intelligent optimization of Wi-Fi performance in the home with use of extenders.

“Our technology provides control from the gateway that automatically sets up and configures extenders so that there’s always one hop between any device and an AP,” says Greg Fisher, CTO of Hitron Technologies Americas. “We support MoCA to Wi-Fi, Ethernet-to-Wi-Fi or Wi-Fi to Wi-Fi.”

CableLabs’ tests of mesh and repeater technologies were conducted in a test home utilizing three APs to provide coverage across over 5,000 square seat of space. “We performed throughput, jitter, latency and coverage testing at more than twenty locations in and around the house,” Bahr says.

The test ran two streaming videos at HD bitrates of about 20 Mbps to video clients and a simultaneous feed to a test client. “Both mesh and AP + repeater solutions were able to handle this video throughput, as well as deliver over 50 Mbps throughput throughout the house and even to some areas 20’ outside the house,” Bahr writes.

CableLabs’ goal, he adds, is to help operators move away from dependence on proprietary solutions by defining a standardized “AP Coordination Protocol” that would enable APs to share the information essential to making client steering decisions and performing network maintenance tasks. The organization is working with vendors to come up with such a protocol with no indication yet as to how close they are to consensus.

Meanwhile, the need to limit customer dissatisfaction resulting from Wi-Fi issues that, traditionally, haven’t been regarded as an operator’s responsibility has become an urgent matter with an average of 63 percent of home Wi-Fi users worldwide experiencing issues, according to the 2015 ARRIS Consumer Experience Index report. Moreover, 72 percent of consumers surveyed by ARRIS said that Wi-Fi availability in every room of the home is very or vitally important and 54 percent indicated they are not experiencing the range of coverage they need.

As represented by the AirTies and Hitron strategies, mesh and advanced repeater options constitute very different models for moving forward. Presently, the momentum at least as far as publicity is concerned, appears to be on the side of the mesh solution providers.

Mesh is definitely gaining momentum, says Khin Sandi Lynn, industry analyst at ABI Research. “Wi-Fi mesh network systems are one solution and a newer concept for homes, though enterprises commonly utilize them,” Lynn says. “This technology is beneficial in larger households that suffer from pockets of inadequate coverage, as their broadband routers are strong enough to provide premium coverage to the entire home and all of its connected devices.”

But, she quickly adds, they are expensive, with prices for a three-AP mesh system ranging between $300 and $500, which “could price some residential broadband users out of the market.” That could be mitigated if service providers foot the bill and can get people to pay a subscription fee for an enhanced Wi-Fi service, she says, noting AirTies customer Midco, the Tier 2 U.S, MSO, is now charging $7.95 monthly for such a service.

While Lynn agrees with Bahr that mesh systems are easier for consumers to install, since they are self-forming and self-optimizing, a mixed wired backbone with multiple APs can be more stable, she says. As noted in our previously cited report, AirTies is offering what it calls Hybrid Mesh, which puts the wired home network under control of the mesh system software.

As explained by Bulent Celebi, executive chairman and co-founder of AirTies, Hybrid Mesh selects a combination of wired and Wi-Fi hops to route packets, which, he says, dramatically increases total network capacity. As he notes, this is significantly different from traditional wired/Wi-Fi extender configurations, which treat the wireline as a fixed backbone with no dynamic interplay to ensure optimal connectivity.

A hybrid combination of APs and MoCA or Ethernet wiring is what Hitron’s customers are most inclined toward, says Greg Fisher – but not with mesh on the wireless side. “Most of our customers agree with a non-mesh strategy,” he says. “While we can do Wi-Fi to Wi-Fi, that’s not the preferred mode.”

Hitron, which has emerged as a leading player in DOCSIS modems and gateway routers over the past few years after playing a role as OEM supplier to other brands, is focused on leveraging cloud technology and industry standards to create a home networking environment that can maximize Wi-Fi performance with use of its gateways and extenders.“Hitron’s strategy is to deliver a Wi-Fi experience where the MSO’s demarcation point is the customer’s finger tips,” Fisher says.

One element to this strategy is a new marketing and sales partnership between Hitron and Adaptive Spectrum and Signal Alignment, Inc. (ASSIA), a leading provider of broadband diagnostic and optimization solutions to telecommunications companies globally. Leveraging ASSIA’s machine-learning CloudCheck architecture with an agent solution in Hitron’s gateways, the new system performs real-time analysis taking historical information into account to automatically optimize wireless network environments without operator or user intervention, says Jarrett Miller, vice president of global alliances for ASSIA.

“CloudCheck dynamically optimizes Wi-Fi and provides operators with true visibility into and control over subscriber Wi-Fi environments,” Miller says. “This helps to eliminate, or shorten, inbound calls through the self-healing of subscriber Wi-Fi environments.” In addition, he notes, the system is able to send accurate contextually based recommendations to subscribers to aid them in ensuring extenders are properly placed or in identifying device issues, such as when a dual-mode device capable of using both the 2.4 and 5 GHz spectrum tiers fails to jump to 5 GHz when the traffic over the 2.4 GHz channel is slowing things down.

The Hitron strategy also exploits the capabilities of devices equipped to support the IEEE protocol 802.11k, Fisher says. “As the device moves away from the gateway it automatically channels off and moves over to an extender,” he explains, noting that older devices without 802.11k must be “brute forced” into the transfer, which can take a few seconds. Also, rather than simply always connecting to the AP that provides the strongest signal, 802.11k-enabled devices connect to a more distant AP when the close one is over saturated, enabling higher throughput when there’s little or no contention on the weaker signal.

Along with automating optimization of connectivity, Fisher says the Hitron/CloudCheck combo enables a more robust per-device diagnostics regime than operators get with either SNMP (Simple Network Management Protocol), which doesn’t reach beyond the gateway to monitor extenders, or TR-069, the Broadband Forum protocol that’s designed to look at all the CPE. “TR-069 polls every six hours or so when it’s implemented on a public cloud service like Google Web Services,” Fisher says. “We do it every few seconds.”

Operators are able to look at per-device performance to assess the QoS on connectivity to any CPE element, including MoCA as well as Ethernet clients, he adds.  “Analytics across the entire premises footprint allows them to see which devices need help, what steps need to be taken to optimize performance,” he says.

Hitron, which provides commercial-grade as well as residential gateways, counts Charter, Mediacom, Altice, Shaw, Rogers, GCI and Videotron as customers, some of which use its products in both commercial and residential scenarios. Charter is using the gateways just for commercial customers, Fisher says.

With this embedded customer base Hitron clearly is positioned to drive momentum behind the non-mesh approach to advanced Wi-Fi service. It will be interesting to see how these technologies play out as operators move to address the Wi-Fi dissatisfaction issue in the near term.

In the long term, other options to providing robust wireless connectivity loom. We’ll explore these in a forthcoming article.

0

NFV Advances May End Resistance To Complete Network Virtualization

David Ward, CTO & chief architect, service provider division, Cisco Systems

David Ward, CTO & chief architect, service provider division, Cisco Systems


New Datacenter Technologies Combine with Orchestration Tools to Improve Prospects

By Fred Dawson

November 7, 2016 – New tools essential to making full-on conversion to network virtualization practical will soon allow network service providers to move beyond the point-solutions that have characterized use of virtualization technology so far.

While ever more operators are adopting strategies that enable some network functions to run as software on commodity servers, especially in the commercial services domain, they’ve largely shied away from pursuing full virtualization across their network and back-office infrastructures. One big reason for that is they don’t trust a fully virtualized architecture running in multiple locations in support of multiple functions to reliably allocate compute and storage resources as demand ebbs and flows without causing glitches in mission-critical applications.

Justin Paul, head of OSS marketing, Amdocs

Justin Paul, head of OSS marketing, Amdocs

“Our customers are telling us they need to understand and manage the impact of virtualization on a holistic basis,” says Justin Paul, head of OSS marketing at Amdocs. “Our research shows service providers want to move faster with a more complete approach to virtualization rather than having one foot in each camp.”

Forthcoming Amdocs Solutions

So far, Amdocs has seen a strong market response to its support for orchestration of vertically aligned processes that go into virtualizing a specific network function or set of functions related to specific service applications. Noting 2016 “has been a big year for Amdocs in the NFV (network function virtualization) space,” Paul cites as one example his firm’s successful collaboration with Vodafone in a proof-of-concept demonstration of software-defined VPN service at the Mobile World Congress in February.

“It took us just seven weeks to build the SDN/NFV framework with five different partners and six different network functions,” Paul says. The demo employed Amdocs’ cloud orchestration solution to coordinate all the service and virtualized CPE functionalities with a group of partners that included Juniper Networks, Aria Networks, Red Hat, Adva Optical Networking and Fortinet.

But as Vodafone executives have made clear since then, going beyond a demo to full implementation of a virtualized globally available enterprise service across 60 regional operating companies is a challenge that requires a much broader approach to making all the parts do what they’re supposed to. In an environment where hardware resources are dynamically allocated moment-to-moment to software-defined functions there’s an immense range of functionality that must be coordinated to maximum effect without violating performance priorities.

“We have seen in a lot of cases that vendors are thinking in terms of boxes when they are allocating virtualized resources,” said David Amzallag, Vodafone’s head of SDN and NFV, speaking at the Big Communications Event in May. As quoted by LightReading, he added, “This creates a misalignment in the amount of resources needed.”
Similarly, at a TM Forum event in July, Vodafone’s chief systems architect Lester Thomas, also quoted by LightReading, said virtual network functions (VNFs), the software-defined modules that are implemented in various types of virtualization frameworks, aren’t built from the ground up to exploit the cloud flexibility operators require. “They tend to make traditional network elements software based,” Thomas said. “Just virtualizing it doesn’t make it elastic and self-healing; so the next stage is to take systems and make them easy to deploy and scale.”

Vodafone hasn’t said whether it will tap the same vendors that put together the demo for its forthcoming VPN+ service, slated for commercialization next year. The demo utilized the Amdocs Network Cloud Service Orchestrator to maximize real-time operational efficiency by adapting to traffic loads, ensuring system recovery through dynamic policies and self-healing and leveraging analytics to mitigate the impact of security threats.

But Paul suggests Amdocs will soon have more to offer in this vein with a range of capabilities designed to support a holistic approach to NFV that reaches to massive scales. “We’ll be adding a new component in our NFV architecture – Amdocs Active Inventory,” he says.

“Up to now we’ve been in the early stages of network virtualization where people aren’t moving things all over,” he adds. “Functions sit on dedicated servers. They’re not integrating the whole network. This is where Active Inventory comes in.”

Paul describes the new component as “a kind of a jack of all trades that goes to all the systems in the network to pull and orchestrate data essential to network-wide NFV efficiency.” Using predictive analytics, the platform will be able to size up everything to better prepare for system-wide resource utilization as needs shift, he says.

For example, the platform will provide instantaneous, actionable answers to questions such as: “Where is there free capacity in datacenters? What’s the configuration of all the servers? Can they support functions like encryption or whatever? Do we move to another datacenter? Do we create a new VM (virtual machine)? Do we need to instantiate servers in the public cloud?”

New NFV Support from Guavus

Chris Menier, vice president, product & strategy, Guavus

Chris Menier, vice president, product & strategy, Guavus

The need for an overarching intelligence layer that can coordinate NFV scaling across multiple locations is also much on the minds of strategists at Guavus, a provider of big data analytics solutions that bridge silos to support proactive responses to any set of anomalies in network performance that might impact the quality of end user experience. As Guavus moves into enabling personalization and optimization of use cases tied to specific service applications it is also building software solutions for ensuring quality performance in the virtualized environment, notes Chris Menier, the company’s vice president of product and strategy.

“With NFV you have a balloon effect where, when you push in one direction it impacts what’s happening elsewhere,” Menier observes. “When you’re spinning up raw capacity for a given application you need to know whether you’re taking resources away from a higher priority application.”

Taking a pre-determined fair-share approach to resource utilization isn’t sufficient, he adds. “If you don’t know the impact of what you’re doing downstream, you can easily cause problems that impact user experience,” he says. “You’re blind with existing tools, which is a barrier to moving ahead with virtualization.”

Guavus will be incorporating analytics that can anticipate and prevent such problems into its product portfolio. “We have a clear way of looking at it based on what’s happening at the individual customer and device level,” Menier says. “We can predict how changes in the cloud will impact that experience and, if necessary, communicate what needs to be done to re-feature use of resources.”

Guavus has integrated its software platform into NFV orchestration as defined by the MANO (Management and Network Orchestration) framework developed by the European Telecommunications Standards Institute’s Industry Specifications Group for NFV. MANO, one of several orchestration frameworks emerging in the NFV space, is meant to foster a consistent industry approach to instantiating, scaling and configuring virtualized functions.

“The process of taking data in and spitting out answers to optimize orchestration will be automated,” Menier says. “When that happens operators will have a lot less heavy lifting to worry about.”

Major Progress at Cisco

At Cisco Systems moving virtualization into the service provider space has long been part of its larger play supporting virtualization in the enterprise market from the early days of SDN (software-defined networking). “We have multiple platforms we’ve already virtualized,” notes Daniel Etman, director of product management at Cisco, which plans to add virtualization of its cBR-8 CCAP (Converged Cable Access Platform) to the mix next year.

For Cisco, Etman says, virtualizing specific functions isn’t the big challenge. “From a product perspective virtualization is not that complicated,” he says. “It becomes complicated from the customer’s perspective.”

In Cisco’s view network virtualization is by definition a set of processes by which dynamic allocation of datacenter resources is managed across multiple network functions to support software representing both the control planes and the data planes of those functions, not just the control planes as defined by SDN. “Network virtualization is not SDN,” Etman says.

To achieve the real cost-benefit from virtualization requires building an initial approach to NFV that takes into account the need to support ever more services, he adds. “If you just virtualize a single service platform, I’d question the value,” he says.

A year ago Cisco launched its Evolved Services Platform (ESP) as a comprehensive virtualization and orchestration software framework that creates, automates and provisions services in real time across compute, storage and network functions. The company says that by enabling the delivery of desired business outcomes for applications running in multiple domains, ESP allows service providers to deliver prepackaged services from a flexible pool of resources that can be reused and personalized for each customer, automatically and on demand.

While the platform’s orchestration layer is meant to handle a wide array of virtualized network functions, Cisco has made it possible for operators to build virtualization incrementally around VNFs tied to specific services such as what it calls Cloud VPN or to specific hardware components such as the forthcoming virtualized cBR-8. The business logic of the ESP service and resource orchestration modules are used to start up the VMs (virtual machines), activate the VNFs and dynamically create the specific service function chain with linkages that support the service profile and steer the customer traffic through them, Etman says.

But he acknowledges that as the operational impact of NFV expands across ever more services, the orchestration requirements multiply, not only to ensure maximum efficiency in use of datacenter resources on an as-needed basis but also to ensure coordination with legacy aspects of the operational environment.  “Analytics is the key,” he says. “We have that orchestration level and the automation. We have the analytics running on top of that to support proactive network management with a holistic view of what’s happening and the ability to explain how to solve problems.”

To help take things to the level of orchestration required to support ongoing NFV migration Cisco is “working with a business unit that does orchestration for many of our business units that do virtualization platforms, which gives us a unified approach to addressing these issues,” Etman says. “What we’re driving to is to ensure that regardless of what is virtualized and what isn’t, everything is transparent from a service development and operations perspective.”

Facilitating this effort, Cisco recently introduced a new set of cloud professional services to help businesses optimize their cloud environments. Among other things these services help automate and de-risk the complexity involved in onboarding and migrating applications and workloads to the cloud with orchestration support for management across all locations, says Scott Clark, vice president of advanced services at Cisco.

Utilizing what Cisco calls its Domain Ten framework, the new initiative “provides customers with an end-to-end view of the key elements, or domains, in their data center and IT infrastructure,” Clark explains. “It helps customers plan their IT transformation by providing a strategic roadmap and steps to take to achieve their goals.”

Vector Packet Processing

More fundamentally, Cisco’s service division is going all-out in pursuit of a holistic approach to datacenter operations, which it hopes will make massively scaled NFV a practical option in the “5 9s” world of network operations. “This is where I’m putting all my attention,” says David Ward, CTO and chief architect of the division. “Getting that workflow layer right is how we can move cloud solutions to our customers.”

The goal is nothing less than an entire network running as software on computers, which requires “repeatable line-rate performance, deterministic behavior, no (aka 0, null) packet loss, realizing required scale and no compromise,” Ward says. “If this can be delivered by virtual networking technologies, then we’re in business as an industry.”

A key step in this direction is what Cisco calls Vector Packet Processing (VPP), which is a high performance virtual switch-router that runs in any virtual network infrastructure to support efficient use of hardware resources. As Ward explains, commodity server cores can handle the core function processing requirements common to traditional virtualized IT workloads and to VNFs as well, but they have trouble handling the tasks tied to processing huge volumes of packet headers at the I/O stage of workloads common to services involving millions of users.

“The typical compute workload mostly consumes CPU and memory with relatively little I/O,” Ward says in a recent blog. “Network workloads are the inverse. They are by definition reading and writing a lot of network data.”
Fortunately, he adds, the demands imposed on CPUs for processing network functions are relatively modest and well defined. So the main challenge is to figure out optimizations that will increase packet-per-second performance.

Even with the fastest cores, there’s a need to overcome the bottleneck that comes with receiving, processing and transmitting high volumes of packets. With a 67.2 nanosecond time limit on processing each packet, there aren’t enough clock cycles per packet available on a single core to handle the load within that timeframe, nor does traditional sequential processing enable timely use of multiple CPU cores for caching instructions and processing the packets.

Recourse to use of main RAM memory, which consumes about 70 nanoseconds, is out of the question. And spreading out the workload by adding ever more cores to the tasks isn’t a useful way to scale. This means a solution must be found that makes better use of all the resources in the datacenter, including enabling use of hierarchically arranged CPU caches as a supplement to other memory resources.

VPP is one of the baseline contributions to a new Linux Foundation open-source collaborative project called FD.io, which complements the many initiatives focused on control, management and orchestration by formulating packet forwarding processes that ensure data plane performance in real time with no packet loss. “VPP tries to answer what can be done, what…set of technologies and techniques can be used to progress virtual networking towards the actual fit-for-purpose functional, deterministic and performant service platform it needs to be to realize the promises of fully automated network service creation and delivery,” Ward says.

VPP, which works in all virtualization OS environments, including bare metal, VMs and Linux containers, uses parallel processing to execute multi-threading across multiple CPU cores. It addresses main memory access bottlenecks by making optimal use of CPU caches through adjustments based on a variety of techniques that react not only to what’s happening in real time across all the caches but also to what’s about to happen based on instructions “pre-fetched” from memory at the control layer to anticipate imminent packet flow requirements.

The results have been well validated in the field, Ward says, noting VPP is a proven technology shipping in many Cisco networking products. “Combined with a compute-efficient network I/O…VPP enables very efficient and high-performing design and offers a number of attractive attributes,” he says.

HCI and Unikernels
Along with contributing VPP to the Linux-based open-source environment, Cisco is focusing on integrating the latest virtualization infrastructure technologies into its own implementations of NFV with VPP. One of these is hyperconverged infrastructure (HCI), which overcomes the encumbrances to NFV performance in datacenter with partitioned compute and storage resources by integrating compute, storage and networking assets within a multi-core commodity appliance.

HCI platforms provide a modularized, low-cost approach to scaling the cloud with access to all the types of storage used with traditional approaches, including RAM, flash-based solid state memory and HDD (hard disk drive) storage. As they scale the HCI cloud module by module, IT teams are able to seamlessly orchestrate hardware resources across the integrated cluster to support flexible configurations of ever more workflows and applications.

“We think there’s a big place for HCI,” Ward says. “It’s going to be a very important technology.”

Cisco has already built a large customer base for the a software platform, HyperFlex, that’s designed to automate introduction and use of HCI clusters in traditional datacenter environments, enabling unified management across hyperconverged and legacy resources. The company recently reported 600 customers from multiple industries worldwide have implemented HyperFlex to manage thousands of nodes and many petabytes of hyperconverged storage. Latest capabilities include extension of the reach of the platform to holistically manage multiple HCI-equipped private and public cloud environments.

Cisco is also embracing new unikernel technology as another step toward greater efficiency and reliability, Ward notes. “Multi-container and VM solutions make it really hard to move things around,” he says.

While containers provide a more flexible deployment model and consume less hardware resources than conventional VMs, there are still some drawbacks when it comes to implementing the interchangeable virtualized software components known as microservices, which can be scaled up and down as needed in various combinations to support specific instantiations of network functions. These microservices, which share installation images, libraries, binaries and other elements of the host OS kernel, are subject to vulnerabilities in container architectures related to the design of container kernels and to the loss of the security protection afforded by hypervisors in the traditional VM architecture.

Unikernels represent a new approach to supporting microservices that consumes far less compute resources than would be the case were microservice VMs constructed using traditional virtualization technology. Running on hypervisors, which are the OS layers that orchestrate use of hardware resources to run VMs, unikernels compile code from shared OS code libraries into OS runtimes for each microservice in isolation from the underlying OS and from other VMs, thereby enabling high volumes of microservices to run and scale across the shared hardware resources, whether they be HCI appliances or traditional servers and storage components.

All these developments point to the need to utilize the full NFV approach to virtualizing networks rather than the SDN approach that relies on use of distributed data planes running on purpose-built hardware, Etman stresses. Many network operators, especially in cable, are unconvinced but need to make up their minds before they lock themselves into a virtualization migration path, he adds.

0

Virtualized CCAP Options Pose Hard Choices in HFC Migration

Weidong Chen, CTO, Casa Systems

Weidong Chen, CTO, Casa Systems


Platforms on Offer from Casa, Harmonic, Huawei and Nokia Raise SDN vs. NFV Question

By Fred Dawson

October 29, 2016 – After a protracted multi-year debate over best approaches to CCAP virtualization Cable MSOs finally have some real product options to consider as they look for ways to maximize the power of DOCSIS 3.1 to meet bandwidth and service requirements in the years ahead.

The aim of these vendor initiatives is to capitalize on the virtualization efficiencies made possible in CCAPs (Converged Cable Access Platforms) optimized to work with the CableLabs-defined Distributed Access Architecture (DAA) model, which relies on digital optics terminated at the HFC node by Remote PHY electronics that manage modulation, multiplexing, forward error correction and other physical layer processes in the conversion to RF for distribution over coax. The question posed by the new vendor options is, once the CCAP is relieved of performing these PHY layer processes, what components of the DAA–enabled CCAP should be virtualized.

In one approach, as exemplified by Casa Systems’ recent demonstration of its forthcoming vCCAP solutions at Cable-Tec Expo in Philadelphia, both the  control and data plane components of the CCAP are virtualized to run on COTS (commodity off-the-shelf) servers at headends and datacenters. Harmonic, too, has introduced a fully virtualized CCAP dubbed “CableOS.”

The other approach, known as a Remote MAC-PHY configuration, is supported by new products from Nokia and Huawei. Their solutions virtualize the control plane functions running in core locations to orchestrate provisioning, quality control and tie-ins with other back-office elements but move the CCAP data plane or MAC (Media Access Control) into the node.

From a CableLabs’ specifications standpoint there’s a complex set of options associated with Remote MAC-PHY having to do with how much of the MAC resides in the Remote MAC-PHY Device (RMD) and how much remains in the hub or headend. The Nokia and Huawei solutions entail the most comprehensive encapsulation of the MAC in the RMD, which is referred to in the specifications as Remote CCAP.

Now that real products are entering the market, it will be interesting to see how operators respond. It’s not an easy choice, given the far-reaching implications of the pros and cons associated with each approach and how they play into DAA.

Stefaan Vanhastel, head of fixed networks marketing, Nokia

Stefaan Vanhastel, head of fixed networks marketing, Nokia

For Nokia the appeal of the Remote MAC-PHY approach was strong enough to merit acquisition of Gainspeed, which provided the CCAP functions to complement the Nokia-supplied digital optics and Remote PHY components. “By co-locating the CCAP MAC with the Remote PHY electronics in our new SC-2D node operators can reduce space consumed in the headend compared to the other Remote PHY approach by a factor of seven,” says Stefaan Vanhastel, head of fixed networks marketing at Nokia.

Modeling how its solution would benefit an unnamed Tier 1 MSO in a move from a non-virtualized CCAP environment to DAA, Nokia found the platform delivered an eight-fold reduction in power, a seven-fold reduction in rack space and a substantial improvement in signal quality. And like any Remote PHY solution, the MAC-PHY approach eliminates the fiber link distance and wavelength density limitations imposed by analog optics.

Erick Keith, principal analyst for broadband networks and multiplay services at Current Analysis, offers an even more upbeat assessment. The Nokia solution, he says, “takes MSOs to the next level with at least three major ‘Force 10’ efficiency multipliers – specifically, 10x improvements in fiber efficiency, power consumption and rack space footprints over centralized CCAP implementations.”

But Casa believes the industry consensus is settling around another view. Loading up nodes with electronics adds headaches that operators don’t need, especially when the ongoing density gains of commodity processors in combination with other efficiencies tied to NFV (network function virtualization) architecture promise to greatly alleviate the space and power burden in headends, says Matt Eucker, manager for U.S. sales engineering at Casa.

“With Remote MAC-PHY you’re essentially hanging a CMTS on a pole, which leaves you with potentially hundreds of CMTSs to manage from the headend,” Euker says. Adds Casa CTO Weidong Chen: “Anybody can build a remote MAC. We just don’t feel that’s the right architecture.”

Whichever way operators choose to go, there’s little doubt the industry is on the cusp of a major transition to distributed access architecture (DAA), with or without CCAP virtualization, A year ago, IHS-Infonetics, reporting on in-depth interviews with cable companies that collectively control 87 percent of the industry’s capex, found that 42 percent planned to deploy DAA in at least some facilities by 2017.

The adoption of DAA is spurred in part by strategies aimed at maximizing the benefits of DOCSIS 3.1, which survey respondents on average said would reach a third of their subscribers next year. DOCSIS 3.1, with use of much higher orders of QAM modulation, improved forward error correction techniques and the multiplexing efficiencies of OFDM (orthogonal frequency division multiplexing), is designed to utilize up to 1.2 GHz of RF spectrum on the coaxial links to support up to 10 gigabit-per-second speeds downstream and 1 Gbps upstream.

DAA makes it easier to push fiber deeper, not only to be able to reach the 1.2 GHz level of usable coax spectrum with elimination of long amplifier cascades, but also to continue reducing service group sizes once the 1.2 GHz capacity is reached in order to raise the amount of throughput available to each subscribing household. DAA offers the same fiber migration benefits in conjunction with use of DOCSIS 3.0 as well.

Remote PHY is indifferent to what type of approach is taken to implementing distributed CCAP technology. It works with CCAPs running on proprietary hardware as well as virtualized CCAPs running on commodity off-the-shelf (COTS) servers. But with DAA expediting migration to ever more nodes, it appears that CCAP virtualization will eventually be required to more efficiently accommodate the multi-service needs of a growing number of service groups.

When it comes to assessing which approach to virtualization is better, one big question is what, if any efficiency gap exists between the Remote PHY and Remote MAC-PHY models. Claims from vendors taking either approach are equally impressive.

For example, Harmonic says CableOS used in conjunction with DAA saves up to 90 percent on space and power costs. Where a hardware-based CCAP typically requires nine racks of equipment to support 80 service groups, CableOS running on four racks can support more than 250. With these kinds of gains, the efficiency tradeoff question comes down to whether operators believe further gains are worth pursuing with remote placement of the MACs.

It’s also important to recognize that both approaches can claim the benefits of using digital optics with Ethernet transport. Along with the ability to support denser wavelength packing over each strand of fiber at much greater transmission distances than is possible with analog optics, digital produces a higher carrier-to-noise (CNR) output after conversion to RF at the node, which in many cases enables use of higher orders of QAM on the coax.

Huawei, for example, notes that use of digital fiber connected to one of its Remote PHY nodes allows service providers to achieve the 41 dB CNR, which is the minimum RF output at the node required to support 4K QAM with OFDM on DOCSIS 3.1 connections. Looking at current node positioning, this level of modulation supports an average 8x gain in the number of people in the service group that can be served at 1 Gbps, Huawei says.

“Our Distributed Cable Converged Access Platform is a fully baked product that now supports DOCSIS 3.1,” says Sean Long, director of optical network product management for North America. (Huawei’s first D-CCAP release only supported DOCSIS 3.0.) “We’ve announced deployments in Denmark and Peru,” Long says. “And we have some engagements with smaller operators in the U.S.”

Nokia, too, reports strong interest in its Gainspeed Virtual CCAP (V-CCAP) portfolio with a number of lab and field trials underway across North America and Europe in advance of a commercial release scheduled by year’s end. By early 2017 Nokia will augment the options with release of a Gainspeed node supporting 10 Gbps downstream throughput on DOCSIS 3.1 links. And, as previously reported, Nokia’s Bell Labs is working on full-duplex 10 Gbps DOCSIS 3.1 (FDX), a still developing protocol that utilizes full 1.2 GHz of coax spectrum without any amplifiers on the coax link.

Nokia’s confidence in the Remote MAC-PHY approach to DAA rests on the notion that operators will want the flexibility to manage service offerings and spectrum allocations independently within each service group and to have more options when it comes to future migration of nodes and access technology. “The ability to remotely configure and manage each access node will be increasingly important to operational agility in the years ahead,” Vanhastel says.

This includes the ability to make the transition from legacy video to IP video on a node-by-node basis. With introduction of the V-CCAP solution Nokia is making use of the Gainspeed Video Engine, which is designed to protect operators’ deployed Edge QAM (EQAM) investments by terminating RF video services in the hub, thereby enabling Ethernet over digital optics transport to the Gainspeed nodes where the legacy video service is delivered over the existing QAM infrastructure. This allows operators to begin offering an all-IP video service over the DOCSIS 3.1 spectrum without affecting subscribers to the legacy service while simplifying eventual transition of the entire node service area to IP-based TV service.

Another important aspect to migration agility is the support Nokia’s unified cable access solution can provide for PON, point-to-point Ethernet and Wi-Fi access points. Currently, Vanhastel notes, Nokia’s EPON node can be positioned next to or in place of the Gainspeed nodes to share the existing digital fiber resources in support of fiber extensions to customer premises and public Wi-Fi APs. “In the future we’re looking for ways to combine our PON and V-CCAP nodes into a single node,” he adds.

Another future development path involves use of the HFC plant to support 5G wireless access, which, operating from small cell locations, will enable 1 Gbps or higher throughput over very high frequency millimeter wave spectrum to small clusters of end users, The environmentally hardened nodes packed with miniaturized electronic modules might eventually be used as points of fiber extension to multiple small cells, allowing the backhaul support for 5G services to be provided over the HFC digital fiber infrastructure, Vanhastel explains.

Nokia is already looking into ways to configure backhaul architecture to maximize efficiencies, he adds. On the one hand, cloud-based management of the small cell infrastructure is vital to scalability, but transport of all the raw RF feeds from clusters of small cells would require “ridiculous amounts of bandwidth,” he says.

The solution may lie with a “mid-haul” approach that does some processing to consolidate the raw feeds at some point in the network while leaving it to the cloud platform to do the heaviest lifting. “We’re looking at eight or ten mid-haul scenarios, Vanhastel says.

In pursuing the Remote MAC-PHY path to CCAP virtualization, Nokia is utilizing the approach to virtualization known as SDN (software-defined networking), where a centralized control plane manages the data planes embedded in field-deployed purpose-built appliances. The Nokia Gainspeed Access Controller is the brains of the V-CCAP solution enabling operators to configure and manage a large and widely deployed network of MAC-equipped access nodes.

This is in contrast to NFV architecture where both the control and data planes are virtualized, doing away with dedicated purpose-built appliances altogether. “Virtualizing everything is not that easy,” Vanastel says. “We’ve taken a pragmatic approach using virtualization in whatever ways make the best sense for solving actual problems. If you want to maximize network migration agility, you still need physical interfaces in the access network.”

Casa’s Chen sees things differently. “We believe the next step in CCAP virtualization is NFV,” he says. Coding used with Casa’s vCCAP software is agnostic to whether the NFV architecture is based on OpenStack, containers or other emerging technologies, he notes, adding, “NFV allows you to scale much more easily.”

As described by Matt Eucker, the Casa solution maximizes scalability by enabling separation of the virtualized control plane and MAC functions so that one core datacenter-positioned control layer can manage virtualized MAC software positioned in multiple headend or hub locations to control service flows across a large number of nodes. A key advantage of this approach to distributed MAC management is it “closes the timing loop” that might cause unacceptable latency between the control plane and MACs if the MACs were positioned farther away in every node.

Adding to Casa’s case for a pure NFV solution is the fact that the vCCAP solution plays into the larger cloud framework of its Axyom software architecture, Eucker notes. Axyom is an open-source virtual edge mobile computing platform that provides a common secure foundation for independent scaling of control and data plane functions. With modular components supporting Wi-Fi wireless access gateway, LTE, 5G and other functions, Axyom establishes an NFV environment for vCCAP that enables dynamic use of COTS resources across multiple virtualized applications.

Clearly, MSOs face some tough choices as they move to utilize virtualization technology in conjunction with implementing DAA across their DOCSIS 3.1 footprints. Equally clearly, it looks like they’ll have the ability to ride virtualized CCAP technology a long way into the future, whichever path they choose.

0

The Moment Has Arrived For Powerful Next-Gen UIs

Paul Stathacopoulos, VP, strategy, TiVo

Paul Stathacopoulos, VP, strategy, TiVo


TiVo, SeaChange and Other Vendors Respond to MVPDs’ Demand for Navigational Advantage

By Fred Dawson

September 19, 2016 – After watching next-generation UIs come and go over the past few years with little impact, a jaded observer might be inclined to dismiss the latest round of vendor displays at this month’s IBC Show in Amsterdam as another collective exercise in futility. But that would be a mistake, given how vital a compelling navigation experience has become to MVPDs in their battles to build viewership in the OTT-disrupted marketplace.

Moreover, in the latest iterations of UI templates on offer from TiVo, SeaChange and other vendors as well as the underlying support mechanisms for in-house UI development provided by a host of other entities, the navigational and rendering capabilities have reached unprecedented levels of functionality, often fueled by breakthroughs in the use of AI (artificial intelligence) with advanced analytics. Perhaps most important, while some solutions such as that of SeaChange are designed to work with legacy set-top boxes, the potential for widespread implementation of next-gen UIs has been solidified by the penetration of IP-capable STBs across the MVPD ecosystem.

Now the issue is no long whether there’s a real need or feasible way to implement these advanced interfaces. It’s which ones have what it takes to win over a gaggle of hard-to-please buyers.

SeaChange has entered the fray with a new UI it calls NitroX. “Our new generation NitroX products provide a ready-to-deploy multiscreen user experience that’s pre-integrated with SeaChange’s widely deployed Adrenalin multiscreen platform and Nucleus client software stack for rapid introduction of enhanced subscriber capabilities,” says Marek Kielczewski, SeaChange’s senior vice president of customer premises equipment software.

The UI functionalities have also been boosted through SeaChange’s recent acquisition of
DCC Labs, a Warsaw, Poland-based set-top and multiscreen device software developer and integrator, notes Tony Dolph, senior vice president of marketing at SeaChange. “This supports our intention to go farther into enabling personalized user experiences through a converged UI that can operate in virtually any device environment, from old set-top boxes to smartphones and 4K screens,” Dolph says.

Supporting multiple user profiles, NitroX’s presentation intelligence curates content options that are relevant to unique users and enables social discovery within an individual subscriber’s network, establishing a highly attractive and engaging cross-device experience, he adds. The UI lets subscribers move easily between devices, including maintaining individual bookmarks, favorites and recently-watched history, as well as providing access to catch-up and recorded TV from all devices.  Companion push-pull features between devices allow for new consumption modes, such as using one device for viewing, while simultaneously controlling the viewing experience on another.

Nobody is betting bigger on the central importance of advanced navigation than TiVo, which has leveraged its acquisitions of Rovi and Digitalsmiths to build a new UI from scratch, marking the first time it has done so in 20 years. “Our new UI is running on TiVo hardware, but it also runs on other service provider CPE with integrations on the backend,” says Paul Stathacopoulos, vice president of strategy at TiVo.

“The core of the Rovi acquisition is this is a company that’s phenomenally good at user designs and integrating pay TV with OTT,” Stathacopoulos says. “We’ve responded to the fact that our customers are all struggling with how to bring together multiple services in a consumer experience that’s contiguous across all device platforms.”

The urgency of that struggle is underscored by the results of a new TiVo-sponsored survey that queried 5,500 pay TV and OTT subscribers worldwide, including 2,500 in the U.S. and 500 each in the U.K., France, Germany, China, Japan and India. From an MVPD perspective, the most disturbing findings pertain to subscribers’ propensity to downgrade or cancel their subscriptions, especially in the U.S.

While, on average, 11 percent of all global respondents said they are extremely likely to downgrade and eight percent said they are extremely likely to cancel their pay-TV service in the next 6 months, the numbers for the U.S. were 21 percent and 13 percent, respectively. Mirroring results from other recent surveys, the TiVo study found that multiple points of access for OTT is now the norm with 58 percent of all respondents reporting they already pay for more than one subscription streaming video service and 45 percent reporting that they have more than one streaming media device in their home.

The opportunity MVPDs have to leverage advanced navigation as the key to keeping and adding subscribers was evident in some of the survey findings, including one metric that showed 37 percent of global viewers have stopped watching a show they previously enjoyed because it became too difficult to access the content. On average, survey respondents said they spend four hours daily watching or streaming video content and an additional 19 minutes per day searching for something to watch.

In the U.S. Millennials spend more than six hours per day watching content and another 32 minutes searching, the survey found. Across all countries surveyed, 53 percent of Millennials said they often get recommendations with their viewing. But Millennials and other age groups as well expect more, with 47 percent of all respondents agreeing that, for the amount they pay for video services, it should be easier to find what they watch.

The benefits of better discovery were also registered by the survey, which found that consumers across all age groups who were most satisfied with their search functions watch almost seven hours of content daily, which is 21 percent more than the average viewing time of respondents in the U.S. Viewing time, at 7.5 hours daily, was even higher among users most satisfied with their recommendation functions.

“Finding what you want to watch has become increasingly difficult with the growing number of video providers,” notes Margret Schmidt, chief design officer at TiVo. “This was the impetus for the design of the new TiVo UX.”

Conducting a demo of the new TiVo interface, Schmidt says the personalized experience “brings the content the viewer wants right up front faster through expanded discovery and predictions from their own cable subscription and the best online video sources. In short, we designed this UX so the viewer spends less time searching channel guides and opening apps and more time enjoying their favorite shows.”

As Schmidt notes, Rovi, with deep experience compiling metadata and designing EPGs for the MVPD and connected TV markets, and Digitalsmiths, a leading provider of analytics-supported recommendations and other content discovery functionalities, give TiVo a formidable arsenal to work with. TiVo, she adds, has also met another major requirement for the new MVPD UI by allowing every device to become a primary screen for video consumption.

Rovi’s portfolio of capabilities was built with the aid of a number of acquisitions of its own over the past few years, notably including  Veveo, which brought natural-language voice recognition to spoken-voice interfaces in Rovi customers’ products, and IntegralReach, supplier of a cloud-based predictive analytics platform that contributed to the Rovi Knowledge Graph, a repository of dynamic information on program titles, celebrity names, corporate brands, locations and other elements, including machine-generated metadata from 100,000 online sources as well as data manually compiled by Rovi editors.

As a result, TiVo’s new UX can tap contextual signals, such as current world news and trends, time, location and the consumer device, along with social media activity, to deliver cues that accurately anticipate a user’s interests relative to a specific time and place. With this deep understanding TiVo customers can convey semantic, highly relevant search results and recommendations to subscribers that accurately anticipate user intent and interests, as evidenced in what TiVo calls the “predictions” component of the UX.

“This is different from recommendations,” Schmidt says. “We look at a user’s actual viewing habits and predict the shows they most likely want to watch based on what’s available at that moment.”

The UX also populates user’s watch lists with information about episodes of programs they’re watching which they may have missed, letting them know where they might find those episodes either in their MVPD’s free VOD library or among OTT providers they have access to. “Unified discovery and seamless access to content removes some of these barriers for the consumer, improving engagement and resulting in real business benefits, including higher content consumption, increased subscriber retention and improved service value, especially for the Millennial generation,” Stathacopoulos says.

0

Tier 1 MSOs Tap Hosted Service To Offload STB Security Hassles

Doug Lowther, CEO, Irdeto

Doug Lowther, CEO, Irdeto


Operators Gain Flexibility in Choosing Next-Gen Products

By Fred Dawson

August 4, 2016 – In what amounts to lightning speed for an industry notoriously slow to adopt radical departures from the operational status quo, more than half the Tier 1 MSOs in North America have signed on to Irdeto’s set-top security management service to help expedite execution of next-gen set-top strategies.

The Irdeto Keys & Credentials service, which, as previously reported, went into operation with its first customers just two years ago, removes a big impediment to MSOs’ maneuverability, says Irdeto CEO Doug Lowther, “Keys & Credentials provides a very flexible security capability that’s used in the next-generation set-top boxes (STBs) of most of the large cable operators,” Lowther explains.

Given that no matter how far operators go in the direction of moving set-top functions to the cloud, chip-level security will remain essential to their business models, the challenge has been to achieve the flexibility in handling security that will allow them to play the OEM field as they look for cost-effective ways to accommodate new requirements in a rapidly evolving pay TV marketplace. The Irdeto service lets cable companies “evolve their services at Internet speed without each time coming back to a company like Irdeto to make software changes,” Lowther notes. “That’s very powerful, and that’s why it’s been so widely adopted.”

Five of six Tier 1s in North America are now using Keys & Credentials, according to Paul Ragland, vice president of sales at Irdeto. The company also has picked up its first European Tier 1 MSO for the service, which, like the other customers, has chosen to remain anonymous.

“We’ve seen 100 percent growth for the service over the past 12 months,” Ragland says.
Irdeto has secured and keyed over 16 million next-generation STBs.since the service launched, he adds. The scope of the work includes 25 different set-top models and provisioning of more than 22 million updates to deployed STBs in the field.

In a nutshell, the service, utilizing facilities in the U.S. and The Netherlands, integrates with third party workflows for the production, delivery or updating of decryption keys and other security assets. The solution is fully integrated with the operator’s supply chain, and enabling agreements are already in place with all major chip and set-top box manufacturers, the company says.

The Irdeto pitch is that by entrusting the company to manage and provision all security keys within the STB on their behalf, cable operators not only remove a heavy burden of complexity and cost; they also have complete control and independence in choosing security solutions and in adding new services to deployed devices for the lifetime of the platform.

Customers’ reluctance to publicly acknowledge engagement with the service is par for the course in today’s highly competitive climate, where MSOs hesitate to identify the solutions they choose at the cutting edge of operations out of fear of losing advantage against competitors. But one Keys & Credentials customer executive, speaking on background, affirmed the importance of the move to his company’s STB agenda.

“We need to own the keys for the control of the video, and for the box to control what is delivered on the STB,” this executive says. “As we looked around we found that Irdeto had the unique ability to provide us with that key provisioning and grow the service.”

Looking at what lies ahead as the MSO looks for best-of-breed solutions, he adds, “Each SoC (system-on-a-chip) in our population of set tops will require the secure placement of a key package on that SoC. Irdeto already had working relationships with the key SoC providers, STB manufacturers and CAS (conditional access system) vendors that we anticipated working with. The detailed skills required for this level of security practices are better handled by a company that focuses precisely in this area of expertise.”

Hosted management solutions touching on various aspects of the increasingly complex security requirements of high-value video services have emerged across the supplier ecosystem over the past couple of years. For example, Nagra, in a recently published white paper, alludes to the need for and its progress toward developing a cloud-hosted consolidated security management platform.

Last year Verimatrix introduced the globally interconnected Verspective Intelligence Center, a hosted service designed to help operators monitor and control threat activity and to reduce operational expense through proactive management of system status and performance. As explained by Verimatrix president Steve Oetegenn, the service helps operators meet the rigorous next-gen content protection requirements set by Hollywood studios through MovieLabs, including tracking of pirates through use of the company’s VideoMark forensic watermarking system.

“The video services and delivery landscape has become incredibly complex, and concerns about piracy and password sharing is growing,” says Glenn Hower, research analyst for Parks Associates. “Flexible content security solutions that can adapt to nearly any delivery environment will be key to protecting content while allowing delivery through different networks, services and devices.”

Irdeto’s Keys & Credentials has obviously hit a sweet spot in the growing demand for hosted security solutions in the pay TV market. “I think the reason for the success of Keys & Credentials is essentially that we’ve come up with a way of providing a robust level of security without locking in the customer,” Lowther says.

“Irdeto agreed to a neutral position in the U.S. conditional access market that made CA vendors comfortable with our services in a sensitive ecosystem,” he continues. “So we’ve essentially broken that coupling that made it necessary to have the suppliers do all the software upgrades for new services.”

As operators look to providing ironclad security in the unmanaged device space where modes of introducing roots of trust at the SoC level are already in play, they are likely to expand their utilization of Irdeto’s services, Lowther notes. “It’s not about securing pipes anymore,” he says. “It’s about securing the content from glass to glass, end to end. You have to have a level of capability across all the different security systems out there.”

Indeed, says the MSO executive quoted earlier, such requirements are very much on his company’s mind. “As we move to unmanaged devices, we will face similar needs, which we’re excited to explore with Irdeto,” he says.

0

Outlook Improves for Resolving Subscriber Home Wi-Fi Issues

Metin IsmailTaşkın, CTO, AirTies

Metin IsmailTaşkın, CTO, AirTies


NSPs Are Hopeful New Solution Can Impact Customer Satisfaction Levels

By Fred Dawson

July 13, 2016 – A new approach to sorting out causes for subscriber complaints relating to problems with Wi-Fi connectivity is gaining traction among leading service providers as a potential linchpin in their efforts to improve customer satisfaction.

The new Wi-Fi performance monitoring system introduced by AirTies at the recent INTX show in Boston is one of several new solutions vying for attention as remedies to a persistent and growing problem that is only going to get worse as wireless connectivity pervades the residential landscape.  Based on findings of a worldwide survey conducted by ARRIS last year, 63 percent of consumers have had issues with Wi-Fi at home. More anecdotally, service providers routinely report that more than half the calls to their customer service centers are Wi-Fi related.

“Broadband operators are at an important crossroad when it comes to ensuring the quality of the in-home experience,” says AirTies CEO Philippe Alcaras. “For many operators, the top customer care calls are Wi-Fi related. Yet operators are in the difficult position of having almost no visibility into the true network conditions throughout a subscriber’s home, particularly on unmanaged devices.”

Wi-Fi problems and associated hassles customers encounter dealing with call centers make it hard for network service providers to move the needle on customer satisfaction metrics, which have long ranked at the bottom in multi-industry surveys. For example, in the latest update to the multi-industry American Customer Satisfaction Index consumer surveys conducted under the auspices of the University of Michigan, the American Society for Quality and CFI Group, the ISP and subscription TV sectors rank lowest on a 100-point scale across 41 industries with ratings of 64 and 65, respectively, barely higher than the 63.9 points scored by the federal government.

As previously reported, AirTies, building on successes with Tier 1 service providers across Europe, recently opened offices in the U.S. in anticipation that the market is ready for a residential mesh-configured Wi-Fi networking solution capable of supporting multiple simultaneous sessions throughout the home involving everything from smartphones to big-screen TVs. With a growing list of customers in North America, AirTies is finding a welcoming response to its new Remote View monitoring platform, which is designed specifically for maintaining a high quality of experience on mesh-architected Wi-Fi networks.

“People who are testing or already into deployment of the AirTies mesh platform are now incorporating the Remote View monitoring system into their plans,” says a source close to the company, noting that in some cases the ability to achieve better visibility into the home Wi-Fi infrastructure is the primary spur to interest in the AirTies mesh solution. The ten or so trial participants include major Tier 1 operators who have not yet been announced as AirTies customers, this person adds.

Other vendors are tackling the Wi-Fi quality control issue, some focusing on out-of-home as well as in-home infrastructure. For example, Ericsson is rolling out an extension of its QoS monitoring and analytics suite designed specifically for keeping tabs on Wi-Fi performance in the home. Nokia (formerly Alcatel-Lucent) has long provided comprehensive monitoring of network elements, including in-home devices attached to the network via TR-069 and related protocols, as part of its Motive suite of quality assurance solutions.

But the QoS monitoring problem posed by traditional residential Wi-Fi systems is that access points (APs) lack the native intelligence required for performance monitoring, making it impossible to get a direct read from the APs as to what is going on. Consequently, performance monitoring solutions tied to traditional AP architectures must rely on information gathered from the primary gateway router, which is fine as long as the router-based AP is the only AP in the home.

Now, however, with multiple APs required to support expanded use of Wi-Fi networking to accommodate distribution of high-quality video to every room even in smaller households, AirTies is banking on the industry’s switch to mesh-based systems, which require intelligence at every AP to enable best-path selection for maximizing overall performance. This enables use of a more sophisticated monitoring system that can directly view every interaction between every device and every AP.

“AirTies’ Remote View collects data from every gateway, set-top box and AP that is connected to our technology,” says AirTies CTO Metin Ismail Taşkın. “All the data is uploaded to the cloud for analysis.” In the trials, operators are relying on the Remote View analytics engine running on AirTies’ servers, Taşkın adds, but in commercial deployments the analytics software will run on servers in customers’ datacenters or public clouds they are affiliated with.

While the Remote View platform only works with multiple APs on mesh Wi-Fi networks, it can be integrated with other vendors’ products for multiple AP monitoring in their mesh systems or for monitoring on gateways and set-tops with built-in APs that may serve as the only AP in the home or be linked to linearly connected APs, which would not be part of the AirTies monitoring system. “We’ve already integrated Remote View with the CPE used in the Sky Q service,” he notes.

Sky Q, the multiscreen service offered by Sky in the U.K., Germany, Austria and Italy, is deployed with AirTies’ support for the firm’s new Hybrid Mesh service, which incorporates Sky’s Powerline home networks into the holistically managed AirTies mesh environment to enable optimal use of any combination of wired and Wi-Fi hops to route packets. Now Remote View is incorporated to ensure comprehensive monitoring and data gathering from all points of interconnection in Sky Q homes, Taşkın says.

Other announced customers for AirTies’ mesh home networking technology include Vodafone, Singtel, Swisscom and Midco. Customers can deploy the Hybrid Mesh platform with Ethernet and MoCA as well as Powerline. To enable use of Remote View on third-party devices AirTies relies on its operator customers to instigate the integration, Taşkın notes.”If the gateway meets our specifications, we put software on the device and integrate with the device software from whoever makes the gateway,” he says.

The Remote View system consists of multiple software applications initially designed to support field technicians and network engineering teams’ ability to optimize and troubleshoot subscribers’ home Wi-Fi networks, he explains, noting additional applications will be announced in the months ahead. The system enables operators to identify Wi-Fi installation issues, determine if coverage problems exist and whether additional APs are required in a home.

And it helps operators address other key issues that have been hard to deal with, Taşkın adds. For example, through a dashboard that provides real-time and historical data on active Wi-Fi connections, traffic and throughput, operators can examine connection rates of wireless APs and third-party devices; the distribution of 802.11g, 802.11n or 802.11ac clients; the speed capability and distribution of client brands (i.e. phone or tablet models) in use within the home; band connection durations of all clients, and air-time consumption of each device.

One of the most common causes of customer complains occurs when dual-band devices equipped to connect to either 5GHz or 2.4GHz spectrum tiers, fail to choose the higher level and therefore end up registering lower bandwidth speeds than customers expect. Enabling a customer service rep to immediately identify poor client steering in the dual-band environment as the problem can be critical to maintaining customer satisfaction, Taşkın notes.

Another area of performance monitoring that is vital to satisfactory customer care has emerged with the prevalence of so-called “neighborhood hotspots,” which are dual-use hot spots positioned in the home with partitioning of available bandwidth to support public access outside the home. While such partitioning works and provides the bandwidth promised to the home subscriber, problems can occur when an outside user is accessing the hotspot from the fringes of the coverage area, resulting in a disproportionate allocation of signal power to reach the distant device and depriving the AP of enough power to adequately serve in-home devices.

“There’s a real danger with neighborhood hotspots in this regard resulting from how TDMA (time division multiple access) signaling works,” Taşkın says. Because TDMA assigns time slots to each device, a ‘bad apple’ device connecting outside will take up more time on the TDMA link as the system waits for the device to keep trying until it works on an assigned timeslot, forcing devices in the home to wait for timeslots.

“You can do better with multiple APs in the home operating on our mesh architecture, but you’ll still need to know what’s going on with devices outside the home,” he says. “Remote View can also aggregate data to allow you to look at how your Wi-Fi installations are performing across a whole neighborhood or larger residential groupings.”

Since all data is provided by AirTies’ intelligent in-home network of APs, there is no need to download any client-side software on subscribers’ personal devices, Taşkın notes. The system only provides data and analysis of in-home network connections, and does not monitor browser-level data about site visits. By default, the software requires subscribers to grant permission to enable monitoring capabilities within their homes.

0

Reaching 10 Gig over Coax: Is It Really the Way to Go?

Robert Howald, VP, network architecture, Comcast

Robert Howald, VP, network architecture, Comcast


Nokia Breakthrough Maps with CableLabs’ FDX Initiative

By Fred Dawson

With Nokia Bell Labs’ recent demonstration that full duplex 10 gigabit-per-second DOCSIS 3.1 service is technically feasible the onus is on cable operators to let vendors know whether commercialization of the capability is worth their while.

Or, put another way, the question is whether MSOs want to continue relying on coaxial cable in scenarios where a service supporting bi-directional throughput at 10 Gbps makes sense, given that such a service can only be delivered over coax links with no in-line amplifiers, which is to say, with implementation of Fiber Deep architecture. In light of advancements associated with passive optical networking (PON) technology and the long-term commitment to coax entailed in pursuing Fiber Deep strategies, the question represents a key crossroad in cable network migration strategies.

Nokia, like other vendors, is looking for clarity on the issue before it invests in next steps to productize the apparatus it’s been using in tests at Bell Labs facilities.“We’re looking forward to going beyond the research prototype phase,” says Jay Fausch, who heads Nokia’s cable segment marketing team. “But it’s a big decision in terms of investment and resources.”

Nokia’s achievement, unveiled at the INTX Show in May, comes amid an accelerating push to gigabit broadband here and abroad as cable operators using DOCSIS 3.1 and PON technologies battle competitors like Google, AT&T and Verizon that are pushing fiber to the premises and others that are using the recently standardized G.fast DSL technology to reach gigabit speeds. Gigabit broadband is now available in dozens of cities and towns across the country, with many more to be added before the year is out.

At the cutting edge of these efforts, some providers are beginning to test asymmetric 10 Gbps access technologies. Verizon, for example, which currently offers FiOS connections commercially at up to 500 Mbps, last year field tested NG (Next-Generation)-PON2 technology using Cisco and PT Inovação equipment to support a 10 Gbps downstream/2.5 Gbps upstream connection to residential and business customers in Framingham, Mass. In Germany, cable operator Unitymedia says it is preparing to roll out DOCSIS 3.1 in 2018 with an asymmetric service that tops out at 10 Gbps downstream.

Comcast, which this year is launching DOCSIS 3.1 supporting 1 Gbps downstream/35 Mbps upstream service in Atlanta, Nashville, Chicago, Detroit, and Miami at a monthly no-contract price of $139.95, has also introduced an all-fiber service, Gigabit Pro, that supports symmetrical throughput at 2 Gbps. Gigabit Pro is available to about 18 million households that are in close enough proximity to HFC nodes to enable fiber extensions to premises at a monthly price of $300, plus $1,000 in installation and activation fees, exclusive of promotional offers that cut costs by about 50 percent.

Comcast’s commercial services unit also offers multi-gigabit Ethernet service over fiber at up to 10 Gbps, Other MSOs are offering gigabit Ethernet commercial services, and some have announced plans to introduce DOCSIS 3.1 at up to 1 Gbps in 2017 and beyond. But, when it comes to DOCSIS 3.1 over HFC plant, no one has announced any plans for a full-duplex 10 Gbps service, although Comcast has given the Nokia initiative a hearty thumbs up.

“While it is still early in the development of full duplex, Nokia’s XG Cable proof of concept shows that multi-gigabit symmetrical speeds over HFC, as targeted in the CableLabs FDX (Full Duplex) initiative, are achievable,” says Dr. Robert Howald, Comcast Cable’s vice president of network architecture. “As we continue our DOCSIS 3.1 deployments this year, this development further illustrates the power and flexibility of the DOCSIS 3.1 as a tool to deliver next-generation broadband performance.”

As described by Keith Chow, project lead for XG technology at Nokia, the Bell Labs test was only able to hit 10-gig FDX over a point-to-point coaxial link measuring 100 meters. In a typical node-based HFC array with no in-line amplifiers beyond the node, his team achieved FDX at 7.5 Gbps over 200 meters.

“We expect with farther refinements we’ll be able to do full duplex at 10 Gbps at up to 200 meters,” Chow says. Based on single-home density averages in North America, this would translate to serving anywhere from 128 to 256 households, he notes.

Enabling 10 Gbps FDX over DOCSIS 3.1 starts with utilization of the full 1.2 GHz of spectrum available over coax at these distances for both downstream and upstream signals. Along with elimination of in-line amplifiers this requires use of the time division duplex (TDD) techniques used in Wi-Fi and DSL communications to break up the signals for assignment into time slots dedicated to either the downstream or upstream path. But “the physics is difficult,” Chow says, noting there are significant “echo cancellation and vectoring issues.”

One challenge has to do with the echo or near-end cross-talk between the upstream and downstream signals generated at the cable modem or the CMTS (cable modem termination system). “If the echo is louder than the channel, the receiver gets nothing,” Chow says. “In the DSL world we know how to estimate cross talk based on the noise level in comparison to the loss budget on the link. We remove the echo with digital processing at the receiver.”

Another major issue is “interference from neighboring modems sharing the same splitter or tap,” he adds. “It’s a similar echo problem, but very difficult to cancel because any given modem doesn’t know when its neighbors are transmitting. So we have to solve the problem with intelligence at the node.”

Based on knowledge of the topology of the cable plant and the interference patterns created by various combinations of activity among neighboring modems, Nokia runs algorithms that optimize modem transmission sequences in real time to prevent interference, he explains. Presently, Bell Labs is using outsized components to support these processes in Fiber Deep HFC configurations that conform to CableLabs specifications, but Nokia is confident it can attain these performance parameters with its software running on commercially viable components.

So far, MSOs’ interest in FDX over DOCSIS 3.1 has been sufficient to prompt CableLabs’ exploration of solutions along the lines pursued by Nokia, which means Nokia’s XG-Cable should easily integrate with the CableLabs FDX recommendations once they are worked out. In a recent blog, Belal Hamzeh, CableLabs’ vice president for wireless R&D, and Dan Rice, senior vice president of R&D, said CableLabs has independently verified the viability of using a combination of passive HFC and the self-interference cancellation and intelligent scheduling of DOCSIS 3.1 technology to achieve FDX. “These developments are expected to yield DOCSIS 3.1 network performance of up to 10 Gbps symmetrical on 1 GHz HFC networks, with the potential for even higher performance by utilizing spectrum that is currently available for future expansion above 1 GHz,” they said.

“Our design and analysis shows that the existing Physical and MAC (media access control) layer protocols in DOCSIS 3.1 technology can largely support this new symmetric service,” they added. “The evolution to a DOCSIS 3.1 Full Duplex network is an incremental evolution of DOCSIS 3.1 technology and will support both backward compatibility and coexistence with previous generations of DOCSIS network deployments.”

CableLabs is working with a team consisting of engineers from member operators and vendors to help mature the technology. If all signs remain positive, Hamzen and Rice wrote, “the project will transition from an innovation effort into an R&D project, open to all interested participants.”

That’s good news in light of the an in-depth comparison between HFC and all-fiber approaches to meeting emerging bandwidth challenges made by Phil Miguelez, Comcast’s executive director of network architecture, in a paper presented at the INTX show. While migration to Fiber Deep architecture, a prerequisite to enabling 10 Gbps FDX over DOCSIS 3.1, is fraught with challenges, Miguelez made clear that it appears to be cable’s best option over other HFC and fiber migration paths as operators weigh responses to the market pressures emerging in the years ahead.

While Miguelez made only passing mention of 10 Gig FDX, he left no doubt as to where Comcast stands on the need to find a way to ever higher speeds beyond 1 Gbps. “The market landscape is now filled with large and small competitors offering gigabit speed service and threatening to overtake the HFC cable space,” he said. “While it’s true that the only immediate application for gigabit data rates today is the speed test, history has shown that new applications seem to always appear once the network BW (bandwidth) is available to support them.”

On the legacy pay TV side, demand for ever more channels targeting ethnic and special interest communities in densely populated areas has squeezed the space available for broadband bandwidth expansion, he noted. The expansion of service areas with wider spacing of amplifiers in rural areas has had a similar effect. Meanwhile, with growing consumption of video over broadband, per-home device counts are multiplying rapidly, greatly adding to the pressure for more broadband bandwidth.

“A few years ago the network assumption was 3 to 5 devices on average and 5 to 7 devices in a high end user home,” Miguelez wrote. “Today those numbers have jumped to an average of 5 to 7 connected devices per home and 10 to 12 for a high end subscriber.”

Moreover, he added, “Another pending driver is the Internet of Things (IoT). The number of Internet connected devices is exploding.”

No one is suggesting there’s an immediate demand for 10 Gig FDX, but cable operators need a path to get there, especially in light of the relative ease with which telcos currently relying on GPON (Gigabit PON) infrastructure will be able to move to 10 Gig speeds when the need arises. “The cost of 10 Gb optics while higher than today’s 2.5 Gb un-cooled optics are quickly decreasing based on growing volumes in North America and China,” Miguelez noted.

“The Telco market is beginning to feel the pressure from emerging 10G EPON competitors,” he continued. “FSAN (Full Service Access Network), the ITU standards organization that defined the GPON protocols, has recently initiated a new XGS PON spec that is compatible with 10G EPON optics. Before long current GPON competitors will be raising the bar with 10 Gb service offerings.”

DOCSIS 3.1 deployed on Fiber Deep architecture expands coax spectrum to 1.2 GHz, enabling operators to provide “the same DS (downstream) data capacity as 10 Gb PON,” Miguelez said. But, he added, “Fiber Deep architecture is not without challenges.

“Node + 0 (no in-line amplifiers) increases the complexity due to the changes that need to be made to the actual HFC plant configuration. Fiber Deep relies on new technology developments in order to be a successful and practical architecture.”

One major consideration is the number of nodes that must be added. “Depending on the homes passed density of the target system the number of new nodes required can range from10:1 to as high as 16:1 compared to the existing N+X cascade design,” he said. There “are very few Hubs with the space, power, and HVAC (high voltage alternating current) capability to accommodate the growth in equipment associated with this significant increase in nodes.”

While, under current technological conditions, the expanded Fiber Deep node count would incur prohibitive costs tied to obtaining real estate and paying for construction of new hubs, Miguelez expressed confidence that new distributed access architecture (DAA) solutions now under evaluation will cut hub density by disaggregating and distributing network elements and some CCAP (Cable Converged Access Platform) management functions to the nodes. “These solutions could be available in the market starting the second half of 2017,” he noted, adding, “Other development efforts such as all IP transport and SDN/NFV (software defined network/network function virtualization) will further reduce the current equipment density in the Hub and could allow the eventual consolidation of secondary Hubs into a master headend.”

And while the shift to 1.2 GHz of spectrum over coax enabled by Fiber Deep “requires modifications to every aspect of the network,” including distribution taps and passive components, the housing used with the current generation of 1 GHz taps is compatible with the 1.2 GHz faceplates, allowing installers to change out faceplates without changing the housing. “The ability to change the faceplate instead of cutting out and replacing the entire tap housing amounts to a considerable construction cost savings,” he said.

In addition, the new faceplates are equipped with better surge resistance, which adds a cost-saving element to the change out. “Every major tap manufacturer has followed this same guideline and plans to obsolete and replace the current 1 GHz devices with 1.2 GHz standard product later this year,” he added.

Another major technical advance facilitating Fiber Deep is the emergence of hybrid gain blocks supporting node output gains and linear extension of “tilt,” a measure of frequency-related loss variations on the coax and passives that increases with the expansion to 1.2 GHz. “[A]fter a year and a half of development effort two major device suppliers succeeded in creating qualified hybrid gain blocks that are now commercially available,” Miguelez said. “This accomplishment was only possible as a result of working in close partnership with multiple node design teams to solve technical hurdles related to power consumption and thermal capacity limitations of the node housing and cable plant powering.”

It appears another big problem, powering multiple new nodes without adding power sources, has been solved as well. “There are many degrees of freedom when designing and implementing access network architecture changes,” Miguelez said. “Changing the power grid is not one of them….Therefore, for everything except new greenfield construction, any new equipment deployments must safely fit within the margins of the existing AC power capacity design limits.”

One part of the solution has to do with lowering power consumption in new node designs. Another entails transmitting power from existing sources to multiple nodes over the coax, “In order to power the relocated nodes, coax feeder lines are added between the PS (power source) and closest node(s),” he said. “AC power is also fed from node to node by bridging the access coax tap strings between adjacent nodes.”

Yet for all the solutions Miguelez identified for making Fiber Deep a practical option with lower total cost than running fiber to every customer, the question remains whether ongoing reliance on coax will make sense in the long run. After all, it’s likely even more spectrum will be needed eventually, necessitating even deeper fiber extensions to make more spectrum available over shorter coax runs.

There’s no doubt there is much spectrum left to use with farther configuration adjustments, as ARRIS demonstrated at last year’s CableLabs Summer Conference in Keystone, Colo. Tom Cloonan, CTO of Arris’s Cloud & Network Solutions unit, and his team used 6 GHz of coax spectrum to show the potential for 50 Gbps throughput with a fiber-to-the-tap or –curb architecture. And they extrapolated from the demo that, if they could shorten the coax to where 12 GHz was possible, they could hit 100 Gbps, or maybe they could even get to 200 Gbps with 25 GHz of spectrum.

But at what point do the ongoing maintenance costs of HFC plant combined with the costs of ever deeper fiber migration add up to more than MSOs would incur by moving to PON sooner than later? Once in place, PON would allow for lower maintenance costs and ongoing upgrades to higher speeds with simple component replacements in existing housing.

As Miguelez acknowledged, Fiber Deep can be seen as postponing but not necessarily eliminating the need to move to PON. “A target node size of 128 HHP (households passed) delivers the same data rates as a 10G EPON port and positions the network for a future transition to PON when needed,” he said.

Will PON prove at some point to be a better option than further shortening of coax? If so, the demand for 10 gig FDX over DOCSIS 3.1 that Nokia is looking for to drive farther development may turn out to be less than enthusiastic as operators weigh the cost tradeoffs of going to Fiber Deep before moving to PON versus avoiding the interim step by biting the PON bullet whenever delivering 1 Gig over DOCSIS 3.1 is no longer adequate to meeting the competitive imperatives.

Page 1 of 3412345...102030...Last »