Showing posts with label Predictions. Show all posts
Showing posts with label Predictions. Show all posts

Sunday, January 10, 2021

Network predictions 2021: Cisco's Scott Harrell

 by Scott Harrell, SVP and GM, Intent-Based Networking Group at Cisco

Connected Workplaces for the Return to Work

In 2021, workers will increasingly return to the workplace, but it will be very different for everyone, depending on city, region or country policies, among other factors. For those workers who will return to the office full time, new networking standards, procedures, and space reconfiguration will need to be in place to create a safe work environment. 

Reconfiguring the workplace will be essential to adapt to new health safeguards and new working styles. Monitoring space usage and density can help limit the number of people inside the office and adhere to good safety practices while maximizing in-person productivity and regaining office camaraderie. 

In 2021 and beyond, the office will need to be utilized for moments that matter. It’s likely that video conferencing will replace in-person, group conference room meetings. To ensure higher efficiency, video to every device and desktop, remote or in person, will require rethinking Wi-Fi coverage and capacity. 

Next Gen Wireless Will Enable Business Processes to be Reinvented 

5G and Wi-Fi 6 are being deployed across the globe. History has shown every new generation of wireless drives new use cases and rapid innovation. The combination of 5G and Wi-Fi 6 will provide fast, low-latency connections anywhere, with no need for manual network selection or authentication. This foundation becomes an accelerant for the deployment of other next generation technologies and will quicken disruptions across industries. 

In 2020, IT teams proved they can move amazingly fast as they restructured their infrastructure to facilitate moving workers to a home office. In 2021, they will look to apply the same speed to deploy next generation wirelessly ubiquitously across their offices. Video communications will be a primary driver of Wi-Fi 6 deployments as organizations look to enable video to every desktop to mimic the rich video-powered meeting style employees have gotten used to from home. Additionally, the increased speed, capacity, security, and availability enabled by 5G and Wi-Fi 6 will create a richer platform for innovation, and we’ll see entirely new business models start to develop by leveraging these capabilities.

Smart Buildings to Improve Energy Efficiency and Safety 

In the coming years, the U.S. – along with other world governments – will demand and promote smarter, safer, and more energy-efficient buildings for both new construction and existing structures. Next year, we expect companies to add more sensors and IoT devices to current networks to not only improve energy efficiency and safety within buildings, but to provide more connectivity. 

With the addition of new devices and sensors comes increased Wi-Fi congestion and interference within the smart building systems. The availability of more spectrum with Wi-Fi 6e will help IT adapt to the increases in device density and streaming applications, with private 5G filling in the spaces where Wi-Fi is impractical. 

Unmanaged Devices Will Increasingly Be Used for Cyberattacks

In 2020, as many as 20.4 billion IoT devices came online. As IoT devices continue to grow, we can expect that they will increasingly be used for cyberattacks. In 2019 alone, the number of cyberattacks on IoT devices surged by more than 300%.

Industrial IoT and OT systems need additional protection to keep critical infrastructure running even while under attack. To prevent attacks from spreading, industries will use end-point analytics to identify IoT and other endpoints and add them to security groups using intelligent segmentation to automatically quarantine devices when unusual behavior is detected.

Automated and Secure Interconnections

While new applications will be primarily built as cloud-native, the business case often does not justify re-platforming existing applications. Therefore, a hybrid environment will exist for the foreseeable future. Network and data center automation will be key in 2021 as businesses need to provision infrastructure and run a continuous development pipeline to orchestrate multiple public cloud and hybrid environments. Infrastructure teams want to retain control over the IT environment and choose which APIs they expose to development teams. The goal in 2021 will be to build an automated and secure interconnect between on-premises and cloud data centers for ease of provisioning and monitoring at scale in a hybrid environment.

Accelerating Day2 Operations

As more IoT devices, cloud services, hybrid workstyles, and personal devices become the norm in the workplace, IT must be able to manage the increasing complexity even as expectations for uptime and availability grow in parallel. Therefore, in 2021 we can expect the need for advanced network analytics will continue to accelerate but also begin to be utilized by more and more teams within IT. AIOps teams will drive the rapid rise in the utilization of these tools within and across NetOps, SecOps, and DevOps teams. These tools will enable the effective upskilling of first and second-line support, empower end users with increased visibility and self-remediation capabilities, and provide disparate teams with shared perspectives and common sources of truth. 

Common sources of truth can be exceptionally powerful. To enable this, sharing data and linking automation workflows across systems will become increasingly common. This will enable rapid resolution of issues across teams. The ability to link the automation layer with the analytics layer will enable IT to begin to move increasingly in machine time versus human time. 

Security has been and will continue to be a key driver for the transition to machine time. Detected threats need to be acted on rapidly and pervasively across the infrastructure. Since attackers are already leveraging large scale automation to attack businesses, IT must increasingly do the same in their response, bridging detection via SecOps with automation via NetOps to counteract the range of constantly evolving threats. For example, linking identity services and management resources with security detection tools creates dynamic access control and segmentation policies that limit fast-spreading malware as the systems detect and respond in real time — all without human intervention. 

These changes will of course benefit the line of business, the workforce, and customers. But they will also provide welcome help to the overstretched and under-appreciated professionals staffing help desks and maintaining critical infrastructure across the world.

Monday, December 21, 2020

Nework Predictions 2021: TelcoDR's Danielle Royston

 by Danielle Royston, Founder, TelcoDR

A telco will figure out how to really use the public cloud and save 50% on its IT costs – or more

How will it happen? It'll move a ton of software to the cloud and prove: 1) it works; 2) it’ll save a ton of money (the company that embraces the software of the public cloud will see a 50% savings on IT costs); 3) life is sweet! (And way sweeter than it ever was before. I’m talking about taking the oldest, suckiest, super unsexy legacy applications and refactoring them for 90% savings.)

Who’ll be the bold telco? Definitely not a company in the US. Sorry America. It’ll likely be based in Asia, which has moved on from dumb private cloud, and we’ve already seen examples of successful moves to public cloud in this region (take a bow, M1). 

In 2021 we might be going back to 1981-style boldness, but it’ll be a huge move forward for modernizing the telco industry. A bold telco will successfully transition to the public cloud and show everyone else how it’s done. Note to everyone else: be prepared, this change will require all hands on deck.

Telcos will take the wrong approach – and fail

Alongside public cloud success, we’ll also witness public cloud failure in 2021. Without a proper understanding of the cloud ecosystem – and what ‘cloud native’ means: see my 2020 round-up above – telcos will foot some spectacular fuck-ups. On that note: if you want to avoid being that telco, look for my blog in January where I’ll clarify cloud language and explain how each part of the telco business can benefit.

Back to those failures though. It’s common sense to move to the public cloud, but there are still so many misconceptions that telcos will get bound in. It’s not just about infrastructure and IT, for instance. It requires a top-down, organization-wide cultural change. It requires clear communication.

Wrong moves will result in failure. Or, if not complete failure, then a load of back-tracking, additional costs and tails between legs. No one wants to hear ‘I told you so.’ Bank of America probably didn’t. For almost a decade, the institution was adamant that ignoring public cloud and obsessing about its vanity project (aka, building its own private cloud) was the way to go. It wasn’t. In 2019, Brian Moynihan, BofA chairman of the board and CEO, admitted that although it had been pursuing private cloud – and spending on private cloud – third-party cloud providers are 25-30% "cheaper.” It then teamed with IBM to develop a public-cloud computing service for banks.

There’s also the cautionary tale of Verizon, a company that thought it was a great idea to spend $1.4 billion on data center provider Terremark. It later realized it couldn’t compete with the might of the hyperscalers and dumped the business on Equinix.

People will fall for IBM’s #fakecloud

You thought the claws of Oracle were bad? In 2021, you’ll see it’s IBM that has the real talons.

In November IBM launched its cloud-for-telco play. Unfortunately for telco – and bad luck for buyers – Big Blue launched a big crock of shit. This is not cloud. It was fake news. It’s #fakecloud. In 2021 we’ll see the results from the poor suckers who’ve invested and we’ll hopefully see a greater realization that a hybrid strategy and a half-assed move to the cloud will never work.

At launch, IBM tried to persuade telco to keep things on-premise. If you do move to the BFCs, then IBM can manage it all for you. What they didn’t mention was that this would happen at a cost, and it’d be a massive waste of time. Telcos that fell for this trap last year will be adding five more years to their public cloud journey, by which time they’ll be way behind competitors that saved time and money, and whose customers love the service they offer. 

Be wary of IBM, my telco children. Do not fall for the trap!

OpenRAN will explode

The tail end of 2020 saw OpenRAN start to bubble rapidly to the surface of telco conversations. In 2021, it’s gonna explode. Vendors: be afraid, be very afraid. Ericsson’s revenue will slip even further through its fingers – something it already admitted last year, when CEO Börje Ekholm said he expected OpenRAN market developments to “impact revenues” from 2023 onwards. 

Other vendors will hemorrhage revenue as telcos realize that there is (finally!) an alternative to overpriced infrastructure and vendor lock-in. They’ll get choice, at last, picking and choosing best-of-breed elements from whomever the hell they want! More features will be driven into software. Networks will be easier and cheaper to maintain, easier and cheaper to upgrade. Spend on RAN will go from historic levels of around 90% of total spend to 50XX%. It might not be next year, but the development and industry excitement around disaggregated network components will certainly define the trajectory of telcos’ decision making next year.

Pioneers like Rakuten will gain column inches and market share next year. It’s no wonder: Rakuten claims operators can reduce capex CAPEX by 40% with its telco-in-a-box network. Vodafone has also been staking its claim in the OpenRAN space: last November it announced it would be deploying OpenRAN technology at 2,600 mobile sites across Wales and the South West of England.

Experimentation is the name of the game here. There might be failures along the way, but telcos will be less afraid of dipping their toe in the OpenRAN water. This will gear them up for taking a plunge in the public cloud ocean down the line.

There’ll always be another G

You can’t move nowadays without being bombarded with something about a ‘G.’ Clearly people believe the hype – 5G networks will cover an estimated one billion people by the end of the year, attracting 220 million subscriptions, according to Ericsson. And it’s not all about faster speeds and greater capacity … research suggests 5G is 90% more energy efficient than legacy mobile infrastructure.

Telcos are set to ramp-up 5G investment in 2021, according to Fitch Ratings, which has warned there will be increased pressure on credit metrics for most worldwide. Free cash flow, it says, will be constrained over the next three years. But if telcos believe they can monetize all 5G capex by simply boosting customer experience, that’s just not possible. Instead, they should focus on bringing new forms of life into reality with the help of 5G – I’m thinking best-in-class remote work, e-learning and virtual services. 

That capex pressure will only increase with demands for more connections, higher speeds, greater capacity. Telcos simply can’t afford NOT to move to the public cloud, helping them to further enrich their offerings, as well as cut time and costs with reduced latency. Only the foolish would add to that capex pressure by building their own cloud – remove that headache by using the BFCs!

Sunday, December 20, 2020

Network predictions 2021: Ciena's Steve Alexander

by Steve Alexander, CTO, Ciena

2021 will take investment to the edge

5G networks are primed to deliver faster web browsing and video streaming with reduced latency, both very appealing for consumers. But 5G can do so much more once networks have matured. Advanced 5G services like rich AR and VR, cloud gaming, telemedicine, and Industry 4.0 (the connected manufacturing revolution), all require highly reliable networks that can deliver low latency as well as higher bandwidth – but also high levels of intelligence.

For these services to take off, networks must continue to get faster, closer and smarter, utilizing automation intelligence and software to deliver on the hype of these exciting services. A part of building faster, closer and smarter networks is to build out the edge, where we need up to five times more data centres than are available today.

There is already heavy investment in building out edge data center sites to bring the cloud closer to users and this investment will continue at pace in 2021. The carriers know they need to continue to focus on building out their edge infrastructure in these smaller data center sites, leveraging edge cloud capabilities which will mean that services can be processed closer to users, improving user experience and delivering on the bold promises of 5G.

Hitting new network requirements will become automatic

Carriers know the demands we are placing on networks show no signs of slowing as our lives become more digital and distributed. That means network rollout will continue at pace, but networks must now be built to adapt on their own. Carriers have already taken steps to make this happen, but in 2021, we will start to see even more use of software and analytics to improve the way optical networks function.

Advanced software capabilities will redefine how network providers engineer, operate and monetize their optical networks. These software solutions were originally focused on extracting more value from existing network assets. In 2021 will see these software solutions play a key role in new network builds, giving CSPs the ability to fine-tune, control and dynamically adjust optical connectivity and capacity.

Software will also give greater visibility into the health of the network via real-time link performance metrics and increased, end-to-end photonic layer automation. By utilizing the latest advanced software solutions, providers can monitor and mine all available network assets to be able to instantly respond to new and unexpected bandwidth demands and allocate capacity across any path in real time – a function which will become increasingly important year-on-year.

Increasing Digital Inclusion will be key to continued remote working

This year has demonstrated how important connectivity is for people to stay in touch, shop and work remotely to keep our economy moving.  It has also proven crucial to the continued education of students. There is a growing desire to maintain this flexibility even once Covid restrictions are lifted, but this is only possible if you have the connectivity and capacity.

In 2021, we’ll see rural connectivity and digital inclusion initiatives move higher up the political agenda, and solutions like low-orbit satellite connectivity will come to greater prominence. The solution that maximizes ultimate capacity is still scaling fiber based broadband, but we know this can be a challenge in rural areas, so will require a nudge from policy makers to get things moving.

If countries want to stay at the forefront of the digital economy, they must break down the barriers to rural connectivity and invest in fixing the last-mile problem. They must also continue supporting digital inclusion programmes that grant students access to technology and tools. Incentives and initiatives from the government, and an ongoing review to ensure that networks are using the most effective equipment suppliers, are certainly ways to help.

Enhanced reality will step forward as the first killer use case for 5G 

Almost as soon as talk of 5G networks first started, so too did questions about what the killer app for the new standard will be. 2021 might not be the year we get the definitive answer to that question, but it will be the year in which enhanced reality (AR and VR) applications take a step forward. However, it may not be consumer-centric services that light the path, but instead, enterprise use cases could lead the way. 

I think it’s safe to say that all of us have grown weary of online team meetings this year, and ‘zoom fatigue’ has become a very real thing. Next year I predict we will see more instances of AR and VR being used as collaboration tools, helping remote teams regain some of the ‘live’ element of working together. These services will initially need to run over combinations of home broadband, in building Wi-Fi, 4G and 5G networks.  They will ultimately open the door to more commercial AR and VR services over 5G networks and WiFi 6 further down the road. The quality of those networks will take these enhanced reality applications beyond a fun, short-term gimmick into being a viable and valuable service offering.

WebScalers and telcos expand their collaborations to improve our cloud experience

One of the biggest trends of 2020 has been the partnerships that have been forged between telecoms carriers and some of the the hyperscalers. There’s no doubt this will continue and grow well beyond 2021, but as networks become increasingly more software centric there is an opportunity to improve the delivery of new services and applications to the users.

From the perspective of a WebScale operator, service provider networks often appear to be a patchwork quilt of various vendors and technologies. The suite of Internet protocols allows this complexity to be abstracted up to a set of globally uniform IP addresses and this has served us fantastically well. At the same time, service provider networks look largely opaque to the cloud and consequently it is hard to guarantee a user the cloud experience that is desired. To deliver next generation service more collaboration between cloud and network is required.  Making the network adaptive through the use of intelligent software allow coordination between service provider networks and the cloud and will enable a generation of AR and VR-based immersive services and applications.

Steve Alexander is Ciena’s Senior Vice President and Chief Technology Officer. He has held a number of positions since joining the Company in 1994, including General Manager of Ciena's Transport & Switching and Data Networking business units, Vice President of Transport Products and Director of Lightwave Systems.

Sunday, December 6, 2020

2021 Foresight: Predictions for Service Providers

by Sally Bament, VP of Cloud & Service Provider Marketing, Juniper Networks

COVID’s Impact

COVID aims the spotlight on preparing networks for the unknown, AI/ML will be big focus

The COVID-19 pandemic shifted our world from physical to virtual literally overnight, placing enormous responsibility on service providers to deliver seamless real-time and near real-time experiences at peak traffic levels. Traffic patterns are shifting from mobility towards Wi-Fi and broadband networks, and as work continues to shift to the home, the lines between consumers and enterprise users continue to blur. This implies there will be long-term changes in how service providers architect and manage their networks particularly for enterprise customers, which by extension means to the home. Next year, we will see more focus on ensuring networks are ready for the “unknowns.” We will see accelerated investments in open, agile network architectures built on cloud principles, elastic on-demand capacities, and automation and security for an assured service experience. And with a heightened focus on service experience, we can expect automation, service assurance, AI/ML, and orchestration technologies to take on an even more significant role in service provider network operations, guaranteeing service quality and simplifying operations as networks get bigger, more dynamic and more complex.

COVID accelerates the value of the edge

Networks have never been more critical than they are right now. Business, education, telemedicine, social, all have moved from engaging in person to engaging virtually and multi-participant interactive video calls have become fundamental to our daily lives. We have seen a massive consumption of streaming media (largely video based), and similarly an all-time high in online gaming, each driving CDN growth. Service providers have responded fast to manage the surge in traffic while avoiding lagging, downgraded quality, and slower speeds. Next year, we’ll see service providers double down on investments in edge cloud, moving applications and data closer to users and connected devices to enhance the user and application experience, support new emerging low-latency applications, and make more efficient use of network transit capacity.

COVID drives network security

While security has often taken a back seat to make way for faster network speeds, the pandemic has proven that bad actors will take advantage of crises for their own gain. Next year, we’ll see service providers take a holistic, end-to-end security approach that combines network, application and end-user security to deliver a secure and assured service experience. This is especially important as we’re approaching a second wave of lockdowns and working from home becomes the new normal – which presents an enticing attack surface to attackers. In 2021, we’ll see companies investing more in Enterprise-at-Home solutions with security at the forefront, ensuring that all endpoints in the networks are secure, wherever they are.

5G

5G hype fades as monetization opportunities skyrocket

Despite the pandemic shifting operational priorities, causing some 5G roll outs to slow down, service providers have still been heavily investing in and deploying 5G networks. With over 100 commercial networks launched across the globe, and many more are expected in 2021, 5G is now real, bringing new monetization opportunities for operators. With massive speeds, huge connection densities and ultra-low-latency experiences, we expect to see progress in new consumer applications (e.g. gaming, AR/VR/MR), 5G for industry verticals, consumer broadband with content bundling, enterprise broadband and cloud-managed services, and fixed wireless access services in 2021.

400G

400G deployments ramp up beyond the cloud data center

As commercial solutions become more viable to support the relentless growth in bandwidth demand, we will continue to see momentum build for 400G in 2021. While large cloud providers are driving the first wave in the data center and the wide area network, expect to see 400G ramp up in service provider networks in 2021, as well as across data center interconnect, core, peering, and CDN gateway use cases, among others. We will see large-scale rollouts of 400G in the WAN, especially in the second-half of the year, driven by the availability of lower-cost optics, lower operating expense potential with fewer ports to manage, and pay-as-you-go pricing models that will allow operators to smoothly navigate the upgrades. Looking beyond 2021, we will see 400G appear in metro aggregation nodes as 5G buildouts drive even more traffic and network densification.

Open RAN

Open Architectures remain a top theme, Open RAN is here to stay

The service provider industry’s drive towards Open Architectures will continue to gain momentum in all areas from Open Access (including Open RAN, Open OLT), Open Broadband, Open IP/Optical and Open Core. Open RAN is no longer a question of IF, but WHEN. We will see accelerated momentum in Open RAN globally with RFPs, trials and early deployments as many operators commit to democratize their radio access domain primarily to drive vendor diversity and best-of-breed innovation. While commercial widescale deployments of Open RAN are a few years out, we will see a strengthened Open RAN ecosystem, greater technology maturity and new kinds of partnerships that will fundamentally change how radio networks will be deployed, managed and leveraged for value creation in the future.



The Role Operators can play at the Edge Over 50 billion devices are expected to come online next year, driving the need for edge-located control points to manage these devices in real-time and near real-time. For service providers, this makes edge compute a critical and strategic area of focus. Sally Bament, VP of Marketing at Juniper Networks, discusses the role operators can play in the edge value chain.

Thursday, December 20, 2018

2019 Network Predictions - The campus becomes hot again

Michael Bushong,  Juniper Networks’ VP of Enterprise and Cloud Marketing 

Network automation will hit the curve in the proverbial hockey stick.

Despite years of talking about automation, the vast majority of enterprise operations are still manual, CLI-driven activities. In 2019, adoption will shift from linear to something more aggressive.

This will be driven in part by a general need to automate the network to keep pace with the dynamic application environment that already exists in many enterprises. But the broader DevOps efforts, especially in the cloud arena, will demonstrate what operations could look like outside of the application teams. And enterprises will begin their transformation.

Notably, this means that the automation that emerges will not be the automation that has been talked about for years. Where the last decade has been about removing keystrokes in mundane tasks, the real path forward for network automation will more closely track with the Site Reliability Engineering (SRE) movement. Expect to see the rise of NRE in enterprises (a trend that has already started in the major cloud and service provider properties).

Open source will be more than an alternative business model.

As open source continues to climb in importance in the IT supply chain, enterprises will begin to develop stronger open source policies. This will include everything from procurement practices (which partners will be involved and how will support be handled?) to supply chain (how do you secure the supply chain if no one is inspecting?).

Enterprises outside of the major open source and cloud players will begin to treat open source as just another route to market, implementing appropriate controls, checks, and balances to ensure that products are robust, support is available, and security is more than a hope.

SD-WAN will begin to yield to SD-Enterprise.

It’s not that SD-WAN will become less important in 2019, but as the industry starts applying the principles of SD-WAN more broadly, SD-WAN will start its evolution to SD-Enterprise. Cloud management and intelligent routing across the WAN can be transformative for more than the subset of products currently in market. As campus moves this direction, it seems inevitable that the concept will broaden.

Campus becomes hot again

A few years ago, data center was all the rage. More recently, SD-WAN has revitalized the branch. In 2019, expect campus networking to be in vogue again. Driven by some of the same technologies (SDN, SD-WAN, intent-based networking, and so on), the campus will go through a similar transformation. Vendors have retooled their portfolios in preparation, and most market forecasts showed campus shifting from slow decline to slight growth this year. That trend should continue.

Notably, the embrace of software as the primary vehicle for delivering value also means that the days of refresh cycles being on the order of 5-to-7 years will likely come to an end as well. This should stoke competition in a market that, frankly, has looked more like a monopoly than a vibrant ecosystem at times over the last decade. Times, they are a-changin’.

Ecosystems will replace vertical suppliers

For decades, the networking space has been dominated by large, vertically-integrated stacks. With the rise of cloud and multicloud forcing multi-vendor integration from an operations perspective, it would seem that the vertical approach to the market will begin to give way to an ecosystem strategy.

Importantly, that ecosystem will bring suppliers together that span all of compute, storage, networking, and even applications. Where the past was led by a well-known set of incumbents, suppliers like Nutanix with their hybrid and multicloud solutions and RedHat (now IBM) with their orchestration solutions will take on more prominent roles. This will chip away at the incumbent routes to market, which will begin a one-way move towards a more diverse solutions environment.


2019 Network Predictions - 5G just can’t ‘contain’ itself

by John English, Director of Marketing, Service Provider Solutions, NETSCOUT

5G just can’t ‘contain’ itself 

In 2019 as virtualized network architectures are rapidly adopted to support 5G we expect to see containers emerge as the de-facto platform to run new applications and workloads

The excitement around 5G is building as we hear more news about network deployments, trials and handsets. However, one 5G-related issue that hasn’t yet been crystallized is what form 5G software and innovations will take, and how these new services and applications will be deployed into the network. Unlike 4G/LTE network infrastructure, the architectures that support 5G are virtualized and cloud-based, so the smart money is on application developers, mobile operators and equipment vendors using microservices, and in particular containers, to drive 5G evolution.

It makes sense to use containers to support 5G as they will provide operators with a flexible and easier to use platform to build, test and deploy applications that is now also becoming more secure. This is vital for the development of 5G services at a time when the use cases for 5G are still being defined. Operators will need to be in a position to spin up services as and when needed to support different use cases, by using containers it will be possible to serve customers quickly and efficiently.

Another key aspect is the need to deliver services and applications closer to the end user by utilizing mobile edge computing. This is integral to ensuring the low latency and high-bandwidth associated with 5G and will support use cases across a wide range of verticals including transport, manufacturing and healthcare. However, flexible architectures will be required to support this type of infrastructure throughout hybrid cloud and virtualized environments. As operators move network infrastructure to the edge, the use of containers will become pivotal to supporting 5G applications.

The use of microservices and containers will increase during 2019 as operators’ ramp up their 5G propositions. Despite offering clear advantages, they will also add a new layer of complexity and carriers will need to have clear visibility across their IT infrastructure if they are going to make a success of 5G.

5G will drive virtualization in 2019 

Momentum is building behind 5G. The US and South Korea are leading the charge with the rollout of the first commercial networks; trials are taking place in every major market worldwide; and Verizon and Samsung have just announced plans to launch a 5G handset in early 2019. Expectations for 5G are high – the next-generation mobile standard will underpin mission-critical processes and innovations, including telemedicine, remote surgery and even driverless cars. However, vast sums of money will need to be spent on network infrastructure before any of this can happen, and it's the mobile and fixed carriers who will be expected to foot the bill. This is compounded by the fact that many of the aforementioned 5G use cases have yet to be defined, so carriers are being asked to gamble on an uncertain future.

So, what will the 5G future look like and what will it take to get us there?

One thing is for certain - 5G will drive network virtualization. In 2019, we will see an increasing number of carriers committing to deploying virtualized network infrastructure to support 5G applications and services. Without virtualization, it will be ‘virtually’ impossible to deliver 5G. This is because 5G requires virtualization both at the network core, and critically at the network edge. Puns aside, the days of building networks to support a single use case, such as mobile voice and data, or home broadband, are behind us. If 5G is to become a reality, then the networks of the future will need to be smart and automated, with the ability to switch between different functions to support a range of use cases.

However, moving from the physical world to the virtual world is no mean feat. Carriers are now discovering that their already complex networks are becoming even more so, as they replicate existing functions and create new ones in a virtualized environment. Wholesale migrations aren’t possible either, so carriers are having to get to grips with managing their new virtual networks alongside earlier generations of mobile and fixed technologies. Despite these challenges, 5G will undoubtedly accelerate the virtualization process. Subsequently, no-one will want to be left behind and we will see greater competition emerge between carriers as they commit funds and resources to building out their virtualised network infrastructures.

To justify this spend, and to tackle the challenges that lie ahead, carriers will require smart visibility into their constantly evolving network architectures. Virtual probes that produce smart data, supported by intelligent tools, offer much-needed visibility into the performance of these new networks and the services they support. The invaluable knowledge they provide will be absolutely critical for carriers as they accelerate their use of virtualized infrastructure to successfully deploy 5G.

2019 Network Predictions - Operators must ‘scale or fail’ for 5G

by Heather Broughton, Sr. Director of Service Provider Marketing, NETSCOUT

Operators will ‘scale or fail’ to meet the 5G demand in 2019

5G will be faster, smarter and more efficient than 4G, but in order to meet demand and to support new architectures, networks will have to scale. While most of the scale in the core network will be cloud and software-based, there will still be a need for hardware and equipment at the network edge, and in a 5G environment there will be a lot more equipment. In fact, the number of cell sites will increase dramatically to support and propagate the higher frequency bands that will transmit 5G data traffic over the air. This is when network management tools will come into their own. In 2019 we will see the deployment of automated networks driven by software, and controlled by virtual machines and artificial intelligence.

Network automation and orchestration are by-products of virtualisation and will add another layer of complexity. However, they are also integral to the rollout and sustainability of 5G networks, particularly as network topologies will change to accommodate a combination of small cell and macro cell sites. Small cells in particular will form the bulk of the new RAN (radio area network) and they are expected to increase cellular networks threefold.

If network engineers think they have enough issues to deal with today maintaining 4G/LTE networks, then they may be in for a shock as 5G networks are gradually rolled out. In fact, without having total visibility of these more complex and expansive networks, 5G in the RAN is going to become extremely difficult to manage. If the number of cells were to double or triple, not only would network engineering teams need to have the full confidence in their network management tools to make sure the network is running optimally, but they would also be faced with one heck of a job troubleshooting hundreds, potentially even thousands of cells if an issue arose.

In 2019, carriers will be scrutinising costs per cell site as they look to invest in new infrastructure. They will look to offset any costs by implementing intelligent and automated systems that can support 5G networks. However, carriers need assurances that these systems are providing them with the right information about the uptime and performance of their new networks. The only way to achieve this will be to have complete visibility of these complex new architectures. Having a window into this multi-layered and virtualized environment, and being able to extract smart data in near real-time, will be essential for the ongoing management of new 5G networks.

2019 - The year carriers get to grips with 5G security

The benefits of 5G are clear; the new communications standard will offer carriers and their enterprise customers faster network speeds and performance, ultra-low latency and greater efficiencies. General discussion around carrier trials and deployments tends to focus on increased speeds and the new innovations that 5G will enable, but security rarely comes up. That’s all about to change with 5G security set to become a big issue for the industry and a major talking point in 2019.

To date, it appears that 5G security has almost been treated as an afterthought, rather than a critical aspect of network development. However, behind the scenes this is an issue that the carriers take very seriously. The situation for carriers has altered dramatically, because in a 5G domain, the attack surface becomes much greater. Consequently, the number of opportunities for malicious players to exploit vulnerabilities increases. This is partly due to the adoption of virtualized network infrastructures that will allow carriers to scale and meet the demands of 5G, but also because 5G networks will be configured to support a wide variety of industrial and business use cases. This means that going forward, carriers will be responsible for managing mission-critical systems and devices, in addition to handling high volumes of sensitive data. In a 5G environment, there will be a strong emphasis on securing smart factories, automated production lines and fleets of driverless cars.

The network security stakes get a lot higher

As new 5G network architectures are based on virtualization and distributed cloud models, and a containerized environment to support workloads and applications, it’s apparent that carriers have to deal with a whole new set of complexities. Existing security protocols will need to be scrapped and replaced with robust systems and procedures that account for this new complex environment and the burgeoning 5G value chain; that includes applications developers, device manufacturers, cloud service providers and the carriers themselves. A new built-in resilience is required to limit the attack landscape and to reduce the risk of malicious attacks and perimeter breaches. A pervasive security model that offers comprehensive insight on both service performance management and security offers the best solution to address 5G security. It enables service providers to extract ‘smart data’ that is collected and processed at the source from legacy, virtual and hybrid cloud environments. It’s the closest carriers and their customers will ever get to implementing ‘holistic security’ across their entire IT estate.

Wednesday, December 19, 2018

2019 Network Predictions

by Angelique Medina, senior product market manager, ThousandEyes

2018 has seen the acceleration of modern infrastructure from public cloud, SaaS, hybrid and SD-WAN. 2019 will see enterprises feeling the impact of this dramatic shift more than ever.

Internet unpredictability impacts become more visible as SD-WAN projects spread and mature

SD-WAN adoption is on the rise, and with it, the enterprise’s growing dependence on the Internet. Before moving to SD-WAN, most enterprises only had to worry about Internet performance from its data centers to key services. With SD-WAN, they’re increasingly leveraging DIA and broadband connectivity and grappling with hundreds or thousands of sites, each of which will have distinct Internet paths to many different cloud-based services. Shifting from a carrier managed service to the Internet, means that there’s an exponential rise in the number of service providers that can potentially impact performance for branch office users. As a large number of enterprises move from deployment into their operations stage in 2019, the impact of Internet unpredictability will become more evident. As a result, more enterprise IT teams will start to develop operational capabilities to deal with Internet-centric issues.

Digital experience will confront the weight of backend multiplicity

Enterprises and SaaS providers are increasingly leveraging third-party APIs and cloud-services as part of their web and application architectures. This distributed, microservices approach to building applications not only provides best-of-breed functions, it enables companies to quickly consume and deliver new services. Applications today might leverage dozens of APIs to handle services such as messaging and voice, maps, and payments, while also connecting to cloud-based services such as CRM, ERP and analytics. Websites are also getting weighed down by the addition of many externally hosted applications. Even a seemingly simple “Buy Now” function on an ecommerce site will invoke many external services, including payment gateways, CRM, analytics, inventory, fulfillment, and potentially many others.

The weight of all of these external dependencies means that websites are going to continue to get slower, while at the same time their risk surface increases. Since these services are not internally operated, isolating the source of a problem when something goes wrong can be challenging, particularly since these services are connected to over the Internet. The question of whether the application or the network is at fault will become “Which application?” and “Which network?”.

Understanding the tradeoff of function over user experience and knowing how every third-party web or app component impacts performance will get even more critical to enterprises and SaaS providers in 2019.

Fragmentation, not bifurcation of the Internet

Eric Schmidt, former CEO of Google, famously predicted that the Internet would bifurcate into a US-led Internet and a Chinese-led Internet by 2028. While we still have plenty of time to see how this prediction plays out, in the near term, the Internet is shifting towards fragmentation. Multiple nation states, including Iran, Turkey, Saudi Arabia, and Russia, have joined China in creating a walled-off Internet, using a variety of technical, social, and political techniques. As more countries pursue nationalist agendas and choose to opt out of regional or global alignments, we will see increasing Internet fragmentation. This will initially take the form of politically-motivated censorship, but will expand to include the broader curation of connectivity based on politically-prescribed social and cultural norms.

Hybrid starts tilting to the cloud

While the data center will continue to lose ground in favor of cloud, enterprises still early in their cloud journey or who have special security or regulatory constraints will keep hybrid cloud alive. To extend their reach into the enterprise data center, public cloud providers have begun offering on-premises solutions, featuring greater agility, favorable economics, and a single pane of glass for management. While still in its early days, Azure Stack has already announced that it has deployed customers, while newly announced AWS Outpost (scheduled to release in the second half of 2019) has the potential to be highly disruptive to the data center landscape.

2019 will see an increased tilting of hybrid towards public cloud providers, though a lack of maturity may cause an initial freezing of of the hybrid market, particularly for AWS customers, who will want to consider an AWS offering over existing network providers once commercially available.

The Edge gets less “edgy”

Early edge architectures, where the data of billions of IoT devices is notionally processed at central points by infrastructure in public cloud or private data centers, presented challenges, ranging from security to physics (increased latency) and cost (bandwidth). The introduction of intermediary nodes into edge architectures will address the latency and security concerns of a strictly core/edge architecture, moving edge deployments in 2019 from largely theoretical to realizable.

Intermediary nodes are designed to perform some of the processing functions of the cloud closer to the edge, which will help ensure better performance and scale for users and devices and help drive IoT and edge deployments. These nodes are already available from a variety of vendors, including public cloud providers, such as Microsoft. Microsoft has previously stated that they want their Azure cloud data centers be 50ms from everywhere. These new intermediary nodes will help extend the reach of cloud-centric infrastructure to the range of single digit milliseconds and make IoT and edge computing aspirations a reality.

Cyber attacks focus on foundational Internet systems for maximum effect

The pervasive risk associated with offering a digital service has forced most large enterprises and digital businesses to employ sophisticated systems of defense. These systems are designed to handle increasingly large-scale attacks, such as the one launched against GitHub earlier this year. That attack was the largest ever recorded and although it was disruptive, it was successfully mitigated through a highly elastic cloud-based DDoS protector called Prolexic. This and other tools make launching an impactful attack against a high-value target more challenging to pull off, which may be one reason why the number of DDoS attacks is trending downward, particularly in North America and Europe. This doesn’t mean that cyber attacks are going away. Cyber attacks will continue to make headlines in 2019, but they will largely take an indirect approach, exploiting relational weaknesses in foundational Internet systems, such as DNS and BGP routing.

Two incidents this year, one malicious, the other unintentional, underscored the vulnerability of even the most sophistical digital businesses to service disruption. In the case of the malicious incident, Amazon’s DNS service, Route 53, was hijacked, which enabled a cryptocurrency theft and led to many customer sites, including Instagram and CNN, becoming partially unreachable. The attackers who pulled off this digital hijacking and robbery made no attempt to penetrate Amazon’s infrastructure. Instead, they compromised a small Internet Service Provider in Columbus, Ohio, using them to propagate false routes to Amazon’s DNS service. The implicit trust built into Internet routing allowed this attack to take place. The fact that the hijacked service (translating URLs into Internet addresses) is a critical dependency meant that the impact was massive and went far beyond the intended target.

Indirect attacks, taking advantage of critical dependencies outside of the control of the intended target, will continue to grow in 2019, netting more high-profile victims while maximizing the scope of collateral damage.

The operational impact of cloud adoption pushes enterprises to reexamine their management stack mix

Now that SaaS has mainstreamed, with most enterprises shifting their application consumption model from internal to the cloud, we can expect to see a follow-on shift in IT operations stacks in the coming year, as more enterprises begin to realize that the existing toolset is not oriented to address externally-hosted applications.

The traditional IT operations stack is rich with tools, but as the usage of SaaS applications and cloud-based services has increased, the domain of many of these tools is narrowing, exposing gaps in visibility for SaaS applications and their delivery over the Internet. Network tools that collect data from on-premises will see a reduction in usage and budget allocation, making room for cloud-specific tools and technologies designed to provide visibility into networks and services that enterprises rely on (such as ISPs and SaaS apps) but that they do not own or control. This new operations stack will continue to feature traditional toolsets, but its proportional emphasis will favor cloud-focused technologies.

Monday, December 17, 2018

2019 Network Predictions

Bill Fenick, VP of enterprise at Interxion

Enterprises will be smarter about the cloud
The cloud has quickly become a mainstay in the enterprise. However, early on, many businesses dove into the cloud head first, and quickly realized that that not only are not all apps meant to be reengineered for the cloud, but even a lift and shift approach doesn’t always work. Because of this, in 2019, I believe that while enterprises will continue to adopt cloud in a more ferocious way, they’ll do it with a better layer of intelligence on top.

 Artificial Intelligence will drive cloud adoption
As companies increasingly integrate a variety of AI-driven technologies across voice, vision, language and machine learning in order to transform their businesses and get the competitive edge in 2019, I believe they will be leveraging cloud technologies as a matter of course.

Location is becoming more important to enterprises
Today’s enterprises have the need for speed. Regardless of it being application to application or application to end user, businesses need data to move faster than ever before. As a result, in 2019 I expect enterprises to pay closer attention to the location of their data, whether that’s the location in proximity to other data sources including the cloud, or geographic location.

Sally Bament, VP of Service Provider Marketing, Juniper Networks

5G will create a new billion-dollar app economy
The first smartphones and eventually LTE networks paved the way for mobile apps as we know them, giving rise to a multitude of new ways companies interact with customers. 5G is poised to go live in many cities across the United States and globally in 2019, and we expect next year to really showcase the economic power of the new mobile technology. This is the year apps start to show their real value in the enterprise and industrial space with a host of new IoT, AR/VR, digital twins and connected-car applications coming to life.

Two separate high-profile cybersecurity breaches will hit critical U.S. infrastructure
The increasing amount of distributed applications and data deployed in various parts of the cloud environments will increase sophisticated breaches. In the year ahead, we will likely see major attacks on systems of livelihood including utility systems, municipal water supplies and electrical grids. Predictive analytics and end-to-end monitoring are necessary tools to thwart catastrophic structural attacks.

Expect more frenemies in the edge
The hyperscale cloud players have clearly demonstrated the power of their massive networks in terms of application hosting and development. But it’s the telcos that have the beachfront property in their established network infrastructure that’s closest to end users. Cloud providers will try to build an edge of their own, but service providers will remain keepers of the edge as they can compete with much better economic scale. Over the next year, service providers and cloud providers will compete to win the edge but expect more cloud-SP partnerships to unfold as the year progresses.

Automation is the secret to customer satisfaction
In 2019, automation will be the differentiating factor among service providers. Early software and virtualization technology have provided some relief from stagnant development but this year service providers will fully adopt automated and virtualized cloud platforms that can deploy new services in months, not years. Those who fail to implement automation will find themselves years behind competitors, as end users will find more agility and better service with those who embrace automation

Dave Wright, President of the CBRS Alliance

Expect commercial launch in the 3.5 GHz CBRS Band -- Earlier this year, the FCC announced plans for the launch of commercial services in the CBRS 3.5 GHz band, a wide swath of lightly-used spectrum that currently has U.S. Department of Defense systems as its primary user. This new opportunity is enabled through the use of a dynamic sharing mechanism which protects the incumbent government operations while allowing new commercial services. OnGo solutions for the band will offer secure, cost-effective connectivity in the places it is needed most, and at a fraction of the cost that has historically been associated with cellular technologies. There is universal agreement that mid-band spectrum will be critical for next-generation wireless services, and CBRS is the first mid-band spectrum being made available in the US.

Organizations – including existing mobile, fixed wireless, and cable operators, as well as enterprises and industrial players – are already laying the groundwork for deployment. Testing and certification programs for equipment and devices operating within the 3.5 GHz band are well underway – with a number of radio infrastructure and client devices now authorized by the FCC. The testing of the dynamic sharing databases (SASs) is also well underway. The industry is ready for commercial deployment, and 2019 will be the year of improved wireless coverage and capacity on a massive scale.

Jon Toor, CMO, Cloudian

There’s No Place Like Home: Cloud Repatriation Increases: While the growth of the public cloud will remain strong, enterprises will expand their adoption of on-premises private clouds in a hybrid cloud model. This will include repatriating data from the public cloud to avoid the bandwidth, latency and cost issues that can arise when accessing such data.

Two Clouds Are Better Than One: More enterprises will adopt a multi-cloud strategy to avoid vendor lock-in and enhance their business flexibility. However, a multi-cloud approach raises new management challenges that users will need to address to ensure a positive experience.

Object Storage: Ready, Camera, Production: Moving beyond its traditional use for large-scale archiving, object storage will play an increasing role in video production workflows. Offering a combination of limitless scalability, S3 compatibility and tremendous durability, object storage provides an ideal platform for managing video content, including over-the-top (OTT) distribution.

What Do You Get When You Mix Blue and Red?: IBM-Red Hat Deal Scrambles the Cloud Landscape – IBM’s acquisition of Red Hat will reverberate throughout 2019, giving enterprises more options for designing a multi-cloud strategy and highlighting the importance of data management tools that can work across public cloud, private cloud and traditional on-premises environments.

AI and Object Storage Play Tag: As businesses increase their use of AI to extract greater value from their digital assets, metadata tagging will become an even more critical element of enterprise storage. This will bring more attention to object storage, which is centered on metadata, and the key will be integrating well with AI tools.

Cloud Foundry Foundation’s Executive Director Abby Kearns and CTO Chip Childers

Consolidation will continue: Based on 2018’s acquisitions, we predict we’ll see a steady rollout of acquisitions in the next 12-18 months, as major enterprise tech companies rush to get a piece of the latest innovations. Shuffling in executive leadership at certain large companies is a telltale sign that acquisition opportunities will be used to grow business more rapidly. Consolidation around a specific technology is bound to happen, with the market solidifying around that tech.

Multi-platform will be the new normal: A majority of the market believes containers must be the solution to digital transformation, but in 2019, they’ll realize they’re just a tool -- not a silver bullet. We’re already seeing that companies are more broadly deploying a combination of technologies like PaaS, containers and platform in tandem, which we published in a report earlier this year. 2019 will be the year enterprises begin to embrace this versatility and see the flexibility, scalability and interoperability in a multi-platform solution.

People and process are more important than technology: In 2019, companies are going to realize the people on their teams matter more than anything. Reskilling their workforces is going to become essential to business success. Technology is evolving at the same rate as training, so most people with today’s desired skill sets are already employed. Organizations that build continuous learning cycles into their business model and upskill their employees will keep themselves ahead of the curve.

FaaS adoption will continue momentum: FaaS is a serverless technology. It’s already been adopted rapidly as glue code which will continue. However, its function as a productive way to build business applications is only beginning to take off. We will see the beginnings of an explosion of developer frameworks built on top of serverless systems. This type of tooling makes it easier to work with and build things with FaaS, so it becomes a self-perpetuating cycle.

Eyes to the east: Together, we’ve been to China seven separate times this year, and we are astonished at the pace of technological advancement there. With special interest in Artificial Intelligence, China is moving at lightning speed. In 2019, there will be global impact as China’s advancement pushes other regions to hasten their own development.

Scaling up, and quick: We’re seeing the momentum of scale steadily speed up as a result of continued enterprise adoption of cloud technologies. As the technologies mature and are integrated into cloud solutions, enterprises grow more familiar with them, gain trust in their value and increase adoption. It’s a virtuous cycle, which we wrote about in the Foundation’s latest research report, and it’s only going to start spinning faster in 2019.

Culture matters: We’ve said it before and we’ll say it again: Your people and your processes are more important than your technology. In our most recent research, we found that nearly 50 percent of organizations believe culture change is a bigger obstacle than the technology itself. The shift to digital has to happen within your organization, and that means with your people. In 2019, companies are going to prioritize a new culture that emphasizes agile, integrative, inclusive workflow. It’s just another way the cloud market is restructuring.

Wednesday, December 20, 2017

Top 5 Container Predictions for 2018

by David Messina, CMO, Docker

Prediction #1: The next big security breach will be foiled by containers

As we witnessed with the Equifax breach in early September, data breaches can place personal data at risk and in doing so, erode consumer confidence. But what if you could prevent a major breach by simply placing the software in a container? The Equifax breach occurred when a piece of web software was vulnerable and exposed to hackers. Containers act to reduce the attack surface available for exploitation, and in doing so greatly increase the difficulty and minimize the possibility of many forms of compromise. In many cases, simple steps like using read-only containers will fully mitigate a broad range of attack vectors.
                                                                                                                                                                    From being ephemeral and isolated in nature to enabling frequent patching and scanning against the latest CVEs, containers are vital to securing the software supply chain. Containers will be more widely relied upon in the coming year to combat future threats.

Prediction #2: Complexity and time to market will thwart PaaS adoption

As calls for accelerated cloud strategies only get louder across the Global 10K, it's becoming increasingly clear that outdated Platform as a Service (PaaS) frameworks are not equipped to handle the demand of managing all of the applications that are part of today’s modern enterprise. For the past few years, utilizing PaaS has been considered a cutting-edge approach to migrating your apps to the cloud. What is often overlooked, is the time required to set up PaaS frameworks, retrain employees and re-code each application - efforts that can take a year to drive and complete. In 2018, we expect to see PaaS adoption stall as enterprises recognize the time to value is too prolonged for the current and future pace of business. This will give way to accelerated Container as a Service (CaaS) platform adoption as enterprises look to migrate more workloads to the cloud while achieving greater agility, innovation, and cost-efficiencies.

Prediction #3: Containers will break the 80/20 Rule for IT budgeting 

It’s widely understood that CIOs typically commit 80% of their budget towards maintenance with only 20% left for innovation - a major roadblock in the path to digital transformation. We expect this to change in 2018 as CIOs rewrite the 80/20 rule in favor of innovation by unlocking new methods for managing and modernizing their legacy apps. In the past, application modernization required refactoring apps, ripping/replacing existing infrastructure and implementing new processes. Instead, enterprises are now using containerization for meaningful application modernization results in days. Organizations will reap the benefits of cloud portability and security while using the significant cost-efficiencies to reinvest their savings in more strategic digitization efforts.

Prediction #4: Security, not orchestration, will write the next chapter of containerization

2016 and yes even some of 2017 might have been about the orchestration wars but now that companies like Docker offer a choice of orchestration, some might argue that orchestration has been largely commoditized. With container adoption expected to grow into a nearly $3 billion dollar market by 2020 according to 451 Research and Docker itself experiencing more than one billion downloads bi-weekly, security will be the next frontier that companies need to address. Ironically, the threats will come from the applications themselves, making “container boundaries” an imperative for segmenting and isolating threats. The container boundary can also make it more difficult for an attacker to get the data out, resulting in detection. Securing the software supply chain will be paramount to safeguarding the application journey.

Prediction #5: CIOs will accelerate plans for digital transformation with containers

Although “digital transformation” has become somewhat of a buzzword as of late,  enterprises certainly accept the idea behind it - and with a greater sense of urgency. According to Gartner, as many as two-thirds of business leaders are concerned that their companies aren’t moving fast enough on the digital transformation front, leading to potential competitive disadvantages. In 2018, CIOs will increasingly feel the pressure to speed up digitization efforts and will accelerate their journey through containers. As businesses build out and implement strategies around cloud migration, DevOps and microservices, containers will play an increasingly important role in achieving these initiatives. By Dockerizing their applications, our enterprise customers have experienced the immediate benefits of digital transformation: faster app delivery times, portability across environments, hardened security and more.

Monday, January 9, 2017

Forecast for 2017? Cloudy

by Lori MacVittie, Technology Evangelist, F5 Networks

In 2016, IT professionals saw major shifts in the cloud computing industry, from developing more sophisticated approaches to application delivery to discovering the vulnerabilities of connected IoT devices. Enterprises continue to face increasing and entirely new security threats and availability challenges as they migrate to private, public and multi-cloud systems, which is causing organizations to rethink their infrastructures. As we inch toward the end of the year, F5 Networks predicts the key changes we can expect to see in the cloud computing landscape in 2017.

IT’s  MVP of 2017? Cloud architects 

With more enterprises adopting diverse cloud solutions, the role of cloud architects will become increasingly important. The IT professionals that will hold the most valuable positions in an IT organization are those with skills to define criteria for and manage complex cloud architectures.

Multi-cloud is the new normal in 2017

Over the next year, enterprises will continue to seek ways to avoid public cloud lock-in, relying on multi-cloud strategies to do so. They will aim to regain leverage over cloud providers, moving toward a model where they can pick and choose various services from multiple providers that are most optimal to their business needs.

Organizations will finally realize the full potential of the cloud

Companies are now understanding they can use the cloud for more than just finding efficiency and cost savings as part of their existing strategies and ways of doing business. 2017 will provide a tipping point for companies to invest in the cloud to enable entirely new scenarios, spurred by things like big data and machine learning that will transform how they do business in the future.

The increasing sophistication of cyber attacks will be put more emphasis on private cloud
While enterprises trust public cloud providers to host many of their apps, the lack of visibility into the data generated by those apps causes concerns about security. This means more enterprises will look to private cloud solutions. Public cloud deployments won’t be able to truly accelerate until companies feel comfortable enough with consistency of security policy and identity management.

More devices – More problems: In 2017,  public cloud will become too expensive for IoT 

Businesses typically think of public cloud as the cheaper business solution for their data center needs, yet they often forget that things like bandwidth and security services come at an extra cost. IoT devices generate vast amounts of data and as sensors are installed into more and more places, this data will continue to grow exponentially. This year, enterprises will put more IoT applications in their private clouds, that is, until public cloud providers develop economical solutions to manage the huge amounts of data these apps produce.

The conversation around apps will finally go beyond the “where?”

IT professionals constantly underestimate the cost, time and pain of stretching solutions up or down the stack. We’ve seen this with OpenStack, and we’ll see it with Docker. This year, cloud migration and containers will reach a point that customers won’t be able to just think about where they want to move apps, they’ll need to think about the identity tools needed for secure authentication and authorization, how to protect and prevent data loss from microservices and SaaS apps; and how to collect and analyze data across all infrastructure services quickly.

A new standard for cloud providers is in motion and this year will see major developments in not only reconsidering the value of enterprise cloud, but also modifying cloud strategy to fully extend enterprise offerings and data security. Evaluating the risks of cloud migration and management has never been as vital to a company’s stability as it is now. Over the course of the year, IT leaders who embrace and adapt to these industry shifts will be the ones to reap the benefits of a secure, cost-effective and reliable cloud.

About the Author

Lori MacVittie is Technology Evangelist at F5 Networks.  She is a subject matter expert on emerging technology responsible for outbound evangelism across F5’s entire product suite. MacVittie has extensive development and technical architecture experience in both high-tech and enterprise organizations, in addition to network and systems administration expertise. Prior to joining F5, MacVittie was an award-winning technology editor at Network Computing Magazine where she evaluated and tested application-focused technologies including app security and encryption-related solutions. She holds a B.S. in Information and Computing Science from the University of Wisconsin at Green Bay, and an M.S. in Computer Science from Nova Southeastern University, and is an O’Reilly author.

MacVittie is a member of the Board of Regents for the DevOps Institute, and an Advisory Board Member for CloudNOW.

Friday, January 6, 2017

Wi-Fi Trends Take Center Stage in 2017

by Shane Buckley, CEO, Xirrus 

From an unprecedented DNS outage that temporarily paralyzed the entire internet, to the evolution of federated identity for simple, secure access to Wi-Fi and applications, 2016 had its mix of growing pains and innovative steps forward.

Here’s why 2017 will shape up into an interesting year for Wi-Fi technology.

IoT will create continued security issues on global networks

In 2017, the growth of IoT will put enormous pressure on Wi-Fi networks. While vendors must address the complexity of onboarding these devices onto their network, security can’t get left behind. The proliferation of IoT devices will propel high density into almost all locations – from coffee shops to living rooms – prompting more performance and security concerns. Whether Wi-Fi connected alarms or smart refrigerators, the security of our homes will be scrutinized and will become a key concern in 2017. Mass production of IoT devices will make them more susceptible to hacking, as they will not be equipped with the proper built in security.

The recent IoT-based attack on DNS provider Dyn opened the floodgates, as estimates show the IoT market reaching 10 billion devices by 2020. The event foreshadows the power hackers hold when invading these IoT systems. Taking down a significant portion of the internet grows more detrimental, yet all too plausible these days. Because of increased security concerns, vendors will equip devices with the ability to only connect to the IoT server over pre-designed ports and protocols. If IoT vendors don’t start putting security at the forefront of product development, we can only expect more large-scale cyberattacks in 2017.

LTE networks won’t impact Wi-Fi usage

Don’t expect LTE networks to replace Wi-Fi. The cost of deploying LTE networks is ten times greater and LTE is less adaptable for indoor environments than Wi-Fi. Wi-Fi will remain the lowest cost technology available with similar or superior performance to LTE when deployed properly and therefore will not be replaced by LTE. When people have access to Wi-Fi, they’ll connect. Data plan limitations remain too common.

Additionally, the FCC and other international government agencies began licensing the 5GHz spectrum to offer free and uncharted access to Wi-Fi. But, we don’t want carriers grabbing free spectrum and charging us for every byte we send, now do we?

LTE and Wi-Fi will co-exist as they do today, where LTE works well outdoors and Wi-Fi well-designed to work consistently throughout internal spaces.

The push toward federated identity will continue in 2017

Today, there remains a disparate number of Wi-Fi networks, all with different authentication requirements. This marks an opportunity for Wi-Fi vendors. In the coming year, we will see federated identity become a primary differentiator. By implementing federated identity, vendors simplify and secure the login process. Consumers can auto-connect to any public Wi-Fi network with their existing credentials – whether Google, Microsoft or Facebook – thus providing them with a seamless onboarding experience. It’s the next step for Single Sign-On (SSO), and one that will set Wi-Fi vendors apart in 2017.

This coming year, the repercussions of IoT, coexistence of LTE and Wi-Fi, and demand for simple, secure access to Wi-Fi, will take center stage. The onus falls on company leaders, who must adapt their business strategies so they can keep pace with the fast and ever-changing Wi-Fi landscape. 2017 will have plenty in store.

About the Author

Shane Buckley is CEO of Xirrus. Most recently, Mr. Buckley was the General Manager and Senior Vice President at NETGEAR where he led the growth of NETGEAR’s commercial business unit to 50 percent revenue growth over 2 years, reaching $330 million in 2011 – and played a prime role in growing corporate revenues over 30 percent. Prior to that, Mr. Buckley was President & CEO of Rohati Systems, a leader in Cloud-based access management solutions, Chief Operating Officer of Nevis Networks, a leader in secure switching and access control. He has also held the position of Vice President WW Enterprise at Juniper Networks, President International at Peribit Networks, a leader in WAN Optimization and EMEA vice president at 3Com Corp. Mr. Buckley is a graduate of engineering from the Cork Institute of Technology in Ireland.

Tuesday, December 20, 2016

Predictions 2017: IaaS Becomes the Next Launching Pad for Cyber Threats

by Corey Nachreiner, Chief Technology Officer, WatchGuard Technologies

Cloud technology has had an incredible impact on the business landscape over the last five years. Public infrastructure-as-a-Service (IaaS) platforms like Amazon’s AWS and Microsoft Azure, in particular, are growing at incredible rates – even among small businesses. According to RightScale’s 2016 State of the Cloud report, 71 percent of small and medium businesses (SMBs) are running at least one application in AWS or Azure. It’s clear that IaaS solutions provide a ton of business opportunities for organizations, especially those without the financial or personnel resources necessary to manage physical network infrastructure.

However, as the public cloud becomes more engrained in the fabric of everyday business operations, it has also become a serious target for hackers. The question: How safe is it really? With so much valuable customer, financial and healthcare data stored in one place, and managed by a third party, it’s easy to see why criminals have begun to focus their efforts on IaaS.  

In the past, we’ve seen threat actors target or infect servers running in public cloud services. For example, there have been cases where hackers take over servers running in Amazon EC2—the virtualized compute portion of Amazon AWS. Remember, servers you spin up in EC2 are no different from servers on your premises. If you leave a port open, without a firewall or access control rules, hackers can attack it in the same way they attack physical servers. To illustrate this, a honeypot organization spun up some fake SSH servers in Amazon EC2 to see whether they’d get targeted. Even without publishing the servers’ IP addresses, or attaching them to a domain, attackers found and started brute-force attacks the IaaS-based honeypots within 10 hours.

We’ve also seen criminals target IaaS customers through their cloud credentials. An Amazon AWS account is powerful. Customers can spin up almost endless servers, as long as they are willing to pay Amazon for the compute power they use. In 2014, one AWS customers had a very costly AWS credential breach. Some criminal learned his AWS credential, and used its administrative powers to spin up more EC2 server instances, which he used to mine bitcoin. This credential leak (due to the victim accidentally leaving credentials in a Github project), almost cost the victim over $5000 in AWS bills.

In short, without the proper protections, attackers can hack servers in the public cloud just as easily as the ones on your premises. As we move more and more of our data to IaaS servers, you can expect criminal hackers to follow.

Iaas doesn’t only make a good attack target, but also provides a powerful attack platform. We’ve seen cybercriminals leveraging these robust virtualization cloud platforms to build their attack infrastructure. For instance, criminals started putting their botnet command and control (C&C) servers in Amazon EC2 shortly after its launch, one example being the Zeus botnet. Despite increased monitoring and security from Amazon, attackers still use AWS infrastructure for attacks.

More recently, a web security company did a study of all the web application attacks launched on the Internet, and found that 20 percent of these attacks from AWS’s IP addresses. This comes as no surprise, since public IaaS services can provide single individuals with more scalable compute and network power one person could easily harness on their own. As long as public clouds offer impressive distributed computing capabilities to customers, hacker will search for ways to exploit these powers for evil.

In 2017, I expect to see attackers increasingly leverage public IaaS both as a potential attack surface, and as a powerful platform to build their attack networks. It’s highly likely there will be at least one headline-generating cyberattack either targeting, or launched from a public IaaS service next year.

So what can businesses to do protect their IaaS properties from being attacked in 2017?

In short, extend your existing network perimeter security tactics to the public cloud. There are a number of simple best practices I’d recommend to proactively protect your IaaS credentials and business critical data:
       
·        Properly implement IaaS’s existing access controls: IaaS services like AWS and Azure have built-in security tools you can use to protect your cloud servers in the same way you do physical ones. While cloud services don’t offer Unified Threat Management (UTM) or Next-generation Firewall (NGFW) services, they do have basic stateful firewalls. At the very least, make sure you firewall your cloud servers, and only expose the network services you really need to. 

·        Use strong authentication or two-factor authentication (2FA) whenever possible: Passwords are not perfect. They can get stolen, or you might accidentally leave them in a Github project, like the victim mentioned above. If you’re only using a password to authenticate to your IaaS service, a lost password gives attackers everything they need to take over your account. However, most public clouds offer two-factor authentication (2FA), where you can pair your password with some other authentication token, such as a secure code delivered to your mobile phone. With 2FA enabled, cybercriminals won’t be able to access your IaaS account even if they compromise your password.

·        Bring your on-prem security to the cloud: Most organizations protect their premise servers with UTM and NGFW appliances that combine many different security controls into one easy to manage appliance. Luckily, you can now bring these advanced premise security solutions to IaaS as well. Search your IaaS marketplace for your favorite security solution and you might find it.

·        Check out your IaaS provider’s security best practices: Frankly, there are more security tips and practices to protect your cloud servers that I can share in one short article. The good news is your favorite IaaS provider may already have you covered. For instance, AWS users can find a white paper on all Amazon’s best practices in this PDF.


Business will continue to boom for the IaaS industry. According to the latest market study by International Data Corporation (IDC), worldwide spending on public cloud services is expected to reach upwards of $141 billion by 2019, up from nearly $70 billion last year. With the sustained growth and prevalence of IaaS, organizations need to constantly educate themselves on new ways cybercriminals are leveraging it and focus on effectively extending their network security into the public cloud. 

About the Author

Corey Nachreiner is Chief Technology Officer of Watchguard Technologies.

Recognized as a thought leader in IT security, Nachreiner spearheads WatchGuard's technology vision and direction. Previously, he was the director of strategy and research at WatchGuard. Nachreiner has operated at the frontline of cyber security for 16 years, and for nearly a decade has been evaluating and making accurate predictions about information security trends. As an authority on network security and internationally quoted commentator, Nachreiner's expertise and ability to dissect complex security topics make him a sought-after speaker at forums such as Gartner, Infosec and RSA. He is also a regular contributor to leading publications including CNET, Dark Reading, eWeek, Help Net Security, Information Week and Infosecurity, and delivers WatchGuard's "Daily Security Byte" video on Facebook.

Monday, December 19, 2016

Predictions 2017: The Age of Collaboration and Interoperability

by Daniel Kurgan,
CEO, BICS

2016 proved to be a remarkable year for the telecoms industry. In its crudest form, mobile was previously synonymous with the core voice services provided by operators. And while voice remains one of the biggest revenue streams for MNOs, the digital revolution has sparked a new age for communication. New, branded digital experiences – messaging apps, chat bots and even the Internet of Things – have pervaded mobile devices globally, as consumers expect instant connectivity and seamless services, wherever they are in the world.

As operators and internet service providers make great headway in giving subscribers the ultimate user experience with faster, more qualitative and cost-effective voice and data services,  2017 – despite its imminent challenges – could prove to be a turning point for the industry as a whole, with new revenue opportunities and partnerships to be reaped for those who have readied themselves to take advantage.

The abolition of EU roaming charges will force significant change in the industry

June 2017 will mark a huge juncture in mobile history as EU regulators abolish roaming charges, allowing subscribers to use their mobile phones abroad as they would at home.
The abolition of roaming charges in Europe for consumers will have a knock on effect for wholesalers, who will be under increasing pressure to reduce their own fees. However, roaming traffic is expected to grow significantly as more subscribers will be inclined to switch on data roaming as they travel. This surge in usage of mobile voice and data services abroad should offset the impact of the changes in legislation.

The wholesale telecoms sector will experience a period of upheaval as the new measures come into place. However, the blow could be softened by the decision made by EU ministers in December 2016 to cap wholesale rates. Essentially, this will give mobile operators and wholesale carriers time to adjust. Either way, mobile operators will need to adapt if they want to reap the rewards of the industry’s digital transformation.

The Internet of Things will spark new services and revenue streams

In 2017, we will witness more convergence across the telecoms space as service providers develop cross-platform propositions to support content services, and also look to diversify in order to target vertical markets.

The IoT will be a key driver of this transformation, as an increasing number of industries become reliant on mobile to provide the connectivity and infrastructure needed to support IoT implementations. IoT has become integral to the growth of smart homes, smart factories, automated production lines and the emergence of driverless cars.

We have already seen several deployments of global IoT solutions, but in 2017 these are likely to gain traction as more and more equipment manufacturers and enterprises look to embed global connectivity into their devices. Wholesalers will play a key role in bringing disparate players from other markets together, helping to drive collaboration that will support innovation in telecoms.

Increased M&A activity will prompt collaboration to take advantage of industry opportunities

Wholesalers are set to play a crucial role in the development and transformation of the telecoms industry in 2017, acting as a facilitator of new partnerships across the sector.

In 2016, we saw a number of high-profile M&As as major operators looked to enhance their digital propositions to offer content and media services. This was evident in the U.S. with the Verizon- Yahoo M&A and the recent move by AT&T to acquire Time Warner. On the consolidation front, the UK incumbent BT acquired EE, the country’s largest mobile operator.

In 2017, we’ll see similar activity again, but on a micro-level, as mobile operators and pure-play telecoms businesses look to partner with digital service providers, cloud communications companies and even fintech companies to embed mobile and rich communications services into their core proposition. In order for these players to capitalise on the opportunities before them, they will need to look at partnering with wholesalers, retail service providers and vendors to help make the connections, share assets and intelligence in order to develop and roll out new services and to new markets in order to differentiate.

About the Author

Daniel Kurgan, Chief Executive Officer, BICS Daniel Kurgan was appointed CEO of BICS SA/NV on March 2nd 2007 after being COO from July 1st 2006 onwards. He started his career as Contracts Manager at SABCA (biggest Belgian aerospace company, subsidiary of French Group Dassault), where he negotiated and managed major industrial sales, subcontracting and purchase contracts with customers like Boeing, Airbus, Aerospatiale (EADS), Asian governments, and suppliers like GEC Marconi and Litton.

Daniel joined Belgacom’s Carrier Division at the start of the carrier's commercial operations in January 1997, where he held several positions including International Account Manager, Head of International Relations & Sales, Sales Director (domestic and international wholesale) and VP International Wholesale, in charge of Sales & Marketing, Buying & LCR, and Customer Service and Network. In 2005 Daniel was VP Commercial of BICS, and contributed to the spin-off of Belgacom’s international carrier business.

Daniel graduated from the Solvay Business School of the University of Brussels.