Showing posts with label Predictions. Show all posts
Showing posts with label Predictions. Show all posts

Thursday, December 20, 2018

2019 Network Predictions - The campus becomes hot again

Michael Bushong,  Juniper Networks’ VP of Enterprise and Cloud Marketing 

Network automation will hit the curve in the proverbial hockey stick.

Despite years of talking about automation, the vast majority of enterprise operations are still manual, CLI-driven activities. In 2019, adoption will shift from linear to something more aggressive.

This will be driven in part by a general need to automate the network to keep pace with the dynamic application environment that already exists in many enterprises. But the broader DevOps efforts, especially in the cloud arena, will demonstrate what operations could look like outside of the application teams. And enterprises will begin their transformation.

Notably, this means that the automation that emerges will not be the automation that has been talked about for years. Where the last decade has been about removing keystrokes in mundane tasks, the real path forward for network automation will more closely track with the Site Reliability Engineering (SRE) movement. Expect to see the rise of NRE in enterprises (a trend that has already started in the major cloud and service provider properties).

Open source will be more than an alternative business model.

As open source continues to climb in importance in the IT supply chain, enterprises will begin to develop stronger open source policies. This will include everything from procurement practices (which partners will be involved and how will support be handled?) to supply chain (how do you secure the supply chain if no one is inspecting?).

Enterprises outside of the major open source and cloud players will begin to treat open source as just another route to market, implementing appropriate controls, checks, and balances to ensure that products are robust, support is available, and security is more than a hope.

SD-WAN will begin to yield to SD-Enterprise.

It’s not that SD-WAN will become less important in 2019, but as the industry starts applying the principles of SD-WAN more broadly, SD-WAN will start its evolution to SD-Enterprise. Cloud management and intelligent routing across the WAN can be transformative for more than the subset of products currently in market. As campus moves this direction, it seems inevitable that the concept will broaden.

Campus becomes hot again

A few years ago, data center was all the rage. More recently, SD-WAN has revitalized the branch. In 2019, expect campus networking to be in vogue again. Driven by some of the same technologies (SDN, SD-WAN, intent-based networking, and so on), the campus will go through a similar transformation. Vendors have retooled their portfolios in preparation, and most market forecasts showed campus shifting from slow decline to slight growth this year. That trend should continue.

Notably, the embrace of software as the primary vehicle for delivering value also means that the days of refresh cycles being on the order of 5-to-7 years will likely come to an end as well. This should stoke competition in a market that, frankly, has looked more like a monopoly than a vibrant ecosystem at times over the last decade. Times, they are a-changin’.

Ecosystems will replace vertical suppliers

For decades, the networking space has been dominated by large, vertically-integrated stacks. With the rise of cloud and multicloud forcing multi-vendor integration from an operations perspective, it would seem that the vertical approach to the market will begin to give way to an ecosystem strategy.

Importantly, that ecosystem will bring suppliers together that span all of compute, storage, networking, and even applications. Where the past was led by a well-known set of incumbents, suppliers like Nutanix with their hybrid and multicloud solutions and RedHat (now IBM) with their orchestration solutions will take on more prominent roles. This will chip away at the incumbent routes to market, which will begin a one-way move towards a more diverse solutions environment.


2019 Network Predictions - 5G just can’t ‘contain’ itself

by John English, Director of Marketing, Service Provider Solutions, NETSCOUT

5G just can’t ‘contain’ itself 

In 2019 as virtualized network architectures are rapidly adopted to support 5G we expect to see containers emerge as the de-facto platform to run new applications and workloads

The excitement around 5G is building as we hear more news about network deployments, trials and handsets. However, one 5G-related issue that hasn’t yet been crystallized is what form 5G software and innovations will take, and how these new services and applications will be deployed into the network. Unlike 4G/LTE network infrastructure, the architectures that support 5G are virtualized and cloud-based, so the smart money is on application developers, mobile operators and equipment vendors using microservices, and in particular containers, to drive 5G evolution.

It makes sense to use containers to support 5G as they will provide operators with a flexible and easier to use platform to build, test and deploy applications that is now also becoming more secure. This is vital for the development of 5G services at a time when the use cases for 5G are still being defined. Operators will need to be in a position to spin up services as and when needed to support different use cases, by using containers it will be possible to serve customers quickly and efficiently.

Another key aspect is the need to deliver services and applications closer to the end user by utilizing mobile edge computing. This is integral to ensuring the low latency and high-bandwidth associated with 5G and will support use cases across a wide range of verticals including transport, manufacturing and healthcare. However, flexible architectures will be required to support this type of infrastructure throughout hybrid cloud and virtualized environments. As operators move network infrastructure to the edge, the use of containers will become pivotal to supporting 5G applications.

The use of microservices and containers will increase during 2019 as operators’ ramp up their 5G propositions. Despite offering clear advantages, they will also add a new layer of complexity and carriers will need to have clear visibility across their IT infrastructure if they are going to make a success of 5G.

5G will drive virtualization in 2019 

Momentum is building behind 5G. The US and South Korea are leading the charge with the rollout of the first commercial networks; trials are taking place in every major market worldwide; and Verizon and Samsung have just announced plans to launch a 5G handset in early 2019. Expectations for 5G are high – the next-generation mobile standard will underpin mission-critical processes and innovations, including telemedicine, remote surgery and even driverless cars. However, vast sums of money will need to be spent on network infrastructure before any of this can happen, and it's the mobile and fixed carriers who will be expected to foot the bill. This is compounded by the fact that many of the aforementioned 5G use cases have yet to be defined, so carriers are being asked to gamble on an uncertain future.

So, what will the 5G future look like and what will it take to get us there?

One thing is for certain - 5G will drive network virtualization. In 2019, we will see an increasing number of carriers committing to deploying virtualized network infrastructure to support 5G applications and services. Without virtualization, it will be ‘virtually’ impossible to deliver 5G. This is because 5G requires virtualization both at the network core, and critically at the network edge. Puns aside, the days of building networks to support a single use case, such as mobile voice and data, or home broadband, are behind us. If 5G is to become a reality, then the networks of the future will need to be smart and automated, with the ability to switch between different functions to support a range of use cases.

However, moving from the physical world to the virtual world is no mean feat. Carriers are now discovering that their already complex networks are becoming even more so, as they replicate existing functions and create new ones in a virtualized environment. Wholesale migrations aren’t possible either, so carriers are having to get to grips with managing their new virtual networks alongside earlier generations of mobile and fixed technologies. Despite these challenges, 5G will undoubtedly accelerate the virtualization process. Subsequently, no-one will want to be left behind and we will see greater competition emerge between carriers as they commit funds and resources to building out their virtualised network infrastructures.

To justify this spend, and to tackle the challenges that lie ahead, carriers will require smart visibility into their constantly evolving network architectures. Virtual probes that produce smart data, supported by intelligent tools, offer much-needed visibility into the performance of these new networks and the services they support. The invaluable knowledge they provide will be absolutely critical for carriers as they accelerate their use of virtualized infrastructure to successfully deploy 5G.

2019 Network Predictions - Operators must ‘scale or fail’ for 5G

by Heather Broughton, Sr. Director of Service Provider Marketing, NETSCOUT

Operators will ‘scale or fail’ to meet the 5G demand in 2019

5G will be faster, smarter and more efficient than 4G, but in order to meet demand and to support new architectures, networks will have to scale. While most of the scale in the core network will be cloud and software-based, there will still be a need for hardware and equipment at the network edge, and in a 5G environment there will be a lot more equipment. In fact, the number of cell sites will increase dramatically to support and propagate the higher frequency bands that will transmit 5G data traffic over the air. This is when network management tools will come into their own. In 2019 we will see the deployment of automated networks driven by software, and controlled by virtual machines and artificial intelligence.

Network automation and orchestration are by-products of virtualisation and will add another layer of complexity. However, they are also integral to the rollout and sustainability of 5G networks, particularly as network topologies will change to accommodate a combination of small cell and macro cell sites. Small cells in particular will form the bulk of the new RAN (radio area network) and they are expected to increase cellular networks threefold.

If network engineers think they have enough issues to deal with today maintaining 4G/LTE networks, then they may be in for a shock as 5G networks are gradually rolled out. In fact, without having total visibility of these more complex and expansive networks, 5G in the RAN is going to become extremely difficult to manage. If the number of cells were to double or triple, not only would network engineering teams need to have the full confidence in their network management tools to make sure the network is running optimally, but they would also be faced with one heck of a job troubleshooting hundreds, potentially even thousands of cells if an issue arose.

In 2019, carriers will be scrutinising costs per cell site as they look to invest in new infrastructure. They will look to offset any costs by implementing intelligent and automated systems that can support 5G networks. However, carriers need assurances that these systems are providing them with the right information about the uptime and performance of their new networks. The only way to achieve this will be to have complete visibility of these complex new architectures. Having a window into this multi-layered and virtualized environment, and being able to extract smart data in near real-time, will be essential for the ongoing management of new 5G networks.

2019 - The year carriers get to grips with 5G security

The benefits of 5G are clear; the new communications standard will offer carriers and their enterprise customers faster network speeds and performance, ultra-low latency and greater efficiencies. General discussion around carrier trials and deployments tends to focus on increased speeds and the new innovations that 5G will enable, but security rarely comes up. That’s all about to change with 5G security set to become a big issue for the industry and a major talking point in 2019.

To date, it appears that 5G security has almost been treated as an afterthought, rather than a critical aspect of network development. However, behind the scenes this is an issue that the carriers take very seriously. The situation for carriers has altered dramatically, because in a 5G domain, the attack surface becomes much greater. Consequently, the number of opportunities for malicious players to exploit vulnerabilities increases. This is partly due to the adoption of virtualized network infrastructures that will allow carriers to scale and meet the demands of 5G, but also because 5G networks will be configured to support a wide variety of industrial and business use cases. This means that going forward, carriers will be responsible for managing mission-critical systems and devices, in addition to handling high volumes of sensitive data. In a 5G environment, there will be a strong emphasis on securing smart factories, automated production lines and fleets of driverless cars.

The network security stakes get a lot higher

As new 5G network architectures are based on virtualization and distributed cloud models, and a containerized environment to support workloads and applications, it’s apparent that carriers have to deal with a whole new set of complexities. Existing security protocols will need to be scrapped and replaced with robust systems and procedures that account for this new complex environment and the burgeoning 5G value chain; that includes applications developers, device manufacturers, cloud service providers and the carriers themselves. A new built-in resilience is required to limit the attack landscape and to reduce the risk of malicious attacks and perimeter breaches. A pervasive security model that offers comprehensive insight on both service performance management and security offers the best solution to address 5G security. It enables service providers to extract ‘smart data’ that is collected and processed at the source from legacy, virtual and hybrid cloud environments. It’s the closest carriers and their customers will ever get to implementing ‘holistic security’ across their entire IT estate.

Wednesday, December 19, 2018

2019 Network Predictions

by Angelique Medina, senior product market manager, ThousandEyes

2018 has seen the acceleration of modern infrastructure from public cloud, SaaS, hybrid and SD-WAN. 2019 will see enterprises feeling the impact of this dramatic shift more than ever.

Internet unpredictability impacts become more visible as SD-WAN projects spread and mature

SD-WAN adoption is on the rise, and with it, the enterprise’s growing dependence on the Internet. Before moving to SD-WAN, most enterprises only had to worry about Internet performance from its data centers to key services. With SD-WAN, they’re increasingly leveraging DIA and broadband connectivity and grappling with hundreds or thousands of sites, each of which will have distinct Internet paths to many different cloud-based services. Shifting from a carrier managed service to the Internet, means that there’s an exponential rise in the number of service providers that can potentially impact performance for branch office users. As a large number of enterprises move from deployment into their operations stage in 2019, the impact of Internet unpredictability will become more evident. As a result, more enterprise IT teams will start to develop operational capabilities to deal with Internet-centric issues.

Digital experience will confront the weight of backend multiplicity

Enterprises and SaaS providers are increasingly leveraging third-party APIs and cloud-services as part of their web and application architectures. This distributed, microservices approach to building applications not only provides best-of-breed functions, it enables companies to quickly consume and deliver new services. Applications today might leverage dozens of APIs to handle services such as messaging and voice, maps, and payments, while also connecting to cloud-based services such as CRM, ERP and analytics. Websites are also getting weighed down by the addition of many externally hosted applications. Even a seemingly simple “Buy Now” function on an ecommerce site will invoke many external services, including payment gateways, CRM, analytics, inventory, fulfillment, and potentially many others.

The weight of all of these external dependencies means that websites are going to continue to get slower, while at the same time their risk surface increases. Since these services are not internally operated, isolating the source of a problem when something goes wrong can be challenging, particularly since these services are connected to over the Internet. The question of whether the application or the network is at fault will become “Which application?” and “Which network?”.

Understanding the tradeoff of function over user experience and knowing how every third-party web or app component impacts performance will get even more critical to enterprises and SaaS providers in 2019.

Fragmentation, not bifurcation of the Internet

Eric Schmidt, former CEO of Google, famously predicted that the Internet would bifurcate into a US-led Internet and a Chinese-led Internet by 2028. While we still have plenty of time to see how this prediction plays out, in the near term, the Internet is shifting towards fragmentation. Multiple nation states, including Iran, Turkey, Saudi Arabia, and Russia, have joined China in creating a walled-off Internet, using a variety of technical, social, and political techniques. As more countries pursue nationalist agendas and choose to opt out of regional or global alignments, we will see increasing Internet fragmentation. This will initially take the form of politically-motivated censorship, but will expand to include the broader curation of connectivity based on politically-prescribed social and cultural norms.

Hybrid starts tilting to the cloud

While the data center will continue to lose ground in favor of cloud, enterprises still early in their cloud journey or who have special security or regulatory constraints will keep hybrid cloud alive. To extend their reach into the enterprise data center, public cloud providers have begun offering on-premises solutions, featuring greater agility, favorable economics, and a single pane of glass for management. While still in its early days, Azure Stack has already announced that it has deployed customers, while newly announced AWS Outpost (scheduled to release in the second half of 2019) has the potential to be highly disruptive to the data center landscape.

2019 will see an increased tilting of hybrid towards public cloud providers, though a lack of maturity may cause an initial freezing of of the hybrid market, particularly for AWS customers, who will want to consider an AWS offering over existing network providers once commercially available.

The Edge gets less “edgy”

Early edge architectures, where the data of billions of IoT devices is notionally processed at central points by infrastructure in public cloud or private data centers, presented challenges, ranging from security to physics (increased latency) and cost (bandwidth). The introduction of intermediary nodes into edge architectures will address the latency and security concerns of a strictly core/edge architecture, moving edge deployments in 2019 from largely theoretical to realizable.

Intermediary nodes are designed to perform some of the processing functions of the cloud closer to the edge, which will help ensure better performance and scale for users and devices and help drive IoT and edge deployments. These nodes are already available from a variety of vendors, including public cloud providers, such as Microsoft. Microsoft has previously stated that they want their Azure cloud data centers be 50ms from everywhere. These new intermediary nodes will help extend the reach of cloud-centric infrastructure to the range of single digit milliseconds and make IoT and edge computing aspirations a reality.

Cyber attacks focus on foundational Internet systems for maximum effect

The pervasive risk associated with offering a digital service has forced most large enterprises and digital businesses to employ sophisticated systems of defense. These systems are designed to handle increasingly large-scale attacks, such as the one launched against GitHub earlier this year. That attack was the largest ever recorded and although it was disruptive, it was successfully mitigated through a highly elastic cloud-based DDoS protector called Prolexic. This and other tools make launching an impactful attack against a high-value target more challenging to pull off, which may be one reason why the number of DDoS attacks is trending downward, particularly in North America and Europe. This doesn’t mean that cyber attacks are going away. Cyber attacks will continue to make headlines in 2019, but they will largely take an indirect approach, exploiting relational weaknesses in foundational Internet systems, such as DNS and BGP routing.

Two incidents this year, one malicious, the other unintentional, underscored the vulnerability of even the most sophistical digital businesses to service disruption. In the case of the malicious incident, Amazon’s DNS service, Route 53, was hijacked, which enabled a cryptocurrency theft and led to many customer sites, including Instagram and CNN, becoming partially unreachable. The attackers who pulled off this digital hijacking and robbery made no attempt to penetrate Amazon’s infrastructure. Instead, they compromised a small Internet Service Provider in Columbus, Ohio, using them to propagate false routes to Amazon’s DNS service. The implicit trust built into Internet routing allowed this attack to take place. The fact that the hijacked service (translating URLs into Internet addresses) is a critical dependency meant that the impact was massive and went far beyond the intended target.

Indirect attacks, taking advantage of critical dependencies outside of the control of the intended target, will continue to grow in 2019, netting more high-profile victims while maximizing the scope of collateral damage.

The operational impact of cloud adoption pushes enterprises to reexamine their management stack mix

Now that SaaS has mainstreamed, with most enterprises shifting their application consumption model from internal to the cloud, we can expect to see a follow-on shift in IT operations stacks in the coming year, as more enterprises begin to realize that the existing toolset is not oriented to address externally-hosted applications.

The traditional IT operations stack is rich with tools, but as the usage of SaaS applications and cloud-based services has increased, the domain of many of these tools is narrowing, exposing gaps in visibility for SaaS applications and their delivery over the Internet. Network tools that collect data from on-premises will see a reduction in usage and budget allocation, making room for cloud-specific tools and technologies designed to provide visibility into networks and services that enterprises rely on (such as ISPs and SaaS apps) but that they do not own or control. This new operations stack will continue to feature traditional toolsets, but its proportional emphasis will favor cloud-focused technologies.

Monday, December 17, 2018

2019 Network Predictions

Bill Fenick, VP of enterprise at Interxion

Enterprises will be smarter about the cloud
The cloud has quickly become a mainstay in the enterprise. However, early on, many businesses dove into the cloud head first, and quickly realized that that not only are not all apps meant to be reengineered for the cloud, but even a lift and shift approach doesn’t always work. Because of this, in 2019, I believe that while enterprises will continue to adopt cloud in a more ferocious way, they’ll do it with a better layer of intelligence on top.

 Artificial Intelligence will drive cloud adoption
As companies increasingly integrate a variety of AI-driven technologies across voice, vision, language and machine learning in order to transform their businesses and get the competitive edge in 2019, I believe they will be leveraging cloud technologies as a matter of course.

Location is becoming more important to enterprises
Today’s enterprises have the need for speed. Regardless of it being application to application or application to end user, businesses need data to move faster than ever before. As a result, in 2019 I expect enterprises to pay closer attention to the location of their data, whether that’s the location in proximity to other data sources including the cloud, or geographic location.

Sally Bament, VP of Service Provider Marketing, Juniper Networks

5G will create a new billion-dollar app economy
The first smartphones and eventually LTE networks paved the way for mobile apps as we know them, giving rise to a multitude of new ways companies interact with customers. 5G is poised to go live in many cities across the United States and globally in 2019, and we expect next year to really showcase the economic power of the new mobile technology. This is the year apps start to show their real value in the enterprise and industrial space with a host of new IoT, AR/VR, digital twins and connected-car applications coming to life.

Two separate high-profile cybersecurity breaches will hit critical U.S. infrastructure
The increasing amount of distributed applications and data deployed in various parts of the cloud environments will increase sophisticated breaches. In the year ahead, we will likely see major attacks on systems of livelihood including utility systems, municipal water supplies and electrical grids. Predictive analytics and end-to-end monitoring are necessary tools to thwart catastrophic structural attacks.

Expect more frenemies in the edge
The hyperscale cloud players have clearly demonstrated the power of their massive networks in terms of application hosting and development. But it’s the telcos that have the beachfront property in their established network infrastructure that’s closest to end users. Cloud providers will try to build an edge of their own, but service providers will remain keepers of the edge as they can compete with much better economic scale. Over the next year, service providers and cloud providers will compete to win the edge but expect more cloud-SP partnerships to unfold as the year progresses.

Automation is the secret to customer satisfaction
In 2019, automation will be the differentiating factor among service providers. Early software and virtualization technology have provided some relief from stagnant development but this year service providers will fully adopt automated and virtualized cloud platforms that can deploy new services in months, not years. Those who fail to implement automation will find themselves years behind competitors, as end users will find more agility and better service with those who embrace automation

Dave Wright, President of the CBRS Alliance

Expect commercial launch in the 3.5 GHz CBRS Band -- Earlier this year, the FCC announced plans for the launch of commercial services in the CBRS 3.5 GHz band, a wide swath of lightly-used spectrum that currently has U.S. Department of Defense systems as its primary user. This new opportunity is enabled through the use of a dynamic sharing mechanism which protects the incumbent government operations while allowing new commercial services. OnGo solutions for the band will offer secure, cost-effective connectivity in the places it is needed most, and at a fraction of the cost that has historically been associated with cellular technologies. There is universal agreement that mid-band spectrum will be critical for next-generation wireless services, and CBRS is the first mid-band spectrum being made available in the US.

Organizations – including existing mobile, fixed wireless, and cable operators, as well as enterprises and industrial players – are already laying the groundwork for deployment. Testing and certification programs for equipment and devices operating within the 3.5 GHz band are well underway – with a number of radio infrastructure and client devices now authorized by the FCC. The testing of the dynamic sharing databases (SASs) is also well underway. The industry is ready for commercial deployment, and 2019 will be the year of improved wireless coverage and capacity on a massive scale.

Jon Toor, CMO, Cloudian

There’s No Place Like Home: Cloud Repatriation Increases: While the growth of the public cloud will remain strong, enterprises will expand their adoption of on-premises private clouds in a hybrid cloud model. This will include repatriating data from the public cloud to avoid the bandwidth, latency and cost issues that can arise when accessing such data.

Two Clouds Are Better Than One: More enterprises will adopt a multi-cloud strategy to avoid vendor lock-in and enhance their business flexibility. However, a multi-cloud approach raises new management challenges that users will need to address to ensure a positive experience.

Object Storage: Ready, Camera, Production: Moving beyond its traditional use for large-scale archiving, object storage will play an increasing role in video production workflows. Offering a combination of limitless scalability, S3 compatibility and tremendous durability, object storage provides an ideal platform for managing video content, including over-the-top (OTT) distribution.

What Do You Get When You Mix Blue and Red?: IBM-Red Hat Deal Scrambles the Cloud Landscape – IBM’s acquisition of Red Hat will reverberate throughout 2019, giving enterprises more options for designing a multi-cloud strategy and highlighting the importance of data management tools that can work across public cloud, private cloud and traditional on-premises environments.

AI and Object Storage Play Tag: As businesses increase their use of AI to extract greater value from their digital assets, metadata tagging will become an even more critical element of enterprise storage. This will bring more attention to object storage, which is centered on metadata, and the key will be integrating well with AI tools.

Cloud Foundry Foundation’s Executive Director Abby Kearns and CTO Chip Childers

Consolidation will continue: Based on 2018’s acquisitions, we predict we’ll see a steady rollout of acquisitions in the next 12-18 months, as major enterprise tech companies rush to get a piece of the latest innovations. Shuffling in executive leadership at certain large companies is a telltale sign that acquisition opportunities will be used to grow business more rapidly. Consolidation around a specific technology is bound to happen, with the market solidifying around that tech.

Multi-platform will be the new normal: A majority of the market believes containers must be the solution to digital transformation, but in 2019, they’ll realize they’re just a tool -- not a silver bullet. We’re already seeing that companies are more broadly deploying a combination of technologies like PaaS, containers and platform in tandem, which we published in a report earlier this year. 2019 will be the year enterprises begin to embrace this versatility and see the flexibility, scalability and interoperability in a multi-platform solution.

People and process are more important than technology: In 2019, companies are going to realize the people on their teams matter more than anything. Reskilling their workforces is going to become essential to business success. Technology is evolving at the same rate as training, so most people with today’s desired skill sets are already employed. Organizations that build continuous learning cycles into their business model and upskill their employees will keep themselves ahead of the curve.

FaaS adoption will continue momentum: FaaS is a serverless technology. It’s already been adopted rapidly as glue code which will continue. However, its function as a productive way to build business applications is only beginning to take off. We will see the beginnings of an explosion of developer frameworks built on top of serverless systems. This type of tooling makes it easier to work with and build things with FaaS, so it becomes a self-perpetuating cycle.

Eyes to the east: Together, we’ve been to China seven separate times this year, and we are astonished at the pace of technological advancement there. With special interest in Artificial Intelligence, China is moving at lightning speed. In 2019, there will be global impact as China’s advancement pushes other regions to hasten their own development.

Scaling up, and quick: We’re seeing the momentum of scale steadily speed up as a result of continued enterprise adoption of cloud technologies. As the technologies mature and are integrated into cloud solutions, enterprises grow more familiar with them, gain trust in their value and increase adoption. It’s a virtuous cycle, which we wrote about in the Foundation’s latest research report, and it’s only going to start spinning faster in 2019.

Culture matters: We’ve said it before and we’ll say it again: Your people and your processes are more important than your technology. In our most recent research, we found that nearly 50 percent of organizations believe culture change is a bigger obstacle than the technology itself. The shift to digital has to happen within your organization, and that means with your people. In 2019, companies are going to prioritize a new culture that emphasizes agile, integrative, inclusive workflow. It’s just another way the cloud market is restructuring.

Wednesday, December 20, 2017

Top 5 Container Predictions for 2018

by David Messina, CMO, Docker

Prediction #1: The next big security breach will be foiled by containers

As we witnessed with the Equifax breach in early September, data breaches can place personal data at risk and in doing so, erode consumer confidence. But what if you could prevent a major breach by simply placing the software in a container? The Equifax breach occurred when a piece of web software was vulnerable and exposed to hackers. Containers act to reduce the attack surface available for exploitation, and in doing so greatly increase the difficulty and minimize the possibility of many forms of compromise. In many cases, simple steps like using read-only containers will fully mitigate a broad range of attack vectors.
                                                                                                                                                                    From being ephemeral and isolated in nature to enabling frequent patching and scanning against the latest CVEs, containers are vital to securing the software supply chain. Containers will be more widely relied upon in the coming year to combat future threats.

Prediction #2: Complexity and time to market will thwart PaaS adoption

As calls for accelerated cloud strategies only get louder across the Global 10K, it's becoming increasingly clear that outdated Platform as a Service (PaaS) frameworks are not equipped to handle the demand of managing all of the applications that are part of today’s modern enterprise. For the past few years, utilizing PaaS has been considered a cutting-edge approach to migrating your apps to the cloud. What is often overlooked, is the time required to set up PaaS frameworks, retrain employees and re-code each application - efforts that can take a year to drive and complete. In 2018, we expect to see PaaS adoption stall as enterprises recognize the time to value is too prolonged for the current and future pace of business. This will give way to accelerated Container as a Service (CaaS) platform adoption as enterprises look to migrate more workloads to the cloud while achieving greater agility, innovation, and cost-efficiencies.

Prediction #3: Containers will break the 80/20 Rule for IT budgeting 

It’s widely understood that CIOs typically commit 80% of their budget towards maintenance with only 20% left for innovation - a major roadblock in the path to digital transformation. We expect this to change in 2018 as CIOs rewrite the 80/20 rule in favor of innovation by unlocking new methods for managing and modernizing their legacy apps. In the past, application modernization required refactoring apps, ripping/replacing existing infrastructure and implementing new processes. Instead, enterprises are now using containerization for meaningful application modernization results in days. Organizations will reap the benefits of cloud portability and security while using the significant cost-efficiencies to reinvest their savings in more strategic digitization efforts.

Prediction #4: Security, not orchestration, will write the next chapter of containerization

2016 and yes even some of 2017 might have been about the orchestration wars but now that companies like Docker offer a choice of orchestration, some might argue that orchestration has been largely commoditized. With container adoption expected to grow into a nearly $3 billion dollar market by 2020 according to 451 Research and Docker itself experiencing more than one billion downloads bi-weekly, security will be the next frontier that companies need to address. Ironically, the threats will come from the applications themselves, making “container boundaries” an imperative for segmenting and isolating threats. The container boundary can also make it more difficult for an attacker to get the data out, resulting in detection. Securing the software supply chain will be paramount to safeguarding the application journey.

Prediction #5: CIOs will accelerate plans for digital transformation with containers

Although “digital transformation” has become somewhat of a buzzword as of late,  enterprises certainly accept the idea behind it - and with a greater sense of urgency. According to Gartner, as many as two-thirds of business leaders are concerned that their companies aren’t moving fast enough on the digital transformation front, leading to potential competitive disadvantages. In 2018, CIOs will increasingly feel the pressure to speed up digitization efforts and will accelerate their journey through containers. As businesses build out and implement strategies around cloud migration, DevOps and microservices, containers will play an increasingly important role in achieving these initiatives. By Dockerizing their applications, our enterprise customers have experienced the immediate benefits of digital transformation: faster app delivery times, portability across environments, hardened security and more.

Monday, January 9, 2017

Forecast for 2017? Cloudy

by Lori MacVittie, Technology Evangelist, F5 Networks

In 2016, IT professionals saw major shifts in the cloud computing industry, from developing more sophisticated approaches to application delivery to discovering the vulnerabilities of connected IoT devices. Enterprises continue to face increasing and entirely new security threats and availability challenges as they migrate to private, public and multi-cloud systems, which is causing organizations to rethink their infrastructures. As we inch toward the end of the year, F5 Networks predicts the key changes we can expect to see in the cloud computing landscape in 2017.

IT’s  MVP of 2017? Cloud architects 

With more enterprises adopting diverse cloud solutions, the role of cloud architects will become increasingly important. The IT professionals that will hold the most valuable positions in an IT organization are those with skills to define criteria for and manage complex cloud architectures.

Multi-cloud is the new normal in 2017

Over the next year, enterprises will continue to seek ways to avoid public cloud lock-in, relying on multi-cloud strategies to do so. They will aim to regain leverage over cloud providers, moving toward a model where they can pick and choose various services from multiple providers that are most optimal to their business needs.

Organizations will finally realize the full potential of the cloud

Companies are now understanding they can use the cloud for more than just finding efficiency and cost savings as part of their existing strategies and ways of doing business. 2017 will provide a tipping point for companies to invest in the cloud to enable entirely new scenarios, spurred by things like big data and machine learning that will transform how they do business in the future.

The increasing sophistication of cyber attacks will be put more emphasis on private cloud
While enterprises trust public cloud providers to host many of their apps, the lack of visibility into the data generated by those apps causes concerns about security. This means more enterprises will look to private cloud solutions. Public cloud deployments won’t be able to truly accelerate until companies feel comfortable enough with consistency of security policy and identity management.

More devices – More problems: In 2017,  public cloud will become too expensive for IoT 

Businesses typically think of public cloud as the cheaper business solution for their data center needs, yet they often forget that things like bandwidth and security services come at an extra cost. IoT devices generate vast amounts of data and as sensors are installed into more and more places, this data will continue to grow exponentially. This year, enterprises will put more IoT applications in their private clouds, that is, until public cloud providers develop economical solutions to manage the huge amounts of data these apps produce.

The conversation around apps will finally go beyond the “where?”

IT professionals constantly underestimate the cost, time and pain of stretching solutions up or down the stack. We’ve seen this with OpenStack, and we’ll see it with Docker. This year, cloud migration and containers will reach a point that customers won’t be able to just think about where they want to move apps, they’ll need to think about the identity tools needed for secure authentication and authorization, how to protect and prevent data loss from microservices and SaaS apps; and how to collect and analyze data across all infrastructure services quickly.

A new standard for cloud providers is in motion and this year will see major developments in not only reconsidering the value of enterprise cloud, but also modifying cloud strategy to fully extend enterprise offerings and data security. Evaluating the risks of cloud migration and management has never been as vital to a company’s stability as it is now. Over the course of the year, IT leaders who embrace and adapt to these industry shifts will be the ones to reap the benefits of a secure, cost-effective and reliable cloud.

About the Author

Lori MacVittie is Technology Evangelist at F5 Networks.  She is a subject matter expert on emerging technology responsible for outbound evangelism across F5’s entire product suite. MacVittie has extensive development and technical architecture experience in both high-tech and enterprise organizations, in addition to network and systems administration expertise. Prior to joining F5, MacVittie was an award-winning technology editor at Network Computing Magazine where she evaluated and tested application-focused technologies including app security and encryption-related solutions. She holds a B.S. in Information and Computing Science from the University of Wisconsin at Green Bay, and an M.S. in Computer Science from Nova Southeastern University, and is an O’Reilly author.

MacVittie is a member of the Board of Regents for the DevOps Institute, and an Advisory Board Member for CloudNOW.

Friday, January 6, 2017

Wi-Fi Trends Take Center Stage in 2017

by Shane Buckley, CEO, Xirrus 

From an unprecedented DNS outage that temporarily paralyzed the entire internet, to the evolution of federated identity for simple, secure access to Wi-Fi and applications, 2016 had its mix of growing pains and innovative steps forward.

Here’s why 2017 will shape up into an interesting year for Wi-Fi technology.

IoT will create continued security issues on global networks

In 2017, the growth of IoT will put enormous pressure on Wi-Fi networks. While vendors must address the complexity of onboarding these devices onto their network, security can’t get left behind. The proliferation of IoT devices will propel high density into almost all locations – from coffee shops to living rooms – prompting more performance and security concerns. Whether Wi-Fi connected alarms or smart refrigerators, the security of our homes will be scrutinized and will become a key concern in 2017. Mass production of IoT devices will make them more susceptible to hacking, as they will not be equipped with the proper built in security.

The recent IoT-based attack on DNS provider Dyn opened the floodgates, as estimates show the IoT market reaching 10 billion devices by 2020. The event foreshadows the power hackers hold when invading these IoT systems. Taking down a significant portion of the internet grows more detrimental, yet all too plausible these days. Because of increased security concerns, vendors will equip devices with the ability to only connect to the IoT server over pre-designed ports and protocols. If IoT vendors don’t start putting security at the forefront of product development, we can only expect more large-scale cyberattacks in 2017.

LTE networks won’t impact Wi-Fi usage

Don’t expect LTE networks to replace Wi-Fi. The cost of deploying LTE networks is ten times greater and LTE is less adaptable for indoor environments than Wi-Fi. Wi-Fi will remain the lowest cost technology available with similar or superior performance to LTE when deployed properly and therefore will not be replaced by LTE. When people have access to Wi-Fi, they’ll connect. Data plan limitations remain too common.

Additionally, the FCC and other international government agencies began licensing the 5GHz spectrum to offer free and uncharted access to Wi-Fi. But, we don’t want carriers grabbing free spectrum and charging us for every byte we send, now do we?

LTE and Wi-Fi will co-exist as they do today, where LTE works well outdoors and Wi-Fi well-designed to work consistently throughout internal spaces.

The push toward federated identity will continue in 2017

Today, there remains a disparate number of Wi-Fi networks, all with different authentication requirements. This marks an opportunity for Wi-Fi vendors. In the coming year, we will see federated identity become a primary differentiator. By implementing federated identity, vendors simplify and secure the login process. Consumers can auto-connect to any public Wi-Fi network with their existing credentials – whether Google, Microsoft or Facebook – thus providing them with a seamless onboarding experience. It’s the next step for Single Sign-On (SSO), and one that will set Wi-Fi vendors apart in 2017.

This coming year, the repercussions of IoT, coexistence of LTE and Wi-Fi, and demand for simple, secure access to Wi-Fi, will take center stage. The onus falls on company leaders, who must adapt their business strategies so they can keep pace with the fast and ever-changing Wi-Fi landscape. 2017 will have plenty in store.

About the Author

Shane Buckley is CEO of Xirrus. Most recently, Mr. Buckley was the General Manager and Senior Vice President at NETGEAR where he led the growth of NETGEAR’s commercial business unit to 50 percent revenue growth over 2 years, reaching $330 million in 2011 – and played a prime role in growing corporate revenues over 30 percent. Prior to that, Mr. Buckley was President & CEO of Rohati Systems, a leader in Cloud-based access management solutions, Chief Operating Officer of Nevis Networks, a leader in secure switching and access control. He has also held the position of Vice President WW Enterprise at Juniper Networks, President International at Peribit Networks, a leader in WAN Optimization and EMEA vice president at 3Com Corp. Mr. Buckley is a graduate of engineering from the Cork Institute of Technology in Ireland.

Tuesday, December 20, 2016

Predictions 2017: IaaS Becomes the Next Launching Pad for Cyber Threats

by Corey Nachreiner, Chief Technology Officer, WatchGuard Technologies

Cloud technology has had an incredible impact on the business landscape over the last five years. Public infrastructure-as-a-Service (IaaS) platforms like Amazon’s AWS and Microsoft Azure, in particular, are growing at incredible rates – even among small businesses. According to RightScale’s 2016 State of the Cloud report, 71 percent of small and medium businesses (SMBs) are running at least one application in AWS or Azure. It’s clear that IaaS solutions provide a ton of business opportunities for organizations, especially those without the financial or personnel resources necessary to manage physical network infrastructure.

However, as the public cloud becomes more engrained in the fabric of everyday business operations, it has also become a serious target for hackers. The question: How safe is it really? With so much valuable customer, financial and healthcare data stored in one place, and managed by a third party, it’s easy to see why criminals have begun to focus their efforts on IaaS.  

In the past, we’ve seen threat actors target or infect servers running in public cloud services. For example, there have been cases where hackers take over servers running in Amazon EC2—the virtualized compute portion of Amazon AWS. Remember, servers you spin up in EC2 are no different from servers on your premises. If you leave a port open, without a firewall or access control rules, hackers can attack it in the same way they attack physical servers. To illustrate this, a honeypot organization spun up some fake SSH servers in Amazon EC2 to see whether they’d get targeted. Even without publishing the servers’ IP addresses, or attaching them to a domain, attackers found and started brute-force attacks the IaaS-based honeypots within 10 hours.

We’ve also seen criminals target IaaS customers through their cloud credentials. An Amazon AWS account is powerful. Customers can spin up almost endless servers, as long as they are willing to pay Amazon for the compute power they use. In 2014, one AWS customers had a very costly AWS credential breach. Some criminal learned his AWS credential, and used its administrative powers to spin up more EC2 server instances, which he used to mine bitcoin. This credential leak (due to the victim accidentally leaving credentials in a Github project), almost cost the victim over $5000 in AWS bills.

In short, without the proper protections, attackers can hack servers in the public cloud just as easily as the ones on your premises. As we move more and more of our data to IaaS servers, you can expect criminal hackers to follow.

Iaas doesn’t only make a good attack target, but also provides a powerful attack platform. We’ve seen cybercriminals leveraging these robust virtualization cloud platforms to build their attack infrastructure. For instance, criminals started putting their botnet command and control (C&C) servers in Amazon EC2 shortly after its launch, one example being the Zeus botnet. Despite increased monitoring and security from Amazon, attackers still use AWS infrastructure for attacks.

More recently, a web security company did a study of all the web application attacks launched on the Internet, and found that 20 percent of these attacks from AWS’s IP addresses. This comes as no surprise, since public IaaS services can provide single individuals with more scalable compute and network power one person could easily harness on their own. As long as public clouds offer impressive distributed computing capabilities to customers, hacker will search for ways to exploit these powers for evil.

In 2017, I expect to see attackers increasingly leverage public IaaS both as a potential attack surface, and as a powerful platform to build their attack networks. It’s highly likely there will be at least one headline-generating cyberattack either targeting, or launched from a public IaaS service next year.

So what can businesses to do protect their IaaS properties from being attacked in 2017?

In short, extend your existing network perimeter security tactics to the public cloud. There are a number of simple best practices I’d recommend to proactively protect your IaaS credentials and business critical data:
       
·        Properly implement IaaS’s existing access controls: IaaS services like AWS and Azure have built-in security tools you can use to protect your cloud servers in the same way you do physical ones. While cloud services don’t offer Unified Threat Management (UTM) or Next-generation Firewall (NGFW) services, they do have basic stateful firewalls. At the very least, make sure you firewall your cloud servers, and only expose the network services you really need to. 

·        Use strong authentication or two-factor authentication (2FA) whenever possible: Passwords are not perfect. They can get stolen, or you might accidentally leave them in a Github project, like the victim mentioned above. If you’re only using a password to authenticate to your IaaS service, a lost password gives attackers everything they need to take over your account. However, most public clouds offer two-factor authentication (2FA), where you can pair your password with some other authentication token, such as a secure code delivered to your mobile phone. With 2FA enabled, cybercriminals won’t be able to access your IaaS account even if they compromise your password.

·        Bring your on-prem security to the cloud: Most organizations protect their premise servers with UTM and NGFW appliances that combine many different security controls into one easy to manage appliance. Luckily, you can now bring these advanced premise security solutions to IaaS as well. Search your IaaS marketplace for your favorite security solution and you might find it.

·        Check out your IaaS provider’s security best practices: Frankly, there are more security tips and practices to protect your cloud servers that I can share in one short article. The good news is your favorite IaaS provider may already have you covered. For instance, AWS users can find a white paper on all Amazon’s best practices in this PDF.


Business will continue to boom for the IaaS industry. According to the latest market study by International Data Corporation (IDC), worldwide spending on public cloud services is expected to reach upwards of $141 billion by 2019, up from nearly $70 billion last year. With the sustained growth and prevalence of IaaS, organizations need to constantly educate themselves on new ways cybercriminals are leveraging it and focus on effectively extending their network security into the public cloud. 

About the Author

Corey Nachreiner is Chief Technology Officer of Watchguard Technologies.

Recognized as a thought leader in IT security, Nachreiner spearheads WatchGuard's technology vision and direction. Previously, he was the director of strategy and research at WatchGuard. Nachreiner has operated at the frontline of cyber security for 16 years, and for nearly a decade has been evaluating and making accurate predictions about information security trends. As an authority on network security and internationally quoted commentator, Nachreiner's expertise and ability to dissect complex security topics make him a sought-after speaker at forums such as Gartner, Infosec and RSA. He is also a regular contributor to leading publications including CNET, Dark Reading, eWeek, Help Net Security, Information Week and Infosecurity, and delivers WatchGuard's "Daily Security Byte" video on Facebook.

Monday, December 19, 2016

Predictions 2017: The Age of Collaboration and Interoperability

by Daniel Kurgan,
CEO, BICS

2016 proved to be a remarkable year for the telecoms industry. In its crudest form, mobile was previously synonymous with the core voice services provided by operators. And while voice remains one of the biggest revenue streams for MNOs, the digital revolution has sparked a new age for communication. New, branded digital experiences – messaging apps, chat bots and even the Internet of Things – have pervaded mobile devices globally, as consumers expect instant connectivity and seamless services, wherever they are in the world.

As operators and internet service providers make great headway in giving subscribers the ultimate user experience with faster, more qualitative and cost-effective voice and data services,  2017 – despite its imminent challenges – could prove to be a turning point for the industry as a whole, with new revenue opportunities and partnerships to be reaped for those who have readied themselves to take advantage.

The abolition of EU roaming charges will force significant change in the industry

June 2017 will mark a huge juncture in mobile history as EU regulators abolish roaming charges, allowing subscribers to use their mobile phones abroad as they would at home.
The abolition of roaming charges in Europe for consumers will have a knock on effect for wholesalers, who will be under increasing pressure to reduce their own fees. However, roaming traffic is expected to grow significantly as more subscribers will be inclined to switch on data roaming as they travel. This surge in usage of mobile voice and data services abroad should offset the impact of the changes in legislation.

The wholesale telecoms sector will experience a period of upheaval as the new measures come into place. However, the blow could be softened by the decision made by EU ministers in December 2016 to cap wholesale rates. Essentially, this will give mobile operators and wholesale carriers time to adjust. Either way, mobile operators will need to adapt if they want to reap the rewards of the industry’s digital transformation.

The Internet of Things will spark new services and revenue streams

In 2017, we will witness more convergence across the telecoms space as service providers develop cross-platform propositions to support content services, and also look to diversify in order to target vertical markets.

The IoT will be a key driver of this transformation, as an increasing number of industries become reliant on mobile to provide the connectivity and infrastructure needed to support IoT implementations. IoT has become integral to the growth of smart homes, smart factories, automated production lines and the emergence of driverless cars.

We have already seen several deployments of global IoT solutions, but in 2017 these are likely to gain traction as more and more equipment manufacturers and enterprises look to embed global connectivity into their devices. Wholesalers will play a key role in bringing disparate players from other markets together, helping to drive collaboration that will support innovation in telecoms.

Increased M&A activity will prompt collaboration to take advantage of industry opportunities

Wholesalers are set to play a crucial role in the development and transformation of the telecoms industry in 2017, acting as a facilitator of new partnerships across the sector.

In 2016, we saw a number of high-profile M&As as major operators looked to enhance their digital propositions to offer content and media services. This was evident in the U.S. with the Verizon- Yahoo M&A and the recent move by AT&T to acquire Time Warner. On the consolidation front, the UK incumbent BT acquired EE, the country’s largest mobile operator.

In 2017, we’ll see similar activity again, but on a micro-level, as mobile operators and pure-play telecoms businesses look to partner with digital service providers, cloud communications companies and even fintech companies to embed mobile and rich communications services into their core proposition. In order for these players to capitalise on the opportunities before them, they will need to look at partnering with wholesalers, retail service providers and vendors to help make the connections, share assets and intelligence in order to develop and roll out new services and to new markets in order to differentiate.

About the Author

Daniel Kurgan, Chief Executive Officer, BICS Daniel Kurgan was appointed CEO of BICS SA/NV on March 2nd 2007 after being COO from July 1st 2006 onwards. He started his career as Contracts Manager at SABCA (biggest Belgian aerospace company, subsidiary of French Group Dassault), where he negotiated and managed major industrial sales, subcontracting and purchase contracts with customers like Boeing, Airbus, Aerospatiale (EADS), Asian governments, and suppliers like GEC Marconi and Litton.

Daniel joined Belgacom’s Carrier Division at the start of the carrier's commercial operations in January 1997, where he held several positions including International Account Manager, Head of International Relations & Sales, Sales Director (domestic and international wholesale) and VP International Wholesale, in charge of Sales & Marketing, Buying & LCR, and Customer Service and Network. In 2005 Daniel was VP Commercial of BICS, and contributed to the spin-off of Belgacom’s international carrier business.

Daniel graduated from the Solvay Business School of the University of Brussels.

Thursday, December 15, 2016

Perspectives 2017: Today’s Internet of Things Reality

And an analysis of the why and how of developing an IoT strategy

by Patrick Hubbard, Head Geek, SolarWinds

The Internet of Things (IoT) has been a buzzword for quite a while now. For many, it conjures up images of smart thermostats, home security systems, app-powered office coffeemakers, and even internet-connected crockpots. Consumer IoT devices such as these, often referred to as internet-connected appliances, are certainly experiencing exponential growth, but the growth of business or industrial IoT is even more astounding: Gartner estimates that there will 21 billion endpoints in use by 2020, resulting in massive potential for data generation in 2020.

The Challenges of IoT

Enterprise and industrial IoT devices themselves can be very helpful in determining such things as soil moisture in smart agriculture, improving asset tracking in the shipping industry, and determining temperature and utilization in a manufacturing facility. However, the sheer volume of these devices presents an issue when they are added to a network without a strategy, much like BYOD when it first came about.

But unlike with phones, tablets, or laptops, a majority of IT professionals managing networks with IoT-connected devices aren’t conducting software updates on the devices; instead, the primary focus has been on how these appliances can be used in novel ways, with the risks of their unmonitored internet connectivity falling by the wayside. This common oversight and its consequences were illustrated via the recent Dyn DDoS attack—many of the devices used in the attack were connected to corporate networks and improperly monitored. In consequence, we need to stop thinking about IoT as “BYOD on steroids.” Instead, we need very different and customized strategies because IoT has the power to disrupt operations in a dangerous way.

Thus, it’s clear that IoT devices are changing networks and our ability to monitor and manage them. With that in mind, it’s important to note IoT device class (0, 1, or 2). This refers to the variety of ways the network is affected by IoT devices. Class 0 devices are light, use low power and aren’t truly IoT devices that require dramatic shifts in the way we monitor and manage our networks

Monitoring and management of classes 1 and 2 are a different story, though. Managing class 1 and 2 IoT devices comes down to managing access properties on the routers and switches that allow devices to get to the internet. Monitoring these devices is more application traffic-specific, calling for Netflow or quality-of-service (QoS) in order to see what the devices are doing because they won’t typically allow for SNMP or provide a management interface to determine performance. This factor makes security information and event management (SIEM) and important consideration as well—you need to be able to detect that a network device is conducting a port scan or file share logons, for example.

In terms of capacity planning, if you believe the estimation of billions of devices in use by 2020, then we will undoubtedly overwhelm our networks in ways we can’t even imagine right now. If subnetting is a problem now, with the typical and somewhat manageable systems, then the order of magnitude brought on by IoT devices would likely force companies into IPv6, which they may not be ready for. They will cause transience with IP addresses and difficulty understanding what the bandwidth is of any given device—different devices have different behaviors, and they all communicate with different servers. Some will be well-optimized for this, and some won’t be. The retail industry, as an example, uses immense hyper-personalization based on IoT, so network capacity and utilization is of utmost importance. In order to avoid latency or downtime, they will need to undertake tremendous network capacity planning and utilization, or risk their reputation and customer experience.

The Benefits of an IoT Strategy

Although it may seem like the industry is moving at too rapid of a pace for you to slow down and implement and test an effective IoT strategy, it’s imperative to do so; the fact that there are a multitude of adoptable standards in existence already may help.

The first and perhaps most obvious benefit of implementing an IoT strategy is the reduced risk of data breach. Without knowledge of possible vulnerabilities, your organization may be open to security compromises in ways that could be harmful to your business and appear to come out of left field, although they could have been left unnoticed for a long period of time. A recent yet admittedly simple poll of IT professionals showed that while some organizations still don’t manage any IoT devices (or none that IT knows about), some, even in regulated industries such as healthcare, manage thousands of devices without following specific protocol.

The second benefit is financial: organizations can anticipate extra costs by conducting capacity planning and network management before IoT devices are put on a network. Additionally, organizations will be more likely to obtain what they set out to in the first place: bottom line savings advantages gained by implementing innovative IoT devices; for example, in HVAC efficiency, physical security, short lead manufacturing efficiency, and production rate optimization in the supply chain. Companies who are using IoT in truly transformative ways within the framework of formulated strategies for their customers will be the first out of the gate to experience unprecedented benefits.

Getting Started On an IoT Strategy

As a first step to gaining knowledge and control over IoT, you should take inventory of what you already have happening from an IoT device perspective within your environment. Without this baseline knowledge, there’s no way to move ahead with any kind of semblance of a strategy.

Next, you need to come to the table with business executives and discuss what they intend to do with IoT devices. Seek to understand how many devices there may be and what type.

Once there’s collective agreement about how many IoT devices will be in your environment, it’s up to you to formulate a security policy, outlining what’s acceptable in terms of risk. This is also dependent on your industry—retail versus financial services versus healthcare, for example. You may need to consider PCI, HIPAA and other compliance issues. The security policy will also drive reconsideration of network and security segmentation.

When having these conversations with business leaders, it’s also a time to calculate business risk and put a hard, defensible number behind the financial hit to the business in the event of a serious security breach as a result of IoT. Once you are able to calculate financial damage estimations, then it becomes easier to have discussions with management about security, network security, configuration, performance and quality of experience (QoE) monitoring needs.

In addition to aligning with the business on security policies and business risks, this is also the time to consider what to do with the huge amounts of data the devices will generate. And because so many organizations are moving towards hybrid IT, you may need to consider how both on-premises and cloud data will be managed from a services, applications, and storage perspective, in order to best manipulate the data to improve marketing, service delivery, or increase yield in a factory setting.

Conclusion

IoT should be an active concern for you. If you’re not already, you will soon be asked to manage more and more network-connected devices, resulting in security issues and a monumental challenge in storing, managing and analyzing mountains of data. The risk is that without a proper strategy to do so, you’ll be tackling all this on an ad hoc basis on an ad hoc basis. Instead, stop what you’re doing and start developing your IoT strategy. Begin by surveying your network today to get a baseline, then come to the table with your organization’s IoT stakeholders to determine why they need IoT and how they plan to use the devices, discuss the security implications and define a security policy, and decide what to do with all the data the devices will generate. Doing this, ahead of time if still possible, will help you ensure that your organization doesn’t become an IoT victim, but an IoT victor.

About the Author

Patrick Hubbard is a head geek and senior technical product marketing manager at SolarWinds. With 20 years of technical expertise and IT customer perspective, his networking management experience includes work with campus, data center, storage networks, VoIP and virtualization, with a focus on application and service delivery in both Fortune 500 companies and startups in high tech, transportation, financial services and telecom industries.

About SolarWinds

SolarWinds (NYSE: SWI) provides powerful and affordable hybrid IT infrastructure management software to customers worldwide from Fortune 500® enterprises to small businesses, government agencies and educational institutions. We are committed to focusing exclusively on IT Pros, and strive to eliminate the complexity that they have been forced to accept from traditional enterprise software vendors. Regardless of where the IT asset or user sits, SolarWinds delivers products that are easy to find, buy, use, maintain and scale while providing the power to address all key areas of the infrastructure from on premises to the cloud. Our solutions are rooted in our deep connection to our user base, which interacts in our thwack online community to solve problems, share technology and best practices, and directly participate in our product development process. 


Got an idea for a Blueprint column?  We welcome your ideas on next gen network architecture.
See our guidelines.

Wednesday, December 14, 2016

Ten Cybersecurity Predictions for 2017

by Dr. Chase Cunningham, ECSA, LPT 
Director of Cyber Operations, A10 Networks 

The cyber landscape changes dramatically year after year. If you blink, you may miss something; whether that’s a noteworthy hack, a new attack vector or new solutions to protect your business. Sound cyber security means trying to stay one step ahead of threat actors. Before the end of 2016 comes around, I wanted to grab my crystal ball and take my best guess at what will be the big story lines in cyber security in 2017.

1. IoT continues to pose a major threat. In late 2016, all eyes were on IoT-borne attacks. Threat actors were using Internet of Things devices to build botnets to launch massive distrubted denial of service (DDoS) attacks. In two instances, these botnets collected unsecured “smart” cameras. As IoT devices proliferate, and everything has a Web connection — refrigerators, medical devices, cameras, cars, tires, you name it — this problem will continue to grow unless proper precautions like two-factor authentication, strong password protection and others are taken.

Device manufactures must also change behavior. They must scrap default passwords and either assign unique credentials to each device or apply modern password configuration techinques for the end user during setup.

2. DDoS attacks get even bigger. We recently saw some of the largest DDoS attacks on record, in some instances topping 1 Tbps. That’s absolutely massive, and it shows no sign of slowing. Through 2015, the largest attacks on record were in the 65 Gbps range. Going into 2017, we can expect to see DDoS attacks grow in size, further fueling the need for solutions tailored to protect against and mitigate these colossal attacks.

3. Predictive analytics gains groundMath, machine learning and artificial intelligence will be baked more into security solutions. Security solutions will learn from the past, and essentially predict attack vectors and behvior based on that historical data. This means security solutions will be able to more accurately and intelligently identify and predict attacks by using event data and marrying it to real-world attacks. 

4. Attack attempts on industrial control systems. Similar to the IoT attacks, it’s only due time until we see major industrial control system (ICS) attacks. Attacks on ecommerce stores, social media platforms and others have become so commonplace that we’ve almost grown cold to them. Bad guys will move onto bigger targets: dams, water treatment facilities and other critical systems to gain recognition.

5. Upstream providers become targets. The DDoS attack launched against DNS provider Dyn, which resulted in knocking out many major sites that use Dyn for DNS services, made headlines because it highlighted what can happen when threat actors target a service provider as opposed to just the end customers. These types of attacks on upstream providers causes a ripple effect that interrupts service not only for the provider, but all of their customers and users. The attack on Dyn set a dangerous presedent and will likely be emulated several times over in the coming year.

6. Physical security grows in importance. Cyber security is just one part of the puzzle. Strong physical security is also necessary. In 2017, companies will take notice, and will implement stronger physical security measures and policies to protect against internal threats and theft and unwanted devices coming in and infecting systems.

7. Automobiles become a target. With autonomous vehicles on the way and the massive success of sophisticated electric cars like Teslas, the automobile industry will become a much more attractive target for attackers. Taking control of an automobile isn’t fantasy, and it could be a real threat next year.

8. Point solutions no longer do the job. The days of Frankensteining together a set of security solutions has to stop. Instead of buying a single solution for each issue, businesses must trust security solutions from best-of-breed vendors and partnerships that answer a number of security needs. Why have 12 solutions when you can have three? In 2017, your security footprint will get smaller, but will be much more powerful.

9. The threat of ransomware growsRansomware was one of the fastest growing online threats in 2016, and it will become more serious and more frequent in 2017. We’ve seen businesses and individuals pay thousands of dollars to free their data from the grip of threat actors. The growth of ransomware means we must be more diligent to protect against it by not clicking on anything suspicious. Remember: if it sounds too good to be true, it probably is.

10. Security teams are 24/7. The days of security teams working 9-to-5 are long gone. Now is the dawn of the 24/7 security team. As more security solutions become services-based, consumers and businesses will demand the security teams and their vendors be available around the clock. While monitoring tools do some of the work, threats don’t stop just because it’s midnight, and security teams need to be ready to do battle all day, every day.

About the Author

Dr. Chase Cunningham (CPO USN Ret.)  is A10 Networks' Director of Cyber Operations. He is an industry authority on advanced threat intelligence and cyberattack tactics. Cunningham is a former US Navy chief cryptologic technician who supported US Special Forces and Navy Seals during three tours of Iraq. During this time, he also supported the NSA and acted as lead computer network exploitation expert for the US Joint Cryptologic Analysis Course. Prior to joining A10 Networks, Cunningham was the director of cyber threat research and innovation at Armor, a provider of cloud-based cyber defense solutions. 


See also