Tuesday, July 12, 2016

Blueprint: An Out-of-This-World Shift in Data Storage

by Scott Sobhani, CEO and co-founder, Cloud Constellation’s SpaceBelt

In light of ongoing, massive data breaches across all sectors and the consequent responsibility laid at executives’ and board members’ feet, the safe storing and transporting of sensitive data has become a critical priority. Cloud storage is a relatively new option, and both businesses and government entities have been flocking to it. Synergy Research Group reports that the worldwide cloud computing market grew 28 percent to $110B in revenues in 2015. In a similar vein, Technology Business Research projects that global public cloud revenue will increase from $80B in 2015 to $167B in 2020.

By taking part in the Cloud, organizations are using shared hosting facilities, which carries with it the risk of exposing critical data to surreptitious elements – not to mention the challenges associated with jurisdictional hazards. Organizations of all sizes are subject to leaky Internet and leased lines. As the world shifts away from legacy systems to more agile software solutions, it is becoming clear that the time is now for a paradigm shift in how to store, access and archive sensitive data.

The Need for a New Storage Model

Enterprises and government agencies need a better way to securely store and transport their sensitive data. What if there was a way to bypass the Internet and leased lines entirely to mitigate exposure and secure sensitive data from hijacking, theft and espionage, while reducing costs both from an infrastructure and risk perspective?

Though it may sound like science fiction to some, such an option is possible, and it’s become necessary for two main reasons:

  • Threatening Clouds – Cloud environments currently run on hybrid public and private networks using IT controls that are not protective enough to stay ahead of real-time cyber security threats. Enterprise data is maliciously targeted, searchable or stolen. Sensitive data can be subjected to government agency monitoring and exposed to acts of industrial espionage through unauthorized access to enterprise computers, passwords and cloud storage on public and private networks.
  • Questions of Jurisdiction – Due to government regulations, critical information could be restricted or exposed, especially when it has regularly been replicated or backed up to an undesirable jurisdiction at a cloud service provider’s data center. Diplomatic privacy rules are under review by governments intent on restricting cross-jurisdictional access and transfer of the personal and corporate data belonging to their citizens. This has created the requirement for enterprises to operate separate data centers in each jurisdiction – financially prohibitive for many medium-sized enterprises.

Storage Among the Stars

What government and private organizations need is an independent cloud infrastructure platform, entirely isolating and protecting sensitive data from the outside world. A neutral, space-based cloud storage network could provide this. Enterprise data can be stored and distributed to a private data vault designed to enable secure cloud storage networking without any exposure to the Internet and/or leased lines. Resistant to natural disasters and force majeure events, its architecture would provide a truly revolutionary way of reliably and redundantly storing data, liberating organizations from risk of cyberattack, hijacking, theft, espionage, sabotage and jurisdictional exposures.

A storage solution of this type might at first seem prohibitively expensive, but costs would run the same or less to build, operate and maintain as terrestrial networks. Further, it would serve as a key market differentiator for cloud service providers who are looking for solutions that provide physical protection of their customers’ critical information. This is because such a system would need to include its own telecom backbone infrastructure to be entirely secure.  While this is extremely expensive to accomplish on the ground, it need not be the case if properly architected as a space-based storage platform.

Sooner than many might think, governments and enterprises will begin to use satellites for the centralized storage and distribution of sensitive or classified material, the storage and protection of video and audio feeds from authorized personnel in remote locations, or the distribution of video and audio gathered by drones.

Escaping Earth’s Orbit

Cyber criminals don’t seem to be slowing their assault on the network, which means data breaches of Earth-based storage solutions will continue. Organizations need to think outside the Cloud in order to keep their critical data secure, both while being stored and in transit. The technology exists today to make satellite storage a reality, and for those who are working hard to stay ahead of malicious actors, it can’t arrive soon enough.

About the author

Scott Sobhani, CEO and cofounder of Cloud Constellation Corporation and the SpaceBelt Information Ultra-Highway, is an experienced telecom executive with over 25 years in executive management positions, most recent as VP for business development and commercial affairs at International Telecom Advisory Group (ITAG). Previous positions include CEO of TalkBox, VP & GM at Lockheed Martin, and VP, GM & senior economist at Hughes Electronics Corporation.

Mr. Sobhani was responsible for closing over $2.3 billion in competitive new business orders for satellite spacecraft systems, mobile network equipment and rocket launch vehicles. He co-authored “Sky Cloud Autonomous Electronic Data Storage and Information Delivery Network System”, “Space-Based Electronic Data Storage and Network System” and “Intermediary Satellite Network for Cross-Strapping and Local Network Decongestion” (each of which are patent pending). He has an MBA from the University of Southern California, and a bachelor’s degree from the University of California, Los Angeles.


Got an idea for a Blueprint column?  We welcome your ideas on next gen network architecture.
See our guidelines.



Microsoft Azure to Offer GE Industrial Internet Capabilities

GE and Microsoft announced a partnership that will make GE’s Predix platform for the Industrial Internet available on the Microsoft Azure cloud. The agreement, which marks the first step in a broad strategic collaboration between the two companies, will allow customers to capture intelligence from their industrial assets and take advantage of Microsoft’s enterprise cloud applications.

Specifically, the Azure cloud will provide Predix customers with scalable infrastructure, data sovereignty, hybrid capabilities, and advanced developer and data services. In addition, GE and Microsoft plan to integrate Predix with Azure IoT Suite and Cortana Intelligence Suite along with Microsoft business applications, such as Office 365, Dynamics 365 and Power BI, in order to connect industrial data with business processes and analytics.

“Connecting industrial machines to the internet through the cloud is a huge step toward simplifying business processes and reimagining how work gets done,” said Jeff Immelt, CEO of GE. “GE is helping its customers extract value from the vast quantities of data coming out of those machines and is building an ecosystem of industry-leading partners like Microsoft that will allow the Industrial Internet to thrive on a global scale.”

http://www.microsoft.com
http://www.ge.com

GE to enter Cloud Services Market with Predix Cloud

GE announced plans to enter the cloud services market with its Predix Cloud -- a platform-as-a-service (PaaS) designed specifically for industrial data and analytics.

GE said it is developing Predix Cloud to capture and analyze the unique volume, velocity and variety of machine data within a highly secure, industrial-strength cloud environment. Machine data is expected drive the next phase of growth for the Industrial Internet and enable developers to rapidly create, deploy and manage applications and services for industry.

“Cloud computing has enabled incredible innovation across the consumer world. With Predix Cloud, GE is providing a new level of service and results across the industrial world,” said Jeffrey Immelt, CEO of GE. “A more digital hospital means better, faster healthcare. A more digital manufacturing plant means more products are made faster. A more digital oil company means better asset management and more productivity at every well. We look forward to partnering with our customers to develop customized solutions that will help transform their business.”

Predix Cloud will leverage GE’s deep domain expertise in information technology (IT) and its operational technology (OT).  GE businesses will begin migrating their software and analytics to the Predix Cloud in Q4 2015, and the service will be commercially available to customers and other industrial businesses for managing data and applications on Predix Cloud in 2016.

https://www.gesoftware.com/predix

China's Huaxintong Semi Licenses ARMv8-A for Servers

ARM has licensed its ARM v8-A architecture to Huaxintong Semiconductor Technology to accelerate advanced server chipset technologies in the rapidly expanding Chinese server market.

ARM said this multiyear license will enable Chinese companies to deliver ARM-based server technologies in their home market, enabling large scale deployment of the most efficient server solutions available.

Huaxintong Semiconductor Technology is a joint venture between China’s Guizhou province and a subsidiary of Qualcomm. The venture is registered in Guizhou province, which is already home to a data center cluster of more than 2.5 million servers for companies including China Mobile, China Telecom and China Unicorn.

“China is facing a mountain of data flowing through its data centers and finding solutions that are cost effective and efficient is a key priority,” said Pete Hutton, executive vice president and president of products groups, ARM. “Huaxintong Semiconductor Technology is creating a blueprint for new server systems and engaging with them is opening up exciting new opportunities for ARM-based servers. This is a chance to reinvent the economics and performance of hyperscale data centers and enables China to architect its own server chips based on ARM technologies for its domestic market and for the rest of the world.”

http://www.arm.com


Mirantis Revs OpenStack 9.0

Mirantis announced its next major release.

Mirantis OpenStack 9.0, which is based on the Mitaka Openstack release, includes enhancements to Fuel, the OpenStack management software project, aimed simplifying the task of operating private OpenStack clouds. Cloud operators can use Fuel to scale the cloud up or down, selectively make changes to their configuration, and deploy new functionality to an existing cloud, such as Murano, a self-service application orchestration and catalog. Additionally, operators of large-scale infrastructure can now export Fuel configuration values into 3rd party configuration management tools.

“The improvements in Mirantis OpenStack 9.0 are based on real-world production deployments of Mirantis OpenStack, including our collaborations with AT&T and Volkswagen,” said Boris Renski, co-founder and CMO of Mirantis. “The improvements we made - largely in the area of post-deployment operations - integrate Mirantis’ services expertise into the software so that we can deliver better business outcomes. Mirantis OpenStack 9.0 will be a valuable asset to Mirantis as we help customers build and operate private clouds.”

The new release also brings enhancements to make it easier to run high performance workloads in OpenStack. Improvements to network function virtualization infrastructure (NFVI) performance include features such as SROIV, DPDK, NUMA CPU pinning, and huge pages.

http://www.mirantis.com

CyrusOne Builds Data Center Capacity Fast

CyrusOne announced the commissioning of a significant amount of power and a new data hall at its San Antonio II data center. The company added 9 megawatts of critical power in conjunction with the newly finished data hall. The San Antonio II data center is expected to encompass up to 372,000 square feet upon second-phase completion.

“The cloud market is experiencing a whirlwind of growth and demand, testing even the most agile cloud service providers and enterprises to meet highly aggressive time-to-market requirements,” said Gary Wojtaszek, president and CEO, CyrusOne. “To support these cloud and high-growth enterprise companies, CyrusOne has been relentless in setting record delivery times that enable customers to go hyperscale at hyperspeed to meet their business needs.”

Cyrus also noted the recent completion with its second data center building in Sterling, Va. The 232,000-square-foot Sterling II data center facility added 159,000 colocation square feet and 30 megawatts of critical power to the company’s Northern Virginia data center campus.

http://www.cyrusone.com

Equinix Plans Huge, New Data Center in Amsterdam

Equinix unveiled plans to build a new International Business Exchange™ (IBX) in Amsterdam at its existing campus at the Amsterdam Science Park.

Equinix said it will invest $113 million, in the first phase of development for the new data center, called AM4, will support growing demand for Platform Equinix™ in Amsterdam, one of the most network-dense locations globally. which will house 1,555 cabinets. On completion of the four expansion phases the facility will represent a total investment of $189 million and provide 4,200 cabinets on eight floors of data center space with a total usable floor area of more than 124,000 square feet (11,500 square meters). The building and first phase is expected to be completed and operational by Q2 2017.

Equinix also announced that it is now offering direct connectivity to AWS in Amsterdam.

http://www.equinix.com

Intel Appoints Dr. Tsu-Jae King Liu to Board

Intel announced the appointment of Dr. Tsu-Jae King Liu to its board of directors.

Liu, 53, holds a distinguished professorship endowed by TSMC in the Department of Electrical Engineering and Computer Sciences (EECS), in the College of Engineering at the University of California, Berkeley where she also serves as associate dean for Academic Planning and Development. Liu’s previous administrative positions within the College of Engineering include associate dean for research and EECS department chair. She has also held research and engineering positions at the Xerox Palo Alto Research Center and Synopsys Inc.

Liu holds over 90 patents and has received numerous awards for her research, including the Intel Outstanding Researcher in Nanotechnology Award (2012) and the SIA University Researcher Award (2014). Currently, her research is focused on nanometer-scale logic and memory devices, and advanced materials, process technology and devices for energy-efficient electronics. She received B.S., M.S. and Ph.D. degrees in electrical engineering from Stanford University in 1984, 1986 and 1994, respectively.

http://www.intel.com

Monday, July 11, 2016

Cisco Builds its Cloud-based Security

Cisco rolled out a number of new services and cloud-based security solutions, including:

Cisco Umbrella Roaming: a centralized, cloud-delivered protection is the simplest way to remove off-network blind spots, guarding roaming employees wherever they work. Umbrella Roaming is now embedded as a module with AnyConnect (Cisco’s VPN solution). It adds a new layer of off-network protection that blocks connections to malicious sites without needing to deploy another agent.

Cisco Umbrella Branch:  a cloud-delivered solution gives businesses more control over guest Wi-Fi use with easy content filtering. With Umbrella Branch, businesses can simply upgrade Integrated Services Routers (ISR) for simple, fast and comprehensive security at branch locations.

Cisco Defense Orchestrator: This cloud-based management application enables users to easily and effectively manage a large security infrastructure and policies in distributed locations across thousands of devices through a simple cloud-based console. It cuts through complexity to manage security policies across Cisco security products from ASA and ASAv firewalls to Cisco Firepower next-generation firewalls and ASA with FirePOWER Services featuring Firepower Threat Defense, and OpenDNS.

Cisco Meraki MX Security Appliances with Advanced Malware Protection (AMP) and Threat Grid: a cloud-managed unified threat management (UTM) solution simplifies advanced threat protection for the distributed enterprise, providing branch offices with malware protection that checks files against its cloud database to identify malicious content, blocking the files before users download them.

Cisco Stealthwatch Learning Network License: This component enables the Cisco ISR to act as a security sensor and enforcer for branch threat protection. It allows businesses to detect and track anomalies in network traffic, analyze suspicious network activity, and identify malicious traffic.

“Digital business is the most impactful disruption to security in the history of the technology industry. As a result, companies are struggling to manage the security challenges from both large, distributed environments and the active adversaries aggressively targeting these expansive attack surfaces every day. Our customers are finding that they need a more integrated approach to security, and Cisco provides them with a threat-centric security architecture that is much more effective in a digital world,” stated David Goeckeler, Senior Vice President and General Manager, Networking and Security Business, Cisco Systems

http://www.cisco.com

CenturyLink Launches Location-Based Analytics

CenturyLink introduced Location-Based Analytics, a mobile engagement, analytics and marketing solution for owners and operators of spaces and places where customers congregate.
CenturyLink Location-Based Analytics is anchored by the CenturyLink Managed Wi-Fi solution that features high-capacity Cisco Meraki Wi-Fi access points and security appliances for delivering fast data connection speeds and reliable coverage for equipment sensors and visitors' mobile devices.

CenturyLink said its fully managed solution can garner relevant customer demographic and location data and then apply analytics capabilities to discern business insights, and social engagement and marketing tools for businesses to act upon data findings in real-time.

"Location-Based Analytics from CenturyLink empowers owners and operators of venues and spaces – such as amusement parks, college campuses, retail destinations and transportation hubs – to make deeper emotional connections with individual customers in real-time," said Troy Trenchard, vice president, product management, CenturyLink.

http://www.centurylink.com

Gigamon Brings Automated Data Center Topology Visualization

Gigamon announced an automated network topology visualization - for managing visibility infrastructure at scale in large data centers.

The new capability provides end-to-end visualization of Gigamon's Visibility Fabric components, associated production networks and interfaces from which traffic is sourced and the connected security and monitoring tools that analyze this traffic. It provides automated discovery of the attached networks using the Link Layer Discovery Protocol (LLDP) or Cisco Discovery Protocol (CDP).

Gigamon said that when security and monitoring tools attached to its Security Delivery Platform detect abnormal behavior, an administrator can quickly trace back to the network interfaces at the source of the abnormality, significantly reducing the mean time to resolution (MTTR).

“We remain committed to offering vendor-agnostic visibility and are uniquely able to meet the needs of customers with large data centers and clouds,” said Sesh Sayani, Director of Product Management, Gigamon. “We recognize that the Visibility Fabric is rapidly becoming essential data center infrastructure and we will continue to be customer-driven in the capabilities that make our market-leading solution easy to use for troubleshooting and security.”

http://www.gigamon.com

UEFA Euro 2016 Brought Increased Cyber Threats to Fans

The risks for digitally active sports fans more than doubled during the 2016 UEFA European Championship, according to a new report from Allot Communications, in collaboration with Kaspersky Lab. The research analyzed the mobile app and website usage of one million randomly selected mobile subscribers from countries participating in EUFA Euro 2016, before and during the matches.

Some highlights of the Allot MobileTrends Report UEFA Euro 2016: How Sports Events Put Mobile Users at Risk:

  • 17% of mobile users who exhibited little or no use of sports apps or websites before the games, became active sports fans during the games. Nearly 50% of these “casual fans” transitioned into sports fans’ behavior profiles with high potential risk for malware.
  • The total number of mobile sports fans at high risk for cyber threats more than doubled during the games.
  • Increases in online sports betting and social networking are major contributors to increased cyber security risk.
  • The number of mobile sports fans accessing betting sites more than doubled during the tournament’s matches. Before the matches, 1 in 9 users visited sports betting sites. During the matches, 1 in 4 visited sports betting sites.
  • During matches, the average time users spent on social media apps or sites tripled over their pre-tournament activity.

"Cyber-criminals often use big events to lure users with phishing emails and fake websites, exposing fans to intensified and new potential cyber risks. Users should be aware of potential threats and lookout when clicking on links, entering their credentials on websites or making financial transactions.” noted Alexander Karpitsky, Head of Technology Licensing, Kaspersky Lab. “We at Kaspersky Lab recommend sports fans take a proactive approach to their online security, especially when mobile, safeguarding their devices with IT security solutions at all times.”

“As sports fans are going mobile and devices are used widely for watching, recording, and sharing experiences, users must protect themselves online, and CSPs are in the best position to deliver these value added services to subscribers,” said Yaniv Sulkes, AVP Marketing at Allot Communications. “Since major sports events are shown to be times of high risk for mobile users, it’s also when mobile service providers have an opportunity to educate customers regarding malware risks and to offer network-based security services to protect mobile devices. With Rio 2016 Summer Olympics on the horizon, mobile operators who adopt a proactive cyber protection strategy for their customers will be able to leverage monetization opportunities.”

http://www.allot.com


Pluribus Advances its Network Monitoring Solution

Pluribus Networks introduced its VCFcenter -- a single pane of glass that combines a big data approach to network visibility with web-scale analytics to offer a business-level network analytics solution.

VCFcenter, which can be deployed in any new or existing campus, branch or data center, is ananalytics platform that provides a wide range of foundational services, including secure user access, common user interface and shared data repository to all of the applications that are hosted within its framework.

Pluribus said its VCFcenter allows organizations to collect and analyze contextual information about business service application flows, and can scale into the billions of flows for web-scale applications. VCFcenter and its applications provide performance metrics associated with the use of any business service, from the packet, to the network flow or even the application level.

“The Modern Enterprise is rapidly adopting software-defined and hyperconverged compute and storage solutions due to the simplicity and value they offer the end user. They are also adding big data, mobility and even IoT applications to their IT strategies,” said Tom Burns, VP and GM of Networking Products at Dell. “The network itself and the business-level analytics which can be derived from it have once again become a critical success factor for our customers. Working with Pluribus enables Dell and its partners to offer an affordable and complementary business-management solution at the network layer.”

http://www.pluribus.com

Sunday, July 10, 2016

Google Acquires Anvato for Media Content Platform

Google has acquired Anvato, a start-up based in Mountain View, California that provides video processing software for TV operators, programmers, and broadcasters. Financial terms were not disclosed.

Anvato said its software will now be offered on the Google Cloud Platform.

Anvato lists NBCUniversal, Univision, Scripps Networks, Fox Sports, and Media General as among its customers.

Anvato is headed by Alper Turgut, who previously led Verkata and was co-founder of Aligo. The team also includes Ismail Haritaoglu, Mehmet Altinel, Oztan Hamanci, and Matt Smith.

Anvato was funded by Oxantium Ventures.

https://cloudplatform.googleblog.com/2016/07/welcome-Anvato-to-the-Google-Cloud-Platform-team.html
http://www.anvato.com

Avast + AVG Merger Extends Cyber Reach to 400 Million Endpoints

Prague-based Avast Software agreed to acquired AVG Technologies N.V. for $25.00 per share in cash, for a total consideration of approximately $1.3 billion. The offer price represents a 33% premium over the July 6, 2016 closing price and a premium of 32% over the average volume weighted price per share over the past six months.

Both companies offer antivirus software and Internet security services.  Both were founded in the Czech Republic in the late 1980s and early 1990s.

The combined company will cover more than 400 million endpoints, of which 160 million are mobile. The increase scale of the combined company will provide a deeper network of de facto sensors for tracking emerging cyber threats across the Internet.

Avast currently serves over 230 million endpoints.  The company has over 500 staff in Prague, Czech Republic, with additional offices in the USA, Germany, China, South Korea and Taiwan.

AVG, which is now based in Amsterdam, serves over 200 million endpoints.

"We are in a rapidly changing industry, and this acquisition gives us the breadth and technological depth to be the security provider of choice for our current and future customers," said Vince Steckler, chief executive officer of Avast Software. "Combining the strengths of two great tech companies, both founded in the Czech Republic and with a common culture and mission, will put us in a great position to take advantage of the new opportunities ahead, such as security for the enormous growth in IoT."

"We believe that joining forces with Avast, a private company with significant resources, fully supports our growth objectives and represents the best interests of our stockholders," said Gary Kovacs, chief executive officer, AVG. "Our new scale will allow us to accelerate investments in growing markets and continue to focus on providing comprehensive and simple-to-use solutions for consumers and businesses, alike. As the definition of online security continues to shift from being device-centric, to being concerned with devices, data and people, we believe the combined company, with the strengthened value proposition, will emerge as a leader in this growing market."

http://www.avast.com
http://www.avg.com/

Polycom Drops Mitel Merger, Agrees to Private Equity Buyout

The Board of Directors of Polycom terminated a previously announced merger agreement with Mitel Networks Corporation, and instead approved a new merger agreement with Triangle Private Holdings I and Triangle Private Merger Sub, entities affiliated with Siris Capital Group.

Under the new deal with Siris, outstanding shares of common stock of Polycom will be exchanged for $12.50 per share in cash at the completion of the merger.

On July 7, 2016, Mitel Networks Corporation waived its right to renegotiate its merger agreement with Polycom after receipt of notice of the Polycom board’s determination that Siris was offering a superior deal. Polycom will pay a merger termination fee to Mitel.

http://www.polycom.com





Mitel to Acquire Polycom for Nearly $2 Billion


Mitel agreed to acquire all of the outstanding shares of Polycom common stock in a cash and stock transaction valued at approximately $1.96 billion, including $3.12 in cash and 1.31 Mitel common shares for each share of Polycom common stock, or $13.68 based on the closing price of a Mitel common share on April 13, 2016 -- a 22% premium to Polycom shareholders based on Mitel's and Polycom's recent share prices. The deal combines Mitel's leadership...

Thursday, July 7, 2016

Veriflow Pioneers Mathematical Network Verification

Veriflow, a start-up based in San Jose, announced $8.2 million in Series A funding for its work in network breach and outage prevention.

Veriflow said it uses formal mathematical network verification to eliminate change-induced network outages and breaches. The technique was created by a team of computer science professors and Ph.D. students at the University of Illinois at Urbana-Champaign.

The funding round was led by Menlo Ventures and included current investor New Enterprise Associates (NEA).

“The feedback from customers and analysts indicates the market is ready for a new approach to network breach and outage prevention. Our use of mathematical network verification, grounded in data-plane information, gives customers a proactive approach to identifying vulnerabilities before they are exposed to catastrophic problems,” said James Brear, president and CEO of Veriflow. “Veriflow provides a comprehensive view of the network that gives administrators the confidence to make changes without fear of damaging critical services and layers of defense. We’ve spent several years developing our innovative technology, and this funding will enable us to hire key talent, bring our product to market more quickly and expand into new markets.”

Veriflow’s automated approach predicts how and if network policies will be violated before an incident occurs.

http://veriflow.net


  • Veriflow exited stealth mode in April 2016 with $2.9 million in initial investor funding from New Enterprise Associates (NEA), the National Science Foundation and the U.S. Department of Defense.
  • Veriflow is led by James Brear, who was previously CEO of Procera until its successful acquisition in August 2015, along with the company’s founders, who include Fulbright and Alfred P. Sloan fellows and an ACM SIGCOMM Rising Star awardee. 


Latest Kubernetes Release Scales for 2,000-node Clusters

Newly released version 1.3 of Kubernetes brings supports 2000-node clusters.  The new release also adds better end-to-end pod startup time, with latency of API calls within one-second Service Level Objective (SLO).

One new features is Kubemark, a performance testing tool to detect performance and scalability regressions.

http://blog.kubernetes.io/

Can Docker Become the Dominant Port Authority for Workloads Between Cloud?


If you think these little snippets of Linux source code might have limited revenue-bearing potential given the fact that anyone can activate them on an open source basis, then you might want to consider DockerCon 2016, which was held June 19-20 at the Washington State Convention Center in Seattle.  DockerCon is an annual technology conference for Docker Inc., the much touted San Francisco-based start-up that developed and popularized Docker runtime Linux containers, which are no longer proprietary but hosted as an open source project under the Linux Foundation.  Docker Inc. (the company) is among the rarified “unicorns” of Silicon Valley – start-ups with valuations exceeding $1 billion based on a really hot idea, but with nascent business models and perhaps limited revenue streams at this stage of their development.



Even with a conference ticket price of $990, DockerCon 2016 in Seattle was completely sold out.  Over 4,000 attendees showed up and there was a substantial waiting list. For comparison, last year, DockerCon in San Francisco had about 2,000 people. The inaugural DockerCon event in 2014 was attended by about 500 people. The conference featured company keynotes, technology demonstrations, customer testimonials, and an exhibition area with dozens of vendors rushing into this space. Big companies exhibiting at DockerCon included Microsoft, IBM, AWS, Cisco, NetApp, HPE and EMC.
 
Punching way above its size, Docker rented Seattle's Space Needle and EMP museum complex to feed and entertain all 4,000+ guests on the evening of the summer solstice.  

Clearly, Docker’s investors are making a big bet that the company grow from being the inventor of an open source standard.

Why should the networking and telecom community care about a new virtualization format at the OS level?

There is a game plan afoot to put Docker at the crossroads of application virtualization, cyber security, service orchestration, and cloud connectivity.  Docker enables applications to be packed into a standard shipping container, enabling software contained within to run the same regardless of the underlying infrastructure. Compared with virtual machines (VMs), containers launch quicker.  The container includes the application and all of its dependencies.  However, containers make better use of the underlying servers because they share the kernel with other containers, running as isolated processes in user space on the host operating system.  The vision is to allow these shipping containers to move easily between servers or between private and public clouds.  As such, by controlling the movement of containers, you essentially control the movement of workloads locally and across the wide area network. The applications running within containers need to remain securely connected to data and processing resources from wherever the container may be located. Thus, software-defined networking becomes part of the containerization paradigm. Not surprisingly, we are seeing a lot of Silicon Valley’s networking talent move from the established hardware vendors in San Jose to the new generation of software start-ups in San Francisco, as exemplified by Docker Inc.
 
The Timeline of Significant Events for Docker

Docker was started by Solomon Hykes as an internal project at dotCloud, a platform-as-a-service company based in France and founded around 2011. The initial Docker work appears to have started around 2012/3 and the project soon grew to become the major focus of the company, which adopted the Docker name.  The official launch of Docker occurred on March 13, 2013 in a presentation by Solomon Hykes entitled “The Future of Linux Containers” hosted at the PyCon industry conference.  Soon after, the Docket whale icon was posted and a developer community began to form.

In May 2013, dotCloud hired Ben Golub as CEO with a goal of restructuring from the PaaS business to the huge opportunity it now saw in building and orchestrating cloud containers. Previously, Golub was CEO of Gluster, another open source software company but which focused on scale-out storage.  Gluster offered an open-source software-based network-attached filesystem that could be installed on commodity hardware.  The Silicon Valley company successfully raised venture funding, grew its customer based quickly, and was acquired by Red Hat in 2011. 

Within 3 months of joining Docker, Golub established an alliance with Red Hat. A second round of venture funding, led by Greylock Partners, brought in $15 million. Headquarter were moved to San Francisco.  In June 2014, Docker 1.0 was officially released, marking an important milestone for the project.

In August 2014, Docker sold off its original dotCloud (PaaS) business to Berlin-based cloudControl, however, the operation was shut down earlier this year after a two-year struggle. Other dotCloud engineers credited with work on the initial project include Andrea Luzzardi and Francois-Xavier Bourlet. A month later, in September 2014, Docker secured $40 million in a series C funding round that was led by Sequoia Capital and included existing investors Benchmark, Greylock Partners, Insight Ventures, Trinity Ventures, and Jerry Yang.  

In October 2014, Microsoft announced integration of the Docker engine into its upcoming Windows Server release, and native support for the Docker client role in Windows.  In December 2014, IBM announced a strategic partnership with Docker to integrate the container paradigm into the IBM Cloud.  A year and a half later, in June 2015, IBM's Bluemix platform-as-a-service began supporting Docker containers. IBM Bluemix also supports Cloud Foundry and OpenStack as key tools for designing portable distributed applications. Additionally, IBM claims the industry's best performance of Java on Docker. IBM Java is optimized to be two times faster and occupies half the memory when used with the IBM Containers Service. Moreover, as a Docker based service, IBM Containers include open features and interfaces such as the new Docker Compose orchestration services.

In March 2015, Docker acquired SocketPlane, a start-up focused on Docker-native software defined networking. SocketPlane had only been founded a few months earlier by Madhu Venugopal, who previously worked on SDN and OpenDaylight while at Cisco Systems, before joining Red Hat as Senior Principal Software Engineer.  These SDN capabilities are now being integrated into Docker.

In April 2015, Docker raised $95 million in a Series D round of funding led by Insight Venture Partners with new contributions from Coatue, Goldman Sachs and Northern Trust. Existing investors Benchmark, Greylock Partners, Sequoia Capital, Trinity Ventures and Jerry Yang’s AME Cloud Ventures also participated in the round.

In October 2015, Docker acquired Tutum, a start-up based in New York City. Tutum developed a cloud service that helps IT teams to automate their workflows when building, shipping or running distributed applications. Tutum launched its service in October 2013. 

In November 2015, Docker extended is Series D funding round by adding $18 million in new investment.  This brings total funding for Docker to $180 million.

In January 2016, Docker acquired Unikernel Systems, a start-up focused on unikernel development, for an undisclosed sum. Unikernel Systems, which was based in Cambridge, UK, was founded by pioneers from Xen, the open-source virtualization platform. Unikernels are defined by the company as specialized, single-address-space machine images constructed by using library operating systems. The idea is to reduce complexity by compiling source code into a custom operating system that includes only the functionality required by the application logic. The unikernel technology, including orchestration and networking, is expected to be integrated with the Docker runtime, enabling users to choose how they ‘containerize’ and manage their application - from the data center to the cloud to the Internet of Things.

Finally, at this year’s DockerCon conference, Docker announced that it will add built-in orchestration capabilities to it Docker Engine.  This will enable IT managers to form a self-organizing, self-healing pool of machines on which to run multi-container distributed applications – both traditional apps and microservices – at scale in production. Specifically, Docker 1.12 will offer an optional “Swarm mode” feature that users can select to “turn on” built-in orchestration, or they can also elect to use either their own custom tooling or third-party orchestrators that run on Docker Engine. The upcoming Docker 1.12 release simplifies the process of creating groups of Docker Engines, also known as swarms, which are now backed by automated service discovery and a built-in distributed datastore. The company said that unlike other systems, the swarm itself has no single point of failure. The state of all services is replicated in real time across a group of managers so containers can be rescheduled after any node failure. Docker orchestration includes a unique in-memory caching layer that maintains state of the entire swarm, providing a non-blocking architecture which assures scheduling performance even during peak times. The new orchestration capabilities go above and beyond Kubernetes