Showing posts with label Google Cloud. Show all posts
Showing posts with label Google Cloud. Show all posts

Thursday, May 17, 2018

Google Cloud acquires Cask for big data ingestion on-ramp

Google Cloud will acquire Cask Data Inc., a start-up based in Palo Alto, California, that offers a big data platform for enterprises. Financial terms were not disclosed.

The open source Cask Data Application Platform (CDAP) provides a data ingestion service that simplifies and automates the task of building, running, and managing data pipelines. Cask says it cuts down the time to production for data applications and data lakes by 80%. The idea is to provide a standardization and simplification layer that allows data portability across diverse environments, usability across diverse groups of users, and the security and governance needed in the enterprise.

Google said it plans to continue to develop and release new versions of the open source Cask Data Application Platform (CDAP).
a
“We’re thrilled to welcome the talented Cask team to Google Cloud, and are excited to work together to help make developers more productive with our data processing services both in the cloud and on-premise. We are committed to open source, and look forward to driving the CDAP project’s growth within the broader developer community,” stated William Vambenepe, Group Product Manager, Google Cloud

Over the past 6+ years, we have invested heavily in the open source CDAP available today and have deployed our technology with some of the largest enterprises in the world. We accomplished great things as a team, had tons of fun and learned so much over the years. We are extremely proud of what we’ve achieved with CDAP to date, and couldn’t be more excited about its future.

Cask was founded by Jonathan Gray and Nitin Motgi.


Wednesday, May 9, 2018

Google Cloud intros managed in-memory data store for Redis

Google launched a public beta of a fully-managed in-memory data store service for Redis,

The company says its Cloud Memorystore provides a scalable, more secure and highly available Redis service that is fully compatible with open source Redis, letting you migrate your applications to Google Cloud Platform (GCP) with zero code changes.

Redis is an open-source in-memory database project implementing a distributed, in-memory key-value store. It supports data structures and features like persistence, replication and pub-sub.

Tuesday, May 8, 2018

Google Cloud Platform to open region in Zurich

Google Cloud Platform (GCP) is building a new region in Zürich, Switzerland. The facility is expected to be online in the first half of 2019.

This will be GCP's sixth region in Europe, joining our future region in Finland, and existing regions in the Netherlands, Belgium, Germany, he United Kingdom, and another facility under development in Finland. With Switzerlan, GCP will have 20 regions in service or announced.




Tuesday, April 24, 2018

Megaport's SDN now extends to Google Cloud's global network

Megaport has added support for Google Cloud's Partner Interconnect, a service from Google Cloud that allows customers to privately connect to Google Cloud Platform.

Google Cloud's Partner Interconnect is a new product in the Google Cloud Interconnect family. Last September, Google announced Dedicated Interconnect, which provides higher-speed and lower-cost connectivity than VPN, and has become the go-to solution to connect on-premises data centres with the cloud.

Megaport said it is now providing connectivity to the nearest Google edge Point of Presence at a variety of sub-rate interface speeds varying from 50 Mbps to 10 Gbps.

"Partner Interconnect gives Google Cloud customers even more connectivity choices for hybrid environments," said, John Veizades, Product Manager, Google Cloud. "Together with Megaport, we are making it easier for customers to extend their on-prem infrastructure to the Google Cloud Platform."

"Scalable connectivity to Google Cloud Platform ensures that cloud-enabled applications perform to meet mission-critical business requirements," said Vincent English, CEO of Megaport. "Google Cloud brings tremendous value to the Megaport Ecosystem and empowers our customers to address a wide variety of business needs. We have been working with Google Cloud since our inception and we are excited to grow and evolve our integration to ensure the next generation of business growth."

Wednesday, April 4, 2018

Google Cloud adopts P4Runtime as foundational for network programming

Google Cloud will use P4Runtime as the foundation for its next generation of data centers and wide area network control-plane programming, according to a new blog posting from Jim Wanderer, Engineering Director at Google and Amin Vahdat, Google Fellow.

P4 is a programming language was designed to be target-independent (i.e. a program written in P4 could be compiled, without modification, to run on a variety of targets, such as ASICs, FPGAs, CPUs, NPUs, and GPUs), and protocol-independent (i.e. a P4 program can describe existing standard protocols, or be used to specify innovative, new, customized forwarding behaviors). P4 can be used for both programmable and fixed-function devices alike. For example, it is used to capture the switch pipeline behavior under the Switch Abstraction Interface (SAI) APIs. P4 is also used by the ONF Stratum project to describe forwarding behavior across a variety of fixed and programmable devices. The P4 Language Consortium (P4.org), creator of the P4 programming language, recently transitioned to become a project of the Open Networking Foundation (ONF) and part of the Linux Foundation portfolio.

Google, which designs and builds its own hardware switches and software, sees the P4Runtime as a new way for control plane software to program the forwarding path of a switch as it provides a well-defined API to specify the switch forwarding pipelines. The vision is that the P4Runtime will be used to control any forwarding plane, from a fixed-function ASIC to a fully programmable network switch.

Google is working with the Open Networking Foundation (ONF) on Stratum, an open source project to implement an open reference platform for a truly "software-defined" data plane, designed and built around the P4Runtime.


https://cloudplatform.googleblog.com/

Tuesday, February 13, 2018

Wix.com is onboard with Google Cloud Platform

Wix.com, which offers website services to small businesses, has chosen G Suite productivity and collaboration apps from Google as its exclusive cloud productivity offering for its users.

Wix.com already uses the Google Cloud Platform, Google Maps API, YouTube and AdWords.

Yuval Dvir, Head of Online Partnerships from Google Cloud added, "We are delighted to be part of Wix's growth and we will continue to help SMBs get the best of Google Cloud as part of our strategic partnership with Wix."

Sunday, December 3, 2017

Former Intel exec Diane Bryant joins Google Cloud

Diane Bryant, former Group President at Intel, has joined Google Cloud as Chief Operating Officer. She will report to Diane Greene,

Bryant is known for her leadership Intel’s Data Center Group (DCG) as general manager and executive vice president.  Intel's DCG generated $17 billion in revenue in 2016. She also serves on the board of United Technologies.



Tuesday, November 21, 2017

GCP trims pricing for cloud GPUs, SSDs

The Google Cloud Platform (GCP) is trimming the cost of NVIDIA Tesla GPUs attached to on-demand Google Compute Engine virtual machines by up to 36 percent. In US regions, each K80 GPU attached to a VM is priced at $0.45 per hour while each P100 costs $1.46 per hour.

This lowers the cost of running highly parallelized compute tasks on VMs and GPUs.

GCP is also lowering the price of preemptible Local SSDs by almost 40 percent compared to on-demand Local SSDs. In the US this means $0.048 per GB-month.


Saturday, September 9, 2017

Google Cloud Platform adds Dedicated Interconnect

Google Cloud Platform (GCP) is lauching Dedicated Interconnect at up to 80 Gb/s through a number of locations.

This allows customers to extend their corporate datacenter network and RFC 1918 IP space into Google Cloud as part of a hybrid cloud deployment.

Google said its Dedicated Interconnect offers increased throughput and even a potential reduction in network costs. It is also expected to bring advantages to applications with very large data sets.

Dedicated Interconnect is available in 10 Gb/s increments:

  • 10 Gb/s
  • 20 Gb/s (2 x10 Gb/s) 
  • 30 Gb/s (3 x10 Gb/s)
  • 40 Gb/s (4 x10 Gb/s)
  • 50 Gb/s (5 x10 Gb/s)
  • 60 Gb/s (6 x10 Gb/s)
  • 70 Gb/s (7 x10 Gb/s)
  • 80 Gb/s (8 x10 Gb/s)

Dedicated Interconnect can be configured to offer a 99.9% or a 99.99% uptime SLA.

https://cloud.google.com/interconnect/

Wednesday, May 10, 2017

Google Cloud Platform launches No. Virginia region

Google Cloud Platform (GCP) activated its latest cloud region: Northern Virginia (us-east4). The region has three zones (data centers) and now supports GCP compute, Big Data, storage and networking cloud services.

With this addition, Google now has four regions serving the Americas market including Oregon, Iowa and South Carolina. Future regions are planned in São Paulo, Montreal and California.

https://cloudplatform.googleblog.com/2017/05/Google-Cloud-Platform-launches-Northern-Virginia-region.html


Monday, March 27, 2017

Expanding horizons for Google Cloud Platform

The most compelling case for adopting the Google Cloud Platform is that it is the same infrastructure that powers Google's own services, which attract well over a billion users daily. This was the case presented by company executives at last week's Google Next event in San Francisco – "get on the Cloud… now", said Eric Schmidt, Executive Chairman of Alphabet, Google's parent; "Cloud is the most exciting thing happening in IT", said Diane Greene, SVP of Google Cloud.

Direct revenue comparisons between leading companies are a bit tricky, but many analysts place the Google Cloud Platform at No.4 behind Amazon Web Services, Microsoft Azure and IBM in the U.S. market. Over the past three years, Google invested $29.4 billion for its infrastructure, according to Urs Hölzle, SVP, Technical Infrastructure for Google Cloud, on everything from efficient data centres to customised servers, customised networking gear and specialised ASICs for machine learning.

Google operates what it believes to be the largest global network, carrying anywhere from 25% to 40% of all Internet traffic. Google's backbone interconnects directly with nearly every ISP and its private network has a point of presence in 182 countries, while the company is investing heavily in ultra-high-capacity submarine cables.

The argument goes that by moving to the Google Cloud Platform (GCP), enterprise customers move directly into the fast lane of the Internet, putting their applications one hop away from any end-user ISP they need to reach with less latency and fewer hand-offs. Two example of satisfied GCP customers that Google likes to cite are Pokemon Go and Snap Chat, both of which took a compelling application and brought it to global scale by riding the Google infrastructure.

One question is, does the Google global network give its Google Cloud Platform a decisive edge over its rivals? Clearly all the big players are racing to scale out their infrastructure as quickly as possible, but Google is striving to take it one step further – to develop core technologies in hardware and software that other companies later follow. Examples include containers, noSQL, serverless, Kubernetes, Map Reduce, TensorFlow, and more recently its horizontally-scalable Cloud Spanner database synchronisation service, which uses atomic-clocks running in every Google data centre.

Highlights of Google's initiatives include:

·         New data centres: three new GCP regions - California, Montreal and the Netherlands - bringing the total number of Google Cloud regions to six, and the company anticipates more than 17 locations in the future. The new regions will feature a minimum of three zones, benefit from Google's global, private fibre network and offer a complement of GCP services.

·         GCP the first public cloud provider to run Intel Skylake, a custom Xeon chip for compute-heavy workloads and a larger range of VM memory and CPU options. GCP is doubling the number of vCPUs that can run in an instance from 32 to 64 and offering up to 416 Gbytes of memory. GCP is also adding GPU instances. Google and Intel are collaborating in other areas as well, including hybrid cloud orchestration, security, machine and deep learning, and IoT edge-to-cloud solutions; Intel is also a backer of Google’s Tensor Flow and Kubernetes open source initiatives.

·         Google Cloud Functions: a completely serverless environment and the smallest unit of compute offered by GCP; it is able to spin up a single function and spin it back down instantly, so billing occurs only while the function is executing, metered to the nearest one hundred milliseconds.

·         Free services: a new free tier to the GCP that provides limited access to Google Compute Engine (1 f1-micro instance per month in U.S. regions and 30 Gbyte-months HDD), Google Cloud Storage (5 Gbytes a month), Google Pub/Sub (10 Gbytes of messages a month), and Google Cloud Functions (two million invocations per month).

·         Lower prices for GCE: 5% price drop in the U.S., 4.9% drop in Europe; 8% drop in Japan.

·         Google BigQuery data analytics service, including automated data movement from select Google applications, such as Adsense, DoubleClick and YouTube, directly into BigQuery.

·         Titan: a custom security chip codenamed Titan that operates at the BIOS level; Intel is also introducing new security tools to keep customer data secure. The chip authenticates the hardware and services running on each server.

·         Project Kubo toolset: a joint effort with Pivotal for packaging and managing software in a Kubernetes environment.

·         Engineering support plans ranging from $100 per user per month to $1,500 per user per month with a 15-minute response time.

·         Data loss prevention API to guard information inside the Google Cloud.

The Google Next event provided a number of sessions for looking over the horizon. In a 'fireside chat', Marc Andreesen and Vint Cerf speculated on the arrival of quantum computing and neural networking/machine learning services on the public clouds. Both possibilities are likely to augment current public cloud computing models rather than replace them. The types of applications could vary. For instance, a cloud-based quantum computing service might be employed to finally secure communications.

Google is also betting big that the cloud is the ideal platform for AI. Fei-fei Li, Chief Scientist for Cloud AI and ML at Google, observed that a few self-driving cars can put considerable data into the cloud. What happens when there are millions of such vehicles? Building on ramps for AI is the next step with API and SDKs that draw new applications onto Google's TensorFlow platform. The company discusses this in terms of 'democratising' AI, which means making sure its algorithms and cloud analytic systems become widely available before others move into this space.


A final differentiator for GCP is that Google is the largest corporate purchaser of renewal energy. In 2017, the company is on track to reach 100% renewal power for its global data centres and offices. One hopes that others will catch up soon.

See also