Showing posts with label Google Cloud. Show all posts
Showing posts with label Google Cloud. Show all posts

Thursday, June 6, 2019

Google to acquire Looker for cloud analytics

Google agreed to acquire Looker, a start-up with a platform for business intelligence, data applications and embedded analytics, for $2.6 billion in cash.

The Looker platform integrates data at web scale. The company claims 1,700 customers, including Amazon, Etsy, IBM, Kickstarter, Lyft, Sony, Spotify and The Economist. Looker is headquartered in Santa Cruz, California. Investors included CapitalG, Kleiner Perkins Caufield & Byers, Meritech Capital Partners, Premji Invest, Redpoint Ventures and Goldman Sachs.

“Google Cloud is being used by many of the leading organizations in the world for analytics and decision-making. The combination of Google Cloud and Looker will enable customers to harness data in new ways to drive their digital transformation,” said Thomas Kurian, CEO, Google Cloud. “We remain committed to our multi-cloud strategy and will retain and expand Looker’s capabilities to analyze data across Clouds.”

“The combination of Looker and Google Cloud advances our mission that we undertook from the beginning — to empower humans through the smarter use of data,” said Frank Bien, CEO, Looker. “Now, we’ll have greater reach, more resources, and the brightest minds in both Analytics and Cloud Infrastructure working together to build an exciting path forward for our customers and partners. Together, we are reinventing what it means to solve business problems with data at an entirely different scale and value point.”

https://looker.com/

Monday, June 3, 2019

Google Cloud outage attributed to configuration change

The widescale outage experienced by Google Cloud Platform on 02-June-2019 was caused by a configuration change that was intended for a small number of servers in a single region but was mistakenly applied to a larger number of servers across several neighboring regions, according to a blog posting by Benjamin Treynor Sloss, VP, 24x7, Google.

Google says the disruption caused a 10% drop in global views of YouTube during the incident, while Google Cloud Storage measured a 30% reduction in traffic.

https://cloud.google.com/blog/topics/inside-google-cloud/an-update-on-sundays-service-disruption

In a follow-up report tracking the Google Cloud outage, ThousandEyes said its monitoring data indicate that connectivity and packet loss issues impacted Google network locations in the eastern US, including Ashburn, Atlanta and Chicago. High packet loss conditions radiated out to Google’s network edge.  However, most GCP regions were unaffected by the outage, including regions in the US as well as in Europe and elsewhere.

More: https://blog.thousandeyes.com/google-cloud-platform-outage-analysis/



Sunday, June 2, 2019

ThousandEyes captures visualization of Google Cloud outage

From its monitoring network of 249 global vantage points, ThousandEyes detected the global outage with Google Cloud Platform beginning around 12-12:15pm PT.

ThousandEyes experienced 100% packet loss trying to reach a service hosted in GCP USwest. All traffic dropped at the edge of Google's network.

Angelique Medina, ThousandEyes product marketing director, stated: “ThousandEyes can confirm Google’s report of network congestion as a likely root cause of Sunday’s massive 4-hour outage, as we started seeing elevated packet loss in Google’s network as early as 12pm PT between sites on the eastern US, including Ashburn, Atlanta and Chicago, and various Google-hosted services. These issues started to impact users globally approximately 20 minutes prior to their public announcement of the issue, showing an early indication of what was to come. For the majority of the duration of the 4+ hour outage, ThousandEyes detected 100% packet loss for certain Google services from 249 of our global vantage points in 170 cities around the world. Starting at around 3:30pm PT, we started to see services slowly become reachable again, and the issue appeared to fully resolve by 4:45pm PT.”

https://www.thousandeyes.com/






Monday, May 6, 2019

ServiceNow form strategic partnership with Google Cloud

ServiceNow and Google Cloud announced a strategic partnership for delivering native support for ServiceNow’s IT Operations Management (ITOM) in Google Cloud Platform. The partnership will also deliver real-time language translation in ServiceNow’s IT Service Management  (ITSM) solution leveraging Google Cloud’s artificial intelligence (AI) and machine learning (ML) capabilities.

This partnership aims to make the Now Platform truly multi-cloud with the ability to support customers wherever their workloads reside - on-premises and across the major cloud vendors.

The companies said they are also working on additional AI and ML capabilities around document, speech and image understanding.

“This partnership is designed to accelerate our customers’ shift to the cloud and deliver digital workflows that unlock productivity and create great experiences for employees and the enterprise,” said Pablo Stern, senior vice president and general manager of IT Workflows at ServiceNow. “Joint customers will get the best of both worlds: a seamless experience that maximizes the value of cloud investments and the ability to harness the power of artificial intelligence for everyday work. I look forward to continuing the drumbeat of innovation as we work together to make the world of work, work better for people.”

“This partnership is natural; Google Cloud and ServiceNow will deeply integrate, enabling enterprises to optimize their cloud investments,” said Kevin Ichhpurani, Corporate Vice President, Global Partner Ecosystem at Google Cloud. “Our customers will be able to seamlessly connect what’s happening across IT and take action across Google Cloud.”

Wednesday, April 10, 2019

Google Cloud offers ice cold archive storage

Google Cloud announced new archive class of Cloud Storage designed for long-term data retention as an alternative to tape storage.

Pricing starts at $0.0012 per GB per month ($1.23 per TB per month).

Access and management are performed via the same APIs used by Googles other storage classes, with full integration into object lifecycle management. Data in Cloud Storage is always redundantly stored across availability zones with 11 9’s annual durability.

Google Cloud also announced the general availability of Cloud Filestore, a managed file storage system that’s built for high performance. Cloud Filestore’s premium instances will now provide increased read performance up to 1.2 GB/s throughput and 60k IOPS.

https://cloud.google.com/blog/products/storage-data-transfer/whats-cooler-than-being-cool-ice-cold-archive-storage

Tuesday, April 9, 2019

Google Cloud teams with partners for "Open Cloud"

Google Cloud committed to bring open source to the next level by announcing strategic partnerships with the following companies

Confluent: Founded by the team that built Apache Kafka, Confluent builds an event streaming platform that lets companies easily access data as real-time streams.

DataStax: DataStax powers enterprises with its always-on, distributed cloud database built on Apache Cassandra and designed for hybrid cloud.

Elastic: As the creators of the Elastic Stack, built on Elasticsearch, Elastic builds self-managed and SaaS offerings that make data usable in real time and at scale for search use cases, like logging, security, and analytics.

InfluxData: InfluxData’s time series platform can instrument, observe, learn and automate any system, application and business process across a variety of use cases. InfluxDB (developed by InfluxData) is an open-source time series database optimized for fast, high-availability storage and retrieval of time series data in fields such as operations monitoring, application metrics, IoT sensor data, and real-time analytics.

MongoDB: MongoDB is a modern, general-purpose database platform that brings software and data to developers and the applications they build, with a flexible model and control over data location.

Neo4j: Neo4j is a native graph database platform specifically optimized to map, store, and traverse networks of highly connected data to reveal invisible contexts and hidden relationships. By analyzing data points and the connections between them, Neo4j powers real-time applications.

Redis Labs: Redis Labs is the home of Redis, the world’s most popular in-memory database, and commercial provider of Redis Enterprise. It offers performance, reliability, and flexibility for personalization, machine learning, IoT, search, e-commerce, social, and metering solutions worldwide.

https://cloud.google.com/blog/products/open-source/bringing-the-best-of-open-source-to-google-cloud-customers

Thursday, April 4, 2019

CoreSite offers 10G access to Google Cloud’s Dedicated Interconnect

CoreSite is now supporting Google Cloud’s Dedicated Interconnect product in both their Los Angeles and Denver markets. This service from Google Cloud allows enterprise and network service providers that are collocated with CoreSite to directly connect to Google Cloud Platform through 10 Gbps  fiber interconnects.

The  dedicated Interconnect offers guaranteed uptime of 99.99%.

“CoreSite is pleased to announce the availability of direct fiber interconnection to Google Cloud Platform, providing our customers with a dedicated, flexible and high-performance solution to optimize their evolving cloud and connectivity requirements,” said Maile Kaiser, Senior Vice President of Sales at CoreSite.

In addition to Dedicated Interconnect in CoreSite’s Los Angeles and Denver markets, CoreSite customers are able to connect directly to Google Cloud Platform via the CoreSite Any2Exchange in Chicago, Denver, and Los Angeles. Additionally, customers may access Google Cloud Platform from all of CoreSite’s markets via its network-rich ecosystem of providers or through its inter-site connectivity in select markets. CoreSite offers numerous inter-site connectivity options including lit transport solutions and dedicated dark fiber.

CoreSite’s Any2Exchange is the second-largest Internet exchange in the United States and is the largest Internet exchange on the West Coast. With CoreSite’s Any2Exchange, customers can make secure, SLA-backed, low-latency connections over one port and at a variety of speeds (including 1Gbps, 10Gbps, and 100Gbps) with direct peering to Google Cloud Platform. Customers connecting directly to Google Cloud Platform benefit from reduced congestion and routing issues, reduction in transit costs, lower latency and reduced complexity, all while having access to connect and peer with over 400 participating members.

Sunday, February 10, 2019

Google encrypts Kubernetes secrets with Cloud KMS

Google Cloud, which was already encrypting data at rest by default, including data in Google Kubernetes Engine (GKE), is adding application-layer secrets encryption using the same keys in its hosted Cloud Key Management Service (KMS).

Application-layer secrets encryption, which is now in beta in GKE, protects secrets with envelope encryption: secrets are encrypted locally in AES-CBC mode with a local data encryption key, and the data encryption key is encrypted with a key encryption key managed in Cloud KMS as the root of trust.

Google Cloud said the new capability provides flexibility for specific security models.

https://cloud.google.com/blog/products/containers-kubernetes/exploring-container-security-encrypting-kubernetes-secrets-with-cloud-kms


Sunday, November 18, 2018

Thomas Kurian to replace Diane Greene at Google Cloud

Thomas Kurian will succeed Diane Greene as head of Google Cloud beginning in January.

Kurian currently serves as President of Product Development at Oracle. He has an MBA from Stanford and a BS in Electrical Engineering from Princeton.

Greene has led the division since December 2015. In a Google blog post, Greene says she will now dedicate time to mentoring women entrepreneurs and focusing on education.

Thursday, October 11, 2018

Google Cloud Platform expands its enterprise networking

Google Cloud Platform introduced several additions to its enterprise networking portfolio.

Cloud NATBeta - a Google-managed Network Address Translation service that lets enterprises provision application instances without public IP addresses while also allowing them to access the Tnternet for updates, patching, config management in a controlled and efficient manner. Outside resources cannot directly access any of the private instances behind the Cloud NAT gateway, thereby helping to keep your Google Cloud VPCs isolated and secure.

Firewall Rules LoggingBeta - allows enterprises to audit, verify, and analyze the effects of firewall rules. For example, it provides visibility into potential connection attempts that are blocked by a given firewall rule. Logging is also useful to determine that there weren't any unauthorized connections allowed into an application. Firewall log records of allowed or denied connections are reported every five seconds, and can be exported to Stackdriver Logging, Cloud Pub/Sub, or BigQuery.

Managed TLS Certificates for HTTPS load balancersBeta - load balancing customers can now deploy HTTPS load balancers, Google taking care of provisioning root-trusted TLS certificates for you and manage their lifecycle including renewals and revocation.

https://cloud.google.com/blog/products/networking/simplifying-cloud-networking-for-enterprises-announcing-cloud-nat-and-more

Google Cloud Storage adds regional replication

Google Cloud Platform (GCP) introduced more data storage replication options within regions and between regions. This allows customers to move large data sets across regions using Google infrastructure for ensuring availability and business continuity.

Some highlights:

  • A new dual-regional option for replicating data, which is especially beneficial for analytics or big data workloads. Customers are able to write to a single dual-regional bucket without having to manually copy data between primary and secondary locations. 
  • Nearline and Coldline data now geo-redundant in multi-regional locations, raising the availability of archival data stored this way. 
  • A new look coming for Cloud Storage choices.
  • A new Cloud Storage C++ library so developers can customize their applications.
  • A new dual-regional option available in beta



Thursday, August 16, 2018

Google Cloud Platform: Building a Hybrid Render Farm

Google Cloud Platform posted an online guide for building a render farm using its scalable architecture.

Render farms, which are needed by content creators for outputting digital productions, can be very expensive to build on-premise and require a lot of electricity to run.

GCP is now enabling render applications to scale from 2- or 4-core VMs all the way to a 160-core VM with up to 3844 GB of RAM. Up to 8 GPUs may be attached to any VM to create a GPU farm.

https://cloud.google.com/blog/products/gcp/building-a-hybrid-render-farm-on-gcp-new-guide-available


Monday, August 6, 2018

Google Cloud expands its virtual workstations with NVIDIA Tesla GPUs

Google Cloud Platform began offering virtual workstations optimized for graphics-intensive applications and machine learning inference based on the NVIDIA Tesla P4 GPU.

The new support enables users to turn any instance with one or more GPUs into a high-end workstation. P4s offer 8GB of GDDR5 memory

https://cloud.google.com/


Wednesday, July 25, 2018

Google expands its cloud database capabilities

Google Cloud Platform (GCP) is expanding its portfolio of managed database services and announcing new cloud partnerships. Here are the highlights:

  • Oracle workloads are now supported in GCP
  • SAP HANA workloads can run on GCP persistent-memory VMs
  • Cloud Firestore launching for all users developing cloud-native apps. Cloud Firestore, which is a serverless, NoSQL document database, brings the ability to store and sync app data at global scale. 
  • Regional replication, visualization tool is now available for Cloud Bigtable, which is a high-throughput, low-latency, and massively scalable NoSQL database.
  • Cloud Spanner updates, by popular demand
  • GCP is partnering with Intel and SAP to offer Compute Engine virtual machines backed by the upcoming Intel Optane DC Persistent Memory for SAP HANA workloads. GCP is scaling up its instance size roadmap for SAP HANA production workloads from a max of 4TB currently to 12TB of memory by next summer, and 18TB of memory by the end of 2019.
  • Google Compute Engine gains additional resource-based pricing options. With resource-based pricing, Google will add up all the resources you use across all your machines into a single total and then apply a usage discount. 

Tuesday, July 24, 2018

Google builds its Cloud Services Platform

Google outlined its vision for integrated cloud services running over on-premise equipment as well as its own, massively-scalable cloud infrastructure. At is Google Next conference in San Francisco, Google presented its Cloud Services Platform, which is architecturally aligned with the joint hybrid cloud products being developed with Cisco.

So far, the Cloud Services Platform includes the following:


  • Service mesh: Google is announcing an Istio service for managing services within a Kubernetes Engine cluster. Google is releasing Istio 1.0 in open source, Managed Istio, and Apigee API Management for Istio.
  • Hybrid computing: Google Kubernetes Engine (GKE) On-Prem will provide multi-cluster management
  • Policy enforcement: GKE Policy Management, to take control of Kubernetes workloads
  • Ops tooling: Stackdriver Service Monitoring
  • Serverless computing: GKE Serverless add-on and Knative, an open source serverless framework. Essentially, Google is rolling out serverless containers, which allow customers to run container-based workloads in a fully managed environment and to pay only for actual usage. 
  • Developer tools: Cloud Build, a fully managed CI/CD platform
Google is also rolling out Cloud Functions, an event-driven compute service with an SLA and support for Python 3.7 and Node.js 8, networking and security controls. Cloud Functions can be tied into more than 20 GCP services such as BigQuery, Cloud Pub/Sub, machine learning APIs, G Suite, Google Assistant, etc.

Google is introducing its third-generation cloud tensorflow processing units (TPU)

Google is rapidly expanding its Cloud AutoML (machine learning) capabilities, which is now in public beta testing.


Cisco and Google Partner on New Hybrid Cloud Solution

Cisco and Google Cloud have formed a partnership to deliver a hybrid cloud solutions that enables applications and services to be deployed, managed and secured across on-premises environments and Google Cloud Platform. The pilot implementations are expected to be launched early next year, with commercial rollout later in 2018.

The main idea is to deliver a consistent Kubernetes environment for both on-premises Cisco Private Cloud Infrastructure and Google’s managed Kubernetes service, Google Container Engine.

The companies said their open hybrid cloud offering will provide enterprises with a way to run, secure and monitor workloads, thus enabling them to optimize their existing investments, plan their cloud migration at their own pace and avoid vendor lock in.

Cisco and Google Cloud hybrid solution highlights:


  • Orchestration and Management – Policy-based Kubernetes orchestration and lifecycle management of resources, applications and services across hybrid environments
  • Networking – Extend network policy and configurations to multiple on-premises and cloud environments
  • Security – Extend Security policy and monitor applications behavior
  • Visibility and Control – Real-time network and application performance monitoring and automation
  • Cloud-ready Infrastructure – Hyperconverged platform supporting existing application and cloud-native Kubernetes environments
  • Service Management with Istio – Open-source solution provides a uniform way to connect, secure, manage and monitor microservices
  • API Management – Google's Apigee enterprise-class API management enables legacy workloads running on premises to connect to the cloud through APIs
  • Developer Ready – Cisco's DevNet Developer Center provides tools and resources for cloud and enterprise developers to code in hybrid environments
  • Support – Joint coordinated technical support for the solution

Tuesday, July 3, 2018

Diane Bryant departs Google Cloud

Diane Bryant has stepped down as Chief Operating Officer for Google Cloud, where she reported to Diane Greene. Bryant joined Google Cloud in December 2017.

Byant was formerly Group President at Intel and known for her leadership Intel’s Data Center Group (DCG) as general manager and executive vice president.  Intel's DCG generated $17 billion in revenue in 2016.

Tuesday, June 26, 2018

Google Cloud readies launch in Los Angeles

Google Cloud Platform is ready to launch a new Los Angeles cloud region next month, joining its current regions or Oregon, Iowa, South Carolina and northern Virginia.

The new Los Angeles cloud region will target the aerospace, music, media and entertainment industries of southern California.

"Los Angeles is a global hub for fashion, music, entertainment, aerospace, and more—and technology is essential to strengthening our status as a center of invention and creativity,” said Los Angeles Mayor Eric Garcetti. “We are excited that Google Cloud has chosen Los Angeles to provide infrastructure and technology solutions to our businesses and entrepreneurs.”

https://cloudplatform.googleblog.com/

Google Clouds adds high-performance file storage

Google Cloud Platform (GCP) announced a new high-performance storage service for users who need to create, read and write large files with low latency.

The new Cloud Filestore is managed file storage for applications that require a file system interface and a shared file system. It gives users a simple, integrated, native experience for standing up fully managed network-attached storage (NAS) with their Google Compute Engine and Kubernetes Engine instances.

Two tiers are offered. A high-performance Premium tier is $0.30 per GB per month, and the midrange performance Standard tier is $0.20 per GB per month in us-east1, us-central1, and us-west1 (Other regions vary).

https://cloudplatform.googleblog.com/

Tuesday, May 29, 2018

AT&T NetBond brings direct connect to Google Cloud Platform

AT&T and Google Cloud announced two areas of collaboration.

First, business customers will be able to use AT&T NetBond for Cloud to connect in a highly secure manner to Google Cloud Platform. Google's Partner Interconnect offers organizations private connectivity to GCP and allows data centers geographically distant from a Google Cloud region or point of presence to connect at up to 10 Gbps. Google has joined more than 20 leading cloud providers in the NetBond® for Cloud ecosystem, which gives access to more than 130 different cloud solutions.

Second, G Suite, which is Google's cloud-based productivity suite for business including Gmail, Docs and Drive, is now available through AT&T Collaborate, a hosted voice and collaboration solution for businesses.

"We're committed to helping businesses transform through our edge-to-edge capabilities. This collaboration with Google Cloud gives businesses access to a full suite of productivity tools and a highly secure, private network connection to the Google Cloud Platform," said Roman Pacewicz, chief product officer, AT&T Business. "Together, Google Cloud and AT&T are helping businesses streamline productivity and connectivity in a simple, efficient way."

"AT&T provides organizations globally with secure, smart solutions, and our work to bring Google Cloud's portfolio of products, services and tools to every layer of its customers' business helps serve this mission," said Paul Ferrand, President Global Customer Operations, Google Cloud. "Our alliance allows businesses to seamlessly communicate and collaborate from virtually anywhere and connect their networks to our highly-scalable and reliable infrastructure."

Thursday, May 17, 2018

Google Cloud acquires Cask for big data ingestion on-ramp

Google Cloud will acquire Cask Data Inc., a start-up based in Palo Alto, California, that offers a big data platform for enterprises. Financial terms were not disclosed.

The open source Cask Data Application Platform (CDAP) provides a data ingestion service that simplifies and automates the task of building, running, and managing data pipelines. Cask says it cuts down the time to production for data applications and data lakes by 80%. The idea is to provide a standardization and simplification layer that allows data portability across diverse environments, usability across diverse groups of users, and the security and governance needed in the enterprise.

Google said it plans to continue to develop and release new versions of the open source Cask Data Application Platform (CDAP).
a
“We’re thrilled to welcome the talented Cask team to Google Cloud, and are excited to work together to help make developers more productive with our data processing services both in the cloud and on-premise. We are committed to open source, and look forward to driving the CDAP project’s growth within the broader developer community,” stated William Vambenepe, Group Product Manager, Google Cloud

Over the past 6+ years, we have invested heavily in the open source CDAP available today and have deployed our technology with some of the largest enterprises in the world. We accomplished great things as a team, had tons of fun and learned so much over the years. We are extremely proud of what we’ve achieved with CDAP to date, and couldn’t be more excited about its future.

Cask was founded by Jonathan Gray and Nitin Motgi.


See also