Wednesday, August 15, 2018

FWD: Intel focuses on Data-Centric Innovation

For months, many market watchers have described Intel as complacent as we've seen the surge in new products from companies such as ARM, NVIDIA and Xilinx. This week, Intel is hitting back. At an all-day media and analyst event at its headquarters in Santa Clara, California, Intel executives laid out their plans for extending the strength of the Xeon product family in data centers to the new frontiers of AI, network transformation, 5G, and supercomputing.

Navin Shenoy, executive vice president at Intel, kicked off the event with the roadmap showing how Xeon has come to dominate the server business and where it is headed next. 

The big takeaways from the events are:

(1) the new frontiers are enormous economic opportunities for silicon developers 
(2) Intel will integrate its Optane persistent memory with its Xeon processors 
(3) Intel is working directly with hyperscale cloud service providers to develop custom silicon 
(4) Intel is entering the SmartNICs business 
(5) the edge presents the opportunity to rebuild the central telco office

Video from all ten of the Data Centric Summit keynotes will be posted later this week to the Intel investor relations website:

Here are some observations from the event.

Due to new projections about autonomous vehicles, and the impact of its Optane memory technology, Intel is raising its forecast for the total addressable market:

Network traffic trends are inescapable, especially east-west traffic in data centers.

 The market for network logic silicon could be worth $24 billion in just four years. Today, Intel has a growing share in silicon for the networking market but still was under 19% for 2017. 

 The biggest point of differentiation for Intel will be to leverage its Optane 3D memory technology to change the current compute-storage-network paradigm.

 Intel's goal is not just faster Xeon processors, but better ways of story and moving data.

The next iteration of Xeon is codenamed Cascade Lake and is optimized for AI with the Optane persistent memory. Intel expects an 11X improvement in inference performance.

 Here is a sneak peek at the roadmap.

The connectivity opportunity is growing at a 25% CAGR. It gives Intel a window to capture a greater share of the overall data center architecture.

to be continued