You are here: Home » Technology » News » Others
Business Standard

Intel, while pivoting to AI, tries to protect lead

Intel has long dominated the business for central processing chips

Steve Lohr | The New York Times News 

Intel

The computers in modern data centres — the engine rooms of the digital economy — are powered mainly by chips. They animate the computing clouds of the internet giants and corporate data centres worldwide.

But is now facing new competitive forces that could pose a challenge to its data-center dominance and profitability.

In particular, the rise of artificial intelligence is creating demand for new computing hardware tailored to handle vast amounts of unruly data and complex machine-learning software — and Intel’s general-purpose chips are not yet tuned for the most demanding tasks. Instead, specialised chips are delivering better performance on artificial intelligence programs that identify images, recognise speech and translate languages.

is hurrying to catch the wave. On Tuesday, to deal with the changing competitive landscape, the Silicon Valley giant presented its newest data-center strategy at an event in New York, addressing its plans and its mainstream data-center business. The company has billed the event as its “biggest data-centre launch in a decade.”

How successful Intel’s efforts prove to be will be crucial not only for the company but also for the long-term future of the computer chip industry.

“We’re seeing a lot more competition in the data-center market than we’ve seen in a long time,” said Linley Gwennap, a semiconductor expert who leads a research firm in Mountain View, Calif.

has long dominated the business for central processing chips that control industry-standard servers in data centers. Matthew Eastwood, an analyst at IDC, said the company controlled about 96 per cent of such chips.

But are making inroads into advanced data centers. Nvidia, a chip maker in Santa Clara, Calif., does not make Intel-style central processors. But its graphics-processing chips, used by gamers in turbocharged personal computers, have proved well suited for tasks. Nvidia’s data-center business is taking off, with the company’s sales surging and its stock price nearly tripling in the last year.

Big customers like Google, and are also working on chip designs. AMD and ARM, which make central processing chips like Intel, are edging into the data-centre market, too. IBM made its Power chip open source a few years ago, and and are designing prototypes. The company formally introduced the next generation of its Xeon data-center microprocessors, code-named Skylake. 

And there will be a range of Xeon offerings with different numbers of processing cores, speeds, amounts of attached memory, and prices.

Yet analysts said that would represent progress along Intel’s current path rather than an embrace of new models of computing.

Stacy Rasgon, a semiconductor analyst at Bernstein Research, said, “They’re late to artificial intelligence.”

disputes that characterisation, saying that artificial intelligence is an emerging in which the company is making major investments. In a blog post last fall, Brian Krzanich, Intel’s chief executive, wrote that it was “uniquely capable of enabling and accelerating the promise of AI”

has been working in several ways to respond to the competition in data-center chips. The company acquired Nervana Systems, an artificial intelligence start-up, for more than $400 million last year. In March, created an group, headed by Naveen G Rao, a founder and former chief executive of Nervana.

The Nervana technology, has said, is being folded into its product road map. A chip code-named Lake Crest is being tested and will be available to some customers this year.

Lake Crest is tailored for programs called neural networks, which learn specific tasks by analyzing huge amounts of data. Feed millions of cat photos into a neural network and it can learn to recognize a cat — and later pick out cats by color and breed. The principle is the same for speech recognition and language translation.

has also said it is working to integrate Nervana into a future Xeon processor, code-named Knight’s Crest.

Intel’s challenge, analysts said, is a classic one of adapting an extraordinarily successful business to a fundamental shift in the marketplace.

graph
As the dominant data-center chip maker, used by a wide array of customers with different needs, has loaded more capabilities into its central processors. It has been an immensely profitable strategy: had net income of $10.3 billion last year on revenue of $59.4 billion.

Yet key customers increasingly want computing designs that parcel out work to a collection of specialised chips rather than have that work flow through the central processor. A central processor can be thought of as part brain, doing the logic processing, and part traffic cop, orchestrating the flow of data through the computer.

The outlying, specialized chips are known in the industry as accelerators. They can do certain things, like data-driven tasks, faster than a central processor. Accelerators include graphics processors, application-specific integrated circuits (ASICs) and field-programmable gate arrays (FPGAs).

A more diverse set of chips does not mean the need for Intel’s central processor disappears. The processor just does less of the work, becoming more of a traffic cop and less of a brain. If this happens, Intel’s business becomes less profitable.

is not standing still. In 2015, it paid $16.7 billion for Altera, a maker of field-programmable gate arrays, which make chips more flexible because they can be repeatedly reprogrammed with software.

Gwennap, the independent analyst, said, “has a very good read on data centers and what those customers want.”

Still, the question remains whether knowing what the customers want translates into giving them what they want, if that path presents a threat to Intel’s business model and profit margins.

© 2017 The New York Times News Service

RECOMMENDED FOR YOU