The battle for artificial-intelligence expertise is forcing Apple
Inc. to grapple with its famous penchant for secrecy, as tech companies
seek to woo talent in a discipline known for its openness.
giant this year has been trying to draw attention-but only so much-to its efforts to develop artificial intelligence, or AI, a term that generally describes software that enables computers to learn and improve functions on their own.
launched a public blog in July to talk about its work, for example, and has allowed its researchers to speak at several conferences on artificial intelligence, including a TED Talk
in April by Tom Gruber, co-creator of Apple’s Siri voice assistant, that was posted on YouTube last month.
Talking up transparency is unusual for a company whose chief executive, Tim Cook, once joked that it is more secretive than the Central Intelligence Agency.
The shift is driven by AI’s growing importance in areas like self-driving cars and voice assistants such as Siri. Rivals including Alphabet
Corp. and Facebook
Inc. have been racing for years to gather talent in the field-largely by recruiting Ph.D. students and professors from university computer-science
Those academics say they want to join companies but still publish regularly, present research and discuss their work.
“We come from a community where we share ideas and get credit for it and a lot of us would be very unhappy to give that up,” said Noah Goodman, a Stanford University professor of computer science.
He works with a research division of Uber Technologies Inc. where he enjoys those perks.
Indeed many big tech companies
have embraced academia’s relative transparency. They have aggressively recruited top researchers over the years such as Yann LeCun of New York University, who joined Facebook
in 2013, and Geoffrey Hinton of the University of Toronto, who joined Alphabet’s Google
unit in 2013. The companies together also have churned out hundreds of research papers over the past several years.
was slow to follow, AI analysts and leading researchers say. And even since its public embrace of greater transparency, it has published a fraction of its competitors’ research, and its scientists have avoided speaking about Apple-related research at conferences.
To date, the company has published portions of four peer-reviewed research papers on its blog, the Apple
Machine Learning Journal. The three posts published this year are attributed to the Siri team and don’t name any individual researchers the way academic papers commonly do.
At a San Francisco conference in March on using AI in autonomous vehicles, Apple
research scientist Charlie Tang gave a presentation on robotics-but the photo he showed was from Google.
He didn’t specifically mention any of Apple’s work.
“We want to open communication with the [artificial intelligence] community,” Mr. Tang said in an interview afterward before directing questions about that strategy to Apple
revenue comes from products like the iPhone or iPad, which are held in strict secrecy before their launch to protect innovations, and “overcoming that [culture] is difficult,” said Jack Clark, who heads strategy at OpenAI, a nonprofit, artificial-intelligence research group. He added that Apple’s blog was a positive step for the company.
spokesman declined to comment on Mr. Clark’s remarks.
Though guarded about products, Apple
is a longstanding member of industry standards groups like the World Wide Web Consortium and has contributed to open-source projects over the years.
Competitive concerns are one reason companies might want to be careful about discussing AI work. But Dr. Goodman of Stanford said companies generally don’t need to worry about losing their competitive edge because the algorithms published in papers only work with proprietary data and remain essentially locked.
Apple’s continued restraint has stoked skepticism about its pledged transparency and doubts about its ability to recruit researchers. Tom Austin, an analyst in AI at research firm Gartner Inc., said Apple
would struggle to “succeed with a strategy that’s bottled up.”
in October named Carnegie Mellon University professor Ruslan Salakhutdinov as its director of AI research. He joined Carlos Guestrin, a University of Washington professor whose company Apple
acquired in August 2016.
Dr. Salakhutdinov announced at an artificial-intelligence conference in December that Apple
intended to be more open and would start publishing. Cornell University Library published that month Apple’s first research paper since Dr. Salakhutdinov’s arrival on improving graphic recognition.
In January, Apple
joined Facebook, Microsoft
and others as a member of the Partnership on AI, a group committed to developing best practices for research.
In an interview earlier this year, Dr. Salakhutdinov said Apple
would publish more, but declined to say how much. “You can have quantity, but producing high quality research is very important,” he said.
Dr. Goodman of Stanford said that contrast has made joining Apple
feel “like a one-way move into industry whereas the AI labs like Google
feel more permeable.”
Though Apple’s public research has been limited, Manuela Veloso, a computer science
professor at Carnegie Mellon, advises her students to consider jobs that offer the opportunity to influence consumer products-an area where Apple
has an advantage.
“If you do this research at Apple, it’s their prerogative to have their [intellectual property],” Dr. Veloso said.
The Wall Street Journal