Facebook open sources its AI hardware as it races Google

In case you haven’t been following, Artificial Intelligence (AI) is the new currency in Silicon Valley.

Wired reports on Facebook open sourcing its AI hardware as it races Google in need of some serious engineering talent. The community of researchers who excel at deep learning is relatively small. As a result, Google and Facebook are part of an industry-wide battle for top engineers.

“At Google, this tech not only helps the company recognize the commands you bark into your Android phone and instantly translate foreign street signs when you turn your phone their way. It helps drive the Google search engine, the centerpiece of the company’s online empire. At Facebook, it helps identify faces in photos, choose content for your News Feed, and even deliver flowers ordered through M, the company’s experimental personal assistant. All the while, these two titans hope to refine deep learning so that it can carry on real conversations—and perhaps even exhibit something close to common sense.”


A couple of weeks ago Google introduced Tensorflow, the Android of Artificial IntelligenceFacebook now announced Big Sur, code-name for a machine packed with an enormous number of graphics processing units, or GPUs—chips particularly well suited to deep learning.

The Internet’s largest services typically run on open source software. “Open source is the currency of developers now,” says Sean Stephens, the CEO of a software company called Perfect. “It’s how they share their thoughts and ideas. In the closed source world, developers don’t have a lot of room to move.” And as these services shift to a new breed of streamlined hardware better suited to running enormous operations, many companies are sharing their hardware designs as well.”

Facebook is the poster child for this movement. In 2011, after years of sharing important software, the company started sharing hardware designs,seeding what it calls the Open Compute Projecta way for any company to share and collaborate on hardware.

“After 18 months of development, Big Sur is twice as fast as the previous system Facebook used to train its neural networks. That means it can train twice as many neural networks in the same amount of time—or train networks that are twice as large. In short, Facebook can achieve a greater level of AI at a quicker pace. “The bigger you make the neural nets, the better they will work,” Yann LeCun says. “The more data you get them, the better they will work.” And since deep neural nets serve such a wide variety of applications—from face recognition to natural language understanding—this single system design can significantly advance the progress of Facebook as a whole.”

But according to LeCun, there are bigger reasons for open sourcing Big Sur and other hardware designs. For one thing, this can help reduce the cost of the machines. If more companies start using the designs, manufacturers can build the machines at a lower cost. And in a larger sense, if more companies use the designs to do more AI work, it helps accelerate the evolution of deep learning as a whole—including software as well as hardware. So, yes, Facebook is giving away its secrets so that it can better compete with Google—and everyone else.

Read the original article here.

Related posts on Artificial Intelligence on The Futures Agency blog

The Future of Connected Music by Rudy de Waele (slides + video)

The Deep Learning Gold Rush of 2015

The future of work – interview with Rudy de Waele (video)

Google introduces Tensorflow, the Android of Artificial Intelligence

2 great podcasts on Artificial Intelligence

Artificial Emotional Intelligence

Deep Shift: Technology Tipping Points and Societal Impact (WEF Report)

The Consequences of Machine Intelligence

These Technologies Will Shift the Global Balance of Power in the Next 20 Years

The best of The Futures Agency’s Artificial Intelligence Archive

Posted by Rudy de Waele aka @mtrends / shift2020.com

Leave a comment