Prepare for Artificial Intelligence to Produce Less Wizardry

栏目: IT技术 · 发布时间: 5年前

内容简介:The company already used purchasing data and a simple statistical method to predict sales. Withdeep learning, a technique that has helped produce spectacular AI advances in recent years—as well as additional data including local weather, traffic conditions

Early last year, a large European supermarket chain deployed artificial intelligence to predict what customers would buy each day at different stores, to help keep shelves stocked while reducing costly spoilage of goods.

The company already used purchasing data and a simple statistical method to predict sales. Withdeep learning, a technique that has helped produce spectacular AI advances in recent years—as well as additional data including local weather, traffic conditions, and competitors’ actions—the company cut the number of errors by three-quarters.

It was precisely the kind of high-impact, cost-saving effect that people expect from AI. But there was a huge catch: The new algorithm required so much computation that the company chose not to use it.

“They were like, ‘well, it is not worth it to us to roll it out in a big way,’ unless cloud computing costs come down or the algorithms become more efficient,” says Neil Thompson , a research scientist at MIT, who is assembling a case study on the project. (He declined to name the company involved.)

The story highlights a looming problem for AI and its users, Thompson says. Progress has been both rapid and dazzling in recent years, giving us clever game-playing programs , attentive personal assistants , and cars that navigate busy roads for themselves. But such advances have hinged on throwing ever-more computing resources at the problems.

In a new research paper, Thompson and colleagues argue that it is, or will soon be, impossible to increase computing power at the same rate in order to continue these advances. This could jeopardize further progress in areas including computer vision, translation, and language understanding.

AI’s appetite for computation has risen remarkably over the past decade. In 2012, at the beginning of the deep learning boom, a team at the University of Toronto created a breakthrough image-recognition algorithm using two GPUs (a specialized kind of computer chip) over five days. Fast-forward to 2019, and it took six days and roughly 1,000 special chips (each many times more powerful than the earlier GPUs) for researchers at Google and Carnegie Mellon to develop a more modern image-recognition algorithm . A translation algorithm , developed last year by a team at Google, required the rough equivalent of 12,000 specialized chips running for a week. By some estimates, it would cost up to $3 million to rent this much computer power through the cloud.

“Deep neural networks are very computationally expensive,” says Song Han , an assistant professor at MIT who specializes in developing more efficient forms of deep learning and is not an author on Thompson’s paper. “This is a critical issue.”

Han’s group has created more efficient versions of popular AI algorithms using novelneural network architectures and specialized chip architectures, among other things. But he says there is a “still a long way to go,” to make deep learning less compute-hungry.

Other researchers have noted the soaring computational demands. The head of Facebook’s AI research lab, Jerome Pesenti, told WIRED last year that AI researchers were starting to feel the effects of this computation crunch.

Thompson believes that, without clever new algorithms, the limits of deep learning could slow advances in multiple fields, affecting the rate at which computers replace human tasks. “The automation of jobs will probably happen more gradually than expected, since getting to human-level performance will be much more expensive than anticipated,” he says. “Slower automation might sound good from a jobs perspective,” he says, but it will also slow gains in productivity, which are key to raising living standards.

In their study, Thompson and his co-authors looked at more than 1,000 AI research papers outlining new algorithms. Not all of the papers detailed the computational requirements, but enough did to map out the cost of progress. The history suggested that making further advances in the same way will be all-but impossible.


以上就是本文的全部内容,希望对大家的学习有所帮助,也希望大家多多支持 码农网

查看所有标签

猜你喜欢:

本站部分资源来源于网络,本站转载出于传递更多信息之目的,版权归原作者或者来源机构所有,如转载稿涉及版权问题,请联系我们

Maven实战

Maven实战

许晓斌 / 机械工业出版社 / 2010年12月 / 65.00元

你是否早已厌倦了日复一日的手工构建工作?你是否对各个项目风格迥异的构建系统感到恐惧?Maven——这一Java社区事实标准的项目管理工具,能帮你从琐碎的手工劳动中解脱出来,帮你规范整个组织的构建系统。不仅如此,它还有依赖管理、自动生成项目站点等超酷的特性,已经有无数的开源项目使用它来构建项目并促进团队交流,每天都有数以万计的开发者在访问中央仓库以获取他们需要的依赖。 本书内容全面而系统,Ma......一起来看看 《Maven实战》 这本书的介绍吧!

XML、JSON 在线转换
XML、JSON 在线转换

在线XML、JSON转换工具

XML 在线格式化
XML 在线格式化

在线 XML 格式化压缩工具

HSV CMYK 转换工具
HSV CMYK 转换工具

HSV CMYK互换工具