a banner year or so
Another Win for Artificial Intelligence: the Turing AwardBy STEVE LOHR
It’s been a banner year or so for artificial intelligence, from the recent triumph of I.B.M.’s Jeopardy-winning supercomputer to a wave of news coverage of the field, like the “Smarter Than You Think” series in The Times, but also coverage elsewhere, including The Atlantic’s March cover story.
So perhaps it is hardly surprising that the 2010 Turing Award, announced on Wednesday, went to Leslie G. Valiant, a Harvard professor whose work laid the theoretical foundations for machine learning. The Turing Award, sometimes called the Nobel of computer science, tends to highlight the two sides of computing — the here-and-now impact of the technology, and its deep roots in research.
Much of Mr. Valiant’s pioneering research in machine learning was done in the 1980s. “He certainly could have gotten the award a decade ago, but this was his moment,” observed Jonathan Kleinberg, a computer scientist at Cornell University.
In machine learning, the computer scans vast stores of data, uncovers patterns and generates rules for predicting results with increasing accuracy. Machine learning is a vital computing ingredient in modern applications like spam filters, Internet search, speech recognition and computer vision.
The prize, named for Alan Turing, the British mathematician and World War II code breaker, carries a prize of $250,000. It is underwritten by Intel and Google, and administered by the Association for Computing Machinery..
Here is a list of the award winners, dating to 1966, and their citations.