The 14th International Congress of Logic, Methodology and Philosophy of Science, held last July, included a special symposium on the subject of "What is an algorithm?" This may seem to be a strange question to ask just before the Turing Centenary Year. Didn't Turing answer this question decisively?
Moshe Y. Vardi
Artificial Intelligence: Past and Future
The most dramatic chess match of the 20th century was the May 1997 rematch between the IBM supercomputer Deep Blue and world champion Garry Kasparov, which Deep Blue won. While this victory was considered by many a triumph for artificial intelligence, John McCarthy, who coined the very name of the field, was rather dismissive of this accomplishment.
Leibniz conceived of a universal mathematical language in which all human knowledge can be expressed, and calculational rules carried out by machines to derive all logical relationships. His definition of computing captures, I believe, the essence of our field.
For almost 50 years we have been riding Moore's Law's exponential curve. Oh, what a ride it has been! No other technology has ever improved at a geometric rate for decades. But exponential trends always slow down, and the end of "Moore's Party" may be near.
I recently attended a rather theoretical computer-science conference, and sat, as is my habit, in the front row. The speaker was trying to convey the fine details of a rather intricate mathematical construction. I was hopelessly lost.
On June 16, 1902, philosopher Bertrand Russell sent a letter to Gottlob Frege in which he argued that Frege's logical system was inconsistent. The letter launched a "Foundational Crisis" in mathematics, triggering an almost anguished search for proper foundations for mathematics.
Technology Has Social Consequences
A conference paper submission constitutes privileged communication. In theory, reviewers should immediately "forget" what they have read. How could a program committee member not know the fundamental rules of scholarly reviewing?
Fumbling the Future: How Xerox Invented, Then Ignored, the First Personal Computer tells the gripping story of how Xerox invented the personal-computing technology in the 1970s, and then "miscalculated and mishandled" the opportunity to fully exploit it.
Where Have All the Workshops Gone?
My initiation into the computing-research community was a workshop on "Logic and Databases" in 1979. I was the only graduate student attending the workshop. In spite of the informality of the event I was quite in awe of the senior researchers who attended.
On P, NP, and Computational Complexity
While the P vs. NP quandary is a central problem in computer science, a resolution of the problem may have limited practical impact.
Shape the Future of Computing
ACM encourages its members to take a direct hand in shaping the future of the association. There are more ways than ever to get involved.
Get Involved
