The opinion archive provides access to past opinion stories from Communications of the ACM and other sources by date.
In 1931, the city fathers of Sunnyvale, California, came up with a unique plan to rescue their town from the doldrums of the Great Depression.
Mars has held a central place in human imagination and culture for millennia.
I'm not the first man to believe that he might fix London.
There was a time when neuroscientists could only dream of having such a problem.
In October 2008, in the early days of the last economic collapse, Sequoia Capital invited founders of technology companies to a frank meeting outlining the new global reality.
Since 1953, to be nominated for a Hugo Award, among the highest honors in science fiction and fantasy writing, has been a dream come true for authors who love time travel, extraterrestrials and tales of the imagined future.
America's youth isn't getting a decent education when it comes to the basics of technology, and a survey conducted by Google and Gallup shows why.
A revolution is under way in psychiatry. The science underpinning this discipline has in the past shifted from psychology to pharmacology, and now it is changing again.
In 1957, a few years after Francis Crick co-discovered the DNA double helix and a few years before he co-won a Nobel Prize for doing so, he published a paper on the genetic provenance of amino acids, the organic compounds that…
From Rosie, the Jetsons' robot maid, to Arnold Schwarzenegger's cyborg in The Terminator, popular culture has frequently conceived of robots as having a human-like form, complete with "eyes" and mechanical limbs. But tech reporter…
Virtual voice-controlled assistants such as Siri, Cortana and Google Now are magical.
As mobile devices continue to explore and colonize the technology landscape, their conquests are leading us to a new era, beyond search and apps.
The Making of Stanley Kubrick’s "2001: A Space Odyssey" documents in nearly scientific detail exactly that: the story of how the iconic science-fiction film came into existence, and how it predicted much of the technology we…
Twenty years ago I attended my first Def Con. I believed in a free, open, reliable, interoperable Internet: a place where anyone can say anything, and anyone who wants to hear it can listen and respond. I believed in the Hacker…
From a viewing spot in a high bay room at NASA Ames Research Center in Silicon Valley, I peer through a glass window at a cab that simulates the cockpit of a commercial aircraft.
I take back all—OK most—of the expletives I’ve ever hurled at Siri.
In the modern age of technology it is not uncommon to come home after a long day at work or school and blow off steam by reading an e-book or watching television. Lately, however, scientists have been cautioning against using…
Last month over a thousand scientists and tech-world luminaries, including Elon Musk, Stephen Hawking and Steve Wozniak, released an open letter calling for a global ban on offensive "autonomous" weapons like drones, which can…
Unfortunately, much of the recent outcry against artificial-intelligence weapons has been confused, conjuring robot takeovers of mankind.
Famed science-fiction writer Fredric Brown (1906–1972) delighted in creating the shortest of short stories. "Answer," published in 1954, encapsulated a prescient meditation on the future of human-machine relations within a single…
"Why shouldn't people be able to teleport wherever they want?" asks Palmer Luckey, the 22-year-old founder of Oculus VR, the virtual-reality company that Facebook bought last year for more than $2 billion.
In the build up to the 2016 U.S. election, both Democratic and Republican presidential hopefuls are talking about cybersecurity—and specifically state-sponsored hacks.
I'm standing on the bow of what looks to be a sunken pirate ship.
December 26, 2004: It is an idyllic morning at a beachside resort in Indonesia.
In June, a father of six was shot dead on a Monday afternoon in Evanston, Ill., a suburb 10 miles north of Chicago.
In an interview, Microsoft Research scientist Cynthia Dwork describes how algorithms can learn to discriminate because they are programmed by coders who incorporate their biases.
In the ten years that I’ve been watching him, Larry Page has always wanted to play by his own rules.
Shortly after its founding, Google posted a document on its site called "Ten things we know to be true," an effort to distill its unusual corporate culture into a succinct list of prescriptions—the 10 commandments of Googliness…
Making fun of how movies screw up the future is one of the main reasons why the Internet was invented.
The fact that he couldn't feel the drill going into the back of his skull made the noise all the more terrifying.
Milestones in the history of computing from the Swiss National Supercomputing Center, Lugano.
How the Technion assimilated its international activities into its other units’ activities and, at the same time, reduced operational costs …
To organize the productive work of multiprocessor chips, it is necessary to establish an efficient distribution of computational processes between …