The opinion archive provides access to past opinion stories from Communications of the ACM and other sources by date.
Imagine a supercomputer so advanced that it could hold the contents of a human brain.
About a decade and a half ago, the neuroscience world got super-stoked about a sexy new way to look at living brains: functional magnetic resonance imaging.
In 2005, Google bought a tiny mobile software company named Android, and almost nobody in the technology industry saw its potential—not even Eric Schmidt, Google's chairman and then chief executive.
On March 27th, an American astronaut named Scott Kelly blasted off from Earth and, six hours later, clambered onto the International Space Station.
Ever since Jim Balsillie and Mike Lazaridis stepped down as co-chairmen and co-chief executives of BlackBerry, neither has spoken much in public about the once-dominant smartphone maker's fall into near market obscurity.
Even the most creative jobs have parts that are pretty routine—tasks that, at least in theory, can be done by a machine. Take, for example, being a reporter.
Java synthesized sound ideas, repackaging them in a practical format that turned on a generation of coders.
The rest of the month is setting up to be pretty dramatic in the Senate.
"Human beings are ashamed to have been born instead of made," wrote the philosopher Günther Anders in 1956. Our shame has only deepened as our machines have grown more adept.
Technology has knocked the bottom rung out of the employment ladder, which has sent youth unemployment around the globe skyrocketing and presented us with a serious economic dilemma.
The perfectibility of the human mind is a theme that has captured our imagination for centuries—the notion that, with the right tools, the right approach, the right attitude, we might become better, smarter versions of ourselves…
There are two stories people are trying to tell right now about the future of the Internet.
Border patrol agents have Predator drones at their disposal, and using them has the potential to become a serious breach of privacy — but it also could be a terrific tool for other needs, if it's done right.
From the self-checkout aisle of the grocery store to the sports section of the newspaper, robots and computer software are increasingly taking the place of humans in the workforce.
With the first papers appearing in the literature that describe CRISPR-Cas9 engineering of human reproductive cells, are we at a new Asilomar moment?
"Here let's zoom in so you can see your Kerbal floating above Kerbin," my boyfriend suggests before hitting the "M" key on his keyboard.
Internet pioneer Vinton G. Cerf warned Thursday that political and technological forces threaten universal access and integrity, which he described as the foundation of the Internet's value.
Theodore Beale had a big day when the nominations for science fiction's annual Hugo Awards were announced last month: He received two nominations for his editing work, and nine stories and books from Castalia House, the tiny…
I am proud to be one of the 17 founders/authors of the The Agile Manifesto back in 2001. But in the 14 years since then, we've lost our way. How did we get into this mess?
On April 19, 1965, just over 50 years ago, Gordon Moore, then the head of research for Fairchild Semiconductor and later one of the co-founders of Intel, was asked by Electronics Magazine to submit an article predicting what…
Five years ago, on the afternoon of May 6, 2010, the Dow and the S. & P. fell more than six per cent in a matter of minutes, losing a trillion dollars in value.
The White House is adding one of the tech policy world's most valuable players to its roster: Princeton Professor Ed Felten.
Gordon Moore appeared in person to talk about the 50th anniversary of Moore's Law, the prediction that he made that has fueled the tech industry and driven the engineering community to continuously improve the electronics that…
In the late 20th century, while the blue-collar working class gave way to the forces of globalization and automation, the educated elite looked on with benign condescension.
Last summer, after months of encrypted emails, I spent three days in Moscow hanging out with Edward Snowden for a Wired cover story.
A federal appeals court ruling that the National Security Agency's collection of millions of Americans' phone records is illegal could undercut more than just that program.
After decades as a sci-fi staple, artificial intelligence has leapt into the mainstream.
There is a lot to praise in the powerful ruling issued by a three-judge federal appeals panel in New York on Thursday, which held that the government's vast, continuing and, until recently, secret sweep of Americans’ phone records…
In a 60-by-60-foot room in Salt Lake City, Ken Bretschneider is taking virtual reality experiences to another level.
In the stylish new sci-fi thriller Ex Machina, Frankenstein's old theme re-emerges in a beautifully designed setting: Instead of the Gothic castle we have a spectacular estate in a vast mountainous wilderness, home of the recluse…
Babbage wanted to control his analytical engine, regarded as the ancestor of the modern-day computer, with punched cards.
In this blog, we describe our vision for a journal that would focus on data science education from the interdisciplinarity …