The opinion archive provides access to past opinion stories from Communications of the ACM and other sources by date.
Chris Urmson led Google's self-driving car team from its early days all the way until the company shed its Google skin and emerged under the Alphabet umbrella as Waymo, the obvious leader in driverless cars.
AI systems can sometimes be tricked into seeing something that's not actually there, as when Google's software "saw" a 3-D-printed turtle as a rifle.
The headlines about the trade wars being touched off by President Trump's new tariffs may telegraph plenty of bombast and shots fired, but the most consequential war being waged today is a quieter sort of conflict: It's the new…
Isn't technology wonderful? At Purdue University, the same IT infrastructure that enables us to manage student assignments and grades, operate residential and dining facilities, and support a leading community of scientific researchers…
Blockchain has always seemed to me like a solution looking for a problem, which isn't a criticism.
Revelations keep emerging in the Cambridge Analytica personal-data scandal, which has captured global public attention for more than a week. But when the dust settles, researchers harvesting data online will face greater scrutiny…
For a spell during 2010 and 2011, I was a virtual rancher of clickable cattle on Facebook.
At first glance, Dark Hammer looks a lot like any other science fiction comic book: On the front cover, a drone flies over a river dividing a city with damaged and burning buildings. But this short story in graphic form comes…
Fields like machine learning, data science, and computational linguistics are facing their A-bomb moment right now. Computer scientists need to confront the weaponization of their work.
We spend way too much time talking about digital currencies and not nearly enough time on digital cats.
When you listen to digital music, the harmonies and chords that you hear have probably been reconstructed from a file that stored them as components of different frequencies, broken down by a process known as Fourier analysis…
The scandal that has erupted around Cambridge Analytica's alleged harvesting of 50m Facebook profiles assembled from data provided by a UK-based academic and his company is a worrying development for legitimate researchers.
The technology at the heart of cryptocurrencies like bitcoin—blockchain—has captured the world's attention, much as the internet, peer-to-peer file transfers, apps, and the cloud did before it. Simply put, blockchains are distributed…
Artificial intelligence is already making significant inroads in taking over mundane, time-consuming tasks many humans would rather not do.
If you think your on-the-job training was tough, imagine what life is like for newbie surgeons.
Geneticist David Reich used to study the living, but now he studies the dead.
"The future is already here—it's just not evenly distributed," is an often-quoted line from the brilliant science-fiction writer William Gibson.
Blockchain has been in the news lately, but beyond knowing that it has something to do with payments and digital currencies, most people don't know what blockchain is or why they should care.
Computer code written by scientists forms the basis of an increasing number of studies across many fields—and an increasing number of papers that report the results.
The world is full of connected devices—and more are coming. In 2017, there were an estimated 8.4 billion internet-enabled thermostats, cameras, streetlights and other electronics.
In person, Sarah Chadwick and Jaclyn Corin are fierce. And young.
The spread of misinformation on social media is an alarming phenomenon that scientists have yet to fully understand.
For a field that was not well known outside of academia a decade ago, artificial intelligence has grown dizzyingly fast. Tech companies from Silicon Valley to Beijing are betting everything on it, venture capitalists are pouring…
Mr. President, if you're looking for someone to demonize for killing blue-collar jobs in your favorite industries, don't blame China and "bad trade deals."
Blame the robots.
Something strange, scary and sublime is happening to cameras, and it's going to complicate everything you knew about pictures. Cameras are getting brains.
Mary Shelley was 20 when she published "Frankenstein" in 1818. Two hundred years on, the book remains thrilling, challenging and relevant—especially for scientists like me whose research involves tinkering with the stuff of life…
In electronics, the past halfcentury has been a steady march away from analog and toward digital. Telephony, music recording and playback, cameras, and radio and television broadcasting have all followed the lead of computing…
A couple of months ago, I made a small tweak to my Twitter account that has changed my experience of the platform.
On the impact of large language models.
Until the middle of the 20th century, computers were in fact humans who performed calculations.