The opinion archive provides access to past opinion stories from Communications of the ACM and other sources by date.
Amid the buildup to the Facebook IPO, serial entrepreneur Steve Blank was sounding an ominous warning to anyone who would listen.
If you ever come across a photograph of communist-era East Berlin, or modern Pyongyang in North Korea, the cityscapes look drab and featureless. Billboards, advertising hoardings, neon signs, shopfronts, and graffiti are conspicuous…
There is one version of Craig Venter's life story where he would’ve been a dutiful scientist at the National Institutes of Health, a respected yet anonymous researcher in genetics, perhaps.
Two weeks ago, Steve Wozniak made a public call for Apple to open its platforms for those who wish to tinker, tweak and innovate with their internals. EFF supports Wozniak's position...
Should we worry about cyberwarfare? Judging by excessively dramatic headlines in the media, very much so. Cyberwarfare, the argument goes, might make wars easier to start and thus more likely.
From all indications, it would appear that attackers are continuing to attack and malware authors are carrying on writing malware.
If you write about genetics and evolution, one of the commonest questions you are likely to be asked at public events is whether human evolution has stopped.
So I guess you've heard about the recent initial public offering that didn't turn out the way it was supposed to. The company's Wall Street advisers misjudged the market, and, on its first day of trading, the stock went a little…
Television began as a box.
"Design is a word that's come to mean so much that it's also a word that has come to mean nothing. We don't really talk about design, we talk about developing ideas and making products," says Jonathan Ive, the London-born head…
In the days of the Internet bubble of the mid to late 1990s, companies received millions of dollars of venture capital to offer products that weren't especially good—but were free.
Daniela Rus, a professor in the Department of Electrical Engineering and Computer Science, has been named the next director of MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL), effective May 23.
Facebook is not only on course to go bust, but will take the rest of the ad-supported Web with it.
Facebook got a black eye last week when General Motors announced it would cease advertising on the platform, yanking $10 million in annual ad spending away from Mark Zuckerberg just days before the company's IPO. The move caused…
If I were empress of the Universe I would insist on every individual having a unique ID permanently attached—a barcode if you will—an implanted chip to provide an easy, fast inexpensive way to identify individuals.
In 1984, Yale sociologist Charles Perrow published his classic book, Normal Accidents: Living with High-Risk Technologies. The odd term, "normal accident," Perrow wrote, is meant to signal that, given a system's characteristics…
Steve Crocker was there when the Internet was born.
Bre Pettis looks like a throwback. He’s got an Elvis Presley-meets-Buddy Holly thing going on with his chunky sideburns, thick-rimmed black glasses, and sculpted salt-and-pepper hair. He talks a bit yesteryear, too, in that he’s…
Much of Intel's success as a microprocessor manufacturer over the past four decades has come from the company's ability to understand and anticipate the future of technology.
In 2002, Indiana rolled out computer scoring of its 11th grade state writing exam.
Google commissioned me to write this White Paper ("First Amendment Protection for Search Engine Search Results"), so I thought I’d pass it along. I wrote the paper as an advocate, and not as a disinterested academic, but I hope…
Alarming cracks are starting to penetrate deep into the scientific edifice. They threaten the status of science and its value to society. And they cannot be blamed on the usual suspects—inadequate funding, misconduct, political…
As supercomputing makes its way through the petascale era, the future of the technology has never seemed so uncertain.
Recently I visited one of the primary ventricles of Silicon Valley's investment culture, Google Ventures.
The patent world is quietly undergoing a change of seismic proportions. In a few short years, a handful of entities have amassed vast treasuries of patents on an unprecedented scale.
Here's a question: What's bigger and far more important than Facebook? Hint: it's very low-tech and doesn't need a smartphone or even an Internet connection.
It is ironic that Steve Jobs, the man who focused on building consumer-friendly devices such as the iPod, iPhone, and iPad, is also radically reshaping the business world.
A mounting effort to transform a United Nations agency into a global Internet regulator is threatening to undo decades of policymaking that helped the Internet evolve into the open, global medium we all depend on.
Rui Viana isn't a full-time app developer and he hasn't learned how to use Apple's iOS software development kit.
I take a peek at the history of computing to see if pioneering early developments were the results of team …
Maybe the first programming language didn't really matter, because students learning programming were different.
Milestones in the history of computing from the Swiss National Supercomputing Center, Lugano.