Contributed article — DOI: 10.1145/2001269.2001297
Don’t Turn Social Media Into Another ‘Literary Digest’ Poll
Daniel Gayo-Avello
Content published in microblogging systems like Twitter can be data-mined to take the pulse of society. Indeed, a number of studies have praised the value of relatively simple approaches to sampling, opinion mining, and sentiment analysis. In this article, the author plays devil’s advocate, detailing a study conducted in late 2008/early 2009 in which such simple approaches largely overestimated President Barack Obama’s victory in the 2008 U.S. Presidential election. The author conducts a post-mortem of the analysis, extracting several important lessons.
Twitter is a microblogging service for publishing very short text messages (only 140 characters each), or tweets, to be shared with users following their author. Many Twitter users do not protect their tweets, which then appear in the so-called public timeline. They are accessible through Twitter’s own API, so are easily accessed and collected.
Twitter’s original slogan—"What are you doing?"—encouraged users to share updates about the minutia of their daily activities with their friends. Twitter has since evolved into a complex information-dissemination platform, especially during situations of mass convergence. Under certain circumstances, Twitter users not only provide information about themselves but also real-time updates of current events.
Today, Twitter is a source of information on such events, updated by millions of users worldwide reacting to events as they unfold, often in real time. It was only a matter of time before the research community turned to it as a rich source of social, commercial, marketing, and political information.
The goal of this article is to focus on one of the survey’s most appealing applications: using its data to predict the outcome of current and future events.
Contributed article — DOI: 10.1145/2001269.2001298
Computing for the Masses
Zhiwei Xu and Guojie Li
The fields of computer science and engineering have witnessed amazing progress over the last 60 years. As we journey through the second decade of the 21st century, however, it becomes increasingly clear that our profession faces some serious challenges. We can no longer solely rely on incremental and inertial advances. Fundamental opportunities and alternative paths must be examined.
In 2007, the Chinese Academy of Sciences (CAS) sponsored a two-year study on the challenges, requirements, and potential roadmaps for information technology advances into year 2050. This article presents a perspective on a key finding of that study: A new paradigm, named computing for the masses, is needed to cope with the challenges facing IT in the coming decades.
The CAS study focused on China’s needs. However, the issues investigated are of interest to the worldwide computing community. For instance, when considering the drivers of future computing profession, it is critical not to underestimate the requirements and demands from the new generations of digital native population. As of July 2010, 59% of China’s 420 million Internet users were between the ages of 629 years old. The time frame of 20102050 is not too distant a future for them. These digital natives could drive a ten-fold expansion of IT use.
Computing for the masses is much more than offering a cheap personal computer and Internet connection to the low-income population. It means providing essential computing value for all people, tailored to their individual needs. It demands paradigm-shifting research and discipline rejuvenation in computer science, to create augmented Value (V), Affordability (A), and Sustainability (S) through Ternary computing (T). In other words, computing for the masses is VAST computing.
Join the Discussion (0)
Become a Member or Sign In to Post a Comment