acm-header
Sign In

Communications of the ACM


121 - 130 of 286 for bentley


Understanding experience using dialogical methods: the case of serendipity

McCarthy and Wright's (2004) approach to understanding user experience provides a rich conceptual framework. In this paper, we report how this framework was used to guide the development of an approach to researching the richness of a particular experience - serendipity. Three themes were identified; life as lived and felt, the whole person, and dialogical sense making. These were used to help understand the key qualities of the strategy, tools and techniques that were required in the empirical study of the experience of serendipity. The paper explains this process and illustrates the depth of understanding that our choice of tools afforded. After describing the case study we offer some guidance on how to choose appropriate tools and methods for researching other types of experience.

2010-11-22
https://dl.acm.org/ft_gateway.cfm?id=1952278&dwn=1

Evidence-based software production

"…[S]oftware remains NIT's [Networking and Information Technology] greatest weakness. Although reliable and robust software is central to activities throughout society, much software is brittle, full of bugs and flaws. Software development remains a labor-intensive process in which delays and cost overruns are common, and responding to installed software's errors, anomalies, vulnerabilities, and lack of interoperability is costly to organizations throughout the U.S. economy." "…[T]he science of software development must be a focus of Federal NIT R&D. As software's complexity continues to rise, today's design, development, and management problems will become intractable unless fundamental breakthroughs are made…"[2]

Current understanding of software development---largely based on anecdotes---is inadequate for this "science of software development." Achieving the deeper understanding needed to transform software production requires collecting and using evidence on a large scale. This paper proposes some steps toward that outcome.

2010-11-07
https://dl.acm.org/ft_gateway.cfm?id=1882403&dwn=1

From silhouettes to 3D points to mesh: towards free viewpoint video

This paper presents a system for 3D reconstruction from video sequences acquired in multi-camera environments. In particular, the 3D surfaces of foreground objects in the scene are extracted and represented by polygon meshes. Three stages are concatenated to process multi-view data. First, a foreground segmentation method extracts silhouettes of objects of interest. Then, a 3D reconstruction strategy obtains a cloud of oriented points that lie on the surfaces of the objects of interest in a spatially bounded volume. Finally, a fast meshing algorithm provides a topologically correct interpolation of the surface points that can be used for both visualization and further mesh processing purposes. The quality of the results (computational load) obtained by our system compares favorably against a baseline system built from state-of-the-art techniques for similar processing times (quality of the results).

2010-10-29
https://dl.acm.org/ft_gateway.cfm?id=1877797&dwn=1

So you want to run a film festival?

In an effort to promote awareness of and increase enrollment in the Division of Arts and Humanities, the Harvard Short Film Festival, a University-wide competition of three-minute short films about scholarly research and teaching, was launched in spring 2010.

Recent advances in media software now make it possible for someone equipped with just a personal computer, a digital camera, and some short software tutorials to compose compelling multimedia presentations that earlier required extensive training and costly equipment. Not the least of the changes implicit in this new technology is the promise of radically altered horizons in intellectual life---the development of new, more aesthetically rich modes of exposition in which the message expressed by words is blended with the implicit meanings and powerful affect conveyed by images and music.

This paper will outline the necessary marketing and technology considerations required to successfully support a film festival in an academic environment

2010-10-24
https://dl.acm.org/ft_gateway.cfm?id=1878370&dwn=1

Serendipitous family stories: using findings from a study on family communication to share family history

Storytelling and sharing family histories are important parts of what it means to "be" a family. Based on results from a study on intergenerational communication over a distance, we created the Serendipitous Family Stories system. The service allows family members to create visual and audio stories about places of importance in their lives and for their relatives to discover them serendipitously as they go about their lives. We will describe the motivation for the application and explain its functionality. Results from a field study are forthcoming.

2010-09-26
https://dl.acm.org/ft_gateway.cfm?id=1864435&dwn=1

Research in the large. using app stores, markets, and other wide distribution channels in Ubicomp research

The mobile phones that people use in their daily lives now run advanced applications and come equipped with sensors once only available in custom hardware in UbiComp research. At the same time application distribution has become increasingly simple due to the proliferation of app stores and the like. Evaluation and research methods have to be adapted to this new context to get the best data and feedback from wide audiences. However, an overview of successful strategies to overcome research challenges inherent to wide deployment is not yet available. App store platform characteristics, devices, reaching target users, new types of evaluation data and dynamic, heterogeneous usage contexts have to be dealt with. This workshop provides a forum for researchers and developers to exchange experiences and strategies for wide distribution of applications. We aim at building an understanding of the opportunities of various distribution channels and obstacles involved in a research context.

2010-09-26
https://dl.acm.org/ft_gateway.cfm?id=1864501&dwn=1

SplittingHeirs: inferring haplotypes by optimizing resultant dense graphs

Phasing genotype data to identify the composite haplotype pairs is a widely-studied problem due to its value for understanding genetic contributions to diseases, population genetics research, and other significant endeavors. The accuracy of the phasing is crucial as identification of haplotypes is frequently the first step of expensive and vitally important studies. We present a combinatorial approach to this problem which we call SplittingHeirs. This approach is biologically motivated as it is based on three widely accepted principles: there tend to be relatively few unique haplotypes within a population, there tend to be clusters of haplotypes that are similar to each other, and some haplotypes are relatively common. We have tested SplittingHeirs, along with several popular existing phasing methods including PHASE, HAP, EM, and Pure Parsimony, on seven sets of haplotype data for which the true phase is known. Our method yields the highest accuracy obtainable by these methods in all cases. Furthermore, SplittingHeirs is robust and had higher accuracy than any of the other approaches for the two datasets with high recombination rates. The success of SplittingHeirs validates the assumptions made by the dense graph model and highlights the benefits of finding globally optimal solutions.

2010-08-02
https://dl.acm.org/ft_gateway.cfm?id=1854798&dwn=1

ReFHap: a reliable and fast algorithm for single individual haplotyping

Full human genomic sequences have been published in the latest two years for a growing number of individuals. Most of them are a mixed consensus of the two real haplotypes because it is still very expensive to separate information coming from the two copies of a chromosome. However, latest improvements and new experimental approaches promise to solve these issues and provide enough information to reconstruct the sequences for the two copies of each chromosome through bioinformatics methods such as single individual haplotyping. Full haploid sequences provide a complete understanding of the structure of the human genome, allowing accurate predictions of translation in protein coding regions and increasing power of association studies.

In this paper we present a novel problem formulation for single individual haplotyping. We start by assigning a score to each pair of fragments based on their common allele calls and then we use these score to formulate the problem as the cut of fragments that maximize an objective function, similar to the well known max-cut problem. Our algorithm initially finds the best cut based on a heuristic algorithm for max-cut and then builds haplotypes consistent with that cut. We have compared both accuracy and running time of ReFHap with other heuristic methods on both simulated and real data and found that ReFHap performs significantly faster than previous methods without loss of accuracy.

2010-08-02
https://dl.acm.org/ft_gateway.cfm?id=1854802&dwn=1

Parallel processing of data from very large-scale wireless sensor networks

In this paper we explore the problems of storing and reasoning about data collected from very large-scale wireless sensor networks (WSNs). Potential worldwide deployment of WSNs for, e.g., environmental monitoring purposes could yield data in amounts of petabytes each year. Distributed database solutions such as BigTable and Hadoop are capable of dealing with storage of such amounts of data. However, it is far from clear whether the associated MapReduce programming model is suitable for processing of sensor data. This is because typical applications MapReduce is used for, currently are relational in nature, whereas for sensing data one is usually interested in spatial structure of data instead. We show that MapReduce can indeed be used to develop such applications, and also describe in detail a general architecture for service platform for storing and processing of data obtained from massive WSNs.

2010-06-21
https://dl.acm.org/ft_gateway.cfm?id=1851590&dwn=1

Bridging pre-silicon verification and post-silicon validation

Post-silicon validation is a necessary step in a design's verification process. Pre-silicon techniques such as simulation and emulation are limited in scope and volume as compared to what can be achieved on the silicon itself. Some parts of the verification, such as full-system functional verification, cannot be practically covered with current pre-silicon technologies. This panel brings together experts from industry, academia, and EDA to review the differences and similarities between pre- and post-silicon, discuss how the fundamental aspects of verification are affected by these differences, and explore how the gaps between the two worlds can be bridged.

2010-06-13
https://dl.acm.org/ft_gateway.cfm?id=1837300&dwn=1