Sign In

Communications of the ACM

1 - 10 of 167 for bentley

Keeping science on keel when software moves

An approach to reproducibility problems related to porting software across machines and compilers.


Validity frame concept as effort-cutting technique within the verification and validation of complex cyber-physical systems

The increasing performance demands and certification needs of complex cyber-physical systems (CPS) raise the complexity of the engineering process, not only within the development phase, but also in the Verification and Validation (V&V) phase. A proven technique to handle the complexity of CPSs is Model-Based Design (MBD). Nevertheless, the verification and validation of complex CPSs is still an exhaustive process and the usability of the models to front-load V&V activities heavily depends on the knowledge of the models and the correctness of the conducted virtual experiments. In this paper, we explore how the effort (and cost) of the V&V phase of the engineering process of complex CPSs can be reduced by enhancing the knowledge about the system components, and explicitly capturing it within their corresponding validity frame. This effort reduction originates from exploiting the captured system knowledge to generate efficient V&V processes and by automating activities at different model life stages, such as the setup and execution of boundary-value or fault-injection tests. This will be discussed in the context of a complex CPS: a safety-critical adaptive cruise control system.


Towards adaptive abstraction for continuous time models with dynamic structure

Humans often switch between multiple levels of abstraction when reasoning about salient properties of complex systems. These changes in perspective may be leveraged at runtime to improve both performance and explainability, while still producing identical answers to questions about the properties of interest. This technique, which switches between multiple abstractions based on changing conditions in the modelled system, is also known as adaptive abstraction.

The Modelica language represents systems as a-causal continuous equations, which makes it appropriate for the modelling of physical systems. However adaptive abstraction requires dynamic structure modelling. This raises many technical challenges in Modelica since it has poor support for modifying connections during simulation. Its equation-based nature means that all equations need to be well-formed at all times, which may not hold when switching between levels of abstraction. The initialization of models upon switching must also be carefully managed, as information will be lost or must be created when switching abstractions [1].

One way to allow adaptive abstraction is to represent the system as a multi-mode hybrid Modelica model, a mode being an abstraction that can be switched to based on relevant criteria. Another way is to employ a co-simulation [2] approach, where modes are exported as "black boxes" and orchestrated by a central algorithm that implements adaptivity techniques to dynamically replace components when a switching condition occurs.

This talk will discuss the benefits of adaptive abstraction using Modelica, and the conceptual and technical challenges towards its implementation. As a stand-in for a complex cyber-physical system, an electrical transmission line case study is proposed where attenuation is studied across two abstractions having varying fidelity depending on the signal. Our initial results, as well as our explorations towards employing Modelica models in a co-simulation context using the DEVS formalism [4] are discussed. A Modelica only solution allows to tackle complexity via decomposition, but does not improve performances as all modes are represented as a single set of equations. The co-simulation approach might offer better performances [3], but complicates the workflow.


Constraint handling in genotype to phenotype mapping and genetic operators for project staffing

Project staffing in many organisations involves the assignment of people to multiple projects while satisfying multiple constraints. The use of a genetic algorithm with constraint handling performed during a genotype to phenotype mapping process provides a new approach. Experiments show promise for this technique.


Rethinking Consumer Email: The Research Process for Yahoo Mail 6

This case study follows the research process of rethinking the design and functionality of a personal email client, Yahoo Mail. Over three years, we changed the focus of the product from composing emails towards automatically organizing specific categories of business to consumer email (such as deals, receipts, and travel) and creating experiences unique to each category. To achieve this, we employed iterative user research with over 1,500 in-person interviews in six countries and surveys to many thousands of people around the world. This research process culminated in the launch of Yahoo Mail 6.0 for iOS and Android devices in the fall of 2019.


Exploring the Quality, Efficiency, and Representative Nature of Responses Across Multiple Survey Panels

A common practice in HCI research is to conduct a survey to understand the generalizability of findings from smaller-scale qualitative research. These surveys are typically deployed to convenience samples, on low-cost platforms such as Amazon's Mechanical Turk or Survey Monkey, or to more expensive market research panels offered by a variety of premium firms. Costs can vary widely, from hundreds of dollars to tens of thousands of dollars depending on the platform used. We set out to understand the accuracy of ten different survey platforms/panels compared to ground truth data for a total of 6,007 respondents on 80 different aspects of demographic and behavioral questions. We found several panels that performed significantly better than others on certain topics, while different panels provided longer and more relevant open-ended responses. Based on this data, we highlight the benefits and pitfalls of using a variety of survey distribution options in terms of the quality, efficiency, and representative nature of the respondents and the types of responses that can be obtained.


Ownership, Privacy, and Control in the Wake of Cambridge Analytica: The Relationship between Attitudes and Awareness

Has widespread news of abuse changed the public's perceptions of how user-contributed content from social networking sites like Facebook and LinkedIn can be used? We collected two datasets that reflect participants' attitudes about content ownership, privacy, and control, one in April 2018, while Cambridge Analytica was still in the news, and another in February 2019, after the event had faded from the headlines, and aggregated the data according to participants' awareness of the story, contrasting the attitudes of those who reported the greatest awareness with those who reported the least. Participants with the greatest awareness of the news story's details have more polarized attitudes about reuse, especially the reuse of content as data. They express a heightened desire for data mobility, greater concern about networked privacy rights, increased skepticism of algorithmically targeted advertising and news, and more willingness for social media platforms to demand corrections of inaccurate or deceptive content.


Promoting Collaborative Skills with Github Project Boards

Teamwork skills are much in demand in the workplace, even more so with the growth of Agile methods. This calls for giving Computer Science students more practice in the kinds of team scenarios they will encounter on the job. Key for success are hands-on experience with planning methods, prioritization techniques, time management and organization. This poster shows how the cooperative tracking tool Github Project Boards helps teams strategize development, track progress, distribute work evenly, and facilitate collaboration. It also shows how instructors can use Github Project Boards to visualize and evaluate a team's development process.


Computing the Geometric Intersection Number of Curves

The geometric intersection number of a curve on a surface is the minimal number of self-intersections of any homotopic curve, i.e., of any curve obtained by continuous deformation. Given a curve c represented by a closed walk of length at most ℓ on a combinatorial surface of complexity n, we describe simple algorithms to (1) compute the geometric intersection number of c in O(n+ ℓ2) time, (2) construct a curve homotopic to c that realizes this geometric intersection number in O(n+ℓ4) time, and (3) decide if the geometric intersection number of c is zero, i.e., if c is homotopic to a simple curve, in O(n+ℓ log ℓ) time. The algorithms for (2) and (3) are restricted to orientable surfaces, but the algorithm for (1) is also valid on non-orientable surfaces.

To our knowledge, no exact complexity analysis had yet appeared on those problems. An optimistic analysis of the complexity of the published algorithms for problems (1) and (3) gives at best a O(n+g22) time complexity on a genus g surface without boundary. No polynomial time algorithm was known for problem (2) for surfaces without boundary. Interestingly, our solution to problem (3) provides a quasi-linear algorithm to a problem raised by Poincaré more than a century ago. Finally, we note that our algorithm for problem (1) extends to computing the geometric intersection number of two curves of length at most ℓ in O(n+ ℓ2) time.


Maximum Physically Consistent Trajectories

Trajectories are usually collected with physical sensors, which are prone to errors and cause outliers in the data. We aim to identify such outliers via the physical properties of the tracked entity, that is, we consider its physical possibility to visit combinations of measurements. We describe optimal algorithms to compute maximum subsequences of measurements that are consistent with (simplified) physics models. Our results are output-sensitive with respect to the number k of outliers in a trajectory of n measurements. Specifically, we describe an O(n log n log2 k) time algorithm for 2D trajectories using a model with unbounded acceleration but bounded velocity, and an O(nk) time algorithm for any model where consistency is "concatenable": a consistent subsequence that ends where another begins together form a consistent sequence. We also consider acceleration-bounded models which are not concatenable. We show how to compute the maximum subsequence for such models in O(nk2 log k) time, under appropriate realism conditions. Finally, we experimentally explore the performance of our algorithms on several large real-world sets of trajectories. Our experiments show that we are generally able to retain larger fractions of noisy trajectories than previous work and simpler greedy approaches. We also observe that the speed-bounded model may in practice approximate the acceleration-bounded model quite well, though we observed some variation between datasets.