Artificial Intelligence and Machine Learning

Three Misconceptions About Human-Computer Interaction

IBM Almaden Researcher Tessa Lau

I come to the field of HCI via a background in AI, having learned the hard way that good interaction design trumps smart algorithms in the quest to deploy software that has an impact on millions of users.  Currently a researcher at IBM’s Almaden Research Center, I lead a team that is exploring new ways of capturing and sharing knowledge about how people interact with the web.  We conduct HCI research in designing and developing new interaction paradigms for end-user programming.

One thing I’ve noticed about my new field is that there is a lot of misconception about what exactly HCI means, an effect that I never experienced as an AI researcher.  People generally understand AI. Computers that exhibit human intelligence?  Maybe not in my lifetime, but the ideals and goals of the field have made their way into popular culture.  You’ve seen Steven Spielberg’s movie about a little boy robot, Isaac Asimov’s three laws of robotics, and the infamous HAL 9000.

If you ask pop culture what the ideals and goals are for HCI, what do you get?  It’s not so clear.  Even amongst computing professionals, there is still a lot of uncertainty about what exactly HCI is.  I think the first step towards correcting these misperceptions is to raise awareness of how HCI is perceived by everyone else.

Myth #1: HCI equals UI design

Working for a large company such as IBM, part of my time is spent interacting with folks across the company and evangelizing the need for increased product usability.  Too often, people assume that HCI is only good for positioning widgets within a window, or drawing pretty icons to replace "programmer art".

At a resume workshop last year, a senior systems researcher saw HCI on my CV and started talking about UI design on the iPhone as an example of successful HCI research.

While these surface aspects should not be ignored, HCI research covers a very broad range of topics that go all the way to the core of what a system will be used for, how people perceive its value, and how one can measure the impact the system has on actual users.

Rather than being tacked on to a product as an afterthought, HCI can make a difference throughout the lifecycle of a product.

Myth #2: HCI is a library you can include in your product

Another misconception I’ve heard while talking to folks in IBM’s product divisions is that HCI is a set of library routines that you can include to make your product more usable/prettier/better.  I have had people come to me and expect, in a 30 minute phone call, to come away with some code oralgorithms or colors that will instantly improve their product.

It’s not that easy.  What HCI teaches is the process of system design, not answers to specific questions.  I liken it to people who come by my office, admire my gorgeous handknitted throw, and want me to make them one.  I tell them if they want their own, I can teach them how to knit.

Myth #3: HCI research has a short time horizon

With all the focus on short-term improvements to existing UI, and the design and study of practical systems that already exist or will soon exist, it’s easy to lose sight of the longer-term goals of HCI.  However, ever since my AI days I try to make time occasionally to consider the bigger picture and ask myself why we are doing what we do.  Having a long-term perspective for your work is important — it keeps you from
slogging away at the same problem for years, without noticing that the problem has become obsolete, or technology has moved in a different direction.  It keeps you relevant to the larger goals of helping humanity, which is what brought me to computer science research in the first place.

My personal vision of HCI is that we are creating new ways for people to interact with information, through smart interaction design and interfaces that unleash human creativity.  Ten years from now, what will the web look like?  How will people interact with it?  What tools will people need to become full-fledged citizens of the new information economy?

More on this in posts to come.


Disclaimer: The views expressed here do not necessarily reflect the views
of my employer, ACM, or any other entity besides myself.


Join the Discussion (0)

Become a Member or Sign In to Post a Comment

The Latest from CACM

Shape the Future of Computing

ACM encourages its members to take a direct hand in shaping the future of the association. There are more ways than ever to get involved.

Get Involved

Communications of the ACM (CACM) is now a fully Open Access publication.

By opening CACM to the world, we hope to increase engagement among the broader computer science community and encourage non-members to discover the rich resources ACM has to offer.

Learn More