Sign In

Communications of the ACM

BLOG@CACM

Why is Great Design so Hard (Part Two)?


Carnegie Mellon Associate Professor Jason Hong

This blog entry is a followup to a previous one a few weeks back looking at the challenges of integrating great interaction and user experience design into organizations. Basically, my question was, everyone wants well-designed products, but few organizations seem to be able to make it happen. What are the barriers? Why isn't good design more pervasive? If we want good design to spread to more organizations, we need to have a better understanding of what is and isn't working. 

 
My last blog entry examined challenges that software companies faced in incorporating design into how they make products. This blog entry takes a different tack, and instead looks at what successful organizations are doing right. 
 
More specifically, I've been having many discussions with different individuals over the past few weeks about how Apple does design. Many people perceive Apple as the paragon of design, with clean, sleek interfaces that aren't just easy to use, but also are fun, attractive, and aesthetically appealing at a visceral level. So what is Apple doing right?
 
One point repeated by many people was that the culture of design is pervasive throughout Apple. Everyone knows it's important and that it helps sell products. 
 
What surprised me was examples of how this culture was translated into practical everyday work at Apple. Some people mentioned how there was strong top-down design work of the overall user experience done up front, rather than the typical bottom-up user interface tinkering done by most organizations (and then bringing in designers after the system has already been built). 
 
One story that stood out in particular really helped me understand just how much prominence was given to design. One person that worked on designing hardware recounted how he was given a prototype of a physical form factor by an industrial designer. His team looked at the shape and size, and said that what the designer was asking for was impossible. The industrial designer pushed back and said "prove it." The team iterated on various hardware layouts and got to about 90% of what the industrial designer wanted, and told him that if he made a few changes to the form factor, that they could make everything fit.
 
Another surprise was how different Apple's design methods were from "standard" best practices in human-computer interaction (HCI). For example, a typical method we teach in HCI is to start with ethnographic field studies to gain deep insights into what people do, how they do it, and why they do it. Another best practice is to do iterative user testing with benchmarks, to ensure that people find products useful and usable. 
 
From what I can tell, Apple doesn't use any of these methods. 
 
Instead, people described three different methods used at Apple. The first is that Apple preferred having subject matter experts who have many years of experience in the field be part of their teams. For example, for Aperture, Apple's photo management software, it might be an expert photographer who deals with tens of thousands of photographs on a regular basis. For iMovie, it might be a team of people who edit movie clips for a living. In one sense, this approach might be thought of as an adaptation of participatory design, where the people who will eventually use the software help design the software. Historically, however, participatory design has been used for custom software for a specific organization, so Apple's use of experts for mass market software is a new twist.
 
The second is that people at Apple think really long and hard about problems. From that perspective, certain solutions will pop out as being obviously better ways of doing things. Thus, part of Apple's strategy is to guide people toward that way of thinking as well. If you see problems the same way that Apple does, then the structure and organization of an interface will make sense.
 
The third is that Apple tends to design by principle rather than from data. In classes in human-computer interaction, we emphasize making design decisions based on evidence as much as possible, for example, from past user studies on previous iterations of the interface, or from the ethnographic field studies. In contrast, at Apple, more weight is given to design decisions made from first principles.
 
So what does this all mean? 
 
I have two closing thoughts here. First, should we just throw away existing HCI methods for design? Given the sharp contrast between traditional methods in HCI and the methods used at Apple, and given the clear success of Apple's products, do HCI methods actually matter? 
 
One of my colleagues has a good counter-argument here, which is that Apple's products aren't always the first in an area. The iPod wasn't the first MP3 player, iTunes wasn't the first online music store, and the iPhone wasn't the first smartphone. As such, Apple can learn from the mistakes of others, apply the skills of subject-matter experts, and hone existing designs in a proven market. However, for new kinds of hardware or applications which there isn't a lot of precedence for, then this style of "think really long and hard" won't be as effective in pinpointing user needs and developing products in new markets. 
 
Second, how much prominence should be given to design within organizations? What is the right mix of design, engineering, and business needs? For example, the so-called "death grip" for iPhones, where holding the phone the wrong way would lead to a drop in signal strength, is clearly an engineering problem rather than an interaction design problem. A better mix of design and engineering may have caught the problem long before production and shipping.
 
Furthermore, it's actually not clear if Apple's approach to design is optimal or even replicable. Apple's approach relies heavily on people at the top of the organization consistently coming up with great ideas and great designs. However, we've also seen a tremendous amount of innovation with reasonably good interfaces coming out of Google, Facebook, Amazon, and other high-tech companies. Perhaps there are ways of helping organizations be more user-centric without having to radically re-structure their company, a topic I'll explore in my next blog post.
 

Sign In for Full Access
» Forgot Password? » Create an ACM Web Account
Read CACM in a free mobile app!
Access the latest issue, plus archived issues and more
ACM Logo
  • ACM CACM apps available for iPad, iPhone and iPod Touch, and Android platforms
  • ACM Digital Library apps available for iOS, Android, and Windows devices
  • Download an app and sign in to it with your ACM Web Account
Find the app for your mobile device
ACM DL Logo