News
Artificial Intelligence and Machine Learning News

Computing What Fits

New apps and pods improve the virtual and actual retail experiences by ensuring clothing and cosmetics look good on you before you buy them.
Posted
  1. Introduction
  2. Reducing Returns
  3. Solving the Color Problem
  4. Virtual Makeup
  5. Improving Recommendations
  6. They Do Not Always Work
  7. Further Reading
  8. Author
  9. Figures
Fits.me robotic mannequin
A Fits.me robotic mannequin, which takes a user's measurements to demonstrate how clothing will fit.

For many men, shopping with their wives is akin to having someone scratch their nails across a chalkboard. For Romney Evans, however, such an outing back in 2004 turned out to be a fateful day. His wife was having “a horrible time finding jeans,” he recalls. “She must have tried on a dozen items … and I think she was feeling bad about herself. I thought, ‘This is crazy; is it really this difficult?'”

Evans thought there had to be a way to enter some information online and find the right style, so he started researching how other people were attempting to deal with the problem. While in graduate school at Babson College, Evans says his goal became launching a consumer site devoted to helping people find the right fit in jeans. He found a partner in classmate Jessica Arrendondo Murphy, who had a background in apparel and buying, and in 2007 they launched what is now called True Fit. The company’s “fit confidence engine” analyzes reams of data to help consumers understand how well clothes and shoes they see online will actually fit them.

Today, the technology is used by department stores like Macy’s and Nordstrom’s; consumers visiting those companies’ websites can create a True Fit profile and receive recommendations of designers whose clothing will work on their body type. “We’re converting browsers into buyers,” says Evans.

There are numerous apps, walk-in pods, and virtual “fitting rooms” that help consumers determine what clothing will accurately fit them and allow them to virtually try cosmetics, and it is not difficult to figure out why. eMarketer estimates apparel and accessories account for 17% of overall retail ecommerce sales, which are projected to reach almost $60 billion this year, and to grow to $86 billion by 2018. Forrester projects that by 2017, 60% of U.S. retail sales will be influenced by research on a laptop or mobile device.


The goal of these apps is to both help people feel confident about the choices they make, and to reduce the cost of returns for retailers.


Technologies to make such apps and pods possible include augmented reality in the case of makeup apps, according to Charlene Li, founder of research and advisory firm Altimeter Group. A video camera captures an image of your face, and the system superimposes images of makeup products over the first image, as if you were looking in a mirror, she says. Also frequently used is Kinect from Microsoft, a 3D scanning technology that acts like a camera and senses motion, allowing users to interact with it through gestures and spoken commands. Kinect can model objects in 3D based on depth information.

Yet the benefits of makeup apps are being diluted by the experience, maintains James McQuivey, a vice president and principal analyst at Forrester Research. While technology makes the apps appear elegant, companies are dropping the ball in “how they invite you to try the app, how they guide you in making decisions inside the experience, and how they then connect you with tutorials on how to apply that makeup, or even something as simple as help you buy the products you need to complete the look itself.”

Consequently, McQuivey says, consumers are left with an app “that is interesting to try once or twice, but is unlikely to really change the way people shop for and experience makeup.”

Back to Top

Reducing Returns

The goal of these apps is both to help people feel confident about the choices they make, and to reduce the cost of returns for retailers. Clothing is notorious for being one of the most difficult ecommerce categories, and one in four clothing items bought online is returned, mainly because the size the consumer selected was wrong, according to Fits.me founder Heikki Haldre.

London-based virtual fitting room company Fits.me addresses online fit with shape-shifting robotic mannequins “that can contort to thousands of body shapes and sizes,” says Haldre. “The mannequins are used to provide consumers photographic images [of] how a garment looks on a model of their own shape and size.”

Similar to True Fit, consumers can go to a participating retailer’s site, plug in certain body measurements, and Fits.me will offer up actual photos of how a garment will fit their body type on an avatar. At the website of London retailer Thomas Pink, for example, consumers are offered the option to “try it on” by clicking a button and entering a virtual fitting room. They enter various measurements and then are shown the “virtual me.”

“Before Fits.me’s solution, online retailers could only show images of garments on standard models who, for the majority of us, do not represent real consumers at all,” says Haldre.

Fits.me’s proprietary technology was built across several universities in Europe and has been in development for the past five years, Haldre says. The company began commercializing the system in 2013. “We started by collecting 3D body scans from over 20,000 people to build robotic mannequins that, as closely as possible, resemble the population.” The shape-shifting robots use about 70 actuators, and “by moving these actuators to different positions, we can control over [one] million ways to manipulate discrete 3D shape[s],” Haldre says.

Back to Top

Solving the Color Problem

The enthusiasm of teens and college students for clothing, shopping, and how they look was the impetus behind White Mirror, a virtual fitting room that began as a senior project for Rice University electrical engineering student Lam Yuk Wong and her friend Cecilia Zhang.

The two students and their friends liked to shop online, finding it convenient “but also troublesome,” Zhang says, because sometimes sizes would be off, or they would wonder whether a certain color would look better on them than another. “It’s related to your skin color or hair—there are many things on your body that you have to be concerned about.”

Wong and Zhang proposed the idea of a more “authentic” system that would help online shoppers visualize how a potential garment would look on them before making the decision to purchase it. They felt systems that suggest sizes to a customer when they enter body parameters were unreliable “because we tried them, and customers are still not able to see or visualize this garment and whether it fits their skin color or body. There’s no visualization and also, it doesn’t solve the color problem.”

In creating White Mirror, they used Kinect to capture a customer’s body type and create “a ‘virtual you’ in the system,” says Zhang.

They also wanted to make White Mirror operate very quickly, Zhang says. In order to get an accurate simulation of how a garment will look on someone, she says their system utilizes a technique called physical simulation, “which deals with physical particles between the garment and the body in order to see how the garment will form on the body.” However, it takes 40 seconds for physical simulation to work, which is longer than online shoppers will want to wait, Zhang says, so “in our system we combined the physical simulation with machine learning techniques, so our system is able to finish the simulation in 1.2 seconds. Our point is to make a real-time simulation.”


“In our system we combined the physical simulation with machine learning techniques.”


As electrical engineering majors, Zhang and Wong knew little about computer graphics, so they reached out to authors of papers on garment simulation for advice. “We were trying to figure out the algorithm to use to make this all work and make it faster, because we encountered quite a lot of difficulties when implementing this system.” Integrating machine learning techniques allows them to finish the physical simulation behind the scenes, says Zhang, so the system is able to display the customer’s 3D image wearing a selected garment as if in a real fitting room. White Mirror has not yet been tested on shoes.

The system currently requires a second person to hold Kinect to scan a person’s body, but Zhang and Wong are working on refining the system and improving the accuracy of its results. They plan to turn White Mirror into an app, and have received funding to do so from the university, as well as from IBM.

For consumers who hope to avoid having to try on items in a brick-and-mortar store, Bodymetrics has developed walk-in pods that utilize 16 Kinect devices to create a full 3D body scan, says Suran Goonatilake, the company’s CEO and co-founder.

“The pod uses automated ‘body-measurement software’ to generate up to 200 accurate body measurements,” Goonatilake says. The data is securely held on a Microsoft Azure cloud platform and is used to run “a bunch of body analytics.” The pod also generates applications that range from showing the best garments for a body type at a retail store to showing “what you might look like if you were to lose 10 pounds at a fitness club,” he says.

Altimeter’s Li says using Kinect in a walk-in pod is more accurate than an online app. “It’s a fixed location,” she explains. “It knows exactly where you’re standing, so the system has a greater level of accuracy.”

However, consumers have to wear tight-fitting garments when entering the pod; “If you wear baggy clothing, the scanner will be getting measurements of the baggy clothes and not your body,” Goonatilake explains. He says Bodymetrics has scanned over 10,000 people to date.

Until recently, most Bodymetrics pods were installed on a trial basis, to gain understanding of how users react to the technology, Goonatilake says. The pods now are being rolled out at two “youth-oriented” fashion retail chains in the U.K., and a fitness chain in the U.S. with 50 clubs nationwide, on a subscription basis for $3,000 per month.

Back to Top

Virtual Makeup

It is not just clothing that users can try on virtually; there are also apps that can help you decide if a particular eye shadow or lipstick works with your coloring. The apps are designed to give consumers the opportunity to try out products they may not have previously considered without going to a retail store. L’Oreal, for example, last year rolled out the Makeup Genius app, which turns the front-facing camera of an iPad or iPhone into a makeup mirror so consumers can virtually try on a variety of cosmetics.

Shiseido has developed a makeup simulator on an iPad-based device called the “Beauty Tablet,” or “B-Tab,” which is used by all its beauty consultants in Japan, according to Sho Nagai, a spokesperson for Shiseido Co., Ltd. “The device is equipped with special sensors that recognize and identify different parts on the face,” Nagai explains. “The sensors can detect the shape of the eyes and mouth so the virtual makeup stays on the screen, even after blinking or moving.”

Back to Top

Improving Recommendations

As True Fit’s Evans notes, one of the most frustrating aspects of shopping is that a size 6 in one brand could be very different in another. “If everybody’s size is speaking a different language, we’re helping translate that.”

Powered by Hadoop, an open source framework that enables large-scale processing of datasets, True Fit gathers data from retailers, consumers, and over 1,000 brands to help the company finesse recommendations based on historical performance and inferences, Evans says.

True Fit assembles the gathered data and uploads the “blueprints of an item” such as fabric details and style attributes (like whether a pair of jeans is high-cut or boot-cut), says Evans. “When a consumer creates a True Fit profile, they tell us about their body —their height, weight and age, and heuristic indicators—are their hips curvy or straight, to understand their perception of themselves.” He says the system does not ask for measurements, just that the user identifies the kinds of items they like to wear. For example, a user may say they like a Michael Kors A-line sleeveless dress and that they wear a size 4; “Then we have a really good understanding, because we already have the information from Michael Kors, and so we match that to what consumer is telling us.”

Evans likens True Fit to Pandora; in the same way Pandora gathers attributes of songs and looks at the patterns of those attributes to suggest other songs a user will like, True Fit does the same with items of clothing.

The True Fit system gathers sales feed information from a retailer about individual users, like the number of returns and what a user’s successful purchases have been, “and we triangulate that set of items against our database of product information from the brands” to add to a user’s profile. “The magic is, we have blueprints of what successfully fit a person, and use machine learning processes to make high-confidence recommendations, which improve over time as consumers buy and return things,” says Evans. “If we make a recommendation that doesn’t work out, we learn from that and the system fine-tunes that.”

Evans will not reveal how many True Fit profiles have been created, but says the total is in the millions.

Back to Top

They Do Not Always Work

For some, such devices and apps may not achieve their desired effect.

“The biggest issue is most of us don’t like the way we look,” notes Altimeter’s Li. “It’s one thing when I try something on, [but] I look at it fairly differently on a model than on me.”

Forrester’s McQuivey predicts Amazon “may do to virtual or magic mirrors what Apple did to phones.” He envisions the future of technologies that help us determine appropriate looks and fits will be in a person’s bathroom and closet—”not the retail environment, and certainly not the living room”—and will be “much bigger than just trying on virtual shoes or applying virtual cosmetics.” The bathroom of the future will have a mirror that will be a huge display, which will show you any particular fashion or cosmetics look you have in mind, he says.

McQuivey also anticipates Amazon or someone else “will have figured out what to show you when you look in the mirror. Perhaps the service can see your calendar and knows that you have a job interview, and so will virtually dress you and style your hair in a variety of looks…which you can then modify interactively.”

The service will also care about more than just your look, he adds, and may also be able to add to your mirror image “data about your health, your condition, your muscle tone and body fat, all of which it will sense using cameras and other devices in your bathroom.” If you’re a few pounds up after a holiday, for example, correspondingly the outfits the mirror will suggest will be the ones it knows you can fit into and are appropriately slimming, McQuivey says.


“Creating a digital relationship with a customer … could theoretically influence a large share of their consumer spending.”


Technically, all of that is possible today, he says, but businesses have to figure out how to get there. “It will be a mad race to that point because there is so much to be won—or lost—in this race, creating a digital relationship with a customer that could theoretically influence a large share of their consumer spending.”

Back to Top

Further Reading

McQuivey, J.
Digital Disruption: Unleashing the Next Wave of Innovation. http://bit.ly/1w8RSPr

Christensen, C.
The Innovator’s Dilemma: When New Technologies Cause Great Firms to Fall http://bit.ly/1q8Pupq

Downes, L., Nunes, P.
Big Bang Disruption: Strategy in the Age of Devastating Innovation, 2014. http://amzn.to/1rwkxGW

User mindsets are driving the future of applications. http://pwc.to/1welZCj

Future of enterprise apps: Moving beyond workflows to mindflows. http://pwc.to/1BnPxjC

Manyika, J., Chui, M., Bughin, J., Dobbs, R., Bisson, P., Marrs, A.
Disruptive technologies: Advances that will transform life, business, and the global economy. McKinsey Global Institute, May 2013. http://bit.ly/1nGHR3R

Back to Top

Back to Top

Figures

UF1 Figure. A Fits.me robotic mannequin, which takes a user’s measurements to demonstrate how clothing will fit.

UF2 Figure. As many as 200 of a woman’s measurements are captured by 16 Kinect devices inside a Bodymetrics walk-in pod, in order to generate a 3D body scan that can be used to match her physical attributes to specific articles of clothing.

Back to top

Join the Discussion (0)

Become a Member or Sign In to Post a Comment

The Latest from CACM

Shape the Future of Computing

ACM encourages its members to take a direct hand in shaping the future of the association. There are more ways than ever to get involved.

Get Involved

Communications of the ACM (CACM) is now a fully Open Access publication.

By opening CACM to the world, we hope to increase engagement among the broader computer science community and encourage non-members to discover the rich resources ACM has to offer.

Learn More