Five ways fashion brands are using AI for personalization

Few industries are impervious to the ongoing artificial intelligence revolution. Driven by a host of open source technologies, brands of all stripes are tapping into the potential of both artificial intelligence and machine learning to make sense of big data.

However, some industries are set to benefit more than others. Fashion and artificial intelligence have always seemed likely bedfellows, for example.

First of all, there is no trendier topic in tech than AI. Chanel hinted at this link between fashion and tech (very overtly) with their robot-themed fashion show in late 2016.

If fashion brands want to be de rigueur nowadays, they need to be investing in artificial intelligence to personalize consumer experiences. AI can also make quick work of the huge amounts of data generated by retail activity, as well as tackling more prosaic pain points like supply chain optimization.

Every aspect of the fashion industry is ripe for disruption by AI, but fashion is also a uniquely subjective pursuit. Deciding on new trends is typically the reserve of an exalted elite; however, IBM’s Watson has also taken a crack at predicting next season’s trends by analyzing the recent offerings by renowned designers.

As such, the scene is already set for AI to have a positive impact on the interaction between fashion brands and consumers.

While some continue to ponder this technology’s potential, other fashion brands are reaping the rewards today. Below are five examples from brands who have been getting AI right.

Sephora Visual Artist

Many beauty brands have experimented with AI through chatbot interfaces, but American brand Sephora has taken the technology a significant step further with its Visual Artist product.

Sephora Visual Artist allows potential customers to “try on” cosmetic products including lipsticks, eye shadows, and highlighting palettes via the Sephora app or website.

Powered by Modiface AI technology, the Visual Artist can map and identify facial features, then use augmented reality to “apply” the user’s selected product and shade. Moreover, it can automatically apply suggested shades based on the consumer’s skin tone.

In the reader’s best interests, I have selected to trial the feature with an image of one of Sephora’s models in the screenshot below, rather than a picture of me.

The Sephora app also features video tutorials and provides a platform for customers to upload their own videos, if they wish. The overall impression is of a very effective and useful application of AI to personalize the shopping experience.

It works because it goes beyond some of the PR stunts we have seen AI used for, instead using the technology to solve a perennial problem. Trying on make-up can be laborious in person; pretty much impossible online.

The tie-in to Sephora’s inventory of products is seamless and, driven by the same AI engine, personalized recommendations can be provided instantly. All of this makes it a grown-up, sophisticated use of technology that feels natural.

The Visual Artist is also hosted on Messenger, so shoppers can send a picture to the Sephora chatbot and it will come back to them with a range of suggested products, along with an augmented reality image of how they look once applied. Of course, purchasing these items is made as painless as possible.

This universality of the AI Visual Artist throughout the Sephora ecosystem will only serve to entrench this technology further in their customers’ digital experience with the brand.

Thread

Thread is a UK-based fashion retailer, launched in 2012. Its core premise is that it will pair customers with a stylist and create tailored recommendations on a weekly basis, based on the customer’s stylistic preferences.

During the sign-up process, customers submit photos of themselves and select a range of images of outfits that best reflect the styles they would like to emulate. They are then introduced to their stylist and asked a few more questions about particular items they are seeking.

It is worth noting that these human stylists play a vital role in this process, selecting the inventory for the Thread website and fine-tuning suggested outfits.

Recommendations are sent to customers once per week and they can use a Tinder-style interface to feed back whether they liked the outfit or not.

This is central to the optimal functioning of Thread’s technology; the more feedback they receive on suggested outfits, the better their recommendations become over time.

Of course, this would be a very difficult process to emulate at scale manually, which is precisely the reason it hasn’t been done until recently. By using artificial intelligence, Thread can spot patterns in images that reflect a customer’s preferred style and comb through thousands of products to find the right item, in the right size. If it gets it wrong, its AI system (known internally as ‘Thimble’), will probably get it right next time.

Thread’s approach to shopping removes some of the layers of frustration that go hand in hand with scouring through clothes rails to find the style you want, only to realize they don’t have your size in stock.

Perhaps owing to this seamless experience and the (somewhat earned) cliché that men hate spending time shopping for clothes, Thread has primarily focused on menswear since its inception.

It serves as a worthy example of the best way to maximize human skills, but also scale these capabilities through the use of artificial intelligence.

There are plans afoot to integrate external data sources into the core AI technology at Thread, including social media accounts and weather forecasts, so those recommendations may start to get spookily accurate in the near future.

Macy’s On Call

The digital revolution has taken its toll on traditional retailers like Macy’s. The convergence of many changed shopper preferences has left them at a disadvantage when it comes to competing for online shopping dollars.

That is not to say that Macy’s have resisted change altogether. They have experimented with some interesting technology to try and unite the online and offline worlds, most notably with Macy’s On Call.

Macy’s On Call is an in-store smartphone-based helper, powered by IBM’s Watson AI technology. When a customer enters a store, they can go to macys.com/storehelp and start chatting with the digital assistant. Using natural language processing, Macy’s can understand a wide variety of requests and direct shoppers towards their desired items within the store.

It can even detect when users are growing frustrated with the information provided and alert the closest member of in-store staff.

Once more, we see the application of AI to help a fashion retailer solve an age-old challenge. Within a vast department store, it can be challenging for consumers to find specific items and sizes. In-store staff can also spend a lot of their time shepherding consumers in the right direction.

Although still ultimately dependent on people visiting the physical store to begin this digital interaction, it is a sign that Macy’s is embracing the latest technology.

While the technology is only currently available in a small subset of stores, the list is expanding, and the longer-term ambition is for Macy’s On Call to function as a personal AI stylist as well as a store guide.

Amazon Echo Look

The aim of acting as a personal AI stylist is much closer to being a reality with the Amazon Echo Look. This hands-free camera, available for $200 in the US (by invitation only, for now), is powered by Amazon‘s AI assistant Alexa.

Put simply, this new Amazon product is intended to help users select the right outfit. Based on voice commands, it can take a picture and review the stylistic merits of the outfit the subject is wearing. This being Amazon, it can also make personalized recommendations for more suitable items – all available to purchase directly from the world’s biggest ecommerce website.

It has not gone unnoticed that the Echo Look is the closest thing we have seen to a real version of Cher’s Closet in the 1995 film Clueless.

At the heart of the Echo Look’s repertoire is the Style Check feature, which is capable of processing images and understanding the relationships between its component parts. Combined with input from professional stylists, it can then decide which outfit looks better on the user.

This is a highly contentious, subjective field, of course, so it is unlikely that users will take Amazon’s word as the gospel truth every time.

Early reviews suggest that the product still has some way to go before it even reaches Cher-level accuracy. In particular, it has been noted that it does not have enough contextual detail at its disposal to make judgments on individual outfits.

Nonetheless, this is exactly the kind of data Amazon is aiming to gain by placing an AI assistant in the center of our homes. Whether enough people want an online retailer to have a camera in their bedroom is another question altogether.

Nonetheless, even if the Echo Look fails to take hold, this quite brazen launch from Amazon serves as a bellwether of the direction the online fashion industry may be heading in over the next few years.

ASOS Visual Search

The final entry on our list is also the most recent development. Earlier this month, online fashion retailer ASOS launched their new visual search technology, available via their app.

Visual search is a fascinating area of technological innovation, with companies like Google and Pinterest investing heavily in image recognition AI systems.

What these all have in common is the desire to monetize the ‘discovery’ phase of the shopping journey by making personalized recommendations to consumers before they even know exactly which kind of product they want.

ASOS visual search turns the user’s smartphone camera into a discovery tool, allowing them to take a picture of a product.

By identifying the shape, color, and pattern of the object, ASOS’ AI technology can then cross-reference its own inventory of products and serve up the most relevant results.

This is an entirely organic development for ASOS. As an online-only platform, the business has always been at the forefront of digital innovation in fashion. In fact, the ASOS name originally stood for ‘As Seen on Screen’, and much of its early success was due to its recreation of celebrity outfits, available for consumers to buy at a low price. As such, we can view this as ASOS using technology to deliver on its original business proposition in a more effective way.

There is still room for this product to grow, too. At present, it captures those transient moments where we see something and want to recreate the look. By expanding this to recommend complementary items and complete a new look, ASOS will open another new revenue stream. With the vast quantities of user data it has captured over the years, the possibilities for personalization will be endless.

Clark Boyd