For optimal user experience, please upgrade your browser.
Eye Know You! Image

Eye Know You!

Floating Widget

Floating Item Container

Floating Rate Widget




Please Select
Your Rating

This article was published in the December 2015 issue of STORES Magazine.

Enhancing the in-store experience through facial recognition software and supercomputer analytics

Online shoppers expect a fair bit of personalization: They can be greeted at log-in, access shopping history, track shipping information and receive promotional emails highlighting items they’d be interested in.

Bricks-and-mortar retailers, meanwhile, might have a loyalty program that associates can access at the point of sale — not much room for personalization while consumers are on the sales floor, browsing products and looking to buy.

“Greater engagement leads to greater sales,” says Doug Bain, chief revenue officer of eyeQ Insights, and “personalization works in all channels. The fact that it hasn’t been done in the bricks-and-mortar space is simply a technological limitation, not a lack of desire to accomplish it there.”

EyeQ’s goal is to help bricks-and-mortar retailers provide the same level of service, attention and awareness available online. The technology firm’s approach involves a combination of in-store digital signage, sophisticated facial recognition software and the capabilities of Watson, IBM’s cognitive computing system. (Watson, you may remember, was developed to show off IBM’s abilities by answering questions on the television program “Jeopardy.” After vanquishing all human competition, it has gone on to other things.)

Watson has two advantages over the average laptop: Instead of answering a query with a list of relevant documents, it synthesizes the information available to it and delivers a specific answer. And it’s a supercomputer, with exponentially greater speed, storage capacity and power.

When a shopper stops to look at an eyeQ digital sign, the sign looks back at him, and based on facial features and appearance, tailors its content to the viewer’s age and gender.

The sorting hat

The best way to understand what eyeQ is trying to do is to compare it to a conventional digital display. Unless it’s a specialized store, that display can’t show shoppers a significant portion of a store’s assortment, so it might instead show a variety: lots of fast fades and dissolves, maybe some panning video of the aisles or the escalators. The idea is to keep it moving, so if a shopper stands there long enough she’ll see something she might be interested in.

The digital sign in eyeQ’s system is connected to a camera, a lot of proprietary facial recognition software and — via IBM’s cloud service — Watson. When a shopper stops to look at an eyeQ digital sign, the sign looks back at him, and based on facial features and appearance, tailors its content to the viewer’s age and gender. Men between the ages of 35 and 45 might see suits, top-end cameras or fly rods; women between the ages of 18 and 30 might see jewelry or clothes.

“When somebody approaches the screen,” says Bain, “the screen is already aware of the baseline information — the age and gender of the person — and is ready to make product recommendations.”

There’s more, though: If the shopper gives the system her Twitter username, Watson can capture her most recent 200 tweets, run them through its natural language processing capabilities and slot her into one of a selection of basic personality types.

Based on that, the system can not only change the products being recommended, but the whole experience — background colors, video, music, whatever it’s got.

eyeQ.Beer o Meter

When presented with an image of an empty beer mug, viewers who smiled at the eyeQ sign triggered the mug to fill.

Bain stresses that identifying oneself isn’t necessary. “If you want to tie into someone’s specific history, they need to opt in. In most stores’ experience, opt-in rates, whether it takes the form of downloading an app or something else, tend to be fairly low, usually in the single digits,” he says.

“What we’re offering may be so compelling that it gets into the high single digits, but we don’t want to depend on that. What we have designed is something that will work for 100 percent of shoppers, because it can see 100 percent of shoppers.”

The most recent version of eyeQ’s system can even register emotion. “It’s categorizing the person’s expression as happy, sad, angry, surprised — and to what degree,” says Bain. “Is their expression changing dramatically when they get to a certain point in the video, or a certain page, or when they touch the screen? It’s another data point to determine the effectiveness of the content.”

EyeQ’s demonstration of this capability featured a screen with an image of an empty beer mug. Stepping up and smiling at the sign would trigger the mug to fill up; maintaining the smile made the mug fill completely and resulted in a ticket for a mug of real beer.

Looking ahead

As it develops its product, eyeQ has been working closely with retail marketing agency TPN. Manolo Almagro, TPN’s senior managing director for digital and retail technologies, notes that the system, even without a consumer opting in or providing their own information, can identify a repeat visitor by the unique media access control address her mobile device sends out to find available Wi-Fi.

“We don’t know her name,” he says, “but we know it’s the same 40-year-old woman who was in on Tuesday.”

The system can identify a repeat visitor by the unique media access control address her phone sends out to find available Wi-Fi.

This, says Almagro, is where further development of the technology requires some care. “I know you’re interested in a certain type of product, I know your age and gender and I know your personality type. This is a level of depth we’ve never had before, and we have to ask ourselves, ‘How much personalization is too much?’ I want to give enough to make the experience more convenient for you, but not so much that it becomes creepy. That’s a kind of fine line we have to walk.”

While some versions of the technology have been in stores for a couple of years, it is still very much a work in progress.

“When we talk to our clients,” Almagro says, “they say, ‘Oh, that’s very interesting, can you tell us more? Can you build a prototype for us? Can we put that prototype in the store to see how it works?’”

At the moment, one retailer has a prototype in place, and several consumer package goods companies are experimenting with prototypes in the retail space.

Within the next few months, eyeQ expects to have a prototype implementation robust enough to start to generate some real feedback.

Bain says early signs are encouraging. “In one series of A/B testing, the touch measurements of engagement went up 47 percent in the most recent generation from what it had been before. And we have other data that demonstrates that increased engagement with the system does indeed mean increased sales.”

As noted, one primary goal is to make the in-store experience more like the online experience. “I think the way we’re looking at it,” says Almagro, “is that we can complete the circle of information. You can literally follow the phone to the cash register if you have a beacon there. You can say, ‘OK, that person got the information, they bought the product and they checked out.’ And every time they come back we can learn more, and be even more thoughtful.”


Digitizing the Store
NRF, Demandware and University of Arizona Study: Digitizing the Store (2014)

The 2014 report surveyed more than 200 retail business and technology executives in the US and Europe to evaluate the convergence of POS and e-commerce technology and its impact on stores.

Featured News

Episode 17 of Retail Gets Real: Rob Colenso of Total Wine on developing seamless ecommerce.

Download research, sign up for regional dinners, subscribe to newsletters and customize your view of the latest retail news. Login or create a MyNRF account.

NRF Membership

NRF members come from more than 45 countries and all sectors of retail, from Main Street merchants to online retailers.