According to Geoffrey Bessin, chief evangelist, Intuiface, running a business without metrics is laughably risky and there isn’t an investor or entrepreneur on the planet who thinks otherwise.
So why aren’t you collecting metrics for your digital signage project? Whether the project is yours or a client’s, chances are you’ve somehow decided that experience and gut feel, though nuts for running a business, are perfect tools for steering a signage deployment.
Why do we do this to ourselves?
There are three key reasons most digital signage projects choose to do without analytics.
• Absence of analytics capability in the underlying software platform.
• Absence of mind share because there is no history of using analytics on a project.
• Absence of experience or confidence in executing an analytics initiative well.
Is ‘budget’ a fourth reason? If you come around to believing every project needs analytics, then budget can’t be a reason because no project can be without it. Anyway, if you don’t measure, you risk going over budget. Don’t be penny wise, pound foolish.
Does the software platform lack good analytics capability? You’ve got the wrong platform. Is it not something your team has done in the past? Unless you think there is nothing left in this world to learn, maybe things should change.
Three charts approach
There are no direct measures for fully non-interactive signage. You’ll need secondary measures and instinct. This is not as controversial as it may seem. There are many ways to interact with digital signage. Some are active, like touch, RFID badge swipes and voice activation. Others are passive, like camera vision and motion detection. It is these latter, passive approaches that could be used to generate data for analysis. Thus, all signage installations can be a primary data source with some degree of accuracy. It just takes vision and investment on the part of the project owner to make it happen.
Average dwell time
Average dwell time is an unvarnished measure with clear implications. With this number, you can unambiguously identify the level of engagement your signage content is achieving, and do this on a screen-by-screen basis.
Dwell time is the amount of time a person (or people) is actively interacting with your content. The interaction could be explicit, like tapping buttons or entering information in forms. But interaction could be implicit as well, like watching a video. Either way, dwell time tells you how long a given user has engaged with your content.
Average dwell time gives you an overall view of how engaging a given screen has been. The timeline is up to you – hourly, daily, weekly. And the focus could be at any comparative level – per screen, per floor, per geographic location, and more.
During your signage pilot phase, establish a dwell time baseline. What is the average across all locations, all screens, over a given time? Then, beyond the pilot, track average dwell time and compare it to the baseline.
Average dwell time is the perfect way to identify the engagement level of your content. Changing content, or location, or format, or many other aspects of your deployment will likely alter the average, an ideal moment for running A/B tests to identify methods for improving engagement. It’s an indispensable measure.
If you know anything about website analytics, then you’re familiar with the notion of a session. A session is the set of actions performed by a unique visitor. That action could be reading a marketing promotion or it could be actively interacting with content. Each visitor is represented by a unique session. The more sessions, the more people. The more people, the more engagement your signage deployment is delivering.
There’s a catch here and that’s how to identify a session. With websites, session identification is easy. Every device has a unique IP address, and Web servers (and Google) can see these IP addresses. Each IP address is a new user and thus a new session. Google takes this even one step further and will treat multiple sessions from the same IP address as one session if they happen in close succession.
Humans don’t have IP addresses so how can you differentiate one person from another and thus one session from another? The key is to create one or more scenarios that will indicate, with a high level of probability, that a new person has approached your signage and thus a new session has begun. Options include:
-Automatically returning to the home screen after a set amount of time during which no interaction has occurred. It’s likely that the next interaction is a new person.
-Using computer vision and eye tracking to identify engagement.
-Adding a Home or Start button. A tap of this button likely indicates a new person has begun their interaction.
The hardest part of any analytics endeavor is identification of the key measures that are meaningful and actionable. Your goal should be to avoid MUMs: measurable but useless metrics. Average dwell time and session count are actionable – they deliver unambiguous insight about your deployment – but there are certainly other useful measures unique to your needs and your deployments.
A formal exercise involving stakeholders should be conducted to identify project goals and the Key Performance Indicators, or KPIs, for those goals. Maybe there’s only one – for example, collecting email addresses. Or perhaps there are multiple goals because there are multiple stakeholders. Regardless, if you have a goal, you have a structure for identifying KPIs.
Tracking KPIs is best achieved through use of a single value chart, the classic chart type for a dashboard. Identify the current, actual value of the KPI, associate it with the target value and – if possible with your analytics platform – indicate the KPI value trend. At a glance you can identify progress toward your goals.
Word of caution
Be very careful about reaching inappropriate conclusions when studying collected data. You know the old saw ‘correlation does not mean causation’. If in-store revenue goes up when winter settles in, does that mean cold weather causes people to buy more? A more sensible explanation is that Christmas and New Years is the cause. Be careful about inferences.
And be careful what you wish for. If you’re hoping for particular outcomes, you may view data with rose-colored glasses and reach inappropriate conclusions. If a particular set of items is never viewed on an endless aisle kiosk, is it because those items are not popular? Or is it possible that the kiosk design makes it hard to discover or understand that category of items, so no one decides to view them?