Methodologies & Techniques

Finding the signal in the noise. Making sense of data with human insights

Innovation is the lifeblood of business and design thinking, and its associated processes have become embedded in our thinking about innovation best practice. But what these new processes often overlook is the role of customer insight to inform them. Using research to evaluate an idea can reveal fundamental flaws frustratingly late in the innovation process, once considerable investment has been made. There’s a strong case to be made for an injection of customer insight earlier within the innovation process. 

Big data has opened up a myriad of opportunities for product and service innovation, with algorithms spotting patterns and pointing to new concepts which previously might have gone unnoticed. Data indicates a behaviour or a need. The business challenge is then how to meet that need profitably. The data, however, isn’t enough.

Innovation and good design demand human insight. Some organisations are more market-oriented than others. However, truly human-centred design requires an understanding of the people you are trying to reach so that you can design from a user perspective. Otherwise, the danger is that you create something because you can, rather than because it’s useful.

It’s well established that big data requires human insight in order to fully reap its rewards. And this is far truer for big data than for previous data revolutions. For example, when Electronic Point Of Sale (EPOS) was introduced into retailers in the 1980s, retail brands suddenly had an unprecedented wealth of data about shopper habits and behaviours. It could’ve been overwhelming, like big data is today. But it wasn’t. The difference was that those whose job it was to interpret the data, had often worked on the shop floor. They had experienced witnessing shopper behaviours first-hand, and so they intuitively knew how to make sense of the data. This is evident in the story of sales data that showed the curious coincidence of nappies and beer appearing in the same purchase from 6.00 pm onwards. Data analysts knew this was due to mothers phoning their partners at work to pick up nappies on the way home, and the obliging fathers doing that and using the opportunity buy beer at the same time.

Today, the job of interpreting big data often belongs to people who haven’t experienced human context at first hand in the same way. That makes finding the signal in the noise more challenging.

This is where other methods come in.

Much has been written about ethnographic research as the natural bedfellow of big data. However, it’s not just ethnography which can supply the insights – other qualitative methods can all contribute: depth interviews, group discussions, online panels, customer immersions, semiotics and cultural analysis. And it’s not just about finding the signal in the noise. Without the human context, big data can easily provide the wrong picture.

Imagine the following scenario: you’re a major supermarket, and your data suggests that parking is a key need for many of your customers. You create a segment around this behavioural need. But your data set hasn’t identified that the need for a parking space is not only a hygiene factor, but also not a unifying shopping motivation of your segment. Parking is a functional need, and it can create a very strong cluster, but it pulls together people who have nothing else in common.

Consequently, your audience is almost impossible to market to coherently because that one functional detail is the main thing you know about them. Or, as Tricia Wang has shown, perhaps you’re Nokia and your data is indicating a relatively low price tolerance for mobile handsets. Whereas in fact, given the right functionality/aesthetic, price tolerance is much, much more elastic, as the iPhone went on to prove.

In both cases, your big data map – whose accuracy was absolutely accurate – has nonetheless misled you, because vital details (such as motivation) were missing from the picture. You needed more maps to fully understand the terrain. 

The good news is that it doesn’t take much to obtain that extra context and inject this much-needed human insight. A light sprinkling is often all that’s needed for those mini epiphanies to strike – you don’t always need an extensive programme of focus groups or lots of in-depth ethnography. The insight buzzwords of the moment all speak to a need to be fleet of foot and entrepreneurial in spirit – ‘agile’, ‘sprints’, ‘pivot’, ‘live insight’, ‘answers in an hour’.  And indeed, a programme of ‘coffee break chats’ in your office with your target audience can make all the difference to making sense of your data – iterate your ideas, test and refine quickly and effectively.

Here are a few good practice principles to get the best out of qualitative insights in design:

  1. Don’t be too laser-focused in your enquiry. Influence comes from a huge range of directions. Context is everything when it comes to understand human behaviour. Keep it open and you’ll arrive at unexpected answers
  2. The degree of qualitative insight you require will vary across the life span of your design project. The upfront inspiration or ‘white space’ phase demands a more extensive degree of qualitative insight as you look for opportunities/pain points to address
  3. That said, a light touch insight might be enough to identify the gaps in your data or answer your key questions – don’t assume you need to spend a lot or require lots of time. Further down the innovation pathway your insight needs will be lighter as you move into a more agile test-refine-optimise process, and qualitative insight can be much speedier than people often assume e.g. coffee break consumer chats
  4. Future-facing research is notoriously difficult (hence Henry Ford’s ‘faster horse’ trope) but a well-designed project mitigates against the pitfalls. It’s true that people can’t tell you want they will want tomorrow. However, they can give you and your design team valuable clues, especially if you talk to early adopters and category experts as part of your early investigations
  5. Get your design team immersed in the process – they should be as involved as possible in witnessing real behaviour first-hand. Designers tune into important clues which researchers might not spot the significance of.  At the very least, they should attend some sessions and feed into the analysis
  6. Add a behaviour change aspect to your research, to flush out false positives and spot hidden opportunities. It’s easy for people to overclaim excitement or indeed express disinterest until they’ve actually experienced a new product or service. Give them time to live with the idea for a while, in as tangible a format as possible

Leave a Comment

* By using this form you agree with the storage and handling of your data by this website.
Please note that your e-mail address will not be publicly displayed.

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Related Articles