Have you been wanting to join the Blend team? Well, good news: We're hiring. Check out the full job details. 

Understanding Customers: Data, Journeys and Insights

In advance of Now What? Workshops, we’re featuring short interviews with our smart and wonderful workshop speakers. This week, we talk to Jon Crowley about analytics and metrics — and how to use metrics to make better content decisions.

2/27/2018

Categorized

  • Content and IA
  • BlendPRESENTS
  • Strategy
  • Discovery and Scoping
  • Digital Optimization

Jon Crowley is a Strategy Director at Cossette in Toronto. He has spent the last decade working in strategic planning roles across the marketing communications spectrum. In his words, he’s either a multidisciplinary strategic thinker engaged in the nuanced interplay between mass and targeted, traditional and digital, and brand and consumer, or he lacks focus.

We asked Jon a few questions about the balance of analytics and insights in advance of his workshop, “Understanding Customers: Data, Journeys and Insights.”

In your words, understanding which metrics are important begins by first understanding your customers. What does the process of understanding your customers look like?

So, one of the complicated things is that when people start trying to figure out who their customer is, their first thought is, “Let me try to describe a person.” And they come up with a demographic that they think makes sense — they assume, for example, that their customer is a 33 year old person who lives in the northeast and makes a certain amount of money.

Now the problem with that kind of thinking is that it traps you into a bubble of assumptions, forgetting that there is a group of people who might relate to your brand or product and don’t fit into that specific bubble.

What I like to do is build upon something an old boss of mine said: think about targets as a how, rather than a who. By developing a better understanding of what the user is trying to accomplish and what they’re really connecting to, it becomes easier to start figuring out what metrics are going to be relevant; what kind of information they’re going to respond to, and what kinds of behavior it makes sense to track and understand.

I mean, I might have a customer who loves water skiing, as an example. But if I’m trying to sell them mortgages, I don’t’ really need to know about their water skiing habits - I want to know what kind of house they’re looking to buy and how much money they make.

You talk about this concept of a “customer journey.” Explain what you mean by a customer journey.

There’s a bunch of different ways to approach a customer journey. That’s what I find most interesting about customer journeys - you never meet two people who have the same definition.

For me, a customer journey is at its most valuable early on in a process, when we’re trying to understand someone’s entire experience. Not just looking at your brand — your touchpoints — and how they’re interacting with you, but looking at all of the different things they’re considering. It’s a customer analysis tool that looks at the different stages a customer will go through, then develops a whole bunch of theories and a whole bunch of different data sources to help understand what’s influencing them and what problems they’re trying to solve at each individual stage.

You allude to this idea that it’s not necessary to track all possible metrics - and, to me, it looks like you focus away from quantitative data and more toward “how can this help me better understand the user.” What is your process for determining which metrics and data you should actually measure?

I think the fundamental problem with the work tied to analytics is that we focus really heavily on what we can measure and extrapolate from there, instead of focusing on what we want to know and then figuring out what cues we can use.

Google Analytics is a perfect example, because it’s something that everyone sets ups and it’s easy and free. When we set up Google Analytics, we look at the numbers and then we try to figure out what we want to do with that data. But those numbers might not communicate anything about the pages we’re measuring.

For example, I had a client create a basic text page — a strictly informational page — without any calls to action. They came to me asking why the bounce rate was so high. I knew that it was because site audiences would come, look at the page, and then leave. I had to explain that just because you’re shown a number,  doesn’t mean that number’s relevant. If there’s no reason that a person would stay on the site or spend any time there, why are you shocked that people were coming, reading your paragraph of information, and then going to do something else.

It’s all about being able to understand what problems you’re trying to solve, what benefits you want people to come away with, and then developing a path you want the user to take and what cues you can use to determine whether or not it’s working.

Here’s the thing — quantitative data is fantastic if you have valuable quantitative data. But to get the kind of data that moves from “what the data says” to “what it means,” it often requires moving a little bit more into the realm of qualitative data and being a little more open minded about the different interpretations. One thing I’ve done to help to put data in better context is finding ways to sit down and talk to people, whether it’s people from my existing customer base or potential customers, and having them walk me through their thinking, telling me why they’re doing what they’re doing. I can then use that qualitative research to help inform my interpretation of the quantitative.

Really, it’s the marriage of qualitative and quantitative that gives you the full vision of what you’re seeing. But this can be very hard to remember when you’re building digital experiences, because quantitative data is often right there for the taking. It’s hard to resist.

I like how you use the term “insights” instead of “analytics” or “data.” Talk about what you see as the difference between insights and traditional analytics.

Insights, for me, are being able to take data and using that to create a lens for understanding human behavior — moving from what you’re seeing to why you’re seeing it, and what’s motivating people toward specific actions.

It’s about understanding hidden motivations — those things that might not be clear even to those of us trying to influence their actions. Sometimes that’s represented by data, but we’re often not sure when, or why, we’re seeing it, so insights help us develop a bit of a theory about what’s going on in the customer’s mind. A theory of why people are making specific choices. Sometimes it’s a deep psychological thing, and other times it’s a really surface level thing.

A great example of an insight that helped me occured when I was talking to people about their spending habits. Interviewing people, I found that people were such more apprehensive about spending cash because it just felt so much more real to them, whereas with a debit card or using a credit card they didn’t worry as much — they never felt that they, for instance, might spend all of their money, even though it was all the same money coming from the same place. The mental process is very different, and that’s something I never would have guessed from data alone.

Talking to people, I could see what shaped their entire spending behavior, and that helped show why online shopping is such a different beast from actually pushing someone into a store to buy a product that’s right in front of them. The visceral, physicalness made it more of a real experience.

We know that a lot of larger organizations and corporations spend a lot of money every year tracking analytics and data. How do things differ for a smaller organization?

The biggest thing that I’ve noticed is that large corporations spend a lot more time preparing to put something out in the world, so they do lots of pretesting. They do a whole lot of user interviews. They put a whole bunch of quantitative and qualitative research into things, until they land on what they think is a perfect solution. (And then they’ll put it out there and they’ll find out it’s not perfect, because nothing’s perfect.)

Smaller organizations are more willing to do what they think is right — put something out there and learn from it, and because of this there’s a much tighter feedback loop and quicker improvement. Smaller organizations have an easier time applying things from insights - there’s less bureaucracy stopping an organization from relying on a version 1.0 and then 1.1 and 1.2, and on and on.

The business and practice of analytics and insight — and how we use those to help shape the content we create — changes really quickly. How do people keep up with it? How do you keep learning stay up to date?

The biggest thing for me is taking meetings whenever possible with people who know more than I do, or are involved with new systems or demos, and being completely unafraid to ask embarrassingly basic questions.

Like, I’m always the guy, if I’m in a meeting with Google or Facebook and they’re talking about a new product, that’s asking embarrassingly simple questions; How is this tracked? How does this data source and this definition term combine with the previous one? Can you tell me about the attribution math? (Spoiler alert, they can never tell me about the attribution math.)

When a new thing comes out, there’s this tendency to smile and nod and assume you’ll figure it out. It’s a lot quicker when I’m always willing to ask the slightly embarrassing questions, or admit that I don’t know how a new thing works. It’s a lot quicker when I just admit there’s a gap in my understanding and try to get someone who’s a little more well versed to walk me through it.

Awesome — thanks so much for your time, Jon, and we’ll see you in April!

No problem!

Resources on the discovery process.

We’ve written at length, both here and beyond, on research and discovery.

A Better Budget: How Tech Planning Improves the Scoping Process

Joe Kepley

One of the toughest parts of any web project is estimating how much the project will cost and how long it will take. Technical planning helps create better budgets, plan for more realistic timelines, and improve the efficiency of the entire project.

October 17, 2022

Episode 8: Gather Insight From Your Metrics (w/ Jon Crowley) Off-site link

Corey and Deane talk about the first time they tracked analytics on their blogs in the early 2000s. Then, Jon Crowley, Senior Vice President of Strategy at Diamond Marketing Group, talks to us about the balance between data and insights — how to focus on questions rather than raw numbers, how to look for answers rather than “trying to be correct,” and a when we can take data at face value. (He also gives us a tour of his shoe collection.)

June 15, 2022 | The Web Project Guide Podcast

Check out more articles on the discovery process.