Combining quantitative and qualitative data
This blog is based I did a talk about combining quantitative and qualitative data at UX Istanbul 2020. It was a 40 minute talk, which makes for a fairly long read.
Why am I talking about this?
Gathering data for me is about taking opinion out of the equation as much as possible when making decisions about the work that needs to be done. I’ve seen people ignore the data, because they know better than the evidence, to disastrous consequences. If you want to meet your users needs, then you need data — both quantitative and qualitative depending on the situation to make good decisions, one type of data isn’t better than the other wholesale — its using the right data at the right time. You need both to build the right thing and to build the thing right. GDS designers are very good at making posters like these, that speak powerful truths in simple ways. They are freely available online, if you feel your organisation needs some home truths.
Effective and efficient use of data
What do you have? What do you need?
It can be expensive to get new tools, get more data whether its quantitative or qualitative. But this is often what organisations jump straight to.
It’s important to pause and assess, what do you need? What do you need to understand? And what do you already have? And does it meet the need? Perhaps not, maybe you really do need something new, but perhaps you have a lot of valuable data that’s sitting there unused. I have worked at organisations that have gathered lots of data at significant cost but has used it only to report up to senior management and nothing else. Sometimes a single source of data may not be that helpful, but when you combine and triangulate that data, it can be much more powerful. But it all starts with the data having a purpose.
Creating a holistic, evidence based narrative
The What. The Why.
Quantitative data is really good at telling you what is happening, what has happened — how your products and services are being used, what people’s attitudes and opinions are about the things they are using. What’s going well and what isn’t. But it doesn’t tell you why. That’s where qualitative data comes in, you can use qualitative user research techniques to understand your users and why they are behaving the way they do. And where their attitudes and opinions come from.
Thinking about the efficiency and effectiveness is really being able to zoom in and out. Look at the bigger picture as well as the detail and go back and forth to create that holistic view of what’s going on. So the macro might be understanding the landscape — the context your users are living and working in, the context that they’re using your product or service in. What their needs are, what their goals are — what is helping them and what are the barriers to achieving their goals. The micro can zoom right down to does the interaction on this component in this step of the digital service work? Is it accessible? Do people understand it? Can they use it? And you need to iteratively go back and through to make sure you are doing the right thing and you are doing the thing right.
Appeasing the skeptics
Those who are new to UX and those who are skeptical of it often aren’t comfortable with small numbers in qualitative research and data. For example, making design decisions after observing 5 people’s behaviour in a lab. Although the data shows this is a perfectly reasonable thing to do. Getting quantitative data to back it can really help your team and stakeholders on that journey, to trusting the qualitative. It’s not just about appeasing people, but we’ll see later that combining quant and qual can help with leaner, more efficient working and better decision making.
Combining different data sources
First thing first. What are your objectives?
What to measure? Why to measure it?
More ≠ More
Prioritising what you gather — you don’t want too much noise, minimise this to maximise the signals — you should only gather the data you need. Because it can be time consuming to gather and analyse data.
There is also an ethical consideration — don’t gather data you don’t need.
For a website or digital service to meet its targets, it’s important that teams can use the data they gather in meaningful ways
Questions to ask before you gather more data
- What are your objectives?
- What data do you already have?
- What data do others have?
- What do you already know?
- Is the data you have appropriate for your needs?
- What gaps do you?
- What are you trying to understand?
These goes back to what do you have and what do you need?
These are questions you can ask before you do some original research, whether its quantitative or qualitative.
There are extremely useful methodologies you can use to help your team decide what data you should be gathering.
Performance frameworks
When I worked in government, performance frameworks were relatively common, and increasing in popularity as a way for the whole team to agree on what they needed to do.
To know if a service or website etc is meeting its target you need to set performance goals and know how to measure them — a performance framework is a tool to help facilitate this.
You can read all about it here: https://dataingovernment.blog.gov.uk/2016/11/02/setting-up-a-performance-framework-for-the-uk-parliament-website/
Google’s heart framework
The idea of the heart framework is to identify and adopt user centric metrics to measure UX on a large scale and in turn help you make evidence based decisions in your product development cycle. This is very much focusing on quantitative data, its useful for helping you understand what it is you need to know and measure — if metrics are appropriate for the work you are doing. What you measure and how you measure it is really up to you.
You can read all about it here: https://www.interaction-design.org/literature/article/google-s-heart-framework-for-measuring-ux
Research strategy
As with many things in UX and UR — there isn’t a one size fits all answer, unfortunately this would definitely make things a lot, simpler and easier but its not the way the world works.
So here is another option!
This is something that I have never seen enough of! But maybe that’s just the places I’ve worked. Although when you think about GDS being the user centred part of UK government and they weren’t that common.
This can be done on multiple levels — organization, department, programme — its a useful way to agree and then track who is work on what and how they are doing it.
Productive roadmaps
https://uxstudioteam.com/ux-blog/ux-roadmap/
Create roadmaps that are problem or area of interest focussed rather than solution focussed and incorporate research and data gathering that is being done into the product roadmap
We’re tracking everything, right?
Proper set up is key to tracking what you need!
Know what you need to measure and then work with your quants and devs to properly set it up. Don’t assume it’s already being tracked. I’ve heard so many stories from performance analysts who’ve had product managers ask them urgently for data they are not collecting!
How and when to combine your data sources
Looking at the design and development lifecycle makes it is easier to see which method is best for they type of questions we need to answer at each stage — whether is discovering, defining, developing or delivering.
From this we can see that in each situation we can have a combination of quantitative and qualitative and a combination of attitudinal and behavioral data. A more detailed version of a diagram like this lists the different methodologies that can be used in the generative, formative and summative stages. For example, it would show that in the generative stage, when defining the problem, wouldn’t just use usability testing of an existing product. But you could combine ethnographic techniques with surveys.
As a general rule about work by is that attitudinal is almost always better answered with quant data and behavioural can be a combination between qual and quant. Because opinions and attitudes can vary so much from person to person, it’s important to get a large sample. Where are behaviour is much more consistent and we can work with smaller sample sizes
Case study
Here is an example of why it’s useful to combine quant and qual.
You can watch 5 people drive over a pothole — identify that it is a problem and why its a problem — this is your qualitative research. You’ve observed in usability testing that people struggling to do a task when using your product or service.
But what is the scale of the problem? Should we invest in fixing the road?
Quant can tell us about the amount of traffic on the road — is it heavily used? or a couple of cars a week?
Is the task people struggling with on your product / service getting a lot of traffic — is this one of your most common tasks?
https://www.symberity.co.uk/conversion-rate-optimisation/
You can also do this the other way round.
You can see on your analytics that something untoward is happening, for example, poor shopping cart to purchase conversion. But it isn’t clear why this is happening.
Testingtime.com
Qualitative data describes in words rather than measures in numbers. A user researcher, interviews and observes people in order to better understand how they use the things that we make and why they might have problems.
You can do usability testing to observe possible reasons why there is poor conversion, points where people are struggling, expectations that aren’t being meet, etc.
It’s not necessarily easy!
I’m not saying it’s easy, just combine your data! It’s very common that quant and qual are siloed in an organisation, so you’ll have to get over a certain amount of inertia and organizational structures to make this happen.
It took impending Brexit to force GDS to be more agile, more focussed and lean to deliver at a faster pace in our work. It forced better collaboration between quant and qual people.
Showing the value of your UX teams
https://developers.google.com/web/fundamentals/performance/user-centric-performance-metrics
You can use benchmarking both quant and qual on a regular basis to show the impact of improvements, fixes, new features and is where still need to focus on your key tasks.
Things to consider
Legal requirements. Ethical considerations.
GDPR! If you are working in the EU
Springer Nature have taken this on as a global standard
Data ethics — you shouldn’t collect everything just because you can doesn’t mean you should. The UK Government data ethics framework is simple but effective:
●Start with clear user need and public benefit
●Use data and tools which have the minimum intrusion necessary
●Create robust data science models
●Be alert to public perceptions
●Be as open and accountable as possible
●Keep data secure
Effective and efficient use of data;
How are you going to securely store your data but also make it accessible to share the insight you’ve gained from the data
How you communicate the data is also important, that will depend on what data you have gathered, at what level and who the audience is. And also stored in different ways.
Behavioural data and actionable insight however gathered tends to have longevity and be applicable at the organizational level. This kind of data is triangulated and corroborated with multiple data sets across the organisations departments and teams — this should be accessible to everyone to use in their work.
Granular findings are applicable to the team, they are highly valuable to the team, but not necessarily the programme or organisation and only for a short period, for example, iterating your prototypes or your live product. The distilled truths will live in your Design System (if you have them), but this data won’t necessarily get shared with everyone. But everything needs to be findable but the right people — that is the key and it’s also a difficult problem to solve.
With data its important to have push and pull — be able to find the data and also tell people what you want them to know about it — so its not taken out of context and misused
Invest in data literacy training — numbers can be scary. When I was working in government, research was done to understand who was using what kinds of quantitative data and how confident they were using it. It turned out there were a lot of people across a whole range of jobs using data to make decisions and many of them didn’t feel confident that they understood the data or that they could access the data’s quality
What tools you’re going to use and how much do they cost? No one’s resources and budget are limitless sometime you will need to prioritise what data you gather depending on what tools you are able to use. Peter showed us a variety of tools for different types research. Do your homework, compare tools — what will give you most value for the budget? What tools are integrated? What we want to avoid is creating digital silos where data is trapped in lots of separate tools.
Final thought
Whatever you do, do it iteratively.