Understanding Data-Informed Design, the job niche that's kept me in demand
A rare and valuable job niche might be easier to learn than you think
Photo by Mikael Blomkvist: https://www.pexels.com/photo/a-group-of-people-discussing-charts-6476260/
I was recently a subject matter expert and writer on an upcoming Data-Driven Design course, and I realized during that process that I never really explained the “Data” part of the Data & Design newsletter.
While it initially stood for ideas like “Data Science” or “Data Visualization,” more recently, it’s stood for Data-Informed Design, which has become a precious niche for me.
It’s kept me employed and thriving in the face of recent tech layoffs, and it’s a job niche that I don’t see many designers embracing. In my free e-book, the Resilient UX Professional, I've mentioned that narrowing down your focus through niches is an effortless way to tailor your skillset and find jobs you’re a strong candidate for.
So I thought I would walk through one of my niches and talk about what it is and how you can use it easily. However, to understand Data-Informed Design, we must confront a boogeyman many of you may have: metrics.
Data-Informed Design isn’t Data-driven design (thankfully)
The simple fact is that Data-Informed Design often has a negative stigma simply by being associated with other similar terms.
The book, Designing with Data, by Rochelle King and Caitlin Tan talks about three related (but different) concepts around data:
Data-Driven Design
Data-Informed Design
Data-Aware Design
Understanding the difference between these terms is crucial because one of these terms (in particular) often has a lot of negative feedback: Data-Driven Design.
When I say Data-Driven Design, some of you might be conjuring up negative experiences. From stakeholders who say we don’t need user research (because we have analytics) to metrics-obsessed Product Managers or a complete lack of creative freedom with design.
Data-Driven Design, in other words, sounds like businesspeople trying to ‘summarize and define’ user research and design through numbers. This wouldn’t be the first time: the Net Promoter Score (NPS) was an attempt to summarize all customer feedback and loyalty into a single number.
However, in actuality, Data-Driven Design is usually a rarity due to its’ testing method: A/B testing. In essence, Data-Driven Design is where you create two (often similar) designs, route user traffic through them, and choose the design that ‘wins’ in a specific metric.
For example, Data-Driven Design might deploy an A/B test of two home pages, with a signup form in two different places. Whoever gets more signups (i.e., ‘User Adoption’ metrics) will be declared the winner.
Source: https://instapage.com/what-is-ab-testing/
However, the A/B test also shows the limited scope of Data-Driven Design: most of the time, these tiny changes slowly iterate toward an incrementally better page. As a result, this approach is not helpful for a complete re-design unless you’re willing to have tons of A/B tests that fail.
This fear of failure is why only data-centric organizations, which are committed to changing based on data (even if it requires a lot of failed experiments), tend to adopt this approach.
This is not even mentioning that designs rarely ‘win’ cleanly: most of the time, one design is better for specific metrics (like time on task) but worse for others (like User Engagement).
So if we’re not doing Data-Driven Design, what exactly are we doing? This is where the other two concepts come into play, Data-Informed and Data-Aware.
Data-Aware Design, or the ideal goal
Data-Aware Design, in this case, is an ideal we should be striving towards (but it might not be easy in reality). This is a process where UX is integrated into other parts of your data collection teams to ensure we’re getting data that we care about on a larger scale.
The idea is to work with your team to improve your data collection across multiple methods so that you can ask better questions (and gather better data) overall.
For example, imagine you weren’t getting the best responses from Customer Support tickets. Rather than working with what you have (for Data-Informed Approach), you would work with the Customer Support team to provide a list of questions not to diagnose the problem but to dig deeper into user motivations and frustrations.
You would provide this in Customer Surveys, Customer Support tickets, and more. However, this often requires your organization to be comfortable (and have high UX maturity) to undertake this great effort.
Data-Informed Design is a happy medium that’s easy to access
If Data-Driven and Data-Aware Design are the two extremes of using data, then Data-Informed Design is often a baby step for many companies (and one of the most straightforward skills to pick up).
Source: https://uxdesign.cc/moving-from-being-data-driven-to-data-informed-f325b55917e3
Essentially, we follow our normal Design process, which includes doing user testing and research, but we look to other additional Data resources for insights, such as:
User Analytics of our product (including demographics, engagement, and more)
Annual customer surveys, which often voice general opinions and habits
Customer Support Tickets tell us the most common pain points and frustrations users call in to resolve.
You can see how having this additional information (and data sources) might change your approach to user research. For example, you might not need to validate your persona with user testing if you have user analytics. Or, you might see how everyday user habits (and sentiments) around another product might translate to yours (i.e., a user’s mental model).
However, the user testing and design are still primarily focused on the methods and research questions you have learned as a designer. You’re just listening and looking at various sources, not just user interviews and testing, to decide.
This allows you to get insights from the 10,000-foot top-down view that many executives (and business people) look at. This provides you with three main benefits:
Data helps you ask better questions:
Data is often the answer to “How many users encounter this?” or “How often does this occur?”. Whether it’s analytics on 100,000 or a customer survey with 100 responses, these often answer these types of questions.
By understanding these answers, you can stop asking more generalized questions (i.e., “Where are users running into issues?”) and start asking more targeted questions (i.e., “How might we design a better checkout page?”)
Data helps you prioritize your ideas:
User testing often brings less than ideal results: if half your users think one feature is a top priority, and half think something else is, how do you choose what to design?
The answer is to look at the Data. Whether designing A/B tests or working with existing Analytics data, you can help determine your top priorities.
Data helps persuade stakeholders by speaking their language:
There’s no getting around it: your team is probably more familiar with business than design terms. Data-Informed Design creates a bridge between Business and Design by showing how design changes can impact business metrics.
Doing so can make a persuasive argument that design (and qualitative user research) alone can’t make: it’s that the Data reflects what we’re talking about from a Design perspective.
For example, imagine you were asked to create an onboarding tutorial for a new website. However, you quickly realize that’s not the right solution: they need to simplify the home page so that it’s intuitive to users instead of spending money explaining non-intuitive designs.
However, from a business perspective, there are three main reasons for them to reject this:
The project scope, goals, and budget are for an onboarding tutorial
Re-designs take a lot of effort, time, and money
Addressing the home page requires gathering tons of stakeholders, as it will affect a lot of projects
But before you give up, let’s look at how Data can help you. If you can look at Website Analytics and see that many users are abandoning the site after landing on the home page, in addition to spending <5 seconds on the page, you could make a stronger case for your argument.
After all, if users aren’t spending a ton of time before leaving, they won’t even see the onboarding tutorial (most likely). In addition, you should also conduct user research to understand some of the most common usability issues that are most easily solved.
With both of these tools, you have a better chance of persuading stakeholders with a few key points:
Here is the Data (Analytics) that show why the onboarding tutorial won’t solve the issue
Here are some high-priority design changes (i.e., not a complete re-design) that will have a significant impact on usability
With this, I hope you can also see why Data-Informed Design is one of the most manageable steps to take as an Individual designer.
Data-Driven Design requires an organization to be data-centric and be willing to make changes solely on the data (with limited scope).
Data Aware Design is about affecting the entire organization with data collection methods to generate widespread awareness and collect data across the entire organization.
However, Data Informed Design is simply about looking for other familiar data sources and integrating that into your existing user research (and design) methods. So it’s pretty easy to get started with it.
In addition, Data Informed Design tends to be very persuasive. I mentioned it above, but I want to mention how persuasive Data Informed Design can be, especially with a common problem you may face: running into the Highest Paid Person’s Opinion (HiPPO).
Data beats opinions, which means it’s been my saving grace when designs are in danger of being derailed by an executive. From a purely qualitative perspective, it can be hard to argue why someone should base a million-dollar decision on the actions of a few users.
Of course, it can be done, and many people more persuasive (and more innovative) than me often do this. However, here’s the thing about the data you’ll be looking through: it already exists, so you don’t need to create it.
This means that rather than saying something like, “Give me a week to run a usability test and get results,” you can go, “Let me check our analytics data and get back to you in a few hours,” to back up your arguments.
Hopefully, I’ve done enough to convince you that Data Informed Design is worth pursuing. Now, how exactly do I start learning or using Data Informed Design? I wrote a book about this stuff for more detail, but let’s go over this stuff more briefly by telling you that you probably learned how to do this in the 5th grade.
The Hypothesis: the basis of all Data Informed Design
I’m not sure when some of you may have encountered this, but hopefully, you encountered the concept of the Scientific Method during middle school.
Source: https://www.thoughtco.com/scientific-method-p2-373335
This process was the basis of creating a scientific experiment, testing it, and figuring out whether you succeeded. However, we’re concerned about one of the earliest steps, the Hypothesis, because that’s the basis of nearly all Data Informed Designs.
Very simply, if you can fill in the Hypothesis below (and know how to test it), you’re doing Data Informed Design:
If we do [X], we believe users will do [Y] because of [Z] reasons. We’ll know that it’s true if [metric of interest A] is impacted.
Let’s break down each of these sections to understand how it works.
If we do [X]: I’m sure most of you can break this down: it’s what you’re designing. However, this is a quick reminder to define the feature from a user perspective, not a business one. Rather than saying, “If we implement the ability to filter by a dropdown list,” it is better to define it as “If we provide users with filters at the top of the page.”
Users will do [Y] because of [Z]: This is one of the first significant changes with Data Informed Design. We often go into a user test almost entirely blind: we don’t know what users will do, and we don’t want to make any judgments around it. This can sometimes be a positive thing, as we won’t lead users with leading questions if we don’t know what we intend to do.
However, dig a little deeper, and you’ll find that’s not the case. Are you the first company to put a dropdown filter list at the top of the search results page? Probably not.
If you spent a lot of time digging into it, you could probably find a case study that spelled out precisely what doing that resulted in for the users. However, you probably don’t have the time for that. So this is where educated guesses come into play.
What is the most likely thing for users to do, and why is that the case? You don’t have to always be correct in your assumptions: hypotheses are often disproven as much as proven, even when you know the user well. For example, you might have expected that users would love a simplified interface, but they dislike it for removing a lot of user control.
Defining not only what users are doing but why is a powerful statement to make. Making an educated guess about this before user testing is even more powerful, even if it turns out to be incorrect.
Some of the most significant scientific discoveries have happened because hypotheses were proved incorrect, and your process should be no different.
Metric of Interest A is impacted: This is where the bulk of learning around Data-Informed Design is likely to take place. One of the questions you’ll need to uncover is the impact of your designs in terms your team can understand.
For example, it may be completely obvious to you why it’s so troublesome that users are confusing two buttons (like ‘Cancel’ (the current task) vs. ‘Close (down the entire process)), but linking that to its’ relevant metrics (such as abandonment rate, task completion and more), is a way to ensure stakeholders can see the impact.
The good thing is that many metrics you care about will already exist in Analytics and other places. However, to truly understand what each of them means, you may want to do some reading (such as my book). However, here’s a quick primer on some of the more common metrics:
Daily Active Users (DAU): How many people log in daily? This is a metric about User Adoption (i.e., How many people use a feature)
Weekly/Monthly Active Users (WAU/MAU): How many people keep logging in over a week or a month? This tends to be a metric around Engagement/Retention (i.e., How many people come back to your website)
Task Completion/Abandonment Rate: How many users can complete a specific task, and how many fail?
Error rate: How many people make mistakes?
Average Time on task/page: How long does it take users to complete a task (or how long do they stage on a page)?
Search/Navigation: How many people use the search bar to find what they’re looking for, and how many use menu navigation?
After you’ve created your Hypothesis, it’s time to do two things: run it by stakeholders, and validate it through testing. While we often go into user testing a little blind, with the idea to, for example, “Learn about how users do things,” having a hypothesis like this allows us to figure out whether our initial assumptions about something were correct or if they need something else.
For example, imagine your Hypothesis w“If we create search filters, we believe users will use them to quickly filter search results to find what they’re looking for. We’ll know this is the case when the average time on task goes down.”
How might that impact your user testing? Well, you might make one of the tasks about searching for a common term so that it will turn up a lot of search results. You may also want to have additional questions that probe into general insights around filtering and sorting.
However, I know that this may raise a few questions, especially around user bias: if you’re going into user testing looking for something specific, won’t that cause you to ignore certain parts of the testing process (or ask leading questions)?
The answer to this is it shouldn’t. Yes, there’s always some risk of biasing your user test, but consider the alternative: imagine you go into a user test blind, and six different users tell you six completely different things with no overlap. How will you determine (and prioritize) what’s most important, and how long will that take you?
It’s better to come into a user test with a few goals in mind (like you would with most user research plans anyways) and use that to prioritize your testing. If you hear great user feedback about something that you weren’t expecting, you can include it in your presentation. However, with the hypothesis, you’re mainly just trying to address the goals you set beforehand.
The benefits of data-informed design
Taking a Data-informed Design approach offers many benefits (for not a lot of additional effort). First and foremost is a more efficient user testing process: by having one large idea you’re trying to test for, you make it easy to summarize if it was successful.
In addition, it also allows you to prioritize what recommendations you want to provide to the team based on the data that comes in. This is often incredibly helpful when you encounter a breadth of information instead of a depth. If each user had different opinions, feedback, and results (with little overlap), taking this data-informed approach can help you prioritize what to recommend.
Lastly, and most importantly, Data Informed Design is a persuasive tool that allows you to not only set expectations but persuade difficult team members with an augmented process:
Run it by stakeholders to showcase what you’re trying to uncover and how it will impact the business (through metrics)
Narrow down what the overall focus of the analysis (after testing) will be about
Give some educated guesses as to what users may be doing and why
Lastly, show how the results of your analysis can relate to the business
These things are some of the fundamental cornerstones of Data-Informed Design, and they’re part of my skillset that makes me a unique designer that has kept me employed.
Data-Informed Design is a competitive advantage that’s pretty easy to learn.
I won’t say I understand the job market entirely for new designers, but I will say that knowing Data-Informed Design has kept me highly in demand and employed.
These skills (and niche) are some of the most important things businesses care about, and more importantly, they’re practical persuasion tools that designers can adopt that can have a significant influence.
What’s more, it’s not a skill that will take decades to master. A lot of the skills revolve around using and understanding analytics and possibly reaching out to organizations and asking for the following:
Getting access to analytics
Organization-wide Customer surveys
Customer support tickets
Other feedback mechanisms
etc.
So if you’ve ever encountered the frustrating reality of hearing good user feedback but being unable to persuade your team (or knowing the more significant impact), it may help you to learn about Data Informed Design. Doing so has helped accelerate my Design career: it’s kept me employed and in demand in the face of tech layoffs and a recession.
Kai Wong is a Senior Product Designer, Data-Informed Design Author, and Top Design Writer on Medium. His new free book, The Resilient UX Professional, provides real-world advice to get your first UX job and advance your UX career.