It’s not your design content that’s getting rejected: it’s your delivery
Use the “Doubtful Stakeholder” exercise to test whether your explanations make sense

Getting harshly rejected by doubtful stakeholders taught me the importance of translating your work.
I’ve heard it all, from being told, “I talked to the wrong users who didn’t know anything,” to getting polite nods and being told, “Just work on what I tell you, and we’ll tackle that next release.”
While it hurt in the moment, it also taught me a framework for double-checking if what you’re telling your audience makes sense to them.
If you’ve ever had your findings rejected, the problem isn’t always the content you’ve found. The problem is often how you deliver the news.
UX often has a translation problem
UX often runs into communication issues because it deals with qualitative data.
Most people in your organization work with quantitative data. Whether it’s Executives looking at KPIs and Metrics, Sales teams reaching quarterly goals, or Data Scientists doing statistical analysis, many of them have a number-oriented mindset.
On the other hand, UX is one of the few departments that looks at qualitative data and tries to base its recommendations on it. The problem is that bridging the gap between qualitative and quantitative isn’t always easy.
If you were to say something like “3/5 users found task 4 frustrating”, here are all the questions that someone must answer:
What does “frustrating” mean? Frustration is an emotion, meaning it’s not easily quantified, and no one knows its impact.
What does 3/5 users mean? That’s a small number of users that you talked to. It’s too small for any statistical analysis (i.e., we can’t say 60% of all users)
What does task 4 mean? Did you explain to your audience how you broke testing up into specific tasks and what task 4 was about?
This is where a lot of uncertainty comes from, and the first question in particular is what causes your team not to take action. Other people, from the Sales team to Customer Service, have undoubtedly heard customers complain about different aspects of the product.
Unless your feedback discusses the impact your finding has, your team might be unwilling to change.
However, the Doubtful Stakeholder Exercise is a way to reframe your findings to ensure that what you present can be relevant to your team.
Making use of (potentially) traumatic past experiences
Imagine you’re working with a grumpy stakeholder who thinks that UX is a fad (or something equally dismissive). If you want, you can assign them a name based on your personal experiences.
In any case, imagine that they’re going to ask two questions about every single thing you present:
“So what?”: (i.e. Why does this matter to me?)
“Show me the proof.” : (i.e. Show me a reason why I should believe you)
Your recommendation gets rejected immediately if you don’t answer these questions well. Let’s see how this might work with a couple of examples:
“3/5 users accidentally clicked the wrong button while trying to create an account.”
So what? “Well…it’s an error. Errors are bad” (This fails the first question).
While the user finding might be valid, our explanation isn’t great. Let’s try explaining it in another way.
“3/5 users are starting the account creation process but accidentally abandoning it due to a misclick.”
So what? “Users aren’t creating accounts due to a usability problem.” (This passes the first question)
What’s the difference? Well, assuming that account creation/user signups are an issue that your team cares about (i.e., a “Number”/Metric/KPI/etc.”), You’ve defined what you saw not in UX terms but as something they understand.
This simple reframe can often differentiate between a finding being accepted and rejected. However, pairing it with the second question, “Show me the proof.”, makes it even more powerful.
Evidence-based recommendations inspire your team to action
The statement, “Show me the proof.” is often where I point to the Data-Informed Design process, as backing your recommendations up with Data is often the most powerful way to answer this question.
But answering “Show me the proof” doesn’t always have to involve data. Let’s retake a look at that statement:
“3/5 users are starting the account creation process but accidentally abandoning it due to a misclick.”
I’ve highlighted the first part of that statement to ask one question: What sort of justification can you give other than x/5 users did something?
Using Analytics Data, I could highlight that we had a 52% abandonment rate when it came to the workflow of creating an account.
But while that’s often a strong source of proof, that’s not the only one. Here are some other ways we might show this:
Showing a highlight reel/video of people clicking on the wrong button during testing
Having a specific quote from a user about that issue (“You know, if that happened at home, I would have just abandoned the site.”)
Customer Service complaints (I.e. You talked with your customer support department, and they say this is the 3rd most common issue people call in about)
Funnel Data (Analytics or talking with Marketing teams): You see that 1 million people land on the homepage, but only 20,000 people take the next step of creating an account
etc.
The revised statement might look like this after the doubtful stakeholder exercise.
“Users are starting the account creation process but accidentally abandoning it due to a misclick. This may explain why only 20,000 users create an account, despite the high traffic we get on the home page.”
So what? “Users aren’t creating accounts due to usability problems, which may explain why many users get stuck in the funnel
Show me the proof. We have what users are doing (i.e. 20,000 users out of a million), and we have a possible Why (i.e. this might be a usability problem).
Phrasing things like this is often the difference between your team accepting or rejecting your recommendations.
It’s not your content, it’s the delivery
I’m increasingly thankful that my experiences working in organizations with low UX maturity have taught me that the traditional ways don’t work.
Unfortunately, that’s increasingly unlikely to happen. This often involves the UX Champion convincing your entire team (or organization) to understand the value of UX. It’s a large-scale effort that might collapse anytime if that person leaves.
This is why I advocate that Designers learn just enough Data to help bridge the gap between qualitative and quantitative viewpoints.
By phrasing what you see qualitatively in terms that your team can easily understand, your communication will effectively convey why something needs to change.
Doing this is often the difference between making a difference and struggling to make things work.
Kai Wong is a Senior Product Designer and Creator of the Data and Design newsletter. His course, Data-Informed Design, shows you how to work more effectively and complete design projects faster without sacrificing quality.