How the king of usability became vulnerable to naive tech optimism
The curious case of Jakob Nielsen behaving like a tech bro around accessibility
That statement sounds like what a hotshot Silicon Valley Entrepreneur (i.e., Tech Bro) would say. As a result, there's been speculation about whether someone is ghostwriting for Nielsen (or even if he's mentally well).
Rather than reacting immediately to the article, I dug into the subject to understand what was happening here. What I found isn't a defense to his (weak) argument.
Instead, I found tech optimism around a long-standing dream of Silicon Valley: personalized, accessible interfaces made possible through a new tool.
Personalized user interfaces are one of Silicon Valley's dreams
In The Inevitable: Understanding the 12 Technological Forces That Will Shape Our Future" by Kevin Kelly, one vision of the future Silicon Valley wants is the concept of personalized user interfaces.
They aim to create completely personalized interfaces that customize every web page the user visits.
We're there, partially. For example, when we load up our home page, we may see stock prices and sports scores, while we've chosen to hide celebrity gossip and news.
We may receive targeted and personalized ads, specialized coupons, and offers via e-mail. In addition, the social media feeds of two people are likely to be completely different.
When we consider this the future that tech bros want (including Nielsen, apparently), our current accessibility efforts seem hopelessly outdated by comparison. Standardized and limited versions of pages, designed to adhere to guidelines and checklists, make it sound like these accessible interfaces are inferior to regular ones.
Nielsen also highlighted that it's also nearly impossible to make significant improvements. That's because accessibility encompasses the needs of multiple users, including:
Blind users
Low vision users
Senior Citizens
Deaf users
Low mobility users (i.e., hands shaking or limited use of mouse)
etc.
It's too expensive for businesses to try to cater to all of these different user groups' needs. That's why he believes that improving accessibility, at a certain point, has been broken for nearly 30 years, and this is where Generative AI can help.
This is where we can see some parallels to his work in the 90s (and why I don't think he's being ghostwritten):
His revolutionary approach in the 1990s, called "discount usability engineering," was a cheap way to get effective user feedback (with just five users) compared to fancier and more expensive user tests.
Now, he's suggesting a revolutionary approach of "Generative AI" as a cheap and effective way to provide better accessibility to vulnerable populations, compared to expensive 508 compliance testing, standardized checklists, and limited interfaces.
At first glance, it sounds reasonable. After all, AI's promise of quickly generating personalized user interfaces sounds like a better alternative to struggling with accessibility methods.
However, he's throwing the baby out with the bathwater for one reason: Nielsen has forgotten what UX life is like in less mature organizations.
Let me give you a snapshot of life on the other side of the tracks.
Life on the lower UX Maturity organizations
I spent my career working in healthcare and federal UX, which means I fit on the lower side of Nielsen Norman's fancy UX maturity chart.
As a quick reminder, this is a way to describe UX's role in particular organizations, ranging from 1โ6 (with 6 being the best).
I bring up the maturity model to highlight why Nielsen's argument is misguided: he (and his organization) probably works with companies somewhere on the 3โ6 scale.
The Nielsen Norman group charges a hefty fee for their workshops, consultations, and other services. When businesses are willing to pay $500+/day per person for UX workshops or spend $200,000+ on UX consultants (from what I've seen), they're pretty invested in UX.
Yet, these organizations may not be where accessibility struggles the most. Consider the places that need accessibility the most, including:
Library websites
Federal websites
Hospital/Urgent Care Websites
Public Utility websites (i.e., Electric, Gas, etc.)
School websites
etc.
These are places that need accessibility the most but are unlikely to be able to pay. In my experience working for them, they tend to fall in the 1โ4 maturity scale, so resources are scarce.
You might say that Generative AI like ChatGPT and Midjourney is free or inexpensive. While that's true, knowledge isn't: learning AI tools takes time.
If we're also to believe Nielsen when he says, "There will be a million companies demanding Designers with AI experience by 2025," why would people work in these environments when they could be making twice as much elsewhere?
This is why using Generative AI for accessibility is misguided at best. It may not look flashy (especially when resources are scarce), but we've made serious progress with accessibility in the past few decades.
One of the most intelligent people I know, a blind Iranian Ph.D. graduate, has read, organized, and synthesized hundreds of papers through screen-reader-accessible PDFs, a recent invention.
I've helped design new accessible software for stressed medical professionals and frustrated senior citizens and seen how 508 compliance has resulted in actual emotional relief.
So, throwing all that progress out for a shiny new thing with potential (AI), probably out of the price range of the organizations that need accessibility the most, is a slap in the face.
I've seen this optimism in tech bros who believe software can solve everything. I never thought Jakob Nielsen was one of them, but apparently, even he can fall prey to this.
Complex fields seem simple and easy to fix from the outside
I work in Healthcare UX, an ugly, complicated mess of a field at the best of times. One of my favorite quotes about it comes from Chris Kiess, who says, "Healthcare UX is about a decade behind in design compared to other fields."
Every year, several ambitious tech startups try to break into Healthcare and try to 'redefine everything.'Most fail, often for simple reasons like Not having a doctor on your staff, meaning no medical professional will likely talk with you.
This isn't just limited to Healthcare. One of my earliest favorite Medium articles was about how reforming government seems easy when you're on the outside.
Accessibility seems like a constant struggle, especially for obtaining resources, implementing standards, and providing limited but functional interfaces to vulnerable populations.
That doesn't mean that we should believe the 'tech saviors' that AI can solve everything. It might, one day, but we're not at that stage.
When might we reach that? When we're able to answer this fundamental question around AI.
Imagine that AI generates a personalized interface that instructs a blind person to take the wrong medication dosage (3000 mg instead of 30 mg). The person follows those instructions and seriously damages their health (like their liver failure). Who's at fault, and what can you do to prevent that from happening?
Until we have an answer to those questions, AI cannot solve accessibility struggles.
Iโm finishing up a redesigned Maven course on Data Informed Design. If you want to learn this valuable skill, consider joining the waitlist.
Kai Wong is a Senior Product Designer and writer for the Data and Design newsletter. His book, Data-Informed UX Design, provides 21 small changes you can make to your design process to leverage the power of data and design.