- UX for AI
- Posts
- Today’s “DeepSeek Selloff” and What it Means for UX
Today’s “DeepSeek Selloff” and What it Means for UX
Today, the stock market dumped shares of many AI companies on the announcement that DeepSeek V3 is disrupting the AI ecosystem. In this article, we unpack the tealeaves and suss out what this new development means to UX (and it’s plenty).
AI Applications are where it’s at
Any time the foundation vendors get hyper-competitive, the prices come down, which leads to the growth of the applications built on top of the foundation. This was the case with computer hard drives getting commoditized back in the 70s and 80s – ultimately, fewer and fewer people started caring about hard drives and hardware altogether and started paying attention to applications, ROI, and, you guessed it! User Experience. In the AI space, this latest news is a clear signal that the value matrix is shifting drastically away from the “foundation layer” – in this case, foundation AI models, and into higher-level abstractions like applications that run on that foundational infrastructure. However, those applications are slow to come at present. Why?
“Public” AIs (ChatGPT, Claude, Gemini, etc.) are useful but not necessarily to enterprises
Those of us who have been using AI models for a while know how useful they are – I use AI every single day, multiple times a day. However, that is what I would call “personal use.” Even when AI use happens as part of my work, it is not directly driving specific enterprise applications, and my company does not pay for it.
While I love the AI functionality I am getting from these tools for $20 a month, this is not how AI companies are going to make a killing. To make AI work at scale, we need AI-driven applications for the enterprise. These are a long time coming. There are three main reasons for the slow adoption of these “public” models:
Most enterprise data needs to be private and RBAC’ed; public models make this vague, unreliable, or simply impossible.
“Public” LLMs are too big for most enterprise tasks, which can lead to slow processing, high costs, hallucinations, etc.
Most enterprise tasks require up-to-the-second contextual awareness of the environment incorporated into the analysis. This is hard to do at the present, given the above security, slow processing, and cost considerations.
If I can make a quick and dirty assessment of the enterprise AI application at the moment, that would be an SLM (small language model) optimized for a very specific niche application that runs on the company’s own servers and uses an absolutely reliable RBAC (Role-Based Action Control) to determine who can access what data. It means operationalizing smaller, cheaper, leaner, and more thinly sliced private AI models that can be trained, versioned, and retained on the common cloud infrastructure at moderate costs. This is where DeepSeek comes in.
DeepSeek uses open-source AI models
DeepSeek uses open-source AI models that can run without specialized hardware on AMD chips. In software development, there always comes a point where something becomes open source and accessible to run on cheaper mass-produced machines. This is exactly what happened to Unix, which dominated the OS market for years (centuries, really). Unix required complex hardware architecture and specialized enterprise servers (Sun, IBM, HP) with a huge, expensive support staff. Then, Linux came along and cheerfully disrupted this market because it is free, open source, and can run on something as small as 386 chips (which are, like, so the last millennium.)
This same thing is now happening to AI foundational models – ChatGPT, Claude, Gemini, etc., are getting hugely disrupted because we now have an open-source AI model you can install and run on your own laptop that can produce similar “good enough” results but on a much cheaper hardware.
And it’s happening FAST.
The transition from Unix to Linux took many (many) years. I just spoke to someone TODAY who is already running DeepSeek on their home laptop. Imagine what this can do: You have your own instance of a private AI model that you can train to behave and think the way you want. You can feed it your latest private data (or not…). In other words, this is the enterprise dream come true—everything they wanted, but without waiting 40 years.
UX for AI is here
This is all insanely fantastic news for UX for AI. All those UX jobs Jakob Nielsen is writing about (https://jakobnielsenphd.substack.com/p/ux-roundup-20250106?open=false#%C2%A7ai-impact-on-ux-jobs)? I think they are coming fast. Because this news is going to fuel an incredible rise in AI-driven applications in 2025. However, that means you, the UX people, need to get going ASAP on understanding deeply this new medium we are working with – what it is capable of and what it cannot do, and how to make it do what you want. Because AI-driven applications are not programmed, they are taught. You need a new vocabulary and modern UX research and modeling techniques for successfully collaborating with data scientists and engineers, leveraging new design patterns, and creating new vision prototypes – in short, everything you need to lead AI-driven product design projects. The good news is that there is no shortage of growth and learning opportunities. Here are three I would highly recommend:
We have a new UX for AI book coming in April from Wiley, chock full of practical UX skills and frameworks for making AI work for humans. I’ll be sharing the pre-order link in a few weeks.
I will be teaching a UX for AI workshop at SXSW on March 9th.
I will be teaching at the AI Bootcamp for UX Teams (organized by Strat) May 13 - 15 in San Francisco (early bird pricing is now.)
Happy 2025!
Greg
Reply