Finally! After so many years, we’re very happy to launch , our new practical guide for designers and managers on how to set up and track design success in your company — with UX scorecards, UX metrics, the entire workflow and Design KPI trees. Neatly put together by yours truly, Vitaly Friedman. .
Video + UX Training
$ 495.00 $ 799.00
Get Video + UX Training
25 video lessons (8h) + .
100 days money-back-guarantee.
Video only
25 video lessons (8h). Updated yearly.
Also available as a
The Backstory
In many companies, designers are perceived as disruptors, rather than enablers. Designers challenge established ways of working. They ask a lot of questions — much needed ones but also uncomfortable ones. They focus “too much” on user needs, pushing revenue projections back, often with long-winded commitment to testing and research and planning and scoping.
Almost every department in almost every company has their own clearly defined objectives, metrics and KPIs. In fact, most departments — from finance to marketing to HR to sales — are remarkably good at visualizing their impact and making it visible throughout the entire organization.
Designing a KPI tree, an example of how to connect business objectives with design initiatives through the lens of design KPIs. ()
But as designers, we rarely have a set of established Design KPIs that we regularly report to senior management. We don’t have a clear definition of design success. And we rarely measure the impact of our work once it’s launched. So it’s not surprising that moste parts of the business barely know what we actually do all day long.
Business wants results. It also wants to do more of what has worked in the past. But it doesn’t want to be disrupted — it wants to disrupt. It wants to reduce time to market and minimize expenses; increase revenue and existing business, find new markets. This requires fast delivery and good execution.
And that’s what we are often supposed to be — good “executors”. Or to put differently, “pixel pushers”.
Over years, I’ve been searching for a way to change that. This brought me to Design KPIs and UX scorecards, and a workflow to translate business goals into actionable and measurable design initiatives. I had to find a way to explain, visualize and track that incredible impact that designers have on all parts of business — from revenue to loyalty to support to delivery.
The results of that journey are now public in our new video course: — a practical guide for designers, researchers and UX leads to measure and visualize UX impact on business.
About The Course
The course dives deep into establishing team-specific design KPIs, how to track them effectively, how to set up ownership and integrate metrics in design process. You’ll discover how to translate ambiguous objectives into practical design goals, and how to measure design systems and UX research.
Also, we’ll make sense of OKRs, Top Task Analysis, SUS, UMUX-Lite, UEQ, TPI, KPI trees, feedback scoring, gap analysis, and Kano model — and what UX research methods to choose to get better results. or .
The setup for the video recordings. Once all content is in place, it’s about time to set up the recording.
A practical guide to UX metrics and Design KPIs
8h-video course + live UX training. .
- 25 chapters (8h), with videos added/updated yearly
- Free preview, examples, templates, workflows
- No subscription: get once, access forever
- Life-time access to all videos, slides, checklists.
- Add-on: live UX training, running 2× a year
- Use the code SMASHING to
Table of Contents
25 chapters, 8 hours, with practical examples, exercises, and everything you need to master the art of measuring UX and design impact. Don’t worry, even if it might seem overwhelming at first, we’ll explore things slowly and thoroughly. Taking 1–2 sessions per week is a perfectly good goal to aim for.
We can’t improve without measuring. That’s why our new video course gives you the tools you need to make sense of it all: user needs, just like business needs. ()
So, how do we measure UX? Well, let’s find out! Meet a friendly welcome message to the video course, outlining all the fine details we’ll be going through: design impact, business metrics, design metrics, surveys, target times and states, measuring UX in B2B and enterprises, design KPI trees, Kano model, event storming, choosing metrics, reporting design success — and how to measure design systems and UX research efforts.
Keywords:
Design impact, UX metrics, business goals, articulating design value, real-world examples, showcasing impact, evidence-driven design.
In this segment, we’ll explore how and where we, as UX designers, make an impact within organizations. We’ll explore where we fit in the company structure, how to build strong relationships with colleagues, and how to communicate design value in business terms.
Keywords:
Design impact, design ROI, orgcharts, stakeholder engagement, business language vs. UX language, Double Diamond vs. Reverse Double Diamond, risk mitigation.
We’ll explore the key business terms and concepts related to measuring business performance. We’ll dive into business strategy and tactics, and unpack the components of OKRs (Objectives and Key Results), KPIs, SMART goals, and metrics.
Keywords:
OKRs, objectives, key results, initiatives, SMART goals, measurable goals, time-bound metrics, goal-setting framework, business objectives.
Businesses often speak of leading and lagging indicators — predictive and retrospective measures of success. Let’s explore what they are and how they are different — and how we can use them to understand the immediate and long-term impact of our UX work.
Keywords:
Leading vs. lagging indicators, cause-and-effect relationship, backwards-looking and forward-looking indicators, signals for future success.
We dive into the world of business metrics, from Monthly Active Users (MAU) to Monthly Recurring Revenue (MRR) to Customer Lifetime Value (CLV), and many other metrics that often find their way to dashboards of senior management.
Also, almost every business measures NPS. Yet NPS has many limitations, requires a large sample size to be statistically reliable, and what people say and what people do are often very different things. Let’s see what we as designers can do with NPS, and how it relates to our UX work.
Keywords:
Business metrics, MAU, MRR, ARR, CLV, ACV, Net Promoter Score, customer loyalty.
We’ll explore the broader context of business metrics, including revenue-related measures like Monthly Recurring Revenue (MRR) and Annual Recurring Revenue (ARR), Customer Lifetime Value (CLV), and churn rate.
We’ll also dive into Customer Satisfaction Score (CSAT) and Customer Effort Score (CES). We’ll discuss how these metrics are calculated, their importance in measuring customer experience, and how they complement other well-known (but not necessarily helpful) business metrics like NPS.
Keywords:
Customer Lifetime Value (CLV), churn rate, Customer Satisfaction Score (CSAT), Customer Effort Score (CES), Net Promoter Score (NPS), Monthly Recurring Revenue (MRR), Annual Recurring Revenue (ARR).
If you are looking for a simple alternative to NPS, feedback scoring and gap analysis might be a neat little helper. It transforms qualitative user feedback into quantifiable data, allowing us to track UX improvements over time. Unlike NPS, which focuses on future behavior, feedback scoring looks at past actions and current perceptions.
Keywords:
Feedback scoring, gap analysis, qualitative feedback, quantitative data.
We’ll explore the landscape of established and reliable design metrics for tracking and capturing UX in digital products. From task success rate and time on task to System Usability Scale (SUS) to Standardized User Experience Percentile Rank Questionnaire (SUPR-Q) to Accessible Usability Scale (AUS), with an overview of when and how to use each, the drawbacks, and things to keep in mind.
Keywords:
UX metrics, KPIs, task success rate, time on task, error rates, error recovery, SUS, SUPR-Q.
We’ll continue with slightly shorter alternatives to SUS and SUPR-Q that could be used in a quick email survey or an in-app prompt — UMUX-Lite and Single Ease Question (SEQ). We’ll also explore the “big behemoths” of UX measurements — User Experience Questionnaire (UEQ), Google’s HEART framework, and custom UX measurement surveys — and how to bring key metrics together in one simple UX scorecard tailored to your product’s unique needs.
Keywords:
UX metrics, UMUX-Lite, Single Ease Question (SEQ), User Experience Questionnaire (UEQ), HEART framework, UEQ, UX scorecards.
The most impactful way to measure UX is to study how successful users are in completing their tasks in their common customer journeys. With top tasks analysis, we focus on what matters, and explore task success rates and time on task. We need to identify representative tasks and bring 15–18 users in for testing. Let’s dive into how it all works and some of the important gotchas and takeaways to consider.
Keywords:
Top task analysis, UX metrics, task success rate, time on task, qualitative testing, 80% success, statistical reliability, baseline testing.
Designing good surveys is hard! We need to be careful on how we shape our questions to avoid biases, how to find the right segment of audience and large enough sample size, how to provide high confidence levels and low margins of errors. In this chapter, we review best practices and a cheat sheet for better survey design — along with do’s and don’ts on question types, rating scales, and survey pre-testing.
Keywords:
Survey design, question types, rating scales, survey length, pre-testing, response rates, statistical significance, sample quality, mean vs. median scores.
Best measurements come from testing with actual users. But what if you don’t have access to any users? Perhaps because of NDA, IP concerns, lack of clearance, poor availability, and high costs of customers and just lack of users? Let’s explore how we can find a way around such restrictive environments, how to engage with stakeholders, and how we can measure efficiency, failures — and set up UX KPI programs.
Keywords:
B2B, enterprise UX, limited access to users, compliance, legacy systems, compliance, desk research, stakeholder engagement, testing proxies, employee’s UX.
To visualize design impact, we need to connect high-level business objectives with specific design initiatives. To do that, we can build up and present Design KPI trees. From the bottom up, the tree captures user needs, pain points, and insights from research, which inform design initiatives. For each, we define UX metrics to track the impact of these initiatives, and they roll up to higher-level design and business KPIs. Let’s explore how it all works in action and how you can use it in your work.
Keywords:
User needs, UX metrics, KPI trees, sub-trees, design initiatives, setting up metrics, measuring and reporting design impact, design workflow, UX metrics graphs, UX planes.
How do we choose the right metrics? Well, we don’t start with metrics. We start by identifying most critical user needs and assess the impact of meeting user needs well. To do that, we apply event storming by mapping critical user’s success moments as they interact with a digital product. Our job, then, is to maximize success, remove frustrations, and pave a clear path towards a successful outcome — with event storming.
Keywords:
UX mapping, customer journey maps, service blueprints, event storming, stakeholder alignment, collaborative mapping, UX lanes, critical events, user needs vs. business goals.
Once we have a business objective in front of us, we need to choose design initiatives that are most likely to drive the impact that we need to enable with our UX work. To test how effective our design ideas are, we can map them against a Kano model und run a concept testing survey. It gives us a user’s sentiment that we then need to weigh against business priorities. Let’s see how to do just that.
Keywords:
Feature prioritization, threshold attributes, performance attributes, excitement attributes, user’s sentiment, mapping design ideas, boosting user’s satisfaction.
How do we design a KPI tree from scratch? We start by running a collaborative event storming to identify key success moments. Then we prioritize key events and explore how we can amplify and streamline them. Then we ideate and come up with design initiatives. These initiatives are stress tested in an impact-effort matrix for viability and impact. Eventually, we define and assign metrics and KPIs, and pull them together in a KPI tree. Here’s how it works from start till the end.
Keywords:
Uncovering user needs, impact-effort matrix, concept testing, event storming, stakeholder collaboration, traversing the KPI tree.
Should we rely on established UX metrics such as SUS, UMUX-Lite, and SUPR-Q, or should we define custom metrics tailored to product and user needs? We need to find a balance between the two. It depends on what we want to measure, what we actually can measure, and whether we want to track local impact for a specific change or global impact for the entire customer journey. Let’s figure out how to define and establish metrics that actually will help us track our UX success.
Keywords:
Local vs. global KPIs, time spans, percentage vs. absolute values, A/B testing, mapping between metrics and KPIs, task breakdown, UX lanes, naming design KPIs.
Different contexts will require different design KPIs. In this chapter, we explore a diverse set of UX metrics related to search quality (quality of search for top 100 search queries), form design (error frequency, accuracy), e-commerce (time to final price), subscription-based services (time to tier boundaries), customer support (service desk inquiries) and many others. This should give you a good starting point to build upon for your own product and user needs.
Keywords:
Time to first success, search results quality, form error recovery, password recovery rate, accessibility coverage, time to tier boundaries, service desk inquiries, fake email frequency, early drop-off rate, carbon emissions per page view, presets and templates usage, default settings adoption, design system health.
Establishing UX metrics doesn’t happen over night. You need to discuss and decide what you want to measure and how often it should happen. But also how to integrate metrics, evaluate data, and report findings. And how to embed them into an existing design workflow. For that, you will need time — and green lights from your stakeholders and managers. To achieve that, we need to tap into the uncharted waters of UX strategy. Let’s see what it involves for us and how to make progress there.
Keywords:
Stakeholder engagement, UX maturity, governance, risk mitigation, integration, ownership, accountability, viability.
Once you’ve established UX metrics, you will need to report them repeatedly to the senior management. How exactly would you do that? In this chapter, we explore the process of selecting representative tasks, recruiting participants, facilitating testing sessions, and analyzing the resulting data to create a compelling report and presentation that will highlight the value of your UX efforts to stakeholders.
Keywords:
Data analysis, reporting, facilitation, observation notes, video clips, guidelines and recommendations, definition of design success, targets, alignment, and stakeholder’s buy-in.
To show the impact of our design work, we need to track UX snapshots. Basically, it’s four states, mapped against touch points in a customer journey: baseline (threshold not to cross), current state (how we are currently doing), target state (objective we are aiming for), and industry benchmark (to stay competitive). Let’s see how it would work in an actual project.
Keywords:
Competitive benchmarking, baseline measurement, local and global design KPIs, cross-teams metrics, setting realistic goals.
How do we measure the health of a design system? Surely it’s not just a roll-out speed for newly designed UI components or flows. Most teams track productivity and coverage, but we can also go beyond that by measuring relative adoption, efficiency gains (time saved, faster time-to-market, satisfaction score, and product quality). But the best metric is how early designers involve the design system in a conversation during their design work.
Keywords:
Component coverage, decision trees, adoption, efficiency, time to market, user satisfaction, usage analytics, design system ROI, relative adoption.
Research insights often end up gaining dust in PDF reports stored on remote fringes of Sharepoint. To track the impact of UX research, we need to track outcomes and research-specific metrics. The way to do that is to track UX research impact for UX and business, through organisational learning and engagement, through make-up of research efforts and their reach. And most importantly: amplifying research where we expect the most significant impact. Let’s see what it involves.
Keywords:
Outcome metrics, organizational influence, research-specific metrics, research references, study observers, research formalization, tracking research-initiated product changes.
So you’ve made it so far! Now, how do you get your UX metrics initiative off the ground? By following small steps heading in the right direction. Small commitments, pilot projects, and design guilds will support and enable your efforts. We just need to define realistic goals and turn UX metrics in a culture of measurement, or simply a way of working. Let’s see how we can do just that.
Keywords:
Pilot projects, UX integration, resource assessment, evidence-driven design, establishing a baseline, culture of measurement.
Let’s wrap up our journey into UX metrics and Design KPIs and reflect what we have learned. What remains is the first next step: and that would be starting where you are and growing incrementally, by continuously visualizing and explaining your UX impact — however limited it might be — to your stakeholders. This is the last chapter of the course, but the first chapter of your incredible journey that’s ahead of you.
Keywords:
Stakeholder engagement, incremental growth, risk mitigation, user satisfaction, business success.
Who Is The Course For?
This course is tailored for advanced UX practitioners, design leaders, product managers, and UX researchers who are looking for a practical guide to define, establish and track design KPIs, translate business goals into actionable design tasks, and connect business needs with user needs.
What You’ll Learn
By the end of the video course, you’ll have a packed toolbox of practical techniques and strategies on how to define, establish, sell, and measure design KPIs from start to finish — and how to make sure that your design work is always on the right trajectory. You’ll learn:
- How to translate business goals to UX initiatives,
- The difference between OKRs, KPIs, and metrics,
- How to define design success for your company,
- Metrics and KPIs that businesses typically measure,
- How to choose the right set of metrics and KPIs,
- How to establish design KPIs focused on user needs,
- How to build a comprehensive design KPI tree,
- How to combine qualitative and quantitative insights,
- How to choose and prioritize design work,
- How to track the impact of design work on business goals,
- How to explain, visualize, and defend design work,
- How companies define and track design KPIs,
- How to make a strong case for UX metrics.
Community Matters ❤️
Producing a video course takes quite a bit of time, and we couldn’t pull it off without the support of our wonderful community. So thank you from the bottom of our hearts! We hope you’ll find the course useful for your work. Happy watching, everyone! 🎉🥳
Video + UX Training
$ 495.00 $ 799.00
Get Video + UX Training
25 video lessons (8h) + .
100 days money-back-guarantee.
Video only
25 video lessons (8h). Updated yearly.
Also available as a