Analytics & Metrics Articles & Videos

  • Recognize Strategic Opportunities with Long-Tail Data

    Be a strategic thinker by recognizing opportunities at scale with seemingly small and insignificant data.

  • Repeated User Actions Are Frustrating

    It's frustrating for users to go back-and-forth and back-and-forth to the same web page, bouncing around without getting what they need. Analytics data can help identify pages that don't help users progress.

  • Prioritize Quantitative Data with the Pareto Principle

    Prioritize the 20% of your website or app responsible for 80% of a critical metric to generate substantial improvements for less effort.

  • How to Sell UX: Translating UX to Business Value

    We speak users, whereas stakeholders speak business. We must translate: "if we do this for the user, it'll do that for the business."

  • Triangulation: Combine Findings from Multiple User Research Methods

    Improve design decisions by looking at the problem from multiple points of view: combine multiple types of data or data from several UX research methods.

  • Don't Overthink UX ROI

    It can be hard to calculate the return on investment (ROI) for user experience design improvements. But don't get bogged down in less-important details: often simple metrics can give a good-enough estimate to justify UX investments.

  • Better Charts for Analytics & Quantitative UX Data

    Spreadsheet defaults don't generate the most meaningful visualizations of UX data. Modify charts to enhance Context, Clutter (less of it than spreadsheet software likes!), and Contrast.

  • Partner with Other Research Teams in Your Organization

    To gain a holistic picture of your users, exchange data with the non-UX teams in your company who are collecting other forms of customer data, besides the user research you do yourself. You gain; they gain.

  • Statistically-Generated Personas

    Personas are usually a qualitative element in the UX design process, but statistical data from more users can be added for more precision, as long as the personas are still grounded in qualitative insights.

  • Handling Insignificance in UX Data

    After collecting KPI numbers for two versions of a design, the difference between the two metrics is not statistically significant. Now which version should you launch?

  • Why You Cannot Trust Numbers from Qualitative Usability Studies

    Qualitative usability studies have few users and variable protocol; numbers obtained from such studies are likely to poorly reflect the true behavior of your population due to large measurement errors.

  • How Useful Is the System Usability Scale (SUS) in UX Projects?

    SUS is a 35-years old and thus well-established way to measure user satisfaction, but it is not the most recommended way of doing so in user research.

  • Net Promoter Score in User Experience

    Net Promoter Score (NPS) is a simple satisfaction metric that's collected in a single question. While easy to understand, it's insufficiently nuanced to help with detailed UX design decisions.

  • Benchmark Usability Testing

    Benchmark studies measure one or more KPIs (key performance indicators) of a user interface so that you can tell whether a redesign has measurably better (or worse) usability.

  • Calculating ROI for Design Projects

    Demonstrating the value of design improvements and other UX work can be done by calculating the return-on-investment (ROI). Usually you compare before/after measures of relevant metrics, but sometimes you have to convert a user metrics into a business-oriented KPI (key performance indicator).

  • Triangulation: Get Better Research Results by Using Multiple UX Methods

    Diversifying user research methods ensures more reliable, valid results by considering multiple ways of collecting and interpreting data.

  • How to Interpret User Time Spent and Page Views

    Users’ “productivity” tasks differ from “engagement” tasks, in whether more or less is better for metrics like time on tasks, interactions, and page views. Such KPIs are important, but they must be evaluated relative to users' tasks.

  • Don't A/B Test Yourself Off a Cliff

    A/B testing often focuses on incremental improvements to isolated parts of the user experience, leading to the risk of cumulatively poor experience that's worse than the sum of its parts.

  • Rating Scales in UX Research: Likert or Semantic Differential?

    Likert and semantic differential are instruments used to determine attitudes to products, services, and experiences, but depending on your situation, one may work better than the other.

  • The Benefits of Benchmarking Your Product's UX

    Collect UX metrics to show how well your design is performing over time or relative to competitors. If numbers are down, you know what needs improvement. If up, ROI data is a key management tool.

  • Repeated User Actions Are Frustrating

    It's frustrating for users to go back-and-forth and back-and-forth to the same web page, bouncing around without getting what they need. Analytics data can help identify pages that don't help users progress.

  • How to Sell UX: Translating UX to Business Value

    We speak users, whereas stakeholders speak business. We must translate: "if we do this for the user, it'll do that for the business."

  • Triangulation: Combine Findings from Multiple User Research Methods

    Improve design decisions by looking at the problem from multiple points of view: combine multiple types of data or data from several UX research methods.

  • Don't Overthink UX ROI

    It can be hard to calculate the return on investment (ROI) for user experience design improvements. But don't get bogged down in less-important details: often simple metrics can give a good-enough estimate to justify UX investments.

  • Better Charts for Analytics & Quantitative UX Data

    Spreadsheet defaults don't generate the most meaningful visualizations of UX data. Modify charts to enhance Context, Clutter (less of it than spreadsheet software likes!), and Contrast.

  • Partner with Other Research Teams in Your Organization

    To gain a holistic picture of your users, exchange data with the non-UX teams in your company who are collecting other forms of customer data, besides the user research you do yourself. You gain; they gain.

  • Statistically-Generated Personas

    Personas are usually a qualitative element in the UX design process, but statistical data from more users can be added for more precision, as long as the personas are still grounded in qualitative insights.

  • Handling Insignificance in UX Data

    After collecting KPI numbers for two versions of a design, the difference between the two metrics is not statistically significant. Now which version should you launch?

  • How Useful Is the System Usability Scale (SUS) in UX Projects?

    SUS is a 35-years old and thus well-established way to measure user satisfaction, but it is not the most recommended way of doing so in user research.

  • Net Promoter Score in User Experience

    Net Promoter Score (NPS) is a simple satisfaction metric that's collected in a single question. While easy to understand, it's insufficiently nuanced to help with detailed UX design decisions.

  • Benchmark Usability Testing

    Benchmark studies measure one or more KPIs (key performance indicators) of a user interface so that you can tell whether a redesign has measurably better (or worse) usability.

  • Calculating ROI for Design Projects

    Demonstrating the value of design improvements and other UX work can be done by calculating the return-on-investment (ROI). Usually you compare before/after measures of relevant metrics, but sometimes you have to convert a user metrics into a business-oriented KPI (key performance indicator).

  • How to Interpret User Time Spent and Page Views

    Users’ “productivity” tasks differ from “engagement” tasks, in whether more or less is better for metrics like time on tasks, interactions, and page views. Such KPIs are important, but they must be evaluated relative to users' tasks.

  • Don't A/B Test Yourself Off a Cliff

    A/B testing often focuses on incremental improvements to isolated parts of the user experience, leading to the risk of cumulatively poor experience that's worse than the sum of its parts.

  • The Benefits of Benchmarking Your Product's UX

    Collect UX metrics to show how well your design is performing over time or relative to competitors. If numbers are down, you know what needs improvement. If up, ROI data is a key management tool.

  • Bounces vs Exits in Web Analytics

    It's important to study why users leave websites. Analytics tools give you two metrics for web pages: exit rate and bounce rate. Understanding the difference between these two numbers is essential for better UX design.

  • Vanity Metrics in Analytics

    Analytics for websites or other UX design projects should drive the project forward to better business success. Metrics that make you feel good may not achieve this goal.

  • What Is a Conversion Rate, and What Does It Mean for UX?

    Conversions measure whether users take a desired action on your website, so they are a great metric for tracking design improvements (or lack of same). But non-UX factors can impact conversion rates, so beware.

  • A/B Testing 101

    What is A/B testing, and why should you consider this method for measuring the business value of design changes?

  • Why Confidence Intervals Matter for UX

    To make valid design decisions from quantitative user research data, you should be familiar with the concept of a confidence interval.

  • Define Micro Conversions to Measure Incremental UX Improvements

    Not every design and content change generates immediate or significant increases in conversion rates, but they may affect conversion rates in the long run.

  • Five Essential Analytics Reports for UX Strategists

    Google Analytics is filled with very useful information for UX Strategists defining a baseline and tracking trends in order to define goals, strategies, and concepts for a brighter tomorrow.

  • Conversion Rates

    Increased conversion is one of the strongest ROI arguments for better user experience and more user research. Track over time, because it's a relative metric.

  • Three Uses for Analytics in User-Experience Practice

    In order to make the most of analytics data, UX professionals need to integrate this data where it can add value to qualitative processes instead of distract resources.

  • Internet Activity Bias Causes Lumpy User Behavior

    Dramatic differences in how much people use the web on different days can distort simplistic interpretations of site analytics.

  • User Satisfaction vs. Performance Metrics

    Users generally prefer designs that are fast and easy to use, but satisfaction isn't 100% correlated with objective usability metrics.

  • A/B Testing, Usability Engineering, Radical Innovation: What Pays Best?

    3 approaches to better design: each has its uses, but the costs, benefits, and risks differ dramatically.

  • Accuracy vs. Insights in Quantitative Usability

    Better to accept a wider margin of error in usability metrics than to spend the entire budget learning too few things with extreme precision.

  • Reduce Bounce Rates: Fight for the Second Click

    Different traffic sources imply different reasons for why visitors might immediately leave your site. Design to keep deep-link followers engaged through additional pageviews.

  • Usability ROI Declining, But Still Strong

    The average business metrics improvement after a usability redesign is now 83%. This is substantially less than 6 years ago, but ROI remains high because usability is still cheap relative to gains.

  • Quantitative Studies: How Many Users to Test?

    When collecting usability metrics, testing with 20 users typically offers a reasonably tight confidence interval.

  • Outliers and Luck in User Performance

    6% of task attempts are extremely slow and constitute outliers in measured user performance. These sad incidents are caused by bad luck that designers can - and should - eradicate.

  • The Slow Tail: Time Lag Between Visiting and Buying

    Users often convert to buyers long after their initial visit to a website. A full 5% of orders occur more than 4 weeks after users click on search engine ads.

  • Putting A/B Testing in Its Place

    Measuring the live impact of design changes on key business metrics is valuable, but often creates a focus on short-term improvements. This near-term view neglects bigger issues that only qualitative studies can find.

  • Risks of Quantitative Studies

    Number fetishism leads usability studies astray by focusing on statistical analyses that are often false, biased, misleading, or overly narrow. Better to emphasize insights and qualitative research.

  • Return on Investment for Usability

    Development projects should spend 10% of their budget on usability. Following a usability redesign, websites increase desired metrics by 135% on average; intranets improve slightly less.

  • Success Rate: The Simplest Usability Metric

    In addition to being expensive, collecting usability metrics interferes with the goal of gathering qualitative insights to drive design decisions. As a compromise, you can measure users' ability to complete tasks. Success rates are easy to understand and represent the UX bottom line.