It's frustrating for users to go back-and-forth and back-and-forth to the same web page, bouncing around without getting what they need. Analytics data can help identify pages that don't help users progress.
Improve design decisions by looking at the problem from multiple points of view: combine multiple types of data or data from several UX research methods.
It can be hard to calculate the return on investment (ROI) for user experience design improvements. But don't get bogged down in less-important details: often simple metrics can give a good-enough estimate to justify UX investments.
Spreadsheet defaults don't generate the most meaningful visualizations of UX data. Modify charts to enhance Context, Clutter (less of it than spreadsheet software likes!), and Contrast.
To gain a holistic picture of your users, exchange data with the non-UX teams in your company who are collecting other forms of customer data, besides the user research you do yourself. You gain; they gain.
Personas are usually a qualitative element in the UX design process, but statistical data from more users can be added for more precision, as long as the personas are still grounded in qualitative insights.
After collecting KPI numbers for two versions of a design, the difference between the two metrics is not statistically significant. Now which version should you launch?
Qualitative usability studies have few users and variable protocol; numbers obtained from such studies are likely to poorly reflect the true behavior of your population due to large measurement errors.
Net Promoter Score (NPS) is a simple satisfaction metric that's collected in a single question. While easy to understand, it's insufficiently nuanced to help with detailed UX design decisions.
Benchmark studies measure one or more KPIs (key performance indicators) of a user interface so that you can tell whether a redesign has measurably better (or worse) usability.
Demonstrating the value of design improvements and other UX work can be done by calculating the return-on-investment (ROI). Usually you compare before/after measures of relevant metrics, but sometimes you have to convert a user metrics into a business-oriented KPI (key performance indicator).
Users’ “productivity” tasks differ from “engagement” tasks, in whether more or less is better for metrics like time on tasks, interactions, and page views. Such KPIs are important, but they must be evaluated relative to users' tasks.
A/B testing often focuses on incremental improvements to isolated parts of the user experience, leading to the risk of cumulatively poor experience that's worse than the sum of its parts.
Likert and semantic differential are instruments used to determine attitudes to products, services, and experiences, but depending on your situation, one may work better than the other.
Collect UX metrics to show how well your design is performing over time or relative to competitors. If numbers are down, you know what needs improvement. If up, ROI data is a key management tool.
It's frustrating for users to go back-and-forth and back-and-forth to the same web page, bouncing around without getting what they need. Analytics data can help identify pages that don't help users progress.
Improve design decisions by looking at the problem from multiple points of view: combine multiple types of data or data from several UX research methods.
It can be hard to calculate the return on investment (ROI) for user experience design improvements. But don't get bogged down in less-important details: often simple metrics can give a good-enough estimate to justify UX investments.
Spreadsheet defaults don't generate the most meaningful visualizations of UX data. Modify charts to enhance Context, Clutter (less of it than spreadsheet software likes!), and Contrast.
To gain a holistic picture of your users, exchange data with the non-UX teams in your company who are collecting other forms of customer data, besides the user research you do yourself. You gain; they gain.
Personas are usually a qualitative element in the UX design process, but statistical data from more users can be added for more precision, as long as the personas are still grounded in qualitative insights.
After collecting KPI numbers for two versions of a design, the difference between the two metrics is not statistically significant. Now which version should you launch?
Net Promoter Score (NPS) is a simple satisfaction metric that's collected in a single question. While easy to understand, it's insufficiently nuanced to help with detailed UX design decisions.
Benchmark studies measure one or more KPIs (key performance indicators) of a user interface so that you can tell whether a redesign has measurably better (or worse) usability.
Demonstrating the value of design improvements and other UX work can be done by calculating the return-on-investment (ROI). Usually you compare before/after measures of relevant metrics, but sometimes you have to convert a user metrics into a business-oriented KPI (key performance indicator).
Users’ “productivity” tasks differ from “engagement” tasks, in whether more or less is better for metrics like time on tasks, interactions, and page views. Such KPIs are important, but they must be evaluated relative to users' tasks.
A/B testing often focuses on incremental improvements to isolated parts of the user experience, leading to the risk of cumulatively poor experience that's worse than the sum of its parts.
Collect UX metrics to show how well your design is performing over time or relative to competitors. If numbers are down, you know what needs improvement. If up, ROI data is a key management tool.
It's important to study why users leave websites. Analytics tools give you two metrics for web pages: exit rate and bounce rate. Understanding the difference between these two numbers is essential for better UX design.
Analytics for websites or other UX design projects should drive the project forward to better business success. Metrics that make you feel good may not achieve this goal.
Conversions measure whether users take a desired action on your website, so they are a great metric for tracking design improvements (or lack of same). But non-UX factors can impact conversion rates, so beware.
Qualitative usability studies have few users and variable protocol; numbers obtained from such studies are likely to poorly reflect the true behavior of your population due to large measurement errors.
Likert and semantic differential are instruments used to determine attitudes to products, services, and experiences, but depending on your situation, one may work better than the other.
Tracked analytics metrics should be actionable: variations in a meaningful, relatively stable metric reflect change in the user experience. In contrast, vanity metrics appear impressive, but their fluctuations are not operational.
A treemap is a complex, area-based data visualization for hierarchical data that can be hard to interpret precisely. In many cases, simpler visualizations such as bar charts are preferable.
Radical redesigns are best tested using an A/B experiment, while multivariate tests indicate how various UI elements interact with each other and support incremental improvements to a design.
Focus on UX goals to drive analytics measurement plans, rather than tracking superficial metrics. Identify the core goal of a design to meaningfully measure it.
Use bounce rate as a red flag for possible issues lurking on your site, but don’t make design decisions aimed solely at chasing that second click. Optimize for long-term engagement through return visits and track deeper conversion goals.
How often people visit your site and how long they wait between two visits can help to gauge visitor loyalty and to uncover the behavioral trends distinguishing frequent users from occasional ones.
NPS is a loyalty metric that correlates well with perception of usability, is easy to understand and administer, but has limitations for understanding and evaluating UX when used in isolation.
Analytics metrics such as pageviews, conversions, entrances, bounce rates, and search query frequency can help identify problems in your category structure.
Game testing researches the notion of fun. Compared with mainstream UX studies, it involves many more users and relies more on biometrics and custom software. The most striking findings from the Games User Research Summit were the drastic age and gender differences in motivation research.
Misleading links and omitted information force users to bounce back and forth in a hub-and-spoke pattern between a routing page and subpages linked from it, increasing the interaction cost and decreasing engagement over time. Use web analytics tools to identify and monitor pogo-stick behavior on your site.
Persona-inspired segments can be used in website analytics to uncover trends in data and derive UX insights. Better than (a) lumping everybody together or (b) segmenting on demographics that don't relate to user behavior.