It's frustrating for users to go back-and-forth and back-and-forth to the same web page, bouncing around without getting what they need. Analytics data can help identify pages that don't help users progress.
Improve design decisions by looking at the problem from multiple points of view: combine multiple types of data or data from several UX research methods.
It can be hard to calculate the return on investment (ROI) for user experience design improvements. But don't get bogged down in less-important details: often simple metrics can give a good-enough estimate to justify UX investments.
Spreadsheet defaults don't generate the most meaningful visualizations of UX data. Modify charts to enhance Context, Clutter (less of it than spreadsheet software likes!), and Contrast.
To gain a holistic picture of your users, exchange data with the non-UX teams in your company who are collecting other forms of customer data, besides the user research you do yourself. You gain; they gain.
Personas are usually a qualitative element in the UX design process, but statistical data from more users can be added for more precision, as long as the personas are still grounded in qualitative insights.
After collecting KPI numbers for two versions of a design, the difference between the two metrics is not statistically significant. Now which version should you launch?
Qualitative usability studies have few users and variable protocol; numbers obtained from such studies are likely to poorly reflect the true behavior of your population due to large measurement errors.
Net Promoter Score (NPS) is a simple satisfaction metric that's collected in a single question. While easy to understand, it's insufficiently nuanced to help with detailed UX design decisions.
Benchmark studies measure one or more KPIs (key performance indicators) of a user interface so that you can tell whether a redesign has measurably better (or worse) usability.
Demonstrating the value of design improvements and other UX work can be done by calculating the return-on-investment (ROI). Usually you compare before/after measures of relevant metrics, but sometimes you have to convert a user metrics into a business-oriented KPI (key performance indicator).
Users’ “productivity” tasks differ from “engagement” tasks, in whether more or less is better for metrics like time on tasks, interactions, and page views. Such KPIs are important, but they must be evaluated relative to users' tasks.
A/B testing often focuses on incremental improvements to isolated parts of the user experience, leading to the risk of cumulatively poor experience that's worse than the sum of its parts.
Likert and semantic differential are instruments used to determine attitudes to products, services, and experiences, but depending on your situation, one may work better than the other.
Collect UX metrics to show how well your design is performing over time or relative to competitors. If numbers are down, you know what needs improvement. If up, ROI data is a key management tool.
It's frustrating for users to go back-and-forth and back-and-forth to the same web page, bouncing around without getting what they need. Analytics data can help identify pages that don't help users progress.
Improve design decisions by looking at the problem from multiple points of view: combine multiple types of data or data from several UX research methods.
It can be hard to calculate the return on investment (ROI) for user experience design improvements. But don't get bogged down in less-important details: often simple metrics can give a good-enough estimate to justify UX investments.
Spreadsheet defaults don't generate the most meaningful visualizations of UX data. Modify charts to enhance Context, Clutter (less of it than spreadsheet software likes!), and Contrast.
To gain a holistic picture of your users, exchange data with the non-UX teams in your company who are collecting other forms of customer data, besides the user research you do yourself. You gain; they gain.
Personas are usually a qualitative element in the UX design process, but statistical data from more users can be added for more precision, as long as the personas are still grounded in qualitative insights.
After collecting KPI numbers for two versions of a design, the difference between the two metrics is not statistically significant. Now which version should you launch?
Net Promoter Score (NPS) is a simple satisfaction metric that's collected in a single question. While easy to understand, it's insufficiently nuanced to help with detailed UX design decisions.
Benchmark studies measure one or more KPIs (key performance indicators) of a user interface so that you can tell whether a redesign has measurably better (or worse) usability.
Demonstrating the value of design improvements and other UX work can be done by calculating the return-on-investment (ROI). Usually you compare before/after measures of relevant metrics, but sometimes you have to convert a user metrics into a business-oriented KPI (key performance indicator).
Users’ “productivity” tasks differ from “engagement” tasks, in whether more or less is better for metrics like time on tasks, interactions, and page views. Such KPIs are important, but they must be evaluated relative to users' tasks.
A/B testing often focuses on incremental improvements to isolated parts of the user experience, leading to the risk of cumulatively poor experience that's worse than the sum of its parts.
Collect UX metrics to show how well your design is performing over time or relative to competitors. If numbers are down, you know what needs improvement. If up, ROI data is a key management tool.
It's important to study why users leave websites. Analytics tools give you two metrics for web pages: exit rate and bounce rate. Understanding the difference between these two numbers is essential for better UX design.
Analytics for websites or other UX design projects should drive the project forward to better business success. Metrics that make you feel good may not achieve this goal.
Conversions measure whether users take a desired action on your website, so they are a great metric for tracking design improvements (or lack of same). But non-UX factors can impact conversion rates, so beware.
Not every design and content change generates immediate or significant increases in conversion rates, but they may affect conversion rates in the long run.
Google Analytics is filled with very useful information for UX Strategists defining a baseline and tracking trends in order to define goals, strategies, and concepts for a brighter tomorrow.
Increased conversion is one of the strongest ROI arguments for better user experience and more user research. Track over time, because it's a relative metric.
In order to make the most of analytics data, UX professionals need to integrate this data where it can add value to qualitative processes instead of distract resources.
Different traffic sources imply different reasons for why visitors might immediately leave your site. Design to keep deep-link followers engaged through additional pageviews.
The average business metrics improvement after a usability redesign is now 83%. This is substantially less than 6 years ago, but ROI remains high because usability is still cheap relative to gains.
6% of task attempts are extremely slow and constitute outliers in measured user performance. These sad incidents are caused by bad luck that designers can - and should - eradicate.
Users often convert to buyers long after their initial visit to a website. A full 5% of orders occur more than 4 weeks after users click on search engine ads.
Measuring the live impact of design changes on key business metrics is valuable, but often creates a focus on short-term improvements. This near-term view neglects bigger issues that only qualitative studies can find.
Number fetishism leads usability studies astray by focusing on statistical analyses that are often false, biased, misleading, or overly narrow. Better to emphasize insights and qualitative research.
Development projects should spend 10% of their budget on usability. Following a usability redesign, websites increase desired metrics by 135% on average; intranets improve slightly less.
In addition to being expensive, collecting usability metrics interferes with the goal of gathering qualitative insights to drive design decisions. As a compromise, you can measure users' ability to complete tasks. Success rates are easy to understand and represent the UX bottom line.