Any product, whether it is hardware, software, consumable or other, must solve an existing problem in order to be deemed useful and hence have a chance of success in the marketplace. Peeling this back, the concept of usefulness has two layers, the first is ease-of-use and the second is whether the concept for the product itself fills a core need such as saving money, entertaining us, or increasing our chance of attracting the opposite sex. While a case can be made that other motivating factors exists, the truth is that most if not all of human behavior is driven by survival and reproduction. I purposely put ease-of-use ahead of filling core needs. If a product is hard to use then a large majority of people will not figure it out or understand the core value proposition.
Sure, you can market the features and benefits of a product, but nothing sells a product quite like a user that experiences progress towards one or more of the these core incentives.
What we are talking about here is User Experience, also known as UX. UX has been one of the fastest growing job categories, fueled by the onset of interactive web UI technologies such as AJAX and the challenge presented by limited screen space on mobile applications. Proof of this is found in the 51,000 plus members on LinkedIn’s extremely active User Experience group.
But just like marketing, there are two types of people. Those that profess knowing and those that measure to be sure. Like gravity in physics, there are basic truths with respect to UX, which are covered beautifully in books such as Don’t Make me Think (Steve Krug) and About Face 3 (Cooper and Reinmann). But when the rubber meets the road, any UI is born from a product’s innovation in turn creating unique cases that need to be measured and iterated.
There are two ways to measure user interaction, Usability Testing and Analytics. Usability testing is a tried and true method for sure. There’s no better way to uncovering sticking points than watching someone get stuck. However, gathering a meaningful set of data points with this method can be both time consuming and cost tens of thousands of dollars. On the other hand, Analytics quickly and inexpensively casts a wide net, capturing a lot of data points. However, the data points need to be assembled properly just as connecting-the-dots in the right order creates an accurate and meaningful picture.
Both methods are powerful tools, and like any tool you have to choose the right one for the job at hand.
Usability Testing is a great way to know if people can visually find their way around the interface. It answers basic questions like…
These kinds of questions can be answered very quickly by watching 3 to 5 test subjects for about 15-20 minutes each. Yes, that’s right, only 3 to 5 should do the trick as this method is used to find high-level coarse-correction type issues. Usability Testing is a fast way to answer a very basic question, "Is something severely broken or wrong with the user interface?” and it only takes a small number of test subjects to find this out.
Once there’s a high degree of confidence that users can navigate around the UI, more granularity can be measured using Analytics. Thanks to Google’s powerful and free Analytics service, you can measure and tally every click on your site or web application.
There are two categories of actions to test:
Feature Engagement measures the need and adoption of a particular feature and Process Flow measures how well it was implemented. I purposely list these categories in this order, as it’s best to measure whether a feature is actually desired before spending the time and resources to perfect it. Please don’t confuse this with releasing a quick-shot implementation that confuses the user or doesn’t work. As noted earlier, following common UX design rules will create a reasonable UX. However, the optimization phase should only take place once you know the feature is worth further investment, or valuable screen real estate.
With these guidelines in mind, we’ll demonstrate when, where and how to use each of these methods using Analytics and in Part II of this article.