May 04, 2012

User Experience Heuristics - Practical Approaches #UX

This isn't so much a critique as a 'compare and contrast'. The real critical review here was done by Abby Covert and her colleagues at The Understanding Group. I stumbled across her premise, methodology and results only recently. But, it did get me thinking.

First some background. I often employ 'expert heuristic reviews' when I'm doing website competitive audits. The term heuristic has sometimes gotten me in trouble with account folks and clients. As soon as you say 'rule of thumb' or 'common sense', they get all uppity - they want hard numbers. All fine and dandy, but that's what A/B testing and user path analytics are for. There is something to be said for having someone with experience and practice to 'eyeball' something. But, I digress.

When I'm doing these competitive audits, I typically use a couple of different types of analysis, both based on heuristic principles.

The Forrester Website Review Scorecard: 


This one is dead easy. Too simple, in fact. For those not familiar:
  • It's a questionnaire based on historical Forrester research
  • The reviewer answers 25 questions in four categories: Value, Navigation, Presentation, and Trust
  • Each question can score a 2 (ideal) to -2 (disaster). (The worst I've seen/done is a -37)
  • The reviewer ends up with an aggregate score, but can also look to specific questions to identify pain points
  • Because the questions are already defined, the level of expertise required is minimal.

Forrester Metrics


Some thoughts:
  • The scorecard is good for doing quick comparisons of peer websites
  • The criteria can be used on their own as a way to do heuristic reviews. I recently used them for another project
  • The methodology is heavily focused on websites, transactional in particular
  • It doesn't take into account how easy the site is to find, nor does it have any questions related to collaboration (community tools and Social Web)
  • It's focus on 'Trust' is skewed towards ecommerce sites
  • It doesn't take into account the larger digital marketing ecosystem, which is why Forrester has supplemented this scorecard with others that include email and cross - channel customer experience


The 'Immersibility' Index:

This is a hybrid of my own design that has evolved over time. It started with a customer self service 4C framework. Content, Capability, Community, and Commerce. But it was missing 'ease of use'. If you take the wayback machine to 2003, Agency.com (along with the Nielsen Norman Group) came up with an index where they measured "the depth of the relationship a site creates with its users and that immerses the user in the experience". So, I added in the idea of immersibility with questions around heuristics, status & visibility and wayfinding. Then, I realized that having a nice website is great, but the Field of Dreams paradigm comes into effect. So, I added a sixth axis called 'Findability' which relates to how easy it is to find the site in the first place. Finally, with the rise of mobile and tablet platforms as well as the recent focus on pan-marketing channels (customer service, IVR, loyalty programs, email, Social) I added in a seventh - Cross - Channel Customer Service. Brief definitions of the axes are:
  1. Findability: How easy is it to find the site? This includes a look at SEO and SEM efforts
  2. Immersibility: How easy is it for the user to immerse themselves in the site? Includes navigation strategies, design, wayfinding, status & visibility in forms and interactive elements. 
  3. Content: Depth, breath, recency and relevancy of content. Also, is it chunkable, contextual and readable?
  4. Capability: What tools are available on the site? For a bank site, are there mortgage calculators? For a car site, are there shopping and pricing tools?
  5. Community: Can users interact with the organization and with each other? This has evolved from 'ask the expert', comments and discussion forums into Social Web integration.
  6. Commerce/Conversion: Originally focused on how easy it is to buy something and get out, it's now more about the ease with which users can accomplish their goals.
  7. Cross - Channel Customer Experience: How well does the site perform on other platforms (mobile and tablet)? Is the site part of a larger engagement strategy that includes customer service channels? This is an evaluation of the overall infrastructure and integrated customer service model.


Notes:
  • The nice thing about this index is that it has some flexibility - you can focus on the core 5 (Immersibility, Content, Capability, Community, and Conversion) or expand it to 7
  • It lends itself well to being represented visually in a spider diagram, creating impact (and making it look fancier than it really is)
  • It can be loose or well-defined. A quick, 'back of the napkin' approach works, but I've also set up specific questions for each axis so that you end up with a score of 0 - 10 for each.

OOO. Fancy


So, I was pretty happy with how this was working for me. Then, I came across Abby's analysis.

  • A meta-analysis of 5 different sources of Information Architecture heuristics
  • They identified that the majority of them were:
    • In many cases 5+ years old, meaning mobile, social and cross channel were not considered
    • Written for practitioners and students, not clients and partners
    • Not presented in a teachable nor evaluative format
  • Conducted a card sort of 50 statements and winnowed them down into 10 distinct categories
  • Wrote statements and evaluation questions for each

Methodology


Heuristic Principles

Very cool. So cool in fact that I wanted to see how my Immersibility index stacked up against it. Although Abby's work focuses primarily on Information Architecture heuristics, I thought we were still comparing red apples to green apples.  I did a quick sparkline diagram for all three evaluation tools.





Thoughts:
  • When you compare the Forrester attributes to those in the Immersibility index, you can see the gaps. Namely, nothing to do with Findability, Community (read Social) or Cross - Channel Customer Experience
  • Mapping *my* index against the TUG index, the first thing that is apparent is that I have too much invested in the Immersibility axis. Put another way, the TUG index has 6 categories that describe what I've rolled into 1. So, theirs is much more specific
  • TUG's definition of 'Findability' differed from mine to focus more on internal findability rather than the 'being able to find' the site angle
  • I looked, but I couldn't find anything in the TUG matrix that focused on Community elements and Social


My conclusions, albeit definitely debatable, are:


  • That the TUG matrix is much more specific when it comes to usability and information architecture heuristic principles. Which of course makes sense.
  • I would argue however that the Immersibility index is more holistic and inclusive of search, social and cross-channel
  • Both are a damn sight better than using the Forrester model
Regardless, I can see myself using this matrix moving forward. Because it has 10 characteristics any kind of visual representation would be unwieldy, but I can see a scenario with Harvey Balls working very well.

UI Heuristic Grid with Harvey Balls

Congrats to Abby et al for putting this together. It would be nice to see it get some traction.


No comments: