Heuristic Evaluation & Unmoderated Usability Testing

Overview

I am thrilled to have the opportunity to return to Cisco as a UX Research intern within the Customer Experience (CX) team. I conducted a heuristic evaluation on advisories and cases along with piloting an unmoderated usability study on CX Cloud.

Besides working on projects, I also:

  • Networked and connected with researchers in different organizations within Cisco including Customer and Partner Experience (CPX), Security Business Group (TD&R, DUO, CNS), Webex, Cisco Meraki

  • Shadowed UX Researchers in customer interview sessions and synthesizing data for CiscoLive, CX Cloud Insights, and Customer Personas

Role: UX Research Intern

Duration: 12 weeks, June 2022-September 2022

Support from my intern journey: Steve Ricken (Manager), Andrea Lindeman (Co-Manager), Matt Zellmer (Mentor), Marley Pirochta (Cabin Lead), Deborah Sparma (Mentor), Sally Kim (intern), all the other XDi researchers on the team

Heuristic Evaluation

CX Cloud: Advisories & Cases

Foundational Research

Gain a foundational understanding of Advisories & Cases and heuristic evaluation through reading various articles and coffee chatting with stakeholders.

  • Communicated with business architects and subject matter experts (SMEs) to develop an understanding of advisories and cases

  • Read various Medium and Nielsen Norman Group articles to gain a foundational understanding of what a heuristic evaluation is, its purpose, and defining each usability principle

Methodology

After becoming familiar with Advisories and Cases on CX Cloud and what a heuristic evaluation is, I started evaluating CX Cloud using Nielsen Norman’s 10 Usability Principles.

Environment

Campus Network Success Track

  • Evaluated the most complete and heavily used environment of CX Cloud

Severity Rating

Ranked issue by severity rating from

  • 0: no usability problems to 4: usability catastrophe

Feedback from Stakeholders

Feedback from UX Designers & Content designers grouped into 3 suggestions:

  • Recommendation

  • Next Step/Coming Soon

  • Technical Constraint

Example of One Finding

Here is an example of how I approached evaluating CX Cloud.

  • Page Name: where the violation/bug was found

  • Usability Ranking: rank severity from 0: no usability problems ~ 4: usability catastrophe

  • Heuristic Violation: see which heuristic principle is violated

  • Suggestion: recommendation, next step/coming soon, technical constraint

Key Takeaways

Here are the key takeaways from the all the findings.

Consistency is Key

Customers shouldn’t be forced to learn something new

(e.g. key terms, inconsistent filters, indicators of something blank)

Useful Warning Messages

Customers can figure out a way to resolve an error

Clear UI

Customers can easily navigate to complete their tasks/find what is needed

Easy to acquire user focus

Unmoderated Usability Testing

CX Cloud: Support Type and Coverage

Background

After customers purchase a product, they want to know the support type and coverage of the product.

  • Problem: Inconsistent terminology (interchanging support type & coverage) & lack of communicating key information, especially to what they are entitled to

  • Solution: Conduct an unmoderated usability test on UserTesting.com, specifically preference testing methodology

  • Impact: Customers have a simplified user experience in viewing and understanding the support types and coverage they own for their product

Research Goals

Here, we describe what we are looking to understand and learn about people, along with listing out how we will use the research once it is finished.

  • Validate whether having a new tab or summary view is the best method to display coverage details

  • Understand if we are presenting in our current designs and terminology is clear, accurate, and consistent with customers

    • Make sure customers know what they own – coverage, type, status

  • Gain foundational knowledge about whether:

    • Customers check for support type and coverage

    • When/how often they check

    • Importance of knowing support type and coverage

Methodology

Participants/Screening Criteria

10 Network Engineers:

  • 1 CX Cloud User, 9 don’t use CX Cloud but have heard of it

UserTesting.com

  • Unmoderated usability testing

  • Preference Testing - summary view vs. new coverage tab

Key Takeaways

Based on all the data analysis from customer feedback, we came up with these key takeaways.

  • Customers understood the general term definition, however they were unable to fully differentiate support type and coverage.

    • The customers either read information directly on the page, mixed up the terms, partially got the correct definition, or completely said they didn’t know.

  • Most customers rarely check the support type and coverage for their product

    • Check when they need technical support, purchase or renew a product to view what they are entitled to, and view what service and support they have.

  • Equal number of customers prefer the Summary Tab and Coverage Tab 

    • Coverage Tab option might be a better since we know that viewing coverage information isn’t a frequent flow OR use the Summary Tab option to maintain only frequently needed information

Foundational Insights

Design Feedback

View Full Presentation Below

Reflection

  1. Own your project: Initially I thought as an intern, we only work on projects that have low impact on the team. However, I was wrong. I’m thankful to a director of design who gave a valuable piece of advice, “Don’t thank people for listening to your presentation. Instead, continue your strong message and OWN the value you just shared. Be confident that you have just provided a service to us.”

  2. Storytelling drives success: Storytelling will help one advocate customer solutions and effectively explain the process behind it to large, varying audiences of stakeholders. Essentially, this draws attention to the audience and also leaving an impactful message.

In-Person Events

Previous
Previous

IBM - Maximo

Next
Next

Cisco UX Design