
Servigistics Usability Test
An analysis of a new feature currently being implemented by the Servigistics team: The Inventory Collaboration page. This project included collaborating with the Servigistics team, conducting a heuristic evaluation, and a usability evaluation.
My Role
-
Team Representative with PTC (4 member team)
Methods
-
Experimental Design
-
Heuristic Evaluation
-
Qualitative Research
-
Usability Test Facilitation
-
Usability Test Analysis
Time
-
January 2021 - May 2021
The Goal
Our goal was to analyze the software for usability issues. We first discussed potential issues during our meeting with the Servigistics team and through our Heuristic Evaluation. After that, we investigated problem areas using our usability evaluation to further investigate how severe of a problem it was and find more information about why it was a problem for users.
Process
Meet with the Servigistics team to discuss the product
Create test materials (scripts, screeners, consent form, etc.)
Perform a Heuristic Evaluation to find potential issues
Conduct usability testing
Design test based on heuristic evals
Analyze and present test data

The Product
Servigistics is a leading service part optimization software that helps companies with supply chain optimization.
The Focus
Our focus for the usability evaluation was the Inventory Collaboration Page. This page was intended to be used by parts managers responsible for inventory in equipment dealerships, such as automotive dealerships and the US military. The primary use of the feature is to determine which parts to obtain at which location for maximum efficiency and cost-effectiveness.
Heuristic Evaluation
We followed a three-step design process for the heuristic evaluation:
-
We converged to establish a common vision and understanding of the software
-
We diverged to examine the interface. Individually, we examined the interface based on the heuristics listed below.
-
We discussed the violations we found and created an evaluation with the combined findings. In our evaluation we described the issue, rated it based on severity, and offered potential solutions.
The Heuristics
-
External consistency
-
The system uses interactions and design patterns that are consistent with the platform and analogous systems. Design patterns are Externally consistent with their standard.
-
-
Widgets and labels near targets
-
Place widgets (controls) adjacent to their respective targets of operation and labels on, or directly adjacent to, their associated Controls.
-
-
Group like widgets/functions
-
Use the Gestalt principles of proximity, similarity, and closure to group widgets with similar functionality.
-
-
Frequently used functions Optimized
-
The system minimizes the user’s cognitive load by keeping only the most salient information and signifiers visible. Frequently used functions are highly visible while infrequently used functions do not pollute the Interface. The workflow is optimized for the most common use-cases.
-
-
Speak the user’s language
-
The system “speaks” to the user in their native language using signifiers that are contextually relevant (e.g., health care iconography in an application used by medical professionals). The system uses terminology familiar to the user rather than technical jargon.
-
-
Perceptibility Of feedback
-
User interaction with the system must result In immediate, perceptible, and interpretable Feedback.
-
-
Perceptibility of system state
-
The user must have the ability to perceive the state of the system at any given moment. The key question to ask yourself is, “if you were to walk up to the system, after having been away from it for an extended period of time, would you be able to properly interpret its current state?”
-
-
Internal consistency
-
Words, phrases, signifiers and design patterns are used in a consistent manner throughout The system.
-
-
Appropriate selection of design Patterns
-
Are the optimal design patterns used within the system? For example, is progressive disclosure used for long data entry forms? Is the application using design patterns appropriate for the platform (e.g. Mobile vs Desktop)? Are wizards used where appropriate?
-
-
Minimize knowledge in the Head
-
Does the system display the appropriate amount of information to the user? If the application uses a wizard design pattern, is the user forced to remember information from one or more previous steps in the wizard or is all the information necessary to complete each step displayed concurrent with the step.
-
-
User control and freedom
-
The system should be configurable by the user and not force users to alter behavior in order to adapt to the system. The system should provide accelerators for more experienced users (e.g. Keyboard shortcuts and context menus).
-
-
Error Prevention
-
Where appropriate, the system should prevent the user from making an error via the appropriate implementation of constraints. Note: sometimes, error recovery is preferred over error prevention and vice versa.
-
-
Error Recovery
-
Where error prevention is not feasible or desired, the system should provide graceful mechanisms to help the user recover from either system or user errors. It should provide undo and redo capability; error messages that are written using terminology understandable to the user that describe both the problem and remedial action
-
-
Novel interactions easily Learned and recalled
-
Novel interactions are easily learned and remembered because they take advantage of natural mappings and external consistency.
-
-
Help & documentation
-
Help should be easily accessible. Help may take the form of printed or electronic documentation, a knowledge base, a wiki or a live chat system
-
The Results
We found that "Internal Consistency" and "Appropriate use of Design Patterns" needed attention as there were many violations in this area. We found "User Control and Freedom" and "Help and Documentation" were implemented well. With our evaluation, we found that the software works well and making interface design changes could improve the user experience of the product.
We identified the following violations to investigate further during our usability testing:
-
When closing a view it’s difficult to get it back
-
Many buttons are small
-
Large amounts of information are presented at once
-
There is a lack of a general search feature
-
The liquid layout may not be intuitive
Usability Test
-
Testing Environment
-
Remote meeting via Zoom
-
Remote connection to Servigistics' machine with software
-
Data Collection
-
Quantitative data:
-
5-point Likert ratings for
-
Ease of use (Very Easy, Easy, Neutral, Difficult, Very Difficult)
-
Satisfaction with the look/feel of the user interface and help features (Very Satisfied, Satisfied, Neither Satisfied nor Dissatisfied, Dissatisfied, Very Dissatisfied)
-
Agreeance with statement (Strongly Agree, Somewhat Agree, Neither Agree nor Disagree, Somewhat Disagree, Strongly Disagree)
-
Likeliness to use a system like this in the future (Extremely likely, Somewhat likely, Neither likely nor unlikely, Somewhat unlikely, Extremely unlikely)
-
Background questionnaire results, aggregated across participants
-
-
Qualitative data included:
-
Participants' responses to questions
-
Participants' utterances, facial expressions, and gestures during the usability test
-
Observations during the usability test
-
Study Setup
-
Participants were asked:
-
To perform a Think-Aloud approach
-
Open-Ended Questions and 5-point Likert Scales
-
-
Participants were asked to complete six tasks, ordered from simplest to most complex
Results
-
Most participants indicated that the software was similar to or simpler than other inventory systems they have used
-
Participants struggled with the liquid layout
-
Participants expressed a desire for improved search features
-
Participants were overwhelmed by the volume of information on the grid
-
After use, the terminology on the screen was understood by the participants
-
All the participants commented on making adjustments to the view and layout of the software by:
-
Increasing icon size
-
Presenting a cleaner layout
-
Reflection
This project was one of my first usability evaluations completed on behalf of an organization. Performing this usability evaluation not only allowed me to apply skills that I had learned in class, but also allowed me to practice these skills in a professional environment. I was able to learn more about what types of information was valuable to a company and get the chance to collaborate with various members of a software development team.