Perception of Performance (PoP) Assessment
Assess users' perceived performance of software workflows.
Benchmark perceived performance against actual performance to help the business identify problematic areas to prioritize.
Background
The MATLAB software is going through an infrastructure change -- we are making some refactoring to the backend architecture and updating our internal libraries. This is a business decision that affects our entire product, including improving the overall UI look and feel, separating the product's front-end from back-end, etc.
A known issue with this change is that there will be an overall regression in software performance. In other words, it's going to take longer between a user performing an action and the system giving a response, and this is measured by calculating the percentage of change in response time between our old system and the new system.
Business goal: to bridge the gap and improve overall performance.
UX goal: to assist with this process and help make key decisions.
Challenges
Problem 1: A smaller change in performance doesn't mean the performance is good.
Example: the loading time increases from 30 seconds to 35 seconds.
Even though the change is only ~17%, waiting for 30 seconds for something is still a long time.
​
Problem 2: There are over 80% of all interactions that have comparatively worse performance.
We only have limited resources. Given all these areas that need performance improvement, we need to figure out what's the most important ones to focus on, and what can be hold off a little. We also need to decide whether we still feel confident shipping the new software if some of the areas are being left out.
​
​
Problem 3: A regression in performance doesn't mean a regression in the "perception of performance".
Example: the time it takes for a page to load for the first time increases from 5s to 10s VS. increases from 0.2s to 0.4s. Both of these are 100% increases, but the former would make people feel significantly slower, whereas the latter is barely noticeable. ​
​
​
Problem 4: It's hard to figure out how users feel about the performance of very micro interactions.
When doing performance assessment, we measure each interaction separately, such as zooming in on a visualization or dragging a component to the canvas. However, in reality, nobody uses MATLAB in isolated workflows.
Solution -- UX Research Study
New "Perception of Performance" (PoP) Method Design
A combination of unmoderated diary study, guided workflow analysis, and survey.
​​
As a UX Researcher supporting the MATLAB Graphics and App Building area, we decided that we need a study that assesses users' perception of the performance changes for all workflows that we identified.
Two key points:
-
We put each "micro-interactions" into "macro-workflows" that represent a more complete and realistic user workflow.
-
We ask the participants to run the "macro-workflows" twice,
-
The first time, they run it through like doing a diary study and write down any feedback they have on the software.
-
The second time, they run through one step at a time and give ratings specifically on performance.​
-
​
Process
01.
Plan the research with cross-functional teams
02.
Finalize study details and manage project
03.
Recruit participants
04.
Run the assessment
05.
Analyze data and report out