By clicking “Accept All Cookies”, you agree to the storing of cookies on your device to enhance site navigation, analyze site usage, and assist in our marketing efforts. View our Privacy Policy for more information.
Of course, the table is not “done” yet. After the release on the dev environment, I took charge of rolling out a round of usability tests with our development partners. The goal was to learn if the interactions are comprehensible to our users.
We focused on the interactions available in the MVP which were vital to fulfill the requirements we defined at the beginning of this process:
setting up a time and data type dimension,
data input,
composing formulas.
Moderated tests
Before conducting the moderated tests, we used the opportunity to get to know the customer by gathering information about their business and pain points they were facing within their financial operations. This helped us not only to get to know our users, but also to build a database for reference on dovetail later.
We started by giving our users a rough overview of the app, in order to give them context for the tests that we were going to conduct. Then, we presented them with the empty table and gathered their first impressions of the screen.
Encouraging them to describe what they are doing in their own words and following up on their behavior, we continued to give them tasks to perform on the table.
Synthesis
After each session, I uploaded the recordings to dovetail to auto-transcribe the content and tag it. That meant, going through the sessions and marking peoples’ remarks, behaviors regarding pain points, financial reporting, usability struggles and more.
These would serve me as reference later, when I sorted the quotes to understand what people struggled with the most and why.
Tag board in Dovetail
Insights board in Dovetail
What we learned
Gathering user feedback in a structured manner enabled us not only to uncover multiple usability issues, it also helped us to get an idea of how they perceived our product.
We saw that our users were familiar with the concept of data models - the way how data is structured in Pectus. Additionally, they understood the connection between data input and data models.
Data selection however turned out to be somewhat confusing for them. They struggled to recognize where to click to start with the input, relied on trial and error to discover how to select specific groups of data and wanted to insert entire data models onto the canvas.
I like to have all data at once. Then, I can see what I have and adjust from there - rather than selecting data one by one because I think it’s missing part of the puzzle.
- Melissa, CFO
I'm expecting to have some data here that I can drag and drop or have a button that I can click, but I am not sure where I can do that.
- Johannes, Head of Controlling
Iterating the table
To solve these issues we decided to improve the table in the following aspects:
Showing the opened panel per default to give users a prompt to action
Make the selection panel more accessible by introducing a way to select entire data models at once
Adapting learned patterns for selection with checkboxes
Distinguish different data models in the report in order to define selected metrics
At the point of writing, we haven’t implemented all of these adjustments yet. However, my plan is to test the table again to validate our solutions.