Turning Frustration into Insight - Redesigning the Monitoring Dashboard
When I joined the Monitoring Dashboard project, the situation was far from ideal. What was meant to be a key operational tool had become a daily source of frustration. After 35 days of research, including in depth interviews with 9 users from different roles (Advisors and Managers), one number summed it up: 78% of users were dissatisfied with the dashboard’s data reliability. Despite its high adoption, users didn’t trust what they sawand when data loses credibility, decisions lose direction.
framework apllied
To uncover what was really happening, I applied the Double Diamond framework, combining qualitative interviews with desk research on dashboard design and observability best practices. Patterns quickly emerged. Users were spending too much effort on simple tasks, dealing with inconsistent data, excessive clicks, and context loss between screens. The first page wasn’t even the most relevant view. As one participant put it, “I check three screens just to make sure the numbers are real.”
Before and After - From Frustration to Hope with Recent Improvements
Before
100% of interviewees reported high effort to perform simple tasks.
67% mentioned too many clicks and lack of interactivity in cards and charts.
89% pointed out contextual inconsistencies between screens, making navigation difficult.
78% said that the numbers didn’t reflect the reality of the jobs, causing confusion and rework.
After (with recent improvements)
Personalized filters, real-time log viewing, and cancel buttons were well received by 56% of users.
The perceived usefulness of the dashboard has increased, although still limited by the interface and data reliability.
How It Was Done: Qualitative Research and Desk Research
The approach followed the Double Diamond methodology, focusing on
In-depth interviews via Teams with 9 key users.
Desk research on secondary sources regarding best practices for dashboards and observability.
Hypothesis generation and validation based on users’ real pain points.
Key validated hypotheses
The first page of the dashboard is not the most relevant.
The project-based view is more useful than the execution-based view.
Data inconsistency is a critical pain point.
Learnings - What We Discovered and How to Move Forward
Beyond the numbers, the project revealed a bigger lesson: clarity and context are everything. People don’t just want dashboards—they need tools that tell a trustworthy story about their work. By simplifying interactions, refining labels (“in flow” instead of “suspended”), and focusing on the indicators that truly matter, we started to rebuild both usability and confidence.
Key insights
Clarity and context are essential: the dashboard must maintain context between screens and provide consolidated visualizations.
Less is more: reducing clicks and making the dashboard more interactive increases efficiency.
Meaningful indicators: last execution status, real-time execution, failure evolution, and domain history are the most valued.
Interface matters: colors, filters, and labels like “in flow” instead of “suspended” make a big difference in understanding.
learnings
Today, the Monitoring Dashboard is entering its next phase: V2 prototyping, centered on usability, reliability, and user trust. This journey taught me that transforming dissatisfaction into engagement isn’t just about fixing interfaces it’s about listening deeply, designing transparently, and building confidence one interaction at a time.