7 Rules for Designing Better Analytics Dashboards

Vandelay Design may receive compensation from companies, products, and services covered on our site. For more details, please refer to our Disclosure page.

Let’s try a little experiment: What’s your immediate gut reaction when you read the phrase below?

“Hey, how’s the website doing?”

For many of you, it was probably a groan, accompanied by an internal (or even outwardly visible) eyeroll, right? It may have conjured up images of a near-future spent rooting around in dense analytics reports, searching for the one golden nugget of clear info that could help you answer such a seemingly-simple question.

Let’s face it; it’s 2016. While our current abilities are unprecedented, we still resort to the current generation of analytics tools to assess data (the equivalent of panning for gold).

But there is hope; the design of our analytics tools and dashboards can go a long way toward addressing this issue. The way we present collected data has a huge impact on our ability to draw meaning from it. The more intuitive the presentation is, the easier it is for all of us to make better, faster, and more informed decisions.

I’ve worked on a number of SaaS products, and my most recent project was Filament, a blog analytics tool. In the process, I learned firsthand how well-designed analytics can impact the insights users seek in their dashboards and reports. In this post, I’m going to share seven rules for designing better analytics dashboards that’ll help you guide your users to more useful insights, faster.

Some of these rules are UX-based, while others might require data architecture or specific engineering. All rules pertain to the entire experience users should have with their analytics tools. Additionally, we are going to discuss ways to improve user understanding and utility interface. So read on, and learn how to give your users analytics superpowers—they’ll thank you for it.

“Your job is to give your users superpowers” –Kathy Sierra

Answer “How am I doing?” in the first 10 seconds

In Hooked, Nir Eyal’s excellent book on engaging product design, he describes the role of “Triggers”—those internal and external motivators that influence people to seek out and use digital products.

In terms of analytics tools, curiosity is the internal trigger causing your users to log in and look at their data. We all want to know, “How am I doing?” So, a user-oriented analytics tool should prioritize answering that question as quickly as possible, above all else.

Summary dashboards are usually a poor compromise—they provide too much information at once, and usually not enough hierarchy or automated analysis of the data. The result? Hesitation from your users as to where they should dive in. Use different techniques like color, typography and layout to emphasize importance, and draw the user’s eye straight to the thing(s) that need immediate attention.

In the example above, my eyes are immediately drawn to the red and green panels, which quickly convey the idea that Current Wait Time is a problem I need to focus on right now. Alternatively, Average Handle Time is all good. This simple use of bold, meaningful color tells me something important in a split-second—something that might otherwise have taken close to an hour to decipher.

“Designed correctly, a dashboard increases productivity for all users” ~ Daniel O’Sullivan, Invision Blog

Most analytics tools follow a subscription-based pricing model, meaning they need to bring the heat and deliver consistent, recognizable value to the user (if they want to keep their business, that is). So, ensure you’re earning your keep by making your users smarter with every visit to their dashboard, as quickly as possible.

Don’t make me login (just to check how I’m doing)

The way we tend to use analytics today is highly reactionary. We only log in when it occurs to us to check out how we’re performing, but ideally, analytics tools should be more proactive—like alarm clocks. The nice thing about an alarm clock is, you only need to pay attention to it when it has something important to tell you, like, “It’s time to wake up,” or “Your lasagna is burning”.

In much the same way, our analytics tools should really function more as automatic sentinels that constantly watch over our websites, and alert us when something important happens (or changes).

Buffer message

It can be as simple as having your analytics tool send an email to mark a noteworthy milestone, like in the excellent example above from Buffer’s. Today, we get notifications about everything—that our stocks are up, that a deadline is due, that a meeting got cancelled, or that our kitten picture only got 14 Likes (c’mon, Mr. Bigglesworth is way cuter than that.) So why not get notified when something important is happening on our websites or in our businesses? Push notifications, SMS, chatbot/Slack integration, daily summaries—these are all proactive insights that should be baked into a cohesive user experience for a modern analytics tool.

Don’t “cry wolf” with color

Color-coding is an incredibly useful technique for conveying additional, qualitative information to the user—but only if you’re consistent with its application.

For example, if you’re using red to communicate a problem to your users, don’t then turn around and use that same shade of red in places where there aren’t problems. This is a pretty standard rule of thumb for good web design, but when it comes to designing analytics, this becomes a sacred rule, because inconsistent color usage only muddies the already-murky message your data may be conveying to the user. Financial planning app, Mint, is an excellent example of consistent color usage—at a glance, you can always tell where you’re over or under-budget.

Colors in the Mint UI

Think of it as less of a color palette, and more of a color language—once you’ve established the rules and syntax of this language with the user, help them by applying it consistently. This isn’t easy to get right every time, so plan out your color palette. Get your team on the same page, prior to laying out those pretty charts and graphs. You’ll save yourself a ton of rework and user support headaches.

Labelling is key

Much like color usage, your choice of button labels and iconography can make or break the user’s experience, so choose them wisely—and never use Lorem Ipsum, even during the wireframing/prototyping phase. With analytics, the inappropriate or inaccurate labelling of an important button or report link can result in issues. For instance, your users may have a complete misunderstanding of your product’s capabilities—what a wasted opportunity!

Labelling in Google Analytics
“Huh? All I want to know is how my traffic is doing…”

To take a ubiquitous example from Google Analytics, users expect their traffic reports live under the Audience category, rather than Acquisition. Similarly, who would travel to the Behavior menu to investigate specific pieces of Site ContentUnintuitive labelling like this causes users to miss potentially important insights.

Thorough user testing and research is usually necessary to get this right, as it’ll help you identify the keywords and naming conventions your users expect.

Remember, Data != Insights

“Insights” is the new catchphrase among analytics tools, but many of them are really just showing raw data. This is a problem because data by itself lacks meaningful context—therefore, these tools force users to extract useful information, rather than offering it right away.

A couple of useful rules of thumb, when it comes to displaying insights versus metrics:

You need at least two data points to derive an insight—ideally more

“You had 3,000 visitors yesterday”

Simply saying this to your user doesn’t tell them much, unless they already know that their average is 1,000 visitors. To make this data useful, you must pair it with another piece of data to provide context:

“You had 3,000 visitors yesterday, which is 38% higher than normal”

Hmm, getting better… but this insight is pretty basic, and still doesn’t give me a clue of where to begin investigating exactly why my traffic is so high. What if we added a third data point?

“You had 3,000 visitors yesterday, which is 38% higher than normal, mainly due to an increase in visitors from Facebook”

BAM, now that’s an insight I can really use! It tells me how I performed, how normal/abnormal this occurrence was, and clearly identifies the source of this significant event. In this way, the additional pieces of data provide me with meaningful context for my core information, making it a much stronger, with more actionable insight overall.

Use rates & ratios for comparisons, instead of raw numbers

Let’s say you’re trying to compare the performance of two different blog posts—one published 5 days ago, and the other 3 months ago. Simply looking at the number of visitors or social shares will likely tell you that the older post has way more visits and shares than the newer one—but we’d expect this, right? After all, the older post has had way more time to accumulate traffic and shares—so how can we objectively compare the performance of the two?

By calculating the ratio of shares to visits (i.e. # social shares divided by # of visits), we can objectively measure the shareability of each post. We can see how likely it is for one post to be shared over another—regardless of the individual numbers involved.

Numbers are for data. Text is for storytelling.

Articulate the really useful info as text, not numbers.

Why? Words have qualitative meaning and connotation baked into them, which makes them better suited to telling an informative story about the data, which the user can then choose to act on or ignore.

But when presented with performance numbers, the first question a user typically asks is, “So, is that good? Or bad?” The numbers are data—just there for scale—so they need additional information in order to convey meaning. With more information, your numbers can tell a coherent story about what’s happening.

We’re beginning to see a design shift in the way we present analytics reports to users. We’re seeing a greater use of descriptive text, used to convey meaning from the data. (A couple of great examples are ThinkUp (sadly RIP) and more recently, GrowthBot—an analytics chatbot.)

think up

The (now-closed) ThinkUp showed revolutionary design. For instance, ThinkUp presented insights through users’ social media activity, and it did so using clear, concise, qualitative sentences. Instead of relying on the standard Numbers/Charts/Graphs formula, ThinkUp did something new. Social media is, after all, about storytelling, so to interpret performance in meaningful prose just made sense.

growth bot

GrowthBot has gone one step further by enabling users to request reports. Additionally, users can query their analytics data in real time via natural language commands. Rather than relying on a (possibly mislabeled) dropdown menu for answers, GrowthBot requires less to understand more.

Design for zero: Everyone has to start somewhere

One area of analytics design that often goes neglected is something called Zero State—i.e. what does the dashboard look like when there’s no data to see? This can happen under a number of different circumstances:

  • The user hasn’t yet connected an external source of data
  • The tool is importing/syncing data with an external source, and needs a little time
  • A previously tracked activity drops to zero
  • No data has been tracked yet
  • A previously-linked source of data has lost its connection

Each of these are different scenarios, and should be accommodated separately in your dashboard design. Some can be addressed with appropriate error messaging, but others present opportunities to educate users. For instance, users can learn how to make the most of the product or to take other constructive action (invite a friend, for example), while they’re waiting for the tool to become ready. Find something useful to fill that empty air, and you’ll be surprised how many of your users choose to keep themselves busy.

“There’s no data here, and whatever, I don’t care.” *shrug*

Whatever you decide, don’t just show an empty chart area, it’s the de-motivational UI equivalent of a dismissive shrug—which assures a quick death to any engagement you might have hoped to get from your new users.

Summary: future-proof your analytics design

Now, more than ever, we demand that our tools are both easy to use and time-efficient. By these standards, analytics tools have fallen behind the curve.

Good design can change this, however, and the potential impact it could have—on our understanding of data, and on the quality of our subsequent decisions—is vast.

I’ve shared the new rules of analytics design above, but applying them isn’t easy—application takes research, thorough testing to implement, and at times, highly technical engineering. However, the result is a more intuitive display and faster, more in-depth interpretation of the data. And that can only be a good thing for relieving the groans and eyerolls the next time someone asks, “Hey, how’s the website doing?”

Get the Free Resources Bundle