A Preattentive Dashboard
The visual world is extraordinarily complex. For example a quick scan of my desk reveals hand-written notes, dry erase markers and USB thumb-drive. While I recognize these objects rapidly, I experience them at a basic visual perceptual level long before I can label or describe them. This low level of perception is what is called preattentive processing, or visual processing that occurs without deliberate attention. Preattentive processing can be used to create dashboards that easily communicate extraordinary amount of information per pixel and need very little effort to understand. Download our eBook, “Dashboards: Take a closer look at your data”.
Characteristics such as shape, size, color, contrast, luminosity and motion are examples of features that are perceived at this low level of perception. These factors are referred to as preattentive visual cues and help our brains categorize and filter our visual environment. Simply put, preattentive features are the information we gain from a visual scene before we direct attention to salient features to extract deeper meaning.
Humans are very good at extracting meaning from complex visual environments. However, this does not mean that we should be required to. This is certainly the case when designing dashboards and data visualizations. To keep things simple, CAN designs visualizations that focus on using preattentive imagery. Preattentive imagery allows us to communicate complex information in a rapid and concise manner. Our lives are complex enough. We deserve simple dashboards.
CAN recently competed a project for one of our clients examining the accuracy of industry level forecasts for every Metropolitan Statistical Area (MSA) in the United States. The report contained over 600 pages! Six hundred page reports do not get read, and consequently are rarely of value. We decide to go back to the white board. A 600 page report contrasts with our goal of making complex information easy to understand and act upon.
We needed a way for our client to explore and understand the meaning of our complex analysis. The result of our research are meaningless if they are not implemented. We started our design process by defining the business question our client needed to answer, “Which forecasts are inaccurate, and why?” Our client needed to navigate forecast accuracies by geography, industry sector, and the duration for which the forecasts are accurate. The dashboard we developed presents a 600 page report on one screen and can be fully navigated with three clicks.
Users explore the data by selecting areas on the map, concepts or MSAs individually or in group. This action updates the State, MSAs, forecast accuracy durations and industry sections for the selected region.
For our client, forecasts with greater than 90% accuracy are deemed acceptable, and closer examination is need for forecasts with 75 to 80% accuracy. We built these tolerances into our design. Notice the positioning of the grey crosses in each pane. The thin pink line shows 90% accuracy while the pink band shows 75-80% accuracy. As users explore the dataset, this relationship allows them to quickly identify and focus on values which are below the desired range. Glancing at the MSA window, it is clear that forecasts for Yuba City and Merced are suspect, and MSAs like Modesto should be examined more closely.
Let’s take a closeup look.
We’re looking at an overview of all Californian MSAs across several industry Concepts and at the Duration of Forecast. It’s immediately clear that the accuracy for Concept #4 is ‘in the red’. At this point, end users who are experts in the data can ask questions about what is going on in Concept #4, and discuss how this accuracy impacts future planning.
When elements on a page are judged to a similar standard, it is useful to maintain consistent visualization techniques. For example, we kept the theme of the reference lines constant across the Concepts and Duration of Forecast window. This helps reduce the effort required to use the dashboard and frees up some cognitive bandwidth to focus on the meaning of the data.
To visualize the Duration of Forecast, we carried over the reference line theme used in other panes. The purpose of this window is to let the user decide; for the region or categories they have selected, how accurate are the forecasts X quarters out. All the user needs to do is watch for where the grey line crosses the pink lines. This is a simple graphic. Users know they can expect this combination of forecasts to be 90% accurate up to 14 quarters out, and after 18 quarters the usefulness of the forecasts dissolve.
This approach strikes true to CAN’s goal of helping businesses work smarter. We turned a 600 page report into a single page that can be navigated with three clicks. Rather than increase complexity, we just built simplicity.