Search

Not Everything That Can Be Counted Counts

0 views

Why Many Web Reports Fail to Deliver Insight

When Adam Hodge, the head of National Marketing and Communications at the Australian Red Cross Blood Service, read my column “Let There Be Light,” he called it entertaining and creative but pointed out that it offered no real guidance on tackling the flood of web‑site reports that flood marketing desks. Adam is not alone. In my experience, most organizations churn out dozens of analytics dashboards each month, only a fraction of which actually drive decision making. The core issue isn’t that data is missing; it’s that the data that arrives is rarely tied to a clear business objective.

Adam’s own words capture the dilemma. “We provide a myriad of web effectiveness reports to our board, and I am sure that 90 % of them are not really that valuable. However, knowing which 10 % to keep and what to discard is the hurdle I face.” This statement rings true for any organization that has invested in analytics tools without a clear framework. The temptation to measure everything is strong; every click, every scroll, every page view seems like a treasure worth capturing. Yet, when the reports pile up, executives and marketers lose sight of the single question that should guide every metric: does this measurement help us achieve a specific goal?

Consultants often find themselves in the same position: we are hired to cut through the noise, but the first obstacle is that the clients already have a large volume of reports that they consider “important.” The reports are polished, colorful, and easy to read, but they fail to answer the question of whether a campaign is working or whether a user journey is optimized. In the Red Cross case, Adam’s team receives weekly dashboards filled with traffic numbers, bounce rates, and conversion metrics that, while accurate, do not reveal whether the underlying objectives - such as recruiting volunteers or boosting donations - are progressing.

The problem of over‑reporting is rooted in a misunderstanding of what metrics truly matter. In the early days of web analytics, the focus was on technical performance: server uptime, page load times, and the number of visits. Those were the metrics that could be measured easily and that stakeholders could understand. As tools evolved, the range of possible measurements exploded - from session duration to click‑through rates, from form abandonment to exit pages. Every new metric promised insight, and every department claimed it needed a dashboard that included it. The result was a sprawling landscape of reports that looked impressive but were difficult to interpret. Decision makers began to skim dashboards for anomalies rather than patterns, and the value of the data dwindled.

Adam’s frustration is understandable. He mentions that external consultants have offered “different and seemingly logical takes on the issue,” which has only added to the confusion. When each consultant presents a unique set of metrics, the organization’s leadership is left with a collection of reports that serve no common purpose. In practice, this leads to “management over‑reporting” - executives who receive the same set of data week after week, yet are unable to extract actionable insights. The cycle continues: more reports are generated, more confusion follows, and the core objectives are forgotten.

One effective strategy to break this cycle is to start with a clear definition of success. Success can be expressed in many ways: increased sales, higher engagement, or improved brand perception. What matters is that each metric ties back to one or more of those outcomes. For instance, if the Red Cross goal is to increase donations, the relevant metric might be the average donation amount per visitor, not just the total number of visits. If the goal is to recruit volunteers, the number of completed volunteer applications per month is more telling than the number of page views on a volunteer recruitment page.

To make the case for focused reporting, it helps to look at the experience of companies that have successfully reduced their data overload. In Mark Graham Brown’s book “Keeping Score,” a head of an over‑burdened financial reporting department stops all report delivery for a week. The result: a handful of phone calls from managers who need critical updates. That experiment illustrates a key point - less can be more. By eliminating unnecessary reports, the remaining data gains weight and relevance. Employees return to the dashboards with a clearer sense of purpose, and decision making becomes faster and more accurate.

For the Red Cross and other organizations like it, the path forward involves a careful audit of current reports. Each metric should be evaluated against its contribution to a defined goal. If a metric does not move the needle on any objective, it is a candidate for removal. This process requires dialogue between the analytics team, marketing, and senior leadership. By aligning every data point with an outcome, the organization moves from a state of “data overload” to one of “actionable insight.”

In short, the sheer volume of available metrics can be both a blessing and a curse. The key is not to reject new data outright but to ensure that each measurement adds value to a specific, measurable goal. Only then will the reports that surface truly help the organization grow and adapt.

Aligning Metrics with Concrete Objectives

Once an organization recognizes that too many metrics dilute focus, the next step is to determine which data actually matters. This begins with identifying the organization’s primary goals and then tracing the chain of metrics that influence those goals. A good rule of thumb is to ask: “What would the board consider a success, and what would each department need to know to reach that success?”

Take the Australian Red Cross as a case in point. Their website serves several purposes: raising brand awareness, providing emergency information, recruiting volunteers, and soliciting donations. Each of these goals requires a different set of metrics. For brand image, one might look at reach and sentiment on social media channels, and for emergency information, the focus could be on the speed with which users find the help page. Volunteer recruitment hinges on the number of completed volunteer applications, while donation campaigns rely on the number of donors and the average contribution per donor.

Instead of a blanket approach, the analytics team should create separate dashboards for each objective. The donation dashboard might feature metrics such as: donation conversion rate, average donation amount, donor acquisition cost, and donor lifetime value. The volunteer dashboard could track: volunteer application completion rate, volunteer retention rate, and the number of volunteer sign‑ups per campaign. These dashboards would be the primary tools used by the teams that need them, rather than a single, all‑inclusive report that everyone receives.

When setting up these dashboards, it is essential to ground each metric in a clear definition and a target. For example, the donation conversion rate is the number of visitors who make a donation divided by the total number of visitors. If the goal is to increase this rate by 10 % over the next quarter, the dashboard must display that target prominently. Similarly, a volunteer application completion rate should be benchmarked against past performance, and the dashboard should indicate when the rate dips below a critical threshold.

The process of selecting metrics is iterative. Initially, a small set of “must‑have” metrics is chosen, based on the most critical objectives. Over time, the team can add or remove metrics as they discover which data points are most predictive of success. This agile approach keeps the dashboards lean and relevant. It also reduces the burden on analysts, who no longer need to explain dozens of unrelated data points.

Another important consideration is data quality. A metric can’t be useful if it is unreliable or hard to interpret. Therefore, each metric should be accompanied by a brief description of how it is calculated, its data source, and any assumptions involved. Clear documentation turns raw numbers into actionable insights, and it makes it easier for new team members to understand what they are looking at.

Beyond the dashboards themselves, the culture of measurement must be fostered. Managers should be trained to read the data and to question whether a metric truly reflects the intended outcome. The board, for example, needs to ask not just “How many visits do we get?” but “How many visits are turning into meaningful actions?” By engaging senior leadership in this dialogue, metrics become a tool for strategy, not just a compliance exercise.

In many organizations, the push to reduce reporting overload begins with a single conversation. A senior executive or a consultant might challenge the current reporting structure: “Which metrics are truly driving our goals?” That question forces a critical look at every line item in the dashboard. The answer is often simple: remove the metrics that do not map to a business outcome, and refine the rest.

For those looking to implement a similar transformation, the work of Jim Sterne offers valuable guidance. With two decades in sales and marketing, Sterne has helped organizations measure the value of their websites as tools for building customer relationships. He writes extensively on using the Internet for marketing and customer service and has authored five books on the subject. For more insights, you can visit his site at jsterne@targeting.com

Ultimately, the goal is not to eliminate data but to ensure that every metric serves a clear purpose. When metrics are aligned with objectives, reports become powerful decision‑making instruments. When they are not, they turn into a costly distraction. By focusing on the 10 % of data that matters, organizations like the Australian Red Cross can move from being overwhelmed by numbers to being driven by insight.

Suggest a Correction

Found an error or have a suggestion? Let us know and we'll review it.

Share this article

Comments (0)

Please sign in to leave a comment.

No comments yet. Be the first to comment!

Related Articles