The perils of consistency: Why we stick to poor decisions
The day arrived, as it does for most analyses, when we had to reckon with the poor choices we had made in the data and analytics design.
Setting the Stage: Navigating a Complex Data Landscape
To set the scene, let me give you some background. I embarked on a dashboarding project with a client who wanted to monitor "conformity" issues in the budgeting and execution of their projects. The data, extracted from their project management software, was vast and complex. The analytical requirements were equally challenging: I needed to flag projects that had consistently exceeded budget or timelines over the past three project reviews. Since these reviews occurred at random intervals, calculating this metric was already a formidable task. At the outset, the client assured me they would provide a "data model" optimized for analytics.
When the project officially began, I discovered that the so-called "data model" was little more than raw transactional data extracted from their project management system. The client's data team was too overwhelmed to refine the model, and their infrastructure was already at capacity, unable to accommodate an additional data processing job. Consequently, I was asked to implement all calculations directly in the dashboarding tool. I accepted the challenge—or, in hindsight, took the bait.
Despite the constraints, I built the dashboard, only to find that it took over 30 seconds to load. The client accepted this performance risk, and we moved forward.
A New Request, A Bigger Challenge
Two months later, they requested a macro dashboard, which needed to display the number of non-conforming projects across their entire organizational structure, with drill-down capabilities down to the team level. Additionally, they wanted to highlight entities with zero non-conforming projects—a seemingly simple ask, but a nightmare in data analytics. After all, how do you count something that doesn’t exist?
Under pressure, I performed intricate front-end gymnastics to display zeros where there was no data. However, this new dashboard was even slower due to its increased complexity. The client found the performance unacceptable, leading to a meeting to explore solutions. This time, I pushed back, advocating for a shift in data processing to the database layer. The client agreed, and the rest is history.
The Trap of Consistency: Lessons from a Mistaken Approach
Reflecting on this project during my annual review, I felt uneasy. Why had I agreed to implement complex logic in the front-end tool in the first place? The approach was never sustainable—it was a maintenance nightmare, the performance was lackluster, and the system lacked robustness. Any new project data typology would inevitably break the dashboard.
In all honesty, I had accepted the challenge because it felt good. It was satisfying to prove that, despite the limitations, I could deliver a functional dashboard and save the client from their data team’s bottlenecks.
When the second request arrived, I didn't accept out of pride but out of fear. I lacked the courage to admit that agreeing to the first request had been a mistake.
Saying yes to the initial request was an error—I knew that embedding all calculations in the front-end tool would create an unstable and unmaintainable solution. Saying yes to the second request was an extension of that mistake, driven by the need for consistency. This reminded me of Robert Cialdini’s principle of consistency in his book Influence[1], particularly its connection to the classic foot-in-the-door experiment of the 1960s [2]. Once I had committed to the first request, I felt compelled to maintain that commitment, even at the cost of efficiency and best practices.
This realization was profound. We often continue down ineffective paths simply to maintain consistency with past decisions. Beyond data engineering, this principle explains why change is so difficult—we don’t just repeat behaviors out of habit but because we want to appear reliable and steadfast. In that pursuit, we sometimes double down on bad choices rather than admit we were wrong and pivot toward better solutions.
The real lesson? True reliability isn’t about rigid consistency; it’s about recognizing when to adapt. Acknowledging mistakes and changing course isn’t a weakness—it’s a strength that leads to better decisions, better performance, and, ultimately, better outcomes.