
The hypnotic, mechanical whir and clicks of Bombe machines could be heard at all hours at Bletchley Park, England’s clandestine WWII cryptography headquarters. These engineering marvels cycled through possible Enigma combinations at speeds no human could match. But alongside the mechanical hum of technological advancement, something else, something distinctly human, could be heard: the symphony of pencils scratching across paper and the murmur of sharing ideas.
Code breakers would spend hours hunched over intercepted communications, searching for something Alan Turing’s machines couldn’t decipher, patterns in human behavior. They noticed that German weather reports arrived at precisely the same time every morning. They spotted rigid formatting for military communications. They recognized that certain standard military phrases appeared in otherwise routine communications. And they saw that some operators became sloppy with communication procedures.
These insights, which lead to the breaking of the German Enigma code, weren’t computational insights. Instead, these were human observations about human behavior. This kind of nuanced pattern recognition is impossible for a machine, however advanced, to replicate. But here’s what makes Bletchley Park, an intelligence location that closed in 1987, so relevant today. They didn’t choose between human intuition and machine power; they combined them strategically. The code breakers spotted the behavioral patterns and turned them into educated guesses about what encrypted text probably said. Then they fed those insights to their computational workhorses, giving the machines exactly the right starting point to crack that day's code.
The breakthrough wasn't blindly calling for more automation. It was knowing precisely when and where human insight could multiply the power of their automated systems.
Today, we face a similar moment in data analytics. Now, our computational power dwarfs anything the Bletchley team could imagine, but the fundamental challenge remains the same: when should we step back from our automation and get our hands dirty with the data? When does strategic friction make our systems smarter, not slower?
Data teams automate for a myriad of reasons: speed, accuracy, and scale, to name a few. We automate so that manual processes that once took hours can now produce results in mere minutes. We automate to ensure those pesky, error-prone calculations are shored up and accurate. We automate to create data pipelines, that would make human analysts’ brains overheat, to process terabytes of data without ever breaking a sweat.
But here’s the catch, the paradox, if you will: the more we automate, the further we drift from our data. What once required intimate knowledge of every quirk and pattern now happens under the hood. We’ve made a trade. We’ve swapped deep familiarity with our data for operational efficiency. And most of the time, that’s exactly the right deal. Until it isn’t.
Over time, we’ve optimized for increased throughput, efficiency, and scale, but have accidentally optimized away from the creativity, insight, and ability to engineer novel solutions that only the human mind can provide.
In what use cases should the human lead and where should the team rely on automation or augmentation with AI? There is no hard and fast rule for this. The decision exists on a spectrum. Below are a few points on that spectrum that might be helpful for evaluating when your team could use a strategic slow down or a complete handoff.
Strategic manual engagement with data creates a feedback loop that makes your entire system smarter over time.
If you manually investigate data anomalies, you’re not just providing a fix for the immediate issue at hand, but rather, you’re creating labeled examples and edge case documentation that becomes next week’s training data. If you catch an unusual pattern by hand, it shouldn’t stay siloed from innovation; it should become a feature in your team’s next machine learning model. Or if you apply business context in your manual data reviews, amplify that domain knowledge to inform algorithm design.
Manual engagement doesn’t just solve today’s problems; it creates a flywheel that makes the entire system smarter over time. This manual engagement is the catalyst for smarter automation. Let’s walk through a scenario. You are manually exploring the data on why customer churn increased in a specific region. You discover that this spike actually coincides with a local competitor's new promo. This is not context an automated system would have discovered and been able to easily link to the phenomenon you observed; this is human discovery in action.
Keeping your insight only in the manual realm would attenuate its impact. Therefore, just like the intrepid codebreakers of WWII, your insight is fed into the machine. This understanding becomes a new feature in your team’s churn prediction model, a monitoring rule in your BU’s BI system, and documented institutional knowledge that guides your company’s future AI implementations. What you discovered by hand, the computer can now reproduce at scale. Or in other words, humans produce discovery, while computers reproduce the rules.
This is compound intelligence in action. This principle can be distilled down to the concept that human insights do not just solve problems; they teach machines to solve similar problems on their own. Your hard-won domain expertise now becomes embedded in the AI architecture, making it more robust.
To reiterate, the most successful data teams aren’t those who only automate; they're the ones that combine key principles and insights with scalable systems, melding human and artificial intelligence to create powerful feedback loops.
How you implement strategic friction into a workflow will depend on your team’s needs and current capabilities. Some general guidelines that might be informative are below.
When trying to decide where the friction points should go in a typical data workflow, think of these three questions:
Deliberate friction has a higher ROI at transition points than at other stages in the data pipeline or process. Transition points are the moments where the data is moving between systems, formats, or purposes.
For example, when you are ingesting data from a new source, you will likely want to be hands-on and manually review the first pull of data to familiarize yourself with it and to spot patterns your QA rules may have missed. This also gives you the opportunity to hand-check edge cases and outliers before they become obscured by statistical summaries. Additionally, when doing business KPI reviews, move beyond just monitoring the dashboard. Instead, regularly dig into the raw data to maintain an intuition about what drives those numbers.
It is best to automate 80% of your workflow that is routine or contains high-volume processing. The final 20% is where manual engagement will be applied to the highest risk or data in need of the highest insight. To prevent your team from spending excessive amounts of time on manual reviews, time-box these activities. Additionally, create accessible, thorough documentation on areas where automation needs improvement.
Adding a human element to automation and AI work can have an impact on the entire team, from the domain expert who will provide insights at manual checkpoints to the junior analyst who can use strategic manual work as a learning opportunity.
For individual contributors (i.e., data analysts, data engineers, and data scientists), getting your hands dirty with the data is non-negotiable. Schedule regular manual exploration sessions to stay fresh in both the data itself and the automated processes that transform it.
For managers or team leads, your role is equally important. Create the space and expectations for your team to engage strategically with data. This may include building manual checkpoints into your sprint planning or building in protected time for exploratory analysis without an immediate deliverable. You’re not necessarily doing the manual review yourself, but you’re ensuring that your team has the capacity and mandate to do it well.
Outdated ideas cast the data professional as the analyst ex machina, or the person who magically saved the day when automated systems fail. However, the future belongs to a more sophisticated concept, data professionals who don’t wait for systems to break before engaging with them. They haven’t just anticipated strategic manual engagement; they’ve ensured it, engineered it into the process itself. A successful analyst strategically partners with AI and hones a sense of when to trust automation and when digging deeper is necessary. They build judgement to recognize when systems need human insights and how humans and machines can work together to create the future of analytics.
It’s easy to fall into an “all-or-nothing" mindset but moving beyond it allows you to transition from automate everything to automate intelligently. You can capitalize on understanding when AI adds the most value versus when deliberate friction will serve you best. Knowing that difference and how to harmonize is key. It’s also critical to build automation literacy, creating a deep understanding of your tools’ role in your organization.
Just as the code breakers at Bletchley Park weren’t competing against their machines, but instead, guiding them towards actionable insights, you don’t have to choose between human and artificial intelligence. The future belongs to the data maestros, those who understand that great insights, like great symphonies, require knowing when each instrument should lead and when it should support.
At Concord, we do not practice less automation but believe in implementing smarter automation. Is your team ready to take advantage of strategic friction to get more out of your automation and AI investments? Contact Concord today.
Not sure on your next step? We'd love to hear about your business challenges. No pitch. No strings attached.