Associate Member People

Data quality is the real AI battleground

Share
By Colin ToveyManaging Director, Resilient Management Solutions

Conversations about AI are moving fast across the sector. Boards are asking questions. Pilots are running. Technology budgets are rising.

But the findings from the AI Adoption, Workforce Redesign and Operational Readiness report suggest that many organisations are investing in the wrong end of the problem.

The tools are not the bottleneck. Trustworthy data quality is.

What the numbers show

34.5% of survey respondents cite data quality and system integration as a primary barrier to AI ROI. That puts it third in the list of blockers, behind unclear business cases and limited internal capability. But in practice, it underpins both of those problems. You cannot build a credible business case for AI if your data cannot reliably support the outcomes you are trying to measure. And internal capability matters far less if the inputs it is working with are inconsistent, incomplete or fragmented across disconnected systems.

The report also finds that 46% of respondents identify document processing and data extraction as the single biggest area for AI improvement. That is not a coincidence. It reflects the reality that data in this market is often unstructured, manually handled and spread across originations, onboarding, funding, servicing and collections. AI is being deployed in document-heavy workflows precisely because that is where the data problem is most visible and the operational pain is most acute.

Why asset finance has a harder data problem than most

Asset finance is not a clean-data environment. Fragmented source systems, legacy platforms, manual validation steps and frequent rekeying mean that data quality issues are embedded in the operational fabric of most organisations. Information that should flow automatically is instead transferred manually, introducing inconsistency at each step.

This matters for AI because the quality of any model’s output is directly constrained by the quality of its input. An intelligent document processing tool operating on inconsistently formatted, incomplete or inaccurate source data will produce inconsistently formatted, incomplete or inaccurate outputs. The automation does not fix the underlying problem. It inherits it.

Guillaume Moulinet, Global Platform Director at Volvo Financial Services, makes the point directly in the report:

“Data quality is the critical hurdle that must be cleared to unlock high-value, long-term outcomes.”

The hidden cost of skipping data readiness

Organisations that move to AI deployment without first addressing data quality typically encounter one of two outcomes. Either the model underperforms and confidence in the programme erodes quickly, or human oversight is required at a level that negates the efficiency gains the technology was supposed to deliver.

taylor robert 400

Robert Taylor, Managing Director at LTi Technology Solutions UK, describes this dynamic:

“The opportunity with Ai is clear, but real progress will come from practical delivery, trusted data and responsible adoption. Organisations that align technology ambition with governance, workforce readiness and operational value will be best placed to move forward.”

An important observation is that human oversight is not inherently a failure mode, but if it exists because the data cannot be trusted rather than because the process requires judgement, it signals that the AI programme is running ahead of the foundations.

What data readiness actually means

Data readiness is not a single project. It is a set of decisions about what data exists, where it lives, how reliable it is, and whether it can be connected in a form that supports the use case being targeted.

In practical terms for asset finance, that means understanding which systems hold the data an AI application needs, what the quality and completeness of that data looks like today, and what integration work is required before a model can be trained, validated and deployed with confidence.

It also means being honest about timelines. Data remediation and integration work takes longer than most technology roadmaps assume, and compressing that work to hit a deployment date is one of the most reliable ways to produce a pilot that cannot scale.

The sequence matters

The firms that will see consistent, scalable AI ROI across asset, auto and equipment finance are not necessarily those with the most sophisticated models. They are those who treated data readiness as a strategic priority before committing to deployment timelines.

AI investment without data investment is, at best, a bet on a weak foundation. At worst, it is a way of automating existing data problems at greater speed and cost.

Download the AI Adoption, Workforce Redesign and Operational Readiness report here.

Associate Member

Resilient Management Solutions

Resilient Management Solutions is the only executive & critical hire search firm dedicated exclusively to business transformation across Asset, Auto,…