Skills Debt: The Organisational Debt Nobody's Measuring (Organisation Debt Series IV)
In this series I have argued that organisation debt — the accumulated cost of poor structures, deferred decisions, broken processes, and ungoverned data — is a real liability that belongs on the management agenda alongside financial and technical debt. In Part II, I introduced a practical taxonomy: Decision Debt, Control Debt, Data Debt, Platform Debt, Experience Debt, Operating Rhythm Debt, and Capability Debt.
But there is one form of debt conspicuously absent from that ledger, absent from most board papers, and absent from almost every transformation programme I have seen.
Skills Debt.
Not the vague sense that “we need to upskill” — every organisation says that. I mean the measurable, compounding gap between the skills your workforce needs to deliver value today and the skills it actually has. Left unaddressed, it accumulates interest in exactly the same way as every other form of organisation debt. It slows transformation, stalls AI deployments, increases regulatory exposure, and erodes the capacity of your best people to do their best work.
But HR Already Measures Skills Gaps — Doesn’t It?
This is the first challenge any informed reader will raise, and it deserves a direct answer.
Yes. HR teams run skills gap analyses. Talent management platforms catalogue competencies. Learning and development functions track completion rates and certification levels. This work is real and valuable.
But here is what it almost never does: connect skills readiness to the actual flow of value.
A competency framework tells you whether a person is ‘proficient’ in a skill. It does not tell you whether the people responsible for Stage 4 of your motor claims value stream — Loss Assessment — have what they need to govern the AI model now making reserve recommendations at that stage. A training completion dashboard tells you how many people attended an AI awareness module. It does not tell you whether the underwriters pricing commercial risk can identify when the model’s outputs should be overridden.
Skills Debt, as I am defining it, is the gap measured at the value stream level, in the currency of operational and regulatory impact. That is the measure HR frameworks are not built to provide — and it is exactly the measure organisations need most right now.
HR measures skills in people. Skills Debt measures skills against the flow of value. These are not the same question.
Where Skills Debt Fits in the Taxonomy
In Part II of this series, Capability Debt was defined as under-invested core capabilities — Pricing Strategy, Claims Triage, Partner Enablement. Skills Debt is the human dimension of that entry. Capability Debt describes the organisational gap; Skills Debt describes the workforce gap that underlies it.
The distinction matters practically. You can restructure a capability, rename a team, or redesign a value stream — and the skills debt remains entirely intact. It travels with the people, not the org chart. That is precisely why it is so often invisible and so rarely addressed at the root.
Skills Debt accumulates through the same micro-drift logic I described in Part I of this series. No single decision creates it. It builds through a thousand small omissions: the training programme that never got funded, the role profile not updated after a restructure, the subject-matter expert who left and took institutional knowledge with them, the hire made for today’s need without consideration for tomorrow’s capability requirement.
Why AI Is Structurally Different This Time
Every major technology wave — ERP, the internet, cloud, mobile — created skills debt. A reasonable challenge is: why is AI any different?
The answer is structural, not just a matter of pace or scale.
Previous technology shifts changed what tools people use. AI changes what judgments humans are responsible for.
When you deployed a new underwriting workbench, the underwriter still made the decision. The tool changed; the human’s role as decision-maker did not. When you deploy an AI model that generates reserve recommendations, the underwriter’s role transforms: they are no longer making the primary decision, they are governing one. That requires a fundamentally different set of skills — the ability to interrogate a model’s logic, recognise systematic bias, understand confidence intervals, and know when to override. None of these are ‘digital skills’ in the traditional sense. They are judgment skills of a new kind, and most organisations have not equipped their people with them.
There is a second structural difference. Previous technology skills, once learned, were relatively stable. AI models drift, retrain, and change behaviour over time. The skills required to work alongside an AI system are not a one-time acquisition — they require ongoing recalibration. That is a fundamentally different kind of debt, because the principal keeps growing even as you try to pay it down.
The FCA’s Consumer Duty makes this concrete. Demonstrating good outcomes for customers requires human judgment at critical points in the value chain. If the people responsible for that judgment do not understand the AI systems shaping those outcomes, the firm has a skills debt that is simultaneously a regulatory liability.
Measuring Skills Debt: A Worked Example
In Part II I proposed the Organisation Debt Service Ratio — the percentage of change and run capacity consumed by workarounds, rework, and failure demand — as a way to make organisation debt visible and financial. Skills Debt deserves an equivalent measure.
I propose a Skills Debt Exposure Score, assessed at the value stream level across four dimensions. Each is scored 1–3, where 1 = low exposure and 3 = high exposure.
Add the four scores. A total of 4–6 is manageable; 7–9 is elevated and warrants a pay-down plan; 10–12 is critical and should be treated as a default risk on the Organisation Debt ledger.
Applied to a motor claims value stream, the Loss Assessment stage might score: Coverage 3 (no AI-era assessment done), Gap Severity 3 (adjusters lack model interrogation skills), Velocity 3 (AI reserve recommendation tool live in Q2), Concentration Risk 2 (four experienced adjusters, one contractor). Total: 11. That is a critical exposure — and a number a CFO can act on.
That specificity is what distinguishes Skills Debt from a generic skills gap analysis. It is anchored to a stage, a capability, a regulatory risk, and a timeline.
How to Pay It Down: Invest, Hire, or Augment
Skills Debt cannot be paid down through a single training programme. Like all organisation debt, the first question is not ‘how do we close the gap?’ but ‘what is the right instrument?’
There are three options, and the right choice depends on the exposure score and the velocity of change.
- Invest (build). For gaps where the required skills are learnable by your existing workforce and the velocity of change is moderate, structured investment in your people is the right answer. But be precise: not AI awareness e-learning, but targeted programs that build the specific judgment skills needed at each value stream stage. Underwriters learning to interrogate reserve models. Claims managers learning to run override reviews. Compliance officers learning to audit algorithmic decisions.
- Hire (buy). Where the gap is critical, the velocity is high, and the time available is short, hiring is the faster path. The risk is that externally hired skills do not integrate into the operating model without deliberate onboarding — and that you are competing for the same people every other firm is trying to hire. Hiring solves the coverage problem; it does not automatically solve the concentration risk.
- Augment (borrow). For high-velocity, high-complexity gaps where neither training nor hiring can move fast enough, AI tools themselves can serve as a bridge — providing guided decision support that reduces the skills threshold required for safe operation while the underlying capability is built. This is not a permanent solution; it is debt refinancing. It buys time. It does not pay down the principal.
Most organisations will need all three in parallel. The Skills Debt Exposure Score tells you which instrument applies at each value stream stage, rather than defaulting to a company-wide training programme that addresses none of them precisely.
This Is Not an HR Problem. It Is Not a BA Problem. It Is Both.
A fair challenge to the previous version of this argument is: why does Business Architecture have a role here at all? HR owns skills. L&D owns learning. Where does BA fit?
The answer is that BA and HR need each other for this problem, and neither can solve it alone.
HR knows what skills people have. BA knows where skills gaps manifest in the flow of value. HR has the competency frameworks and development tooling. BA has the capability model, the value stream map, and the information concept inventory that define what skills are required at each stage and what the operational consequence of a gap looks like.
Without the BA map, HR skills assessments float free of operational reality — they tell you about people without telling you about performance. Without HR’s measurement infrastructure, BA observations about skills gaps remain anecdotal. The Skills Debt Exposure Score is the instrument that connects the two disciplines around a shared, value-stream-anchored measure of readiness.
In organisations where this partnership works, skills investment stops being a cost-centre decision and starts being a capital allocation decision — targeted, measured, and tied to the operating model outcomes that matter.
So, here's something to leave you with
Throughout this series I have argued that organisation debt is a real liability that deserves to be managed with the same rigour as financial or technical debt. Skills Debt is the entry that has been missing from the ledger — not because it is immeasurable, but because no one has yet connected it to the value stream map where its consequences are felt.
So here is the question I want to leave you with: if you ran the Skills Debt Exposure Score across your top five value streams this month, which stage would score highest — and who in your organisation currently owns that number?
I suspect the honest answer would reveal that nobody does. And that is precisely the problem.