The uncomfortable truth about skills in financial services

The gap no one is really talking about

There is no shortage of investment in skills across financial services right now. Firms are rolling out AI tools across functions, building out learning programs, and putting real effort into capability development. If you looked at it from the outside, you would assume the industry is making steady progress.

The reality is less straightforward.

Performance is not moving in the same direction or at the same speed. Some teams are clearly benefiting from new tools and better access to information, while others are still struggling to translate all of that investment into better decisions or stronger outcomes. The variation is hard to explain, which is usually a sign that something more fundamental is going on.

What is starting to stand out is the disconnect between activity and impact. Financial services has always operated with a high level of measurement discipline. Capital is tracked carefully. Risk is quantified. Efficiency is monitored closely. Yet when it comes to capability, the same clarity is often missing. Training happens, participation is recorded, and progress is assumed, but the connection to actual performance tends to be far less visible.

That gap has been there for a while, though it was easier to ignore when the pace of change was slower. It looks different now. Expectations on finance teams have increased and so has the complexity of the work itself. Under those conditions, the idea that capability will somehow catch up on its own starts to feel optimistic.

The uncomfortable part is not that organizations are failing to invest. It is that they may be investing without a clear line of sight to what actually drives performance.

This article is also available in podcast/video form. Watch the video below from our YouTube channel, or follow The Intuition Finance Digest on Spotify, Apple Podcasts, or Amazon Music.

The work has not become simpler, it has shifted

There was a broad assumption that AI would simplify work in financial services. In some respects, it has already done that. Routine processes take less time, access to data is quicker, and certain types of analysis no longer require the same level of manual effort.

What has changed, though, is not the volume of work but its shape.

Across finance teams, the emphasis is moving away from producing information and toward interpreting it. That sounds like a small adjustment, but it carries a different set of demands. It requires people to decide what matters, when to trust an output, and when to question it. It also requires them to explain those decisions clearly, often in environments where there is no single obvious answer.

Earlier models of capability placed a lot of weight on technical proficiency. If someone could run the analysis, follow the process, and deliver accurate outputs, they were considered effective in their role. That standard is no longer enough on its own.

Outputs have become easier to generate. Knowing what to do with them has not.

In practice, that means more time is spent navigating ambiguity, reconciling conflicting information, and making decisions that are not fully supported by clean data. It also means that the difference between strong and weak performance is less about whether someone can produce an answer and more about how they interpret and act on it.

This is where the shift becomes uncomfortable. The tools have improved quickly, but the capabilities required to use them well are developing at a different pace, and in some cases, they are not being addressed directly at all.

The uncomfortable truth about skills

The working assumption is that the skills challenge in financial services is mainly technical. More data capability, more digital fluency, and a better understanding of AI tools are all seen as priorities, and for good reason.

That view is only part of the picture.

In many cases, the real gap sits one level above that, in how decisions are made and how information is handled once it is available. It shows up in the ability to question an output, to recognize when something does not fully make sense, and to adjust thinking accordingly instead of following a process through to its expected conclusion.

These are not new skills, but they are becoming more visible because of how the work is evolving. When analysis takes longer, there is more time to think through the result. When answers appear quickly, it becomes easier to accept them at face value, even when they should be challenged.

AI does not remove the need for judgment. In a way, it increases it.

It can generate options, surface patterns, and provide a starting point for decisions, but it does not establish context or take responsibility for the outcome. Those responsibilities still sit with people, which means the quality of their thinking becomes more important, not less.

This is where the gap starts to affect performance. Organizations are investing in tools and building out training, yet the capabilities that determine how those tools are used are not always developed with the same focus. As a result, the gains from technology tend to be inconsistent. Some teams move quickly and improve, while others fall back on familiar patterns, even when those patterns no longer fit the work.

It is not a failure of technology. It is not even a failure of intent.

It is a mismatch between where the investment is going and where performance is actually determined.

This is especially visible in risk and compliance environments, where stronger judgment, ethical awareness, and practical decision-making are central to preventing misconduct. For a related view, read our article on strengthening risk culture to prevent corporate fraud.

in article image

The real skills gap is judgment, not just technical ability.

Why the investment is not translating into performance

Part of the problem is how capability is framed and measured.

In many organizations, learning still sits slightly apart from the core of the business. It is tracked through completion rates, attendance, and engagement, all of which are easy to quantify but say very little about how people actually perform in their roles. Meanwhile, performance is judged through a completely different set of metrics tied to output, efficiency, and results.

Those two worlds rarely connect in a meaningful way.

This creates a situation where activity increases, but impact remains unclear. Teams can point to higher participation in training or broader access to learning tools, yet struggle to show how those investments are improving decision-making or accelerating outcomes.

There is also a tendency to treat technology as the primary lever for improvement. When new tools are introduced, the expectation is that performance will follow. That assumption holds in some cases, particularly where processes are well defined, but it breaks down in more complex environments where interpretation and judgment matter.

In those situations, tools amplify whatever capability is already in place. Strong teams get more from them. Weaker teams often continue to underperform, just with better systems at their disposal.

This helps explain why performance gains can feel uneven, even when investment levels are high. The missing link is not access or activity. It is how capability is developed, applied, and measured in the context of real work.

Where the gap is starting to show

The impact of this shift is not abstract. It is already visible in different parts of the industry, though it does not always look the same.

Finance functions are increasingly taking on strategic responsibilities. Teams are expected to move beyond reporting and provide insight that supports decision making. In practice, that requires a different skill set. It is less about producing numbers and more about interpreting them, challenging assumptions, and communicating a clear point of view. Not everyone is comfortable making that shift.

In client-facing roles, the balance is also changing. Clients have access to more information than ever before, and in many cases, they can generate their own analysis. What they look for instead is confidence in how that information is interpreted. The ability to explain a decision, particularly in uncertain conditions, becomes more important than the data itself.

Across decision-making environments more broadly, there is a growing expectation that teams can operate with less certainty. Clean data sets and clear signals are not always available. In those moments, the quality of judgment becomes the differentiator, and it is often here that gaps in capability become most obvious.

None of this is new in isolation. What is different is the consistency with which these patterns are starting to appear.

The same capability gap becomes clear when financial institutions face complex risk decisions. History shows that weak governance, poor transparency, and missed warning signs can turn small misjudgments into wider failures. Read more on key lessons for risk leaders when financial systems fail.

in-article image

Capability gaps appear most clearly in complex decision-making moments.

What this means for financial institutions

The implications are easy to underestimate because the problem does not present itself in a single obvious way. It shows up gradually through variation in performance, slower decision cycles, or a lack of confidence in outputs that should, in theory, be more reliable.

Over time, those effects compound.

Teams that can interpret information effectively move faster and make better decisions. Those that cannot tend to rely more heavily on process or default thinking, which limits the value they get from the tools available to them.

This has a direct impact on the return organizations see from their investment in technology. AI and advanced analytics promise efficiency and better decision-making, but those benefits depend on how they are used. Without the capability to support them, the impact remains uneven and in some cases marginal.

It also starts to shape competitive differentiation in a way that is not always obvious. Access to tools is becoming more consistent across the industry. What varies is how effectively those tools are applied in practice.

That is where performance begins to diverge.

Recent pressure in private credit shows why access to data is not enough. Teams also need the ability to interpret risk, understand market complexity, and assess how interconnected exposures may evolve. Read more on what private credit stress reveals about market risk and investor confidence.

in article image

Better tools only create value when teams know how to use them well.

A shift that is still playing out

The conversation around skills in financial services often focuses on what needs to be built next. More digital capability, more training, more investment. Those things still matter, but they do not address the full picture.

What is emerging is not just a need for new skills, but a shift in how capability itself is understood.

Performance is not determined by what people know or what tools they have access to. It is shaped by how they think, how they interpret information, and how they act in situations that are not fully defined.

That shift is already underway.

What is less clear is how quickly organizations will adjust their approach to match it.

Frequently asked questions

What is the skills gap in financial services?

The skills gap in financial services is not only about technical ability. The article argues that the bigger issue is the disconnect between learning activity and performance impact. Firms are investing in AI tools, training, and capability programs, but teams still vary in how well they interpret information, challenge outputs, and make stronger decisions in real work.

Why are AI tools not enough to improve performance?

AI tools can make routine processes faster, improve access to data, and generate outputs more quickly. However, they do not remove the need for human judgment. People still need to decide what matters, when to trust an output, when to question it, and how to apply the result in situations where the answer is not fully clear.

How has work in financial services changed?

Work in financial services has shifted from mainly producing information to interpreting it. Outputs are easier to generate, but knowing what to do with them remains difficult. Teams now spend more time navigating ambiguity, reconciling conflicting information, and explaining decisions clearly in environments where clean data or obvious answers are not always available.

Why does judgment matter more in financial services now?

Judgment matters more because AI and analytics can surface patterns, options, and starting points, but they do not establish full context or take responsibility for outcomes. Financial services professionals still need to assess whether information makes sense, challenge assumptions, and decide how to act. As tools improve, the quality of human thinking becomes more important, not less.

Why is capability investment not always translating into impact?

Capability investment does not always translate into impact because learning is often measured separately from business performance. Completion rates, attendance, and engagement are easy to track, but they do not show whether people are making better decisions or improving outcomes. This creates a gap where activity increases, but the link to real performance remains unclear.

What does this skills shift mean for financial institutions?

For financial institutions, the skills shift means performance will depend less on access to tools alone and more on how effectively people apply them. Teams that can interpret information well are more likely to move faster and make better decisions. Those that rely only on process or default thinking may see weaker returns from technology investment.

Learn more about building practical capability across financial services with Intuition Know-How, from banking, markets, risk, regulation, and AI to the knowledge teams need to interpret information and make stronger decisions.

Click here to learn more, or fill out the form below.

Browse full tutorial offering

Intuition Know-How logo
mockup