AI is no longer new to most organisations. For years, it has played a role in areas like data analysis, predictive modelling, and process optimisation. What is new is the scale and reach of AI today. No longer confined to specialist teams or isolated use cases, it’s increasingly touching almost every function, from core business operations and strategy through to manufacturing, supply chain, creative and commercial.
Despite this, many organisations are struggling to realise the transformative value AI promises. Even when engagement and investment is high, we often see AI getting absorbed into existing ways of working instead of being allowed to change them. This represents a fundamental leadership and cultural challenge.
One of the most important shifts for leaders today is how to enable experimentation and learning without undermining the rigour the organisation depends on. This is where the idea of “care and dare” leadership becomes essential.
Why AI experimentation is often constrained
AI thrives on iteration, learning, and trial-and-error. Value often emerges from a rapid process of testing, failing, adjusting, and trying again rather than from perfect plans. This creates a natural tension with long-established ways of working, which in many sectors, traditionally operate more slowly and cautiously.
When this friction isn’t addressed, AI initiatives tend not to get anywhere. Experimentation and learning is generally localised, which often means promising ideas struggle to scale beyond pilots. The challenge is maintaining high standards while finding the balance between where precision is essential, and where learning must be accelerated.
Introducing "care and dare" leadership
In our work with leaders, we increasingly see the need for a clear distinction between two modes of operating:
Care: In core business areas (for example, manufacturing or supply chain) operational excellence remains critical. These are environments where mistakes carry real cost and consequence, and where consistency and control must remain high.
Dare: In AI-enabled innovation, leaders must create space for experimentation. This includes testing new ways of working, learning what does not work, and iterating quickly. Here, progress depends on exploration rather than perfection.
The mistake many organisations make involves applying the same mindset to both. Care without dare leads to paralysis. Dare without care creates unacceptable risk. The leadership challenge involves holding both at once.
Giving people the tools and permission to play
One powerful approach involves encouraging everyday AI through playful exploration by everyone. Once people appreciate what is possible, a little training and a safe environment enables them to experiment in their daily work.
Instead of tightly prescribing use cases, leaders provide secure and approved tools, clear guidance on what to avoid, and explicit permission to experiment. Then they step back. This approach recognises that theorising alone can’t reveal what works. Learning emerges through use, through friction and failure. It's only by failing that it’s possible to actually see what works and what doesn't.
Guardrails matter more than ever
However, empowerment doesn’t mean the absence of structure. As AI becomes more embedded in daily work, leaders need to be explicit about where experimentation is encouraged and where constraints apply. This includes clear expectations around data use, privacy, and compliance, and strong governance that supports learning rather than suppressing it.
When these boundaries are communicated clearly, people stop second-guessing what’s allowed, which in turn builds psychological safety. Teams are able to experiment responsibly, rather than cautiously or covertly. In other words, guardrails enable learning.
From supervision to enablement
This shift also requires a change in leadership behaviour. Rather than leaders who monitor every task, AI-enabled teams need leaders who coach, frame experimentation as learning, and who share their own uncertainties and lessons learned.
When leaders openly engage with AI themselves and share what they are learning, they signal that experimentation remains expected. This builds the psychological safety that allows teams to move faster and learn together.
Why the shift matters now
AI is rapidly reshaping how work gets done, and people who learn to work effectively with AI are becoming dramatically more productive. Teams that experiment responsibly move faster than those waiting for certainty. AI may not replace people, but people who use AI will very quickly replace those who don’t.
The organisations that succeed will be those whose leaders can hold the tension between care and dare. Protecting what has to be protected, while deliberately creating space for learning, experimentation, and growth.
If you want to learn more about getting the balance between care and dare right, get in touch today