IMHO: If AI becomes everything, what does that mean for governance?

A human hand grips a robotic hand, illustrating the interaction between people and advanced technology.

Boards must act now to govern AI as a strategic force shaping risk, value and how organisations adapt.

author
Mark Cross, CFInstD
date
21 Aug 2025

Ben Evans, a US technologist, provocatively stated, “AI is Something – but it might be Everything”. This statement, which I shared at this month’s Crown Board Chairs Forum, encapsulates the ambiguity and the core of the boardroom conversation we need to have.

Artificial intelligence is no longer a future state or side project – it’s rapidly becoming the core framework through which organisations operate, compete and deliver services. 

As board chairs, we don’t need another explainer on what AI is. The challenge has shifted. It’s time for board chairs to take an active role in shaping their organisation’s AI strategy and oversight.

I’ve heard AI described as a freight train, and I think that’s right. It’s coming fast at every organisation – and it’s not slowing down while boards get comfortable. Standing still is no longer neutral. It’s a decision – one with consequences.

I framed it this way in my remarks: What if AI isn’t just another tool in the box? What if it becomes the box? The table the box sits on? And the room the table is in? 

That’s the scale of the shift I think we’re facing. AI may follow the same arc as electricity or the internet. It starts as something we switch on and ends up as something we operate inside of. We’ve seen this before. From going online to always online. From a channel to an environment.

This shift brings a critical governance challenge. As board chairs, we are not being asked to write code, design models or master every new tool. But we do set the tone, the pace and the limits. We influence what gets prioritised, what gets resourced and what risks we’re willing to accept. 

Governance, in this space, is about enabling learning while ensuring alignment. It’s about getting out of the way of good decisions – and in the way of bad ones.

AI is already altering how services are delivered, how customers interact with systems and how decisions are made. Even if your organisation isn’t in a competitive market, you’re still competing – for people, for credibility and for confidence in delivery. What we govern now is not just performance, but the conditions for adaptation.

AI doesn’t arrive in neat stages. It doesn’t wait for policies to be written. It emerges across workflows, back offices and user experiences – often before boards even know it’s happening. That’s why mindset matters more than maturity.

We need to stop thinking of AI as a destination – and start treating it as a condition. That requires a change in how we govern, not just how we plan.

In my own governance roles, I’ve seen this take shape in different ways. Across the companies I’m a director of, the most striking common threads are how leadership has treated AI not as a side project but as a core capability, linking AI to purpose and strategy, and identifying clear value use cases before applying AI as the solution. 

None of these companies are rushing headlong into AI without establishing clear guardrails that are clearly understood by their people and link back to company fundamentals.

What has also helped these companies are the governance and cultural choices as much as the technology choices: a willingness to experiment, clarity about where AI could deliver measurable impact and an operating model that encouraged adoption at scale.

The contrast between retrofitting AI into established systems and building natively around it underscores a key insight for boards – that the pace of change and the organisational mindset around AI matter as much as the technology itself. The board-level engagement was early, intentional and repeatable.

It’s worth boards asking themselves and their executive teams how ready they are – right now – to take advantage of AI in a way that is strategic, safe and effective? An AI-ready company is one that has proactively built the foundational capabilities and strategic alignment to integrate AI into the business. Being AI-ready would mean having the following key attributes:

1. A clear AI strategy

2. Capability including people, tools and data

3. The right culture and mindset – an openness to AI, digital confidence with innovation and the right risk tolerance

4. Oversight at board and executive level with the right policies and processes

5. An understanding of AI trust and ethics

6. Integration with operations, with established workflow processes and monitoring systems

To get more AI-ready, where can boards start? At the Forum, I suggested five practical moves that any chair can initiate now – simple, strategic actions that shift boards from awareness to action:

    • Lift AI literacy at the board level through deliberate development and expert input
    • Align with your chief executive to ensure AI strategy links to business value and public purpose
    • Start small with pilots that are low risk/high return/high visibility for experimentation and learning
    • Empower AI champions – including emerging leaders – to build momentum inside the organisation
    • Set guardrails early, with clear expectations on risk, ethics and accountability

None of these require perfection. What they require is permission for the organisation to test, learn and evolve, while staying grounded in its purpose.

To help embed that mindset, I offered a simple four-part imperative to chairs to drive AI governance – something to guide boardroom conversation without adding complexity. I call it:

EAGR – Educate. Accelerate. Govern. Repeat.

    • Educate the board and executive team on AI’s strategic role and evolving risks
    • Accelerate action by starting small, learning fast and avoiding paralysis by analysis and fear of risk
    • Govern by embedding AI into existing structures with appropriate oversight and risk clarity
    • Repeat – because AI is not a one-off initiative. It’s a capability that must be built, tested, improved and embedded over time

EAGR is not a checklist for technologists. It’s a rhythm for directors. Governance isn’t about having all the answers – it’s about asking better questions, earlier and more often.

We’ve seen this pattern before as directors. Digital transformation. Cybersecurity. Climate risk. Each began as a technical issue and quickly became strategic. AI follows the same path, only faster, and with broader implications for how we deliver on purpose.

As chairs, we influence what gets prioritised, how fast we move and what values guide us. That’s why AI – if it becomes everything – must be governed not just as a system, but as a shift in how we lead.

And that’s the question I left the room with: If AI becomes everything – will we lead, or watch?


Mark Cross CFInstD is the chair of Chorus, a director of Fisher & Paykel Healthcare and Xero (and audit and risk committee chair of both), a board member of the Accident Compensation Corporation (ACC) New Zealand and chair of the ACC Investment Committee. His previous governance roles include chair of Milford Asset Management, and director ofZ Energy, Genesis Energy and Argosy Property. 

He will join Matt Prichard CMInstD and Souella Cumming ONZM MInstD at 2025 Leadership Conference, in a session titled “Rethinking risk – adapting governance for a more complex world”.