Boardroom Premium
Rei Ishikawa MInstD urges boards to focus on real problems, not just technology, and ensure tech choices reflect organisational values.
AI in healthcare shouldn’t start with the technology – it should start with the problem. And that distinction is where many boards go wrong, says Rei Ishikawa MInstD.
As Chief Executive of Karo Data Management, a Māori-owned software innovation company serving Aotearoa New Zealand’s health and social sectors, Ishikawa sits at the uncomfortable intersection of hype and reality. His message to directors is consistent across sectors and systems: stop asking what AI can do for you and start asking what problems actually need solving.
“A lot of people ask, ‘How do we use AI?’ The question we need to be looking at asking is: What problems can be solved and could AI be the right solution?” he says.
It is a practical perspective shaped by an unusually broad career. An environmental scientist by training – he holds an MSc from the University of Canterbury – Ishikawa describes himself as a generalist with deep delivery experience.
His work spans waste-to-energy infrastructure projects across South America and Southeast Asia, defence and government science programmes, large-scale retail transformation and digital health implementation across New Zealand, Australia and the UK. Across those environments, he has seen technology succeed or fail for the same reason: culture.
In healthcare, AI’s most visible impact to date is efficiency. A GP might spend 15 minutes with a patient, but far longer afterwards writing notes, generating prescriptions and referrals, and completing administrative tasks. AI transcription tools that draft consultation notes for review are already reducing that burden.
The benefits are not abstract. They include higher-quality consultations, increased capacity to serve larger populations and reduced burnout for clinicians working well beyond standard hours. From a governance perspective, that efficiency directly supports growth through improved outcomes, workforce retention and system resilience.
“The key focus should be around what the strategies for growth actually are and then where AI as a solution fits,” Ishikawa says. “It’s about balancing the opportunity with the risks.”
Those risks include data sovereignty, algorithmic bias, trust, privacy and cultural context – all areas that a purely technology-led approach tends to overlook.
The equity implications of AI are particularly acute in health and social services. AI systems learn from data and where that data reflects historic bias or exclusion, those patterns are reinforced rather than corrected.
That risk is amplified when communities do not trust how their data will be used. If whānau are not brought on the journey, they will not participate – and without participation, the data gaps widen further.
“AI learns from data and if the data has biases in it, it will naturally develop biases,” Ishikawa says.
At Karo, those risks are addressed through te ao Māori principles. Data is treated as taonga – a treasure – not a commodity. Karo does not own the data it works with; it belongs to whānau and providers. The organisation’s role is stewardship, guided by kaitiakitanga.
“We act as guardians of the data,” he says. “We don’t have governing authority over it. Whānau determine how their data is used, alongside their provider.”
That approach extends beyond Māori contexts. For Ishikawa, stewardship of data is no different from stewardship of the environment. Both require care, restraint and accountability. It also means acknowledging AI’s own environmental footprint, given the significant energy demands of large-scale computing.
Directors do not need to be data scientists, but they do need sufficient AI literacy to understand what these tools can and cannot do, and should and should not do. Curiosity matters, but so does ethical positioning.
That positioning shapes organisational culture. Using AI to remove administrative burden and enable people to work at the top of their scope reflects one set of values. Using it to justify sweeping job cuts reflects another.
“The board sets the tone of the organisational culture,” says Ishikawa. “That determines how your organisation operates and uses tools of trade such as AI. We shouldn’t be having the tools determine how we work.”
He is cautious about boards approving AI investments without a clear strategic frame. Chasing ‘AI-enabled’ solutions for their own sake can quickly turn into technology-led decision-making rather than outcome-led governance.
Rather than trying to pick winning platforms, boards should set clear strategic objectives and invest in stages, assessing how technology fits within wider workflows and systems.
“If you don’t do that,” he says, “the focus becomes a technology conversation rather than an ecosystem conversation about how AI fits into your organisation.”
For Ishikawa, one question sits above all others: What decisions and tasks are we comfortable delegating to machines and what should remain human?
That question goes to the heart of strategy, investment and risk. It also forces boards to confront organisational values.
“In healthcare, technology can do wonderful things,” he says. “But when people are unwell, they generally want to engage with other humans who empathise like humans. Health is a human game.”
Some industries can be highly automated. Others require much greater care. The point is conscious, values-driven choice.
Change itself can be unsettling. AI is already displacing some roles, while creating new ones for people who know how to work with these tools. Ishikawa often reminds boards that organisations move only as fast as their slowest members – and that governance has a role in bringing people along, not leaving them behind.
Ultimately, digital transformation succeeds or fails on culture, not code. “The culture of the organisation sets the tone,” he says. “The success of change hangs on that culture.”
Rei Ishikawa MInstD
Rei Ishikawa MInstD is Chief Executive of Karo Data Management, an independent director with Health Central Ltd, and serves on the boards of Mainland Angel Investors and the Institute of Directors’ Otago-Southland branch committee.