OPINION

Embedding Māori perspectives in AI is not optional – it’s essential to trusted, ethical governance in Aotearoa.
OPINION: As artificial intelligence (AI) increasingly shapes the way we live, work and make decisions, it’s tempting to view it as a purely technical endeavour – something for engineers, data scientists and compliance teams. But in Aotearoa New Zealand, governance of AI must be shaped not only by global standards, but also by our bicultural foundations and the multicultural communities we serve.
Grant Broadbent, Managing Director of Stellar Consulting Group and a leader in Aotearoa’s data industry, says “AI isn’t just about algorithms – it’s about people. And in Aotearoa New Zealand that means it’s multicultural, with respect for our bicultural foundation.”
Many AI governance frameworks rightly focus on fairness, performance and risk mitigation. But, according to Broadbent, “too many frameworks start and stop at bias and explainability. But what about mana? What about our whakapapa? When we train a model on data, we’re training it on stories, often without consent, context or connection. There’s a disconnect.”
That disconnect, he says, is where real governance risk lies. Data is not neutral – it carries histories, relationships and meaning. When those are stripped away in the pursuit of technical precision, organisations risk building AI systems that are functionally efficient but socially and culturally misaligned. In a country where data often relates to people, whenua and histories of deep cultural significance, this risk is material.
Effective governance means asking not only “is this safe and legal?” but also “whose data is this?”, “who benefits from the insights?”, and “have we honoured tikanga and Te Tiriti o Waitangi in how we use data, automation and AI?”
The principle of Māori Data Sovereignty – the right of Māori to govern data about or affecting Māori, in accordance with tikanga – is gaining traction across sectors such as health, education and research. It stems from tino rangatiratanga (self-determination) under Te Tiriti and asserts that data is a taonga (treasure), tied not just to information but to whakapapa, mana and wairua (spiritual essence).
Broadbent warns that many organisations still treat Māori perspectives as a compliance exercise. “It’s not enough to simply engage with iwi after the design is done,” he says. “Cultural competency means co-design, co-governance and recognising that mātauranga Māori (our ways of knowing and being) is a taonga in itself. It’s not just an add-on; it’s a complete knowledge system with its own integrity that deserves equal footing.”
This perspective challenges boards to move from consultation to true partnership. It also requires organisations to rethink how they assess impact. Governance conversations should consider whānau, hapū and iwi – not just users, customers or system outputs. Broadbent says this requires reimagining data stewardship models to reflect collective rights and obligations, and ensuring boards and executives lead with cultural intelligence.
From our work with clients at Stellar, we see growing interest in responsible AI governance. However, cultural risk is still often overlooked. If you don’t understand the cultural impact of AI, you can’t govern it properly. Boards may comply with regulations, yet still lose the trust of those most affected by the system.
Broadbent points out that “cultural competency isn’t a one-off training module. It’s a mindset. It’s being willing to learn, to ask questions and to walk alongside others to build trust.”
Board leadership is vital. Directors must ensure AI projects honour Te Tiriti principles – partnership, protection and participation – at every stage of the AI lifecycle. They must seek diverse voices early in the process and embed kaupapa Māori values such as kaitiakitanga (guardianship) and manaakitanga (care and respect) in governance frameworks. They also need the capability to ask the right questions and assess risk from both a technical and a cultural perspective.
Recent national guidance supports this direction. The Government’s AI Strategy and Responsible AI Guidance for Businesses, aligned with the OECD AI Principles, encourage organisations to centre purpose, values and public benefit in AI adoption. A separate Public Service AI Framework also takes a risk-based approach.
Yet some Māori commentators remain concerned about tokenism and lack of true engagement. The Algorithm Charter, while well-intentioned, has been criticised for insufficient involvement of Māori stakeholders.
Dr Karaitiana Taiuru has argued that regulating AI too early, without embedding Māori perspectives, risks unintended consequences. Instead, frameworks must honour Te Tiriti and support the development of Sovereign AI – models governed in alignment with national values, tikanga and local interests, rather than global defaults.
This localised, values-driven approach offers an opportunity to build trust and resilience – provided boards take their role seriously. Culturally competent governance is not only an ethical obligation, it’s a strategic advantage.
AI will shape decisions across healthcare, justice, education, transport and business. Broadbent notes that while many leaders ask “what can AI do?”, in te ao Māori, knowledge is never separate from relationships. “The question should also be who does this AI serve and who does it leave behind?”
Answering that question responsibly requires more than technical excellence. It demands cultural intelligence – the ability to lead respectfully and effectively across cultures. As we build the AI governance frameworks of the future, we must ensure they reflect not just global best practice, but also the local values, relationships, and responsibilities that define us as a nation.
Briar Christensen is Head of Advisory at Stellar Consulting Group, where she leads data and AI strategy and governance engagements for clients across the public and private sector. Stellar is New Zealand’s largest independent data and analytics consultancy and with 50% Māori ownership is proud to be recognised as Pakihi Māori.
Contribute your perspectives and expertise on an area of governance to the IoD membership and governance community. Contact us at mail@iod.org.nz