picture

Governing digital trust in the public sector

As AI and digital systems embed in public services, boards face a critical task: sustaining trust, legitimacy and accountability.

author
Judene Edgar, Principal Governance Advisor, IoD
date
3 Feb 2026

Public sector entities in New Zealand sit at the centre of some of the most trust-dependent relationships in society. Universities hold student records and research data. Councils manage infrastructure systems and community services. Schools collect sensitive information about children and whānau. Health and social agencies operate through digital channels that require people to share identity, information and confidence.

As artificial intelligence accelerates, cyber threats intensify and privacy expectations sharpen, governance in these institutions is being tested in new ways. This is not simply a technology challenge – it is a governance challenge of legitimacy, stewardship and trust.

Digital transformation is often framed as inevitable and in many respects it is. AI tools are moving quickly into workplaces, automation is reshaping service delivery and data is becoming the backbone of decision-making. The question for boards and governors is whether governance frameworks are strong enough to hold public confidence when it does.

The Institute of Directors’ Director Sentiment Survey 2025 reflects how rapidly this shift is landing in boardrooms. AI and digital acceleration have risen sharply as a strategic issue, cited by 38.5% of directors, up from 25.2% in 2024. Directors also report that 60.6% of boards are now working with management on how AI and technology can lift productivity, signalling that adoption is moving into the mainstream.

But the same survey points to uneven readiness. Fewer than half of directors believe their boards currently have the right skills to manage increasing complexity, even as expectations rise. For public sector entities, that capability gap matters because AI systems are increasingly being deployed in environments where decisions affect access, fairness, safety and rights, not just efficiency.

This is why AI features among the IoD’s top governance issues for directors in 2026. The challenge is evolving beyond AI tools towards AI agents of change – systems that increasingly plan, analyse and act across organisational functions. AI is no longer only supporting decisions; it is beginning to shape them. For directors, the issue becomes assurance: knowing where automation sits, what it decides, who is accountable and whether outputs can be trusted.

Trust is also becoming more fragile. Public sector institutions have traditionally benefited from a baseline of confidence, but that trust is under pressure and increasingly conditional. The 2026 Edelman Trust Barometer shows government remains the least trusted institution, sitting below business and NGOs on measures of competence and ethics. Trust is increasingly extended where institutions feel local, transparent and fair.

For public sector entities, legitimacy is no longer secured by mandate alone. It is judged by behaviour, accountability and competence, especially when technology is involved. In a digital environment, trust can erode quickly through breaches or perceived misuse of data and is far harder to rebuild than to maintain.

The governance stakes become clearest when systems fail.

New Zealand has seen how quickly confidence can unravel through digital breach and poor assurance. The Manage My Health incident, involving exposure of highly sensitive health information, triggered ministerial review and ongoing scrutiny of accountability, vendor oversight and privacy protections.

Similarly, in 2023, hackers threatened to dump data stolen from Auckland University of Technology, highlighting the growing reality of ransomware and extortion threats against public-facing institutions. When cyber incidents become public leverage, the issue is no longer only technical recovery, it is reputational trust, stakeholder reassurance and governance accountability under pressure.

These events reinforce a critical governance truth: cybersecurity and privacy failures are not simply IT failures; they are governance failures. They test whether boards have asked the right questions early, whether assurance is real or assumed, and whether accountability is clear when services are delivered through third parties or digital platforms.

Privacy and cyber resilience can no longer sit at the margins of board agendas. Public sector entities hold vast repositories of sensitive personal information about health, education, identity, family circumstances and community need. The consequences of a breach are not just financial or operational; they are deeply human and they strike directly at institutional legitimacy.

Tony Evans, Partner – Digital at KPMG, argues that governance is the mechanism that grants social licence for AI, embedding accountability, transparency and human oversight as foundations of trusted deployment. In public sector contexts, trust is often the currency that enables participation. People will back innovation when they believe it is being used responsibly, with safeguards that reflect public values.

But as AI becomes embedded in everyday software – from copilots to automated decision pathways – boards must also contend with informal or shadow AI adoption. In many organisations, AI is already operating inside workflows whether formally approved or not. Governance discipline now requires visibility, boundaries and verification, not just enthusiasm for innovation.

The challenge, then, is not whether public sector entities should adopt AI or digital tools. It is whether boards are governing those tools with the discipline required to sustain legitimacy.

Public sector governance carries a distinctive burden: decisions are made under public scrutiny, within constrained resources and in service of communities who cannot always opt out. That makes trust more fragile and governance more consequential.

In the digital era, trust cannot be assumed. It must be designed, governed and continuously earned.

Considerations for directors and governors

Boards across the public sector ecosystem should be asking:

    • Do we treat digital trust – privacy, cyber resilience and transparency – as a core governance asset?
    • What assurance do we have over the data and AI systems being used, including informal or “shadow AI” adoption?
    • Have we mapped where automated or agentic systems are influencing decisions, and are delegations and accountability still clear?
    • Are accountability and escalation clear when services are delivered through third-party vendors or platforms?
    • Are cyber risk and privacy readiness tested through realistic scenarios, not just policy documents?
    • Do we understand stakeholder expectations, especially for communities most sensitive to misuse of data or opaque decision-making?
    • If trust were tested tomorrow through a breach, ransomware event or AI failure, would we respond with clarity, discipline and credibility?