Budget 2025: A strategic tightrope for New Zealand directors
ASB, Dentons and KPMG have provided useful perspectives for directors on the 2025 Budget.
Directors who rely on AI outputs without proper oversight risk exposing themselves to personal liability.
The rapid adoption of generative AI, particularly since the release of tools like ChatGPT, has brought considerable promise. For many professionals, automating routine tasks such as minute-taking is an attractive prospect. But these efficiencies also introduce significant governance and legal risks.
Under the Companies Act, directors must act in good faith and in what they believe to be the best interests of their companies, and exercise the care, diligence, and skill that a reasonable director would exercise in the same circumstances. These statutory obligations cannot be displaced by the use of AI tools. Directors who rely on AI outputs without proper oversight risk exposing themselves to personal liability.
Directors must therefore ensure that robust frameworks are in place to evaluate, supervise, and document the use of such technologies.
While directors are not required to personally maintain board minutes, the company must keep minutes of all board and board committee meetings for at least seven years at its registered office. The board must also ensure adequate measures exist to detect and prevent any falsification of those minutes. Failure to meet these obligations is an offence, and every director may be held personally liable.
Directors also have a statutory right to inspect company records in written form, including board minutes, on reasonable notice and without charge. Where AI tools are used to generate these records, the board must ensure that the format and storage method used enables directors to exercise their statutory inspection rights.
"If an AI tool stores these records in inappropriate formats, in third-party systems, or under terms that do not facilitate easy retrieval, the company and the board may be in breach of statutory duties."
These obligations have direct implications for generative AI. If an AI tool fabricates, omits, or alters the statements of directors attending an AI-transcribed board meeting and the board fails to implement adequate safeguards, the liability will rest with the board and the directors, not the AI provider.
Board minutes are a statutory record under the Companies Act and are generally treated as the authoritative record of board deliberations and decisions. Courts regularly rely on minutes as evidence of whether directors fulfilled their duties of care, diligence and skill. Whether drafted manually or generated using AI tools, their form does not diminish their legal significance.
Minutes (including board minutes) can be admissible as evidence of what occurred in a meeting because they record statements made by meeting participants. However, when AI tools are used to capture or generate those minutes, questions may arise as to whether those records reflect actual statements or are merely the AI’s reconstruction of them. Without adequate oversight and verification, these records may not be attributed to a person with first-hand knowledge, increasing the risk that they are treated as hearsay.
Boards must be able to demonstrate how minutes were created, reviewed and approved. Establishing a clear process will be critical in legal proceedings.
Using AI tools to record or generate board minutes raises serious data governance concerns. Many tools transmit data to offshore servers and reserve rights to retain or use the data for model training. Even when described as “de-identified,” true anonymisation is technically difficult, especially with voice data or transcriptions. Inadvertent exposure of sensitive board content or competitive IP remains a real risk.
Directors should not assume that these risks are theoretical or solely rely on vendor statements, which rarely reflect the actual legal permissions buried in their privacy policies and service terms.
Before deploying AI tools in governance settings, directors must understand what data is collected, where it is stored, and whether it is used for vendor’s purposes, including model training. They must also assess whether appropriate contractual safeguards and use limitations are in place, and whether it is lawful to transcribe statements of guests or other third parties who participate in meetings.
The use of AI will inevitably become more common in corporate governance. The adoption of generative AI offers real opportunities for administrative efficiency, but it also carries the risk of breaching statutory duties of companies, boards, and directors. Just as critical are the privacy and confidentiality risks that arise when minutes are kept by AI tools.
Boards and directors should resist the temptation to treat AI as a passive tool and be deliberate and risk-aware in their adoption. Robust governance processes, including internal policies on AI use, clear oversight mechanisms providing continuous supervision and verification, and ongoing legal review, are not just good governance practices, they are also critical for minimising potential legal liability.
Prudent governance will require not only thoughtful adoption of these tools, but demonstrable control over their role in governance processes.
To get yourself up to speed on the ins and outs of AI governance, why not try our AI Governance Essentials Course.