IMHO: Setting an AI risk appetite

Boards can empower AI innovation through setting an appropriate risk appetite.

type
Article
author
By Kate Kolich, Digital, Data, Innovation Governance Advisor
date
30 Apr 2025
read time
3 min to read
IMHO: Setting an AI risk appetite

OPINION: 

At the Institute of Directors’ recent Governing AI Forum, I heard many board members say they are grappling with how to balance the governance of artificial intelligence (AI) with empowering management teams to innovate.

These same directors were also wary of stepping into operational management and aware of the urgency to mobilise this emerging technology to deliver business outcomes safely and ethically.

The good news is boards don’t need to create a new process. One effective approach is to leverage existing governance tools, such as the Risk Appetite Statement (RAS).

A RAS sets the tone for your organisation, making it an excellent tool for the board to convey legal, and ethical, expectations for AI use. Instead of viewing the RAS as a constraint on innovation, it can be used to provide clear guidance on the legal, ethical, and reputational boundaries within which the organisation should operate when using AI. This includes defining acceptable risk levels.

I recommend focusing your efforts in the following three areas.

1. Seek a stocktake of the AI landscape from management

As AI use is still an emerging field, it is timely for the board to engage with management to understand the current use, or lack of use, of AI. If you don’t know the current state of AI use in your organisation, schedule a time for your management team to present an overview of their AI initiatives or aspirations, along with an overview of their operational data governance controls for AI. This will provide an opening to enable conversations about innovation and how to align with the board's risk appetite. 

A stocktake may find that there is little use of AI because management are unclear if the board is supportive.  

Don’t forget to consider a high-level industry comparison so you can benchmark where your organisation is in relation to your industry or sector. The stocktake and benchmark can be used by the board to open conversations to explore if you want to be leading-edge or a fast-follower with AI use.

Even if your stocktake shows that AI adoption is well advanced, the board should not hesitate to set the tone for acceptable AI use and give the management team confidence in moving forward with AI initiatives. 

If there is a discrepancy between the board's comfort level and the management's AI activities, it may be time to revisit the RAS. Do this openly, inviting management to the table to discuss how to maintain the organisation's social license to operate while embracing AI innovation.

2. Ensure that you RAS can cater for acceptable AI use that reflects your organisations ethical compass

Update your RAS to ensure considerations of privacy, security, data sovereignty, IP, content attribution, accuracy and climate considerations are covered, to name a few areas. 

Above all consider your customer expectations and conduct surveys or gather feedback from a representative sample of your customers to gauge their perspectives on acceptable AI use.

If you want to enhance trust, consider including your acceptable AI use statement or principles on your website. This will help align AI initiatives with consumer expectations and build trust and help maintain social licence. 

3. Regularly revisit your RAS with management to mobilise innovation and monitor ROI  

Share the updated RAS with the management team and request regular reports through a relevant committee such as the risk and audit committee, technology or people committee. This ensures that the board stays informed about AI activities without getting caught up in operational detail.

To keep the management team engaged, consider asking for positive metrics in the reporting such as service delivery efficiencies or return on investment for AI initiatives.  

Above all remember that AI technology is moving fast, so this is not a set and forget activity. Keep across emerging risks and opportunities to safely navigate your AI adoption by maintaining alignment with your RAS through regular communication with management.


Kate Kolich MInstD has over 25 years of leadership experience in data, digital, and innovation across private and public sectors. She has won multiple industry awards for her work and was named one of the top 100 innovators in data and analytics by Corinium Global Intelligence in 2024. Kate co-chairs Women in Data Science New Zealand and serves on the advisory board of the Victoria University of Wellington Centre for Data Science and Artificial Intelligence.