Board Governance in the Age of AI: Navigating Risks of Note-Taking Tools
In an excellent article recently published by the US Law Firm Robins Kaplan LLP, they introduce an article dealing with the above with the following paragraph:
“Recently, during a virtual meeting of a board-of-directors committee, the chair noticed one of the absent committee members had suddenly appeared in attendance. But after closer review, the chair realized it was not the human board member who had joined, but rather, the member’s “AI assistant” that automatically joins any meeting listed on the missing board member’s calendar, summarizes meeting content, and then circulates notes to all those included on the meeting appointment. This is a scary scenario for many organizations, but especially for those like boards and board committees, whose meetings are the backbone for corporate governance.”
In the article they identify the following Key governance risk in regard to the above:
“1. Confidentiality Breaches
AI note-taking tools process data in ways that can expose sensitive boardroom information to unintended parties. Many tools rely on cloud-based platforms for transcription and analysis, creating the risk of unauthorized access or data breaches. If confidential information about mergers, acquisitions, strategic plans, or litigation is compromised, the organization could face legal, reputational, and competitive harm.
2.?Loss of Privileged Communications
Certain board discussions are protected by attorney-client privilege. Introducing an AI note-taking tool into these discussions can inadvertently waive privilege if the tool is not adequately secured or its use is not carefully controlled. The loss of privilege can expose sensitive information in future legal proceedings or be used to competitively disadvantage the company.
3.?Compromising Executive-Session Privacy
Executive sessions allow boards to deliberate privately without the presence of management or external advisors. These sessions are critical for candid discussions, particularly regarding performance evaluations, succession planning, or sensitive governance matters. Recording or transcribing these discussions with AI tools risks eroding the sanctity of the executive session and may deter directors from speaking openly.
4.?Regulatory and Legal Risks
Depending on the jurisdiction and industry, boards may face regulatory obligations to safeguard certain types of information. For example, healthcare and financial services boards may handle data protected under HIPAA or GLB regulations. The use of AI tools that do not comply with these requirements can expose the organization to penalties and litigation.
5.???????? Data Ownership and Vendor Risks
Many AI tools are provided by third-party vendors who may assert ownership or access rights over the data processed by their platforms. Boards must carefully assess whether their chosen tool’s terms of service adequately protect the organization’s intellectual property and sensitive information.”
They go on to set out the best a practices for responsible use of AI note-taking tools as follows:
“1.?Assess Necessity and Scope
领英推荐
Before adopting an AI note-taking tool, boards should critically assess whether the tool is necessary and appropriate for their context. In some cases, traditional note-taking methods may be more suitable for preserving confidentiality and privilege.
2.?Choose the Right Tool
Not all AI tools are created equal. Boards should conduct thorough due diligence to select tools designed with robust security features, such as end-to-end encryption, data localization, and compliance with relevant regulations. Tools that allow local data storage rather than cloud processing may offer enhanced security.
3.?Establish Clear Usage Policies
Boards should develop and enforce policies governing the use of AI tools, including:
4.?Engage Legal Counsel
Legal counsel should be consulted to ensure the use of AI tools does not compromise attorney-client privilege or violate regulatory requirements. Counsel can also advise on contractual terms with vendors to protect the organization’s interests.
5.?Provide Training for Directors and Staff
Directors and board staff must understand the risks associated with AI tools and how to use them responsibly. Training should include guidance on identifying sensitive discussions where AI tools should not be used.
6.?Monitor and Review
The board should periodically review the use of AI tools to ensure they remain appropriate and secure. This includes monitoring for updates to vendor terms of service, changes in regulatory landscapes, and advancements in AI technology that may introduce new risks or opportunities.”
The article concludes with outstanding advice which applies not only in regard to the above, but for everything in our new brave world.? They profoundly state that “Technology must serve governance, not undermine it.”
I hope that the above is helpful and I am grateful to the above firm for outstanding advice.?
Chief Information Officer at Coaxle (Pty) Ltd
1 个月i wonder if anyone checks if the terms of use of the AI notetaker change over time ? Currently (like other software) they may claim not not be using your recordings to train their LLM's or worse, to train their voice mimmicing software (imagine being able to impersonate the voice of the Chair or the CFO), but does anyone notice when they change the terms to allow it ? Consider Google changing from a position that their AI wont be used for Military applications, to one where it now will. Did they tell anyone ? Would your AI notetaker tell you ? I was in a meeting the other day where there were 3 people and 4 notetakers....
Chief Executive Officer | CA(SA) GCB.D
1 个月Such a good reminder to be aware of the unintended consequences. The pervasiveness of technology sometimes leads to us being blinded by efficiencies.
Professional Ethics for Professional Accountants; Experienced Non-Executive Director; Corporate Governance Services
1 个月This is so clear in pointing out the problems with unsupervised use of AI. Certainly at this stage AI cannot take over the functions to be performed by human beings, unless that work is appropriately reviewed and edited