Authors
Nushrah Amod
A number of artificial intelligence (AI)-powered transcription services have emerged as assistive tools to streamline the process of minute-taking during discussions, meetings and presentations. Unlike traditional transcription and recording tools, AI-notetakers leverage machine learning for natural language processing to automate and enhance the minute-taking process: they can generate summaries of key discussion points, track action items, identify different speakers and format meeting notes in accordance with company precedents. While AI-transcription services can help improve organizational productivity and promote efficiency in meetings and information sharing, the sophisticated nature of these tools also presents novel business and legal risks.
Traditional meeting minutes do not characteristically capture precise dialogue attributed to each person at a meeting. Indeed, the level of detail in management and board meeting minutes is a specialty area requiring deep training and knowledge of organizational needs and industry practices. The creation of robust and detailed—and unvetted and unfiltered—meeting minutes generated by AI-notetakers may lead to an increase in the availability of information that may be discoverable by third parties in litigation. Similarly, various versions, and the conflicts or discrepancies contained within them, may also be discoverable and inadvertently expose a company to greater litigation risk.
The use of AI-notetakers raises concerns surrounding privilege and legal advice that is intended to be privileged and/or delivered in camera. If the AI-notetaker is not limited to certain sessions or cannot be disabled for certain portions of a meeting, privileged information may become compromised. Further, where settings permit meeting minutes or summaries to be distributed automatically to all attendees indiscriminately, legal advice intended to be privileged or delivered in camera could be circulated without appropriate vetting, notification or consent.
AI-notetakers have access to proprietary and confidential information disclosed in meetings that may be stored on vendor systems. Proprietary and confidential information stored on vendor or company systems could be compromised by a data breach or cyber-attack, exposing the company to reputational harm and additional regulatory and litigation risk. Additional concerns arise where certain vendors could extract company data to train AI models and embed company data within the model’s algorithm.
At both the management and board level, the free exchange of ideas and debate are central to effective decision-making and oversight. While there is a risk that employees and directors will be more reluctant to engage in healthy debate if any form of recording is used, this risk may be increased where AI-notetakers are preparing the minutes, with less human review and discretion over the final content.
Similar to the litigation risk described above, corporate decisions and the individuals participating in them may also be subjected to unwarranted scrutiny if the organization has records of the details of disagreements and debates leading up to a final decision, in addition to more summary minutes that reflect the ultimate path chosen.
AI-notetakers can sometimes exhibit bias by giving deference to senior people at a meeting. As an example, the software may summarize an item by giving more weight to something said by a director as opposed to an equally important perspective from an analyst. This is sometimes carried out in subtle ways, where the AI-notetaker alters the language in a summary document to reflect the authority of the senior person while overlooking valuable insights provided by a more junior individual. Consequently, these tools may inadvertently augment existing power dynamics within organizations, reinforcing hierarchies to the detriment of culture and collaboration.
Companies should mitigate the risks associated with the use of AI-notetaking tools by taking preventative steps to ensure the responsible use of this generative technology.
The adoption and use of AI-notetaking tools present an opportunity for companies to streamline record-keeping processes. However, these tools also present novel risks that companies must be attuned to when engaging AI-transcription services. If choosing to adopt AI-notetaking tools, companies should take preventative steps to ensure that data is secure and the use of the tools aligns with internal data-related policies and workplace culture, as well as external regulatory requirements.
To discuss these issues, please contact the author(s).
This publication is a general discussion of certain legal and related developments and should not be relied upon as legal advice. If you require legal advice, we would be pleased to discuss the issues in this publication with you, in the context of your particular circumstances.
For permission to republish this or any other publication, contact Janelle Weed.
© 2024 by Torys LLP.
All rights reserved.