The Silent Guest in Your Meetings: Legal Risks of AI “Note-Takers”
As AI-powered “note-taker” tools like Read AI, Fireflies.ai and Otter.ai become fixtures in meetings and virtual calls, many organizations are unknowingly inviting a silent participant into the room: one that could compromise confidentiality, data security and even attorney–client privilege. And while the features they offer can save time and help keep teams on target, use of these tools raises significant practical and legal risks, which should be considered before their implementation.
Key Practical Risks to Consider
- Inaccuracies and Omissions. AI-generated transcripts often miss nuance or context; they have a hard time detecting sarcasm or tone. Summaries may omit important qualifiers or follow-up items, leaving an incomplete or misleading record. Transcripts and summaries can also misattribute discussion points between participants.
- Over-Reliance on AI for the Official Record. Because participants tend to rely on the AI notes, they may be less likely to create their own detailed records. They may also be less likely to participate fully in the discussion (on the theory that if needed, they can go back and review the transcript). Most people also do not review or correct the AI-generated notes, increasing the risk of errors becoming “the record.”
- Vendor Use of Data. Many AI note-taking tools store recordings and transcripts on third-party servers. Depending on the provider’s terms of use, this information may then be used to train vendor AI models, shared with affiliates or otherwise disclosed to third parties, placing potentially sensitive information outside the control of the parties involved.
Key Legal Risks to Consider
- Loss of Attorney-Client Privilege. Calls between attorneys and their clients are, as a general rule, not subject to compelled disclosure to courts or third parties; they are “privileged” communications. This privilege is protected only if the relevant communications remain confidential. Allowing note-taker vendors to access or use transcripts for their own purposes could provide grounds for a waiver of privilege, which is problematic in the event of a legal dispute. As the use of note-takers increases, we can expect to see routine demands for all transcriptions or summaries during discovery, thus increasing the focus on protections afforded by attorney-client privilege and work-product doctrine.
- Data Security Concerns. Cloud-stored recordings are attractive targets for cyberattacks. A breach could expose confidential business strategies or privileged legal discussions. Even if attorney-client privilege may not be a particular concern (or is not actually lost), the mere potential for confidential information to be seen or accessed by a third party could itself have negative legal consequences (including breach of a lawyer’s ethical duty to prevent inadvertent or unauthorized access to confidential client information).
- Consent and Privacy Compliance. In many jurisdictions, recording a conversation requires consent from all participants. Automatic recording by an AI note-taker may inadvertently violate wiretapping or privacy laws if participants are not properly informed.
- Regulatory, Industry and Corporate Compliance. Certain industries (e.g., healthcare, financial services, education) and types of data (e.g., personal information, health information, etc.) are subject to additional regulations on storage, transmission and disclosure. In addition, corporations and other entities have internal policies or are subject to industry standards that cover data retention, use and disposal. Use of AI tools without proper safeguards may inadvertently violate these requirements.
- Cross-Border Data Transfers. Some AI note-taking vendors store or process data outside the United States. International data-transfer rules, such as the GDPR, may restrict or complicate the handling of information shared with these services.
Next Steps
While AI note-takers can save time, they should be used carefully. Before implementing or using these tools, consider the following best practices:
- Understand vendor practices. Review how the provider will use, store and access your data, including any rights to use recordings or transcripts for AI training.
- Confirm confidentiality protections. Ensure the tool’s data security measures align with your organization’s privacy, regulatory and contractual obligations.
- Obtain proper consents. Verify that all meeting participants are aware of and consent to recording, as required by law or policy.
- Review and correct AI summaries. Don’t assume accuracy. Confirm that transcripts and summaries are complete and correctly attributed.
- Avoid use during sensitive discussions. Turn off AI note-takers for confidential meetings, especially those involving attorneys or sensitive subject matters.
- Control automatic activation. Make sure tools don’t join or record meetings by default.
Smith Anderson can help your organization assess these and other risks associated with AI meeting tools and develop policies to mitigate exposure. Please contact Darrell Fruth, Tory Hartmann or your regular Smith Anderson attorney for assistance.
Professionals
- Attorney
- Attorney




