Massachusetts AI Meeting Recording Laws (2026)
Massachusetts treats secret recording as a serious crime. Under Mass. Gen. Laws ch. 272, Section 99, every unauthorized interception of a wire or oral communication is a felony punishable by up to five years in prison and a $10,000 fine. Unlike most states, Massachusetts has no misdemeanor category for wiretap offenses. That distinction matters enormously for anyone deploying AI meeting recorders like Otter.ai, Fireflies.ai, or Zoom's built-in transcription features in the Commonwealth.
The rapid growth of AI-powered note-taking and transcription tools has collided with this decades-old statute in ways the legislature never anticipated. This article breaks down how Massachusetts' wiretap law applies to AI meeting recording, what penalties are at stake, how the Attorney General's 2024 AI advisory affects compliance, and what employers and individuals need to know as of April 2026.
Consult an attorney for advice specific to your situation, as this article provides general legal information and not legal advice.
Massachusetts Consent Framework for Recording
The "Secret Recording" Standard
Massachusetts' wiretap statute takes a different approach than most two-party or all-party consent states. Rather than requiring each participant to give explicit, affirmative consent before recording begins, Section 99 prohibits "secret" recording. The statute defines an "interception" as the act of secretly hearing, secretly recording, or aiding another to secretly hear or secretly record the contents of any wire or oral communication.
This distinction is legally significant. Placing a voice recorder on a conference table in plain view is lawful, because the recording is not secret. Hiding a phone in your pocket to capture the same conversation is a felony. The practical test is whether all parties to the communication are aware that recording is taking place. If even one participant does not know, the recording is secret and therefore illegal.
No Misdemeanor Option
Most states that criminalize unauthorized recording offer both misdemeanor and felony classifications, depending on factors like the offender's intent or whether the recording was disclosed. Massachusetts does not. Under Section 99(C)(1), any person who willfully commits an interception, attempts to commit an interception, or procures another person to commit an interception faces a felony charge.
The penalty upon conviction: imprisonment in state prison for up to 5 years, or a fine of up to $10,000, or both. Possession of an intercepting device with intent to commit an illegal interception carries a separate penalty of up to 2 years in a house of correction and a $5,000 fine.
Civil Liability
Beyond criminal penalties, Massachusetts allows civil lawsuits for wiretap violations. A person whose communications are illegally intercepted may bring a private right of action for actual damages, punitive damages, and attorney's fees. This dual criminal-civil exposure creates substantial risk for companies deploying AI recording tools in Massachusetts.
How Massachusetts Law Applies to AI Meeting Recorders
AI Bots as "Intercepting Devices"
Section 99 defines an "intercepting device" broadly as any device or apparatus capable of transmitting, receiving, amplifying, or recording a wire or oral communication. AI meeting assistants that join video calls to record, transcribe, or summarize conversations fall squarely within this definition.
When an AI bot like Otter.ai's Notetaker joins a Zoom or Microsoft Teams meeting, it functions as an intercepting device. The bot captures audio from all participants, transmits it to the vendor's servers, and processes it through machine learning models to generate transcripts and summaries. Under Massachusetts law, the critical question is whether every meeting participant knows the bot is recording.
The Disclosure Problem
Many AI meeting tools operate by joining calls as a named participant (often labeled "Otter Notetaker" or "Fireflies.ai Notetaker") after the meeting host grants permission. The host may have integrated Otter or Fireflies with their calendar, allowing the bot to join automatically. In some configurations, the bot joins without sending a separate consent request to non-host participants.
This default behavior creates significant legal exposure in Massachusetts. If a meeting includes a participant located in the Commonwealth who is not aware that the AI bot is recording, the recording is "secret" under Section 99. The fact that the bot's name appears in the participant list does not necessarily constitute adequate disclosure, particularly if participants join by phone or do not notice the bot among a large list of attendees.
The Brewer v. Otter.ai Lawsuit
The August 2025 class action Brewer v. Otter.ai illustrates these risks. The plaintiff, who did not have an Otter.ai account, participated in a Zoom meeting where the Otter Notetaker joined automatically because the host had integrated the tool. The complaint alleges that Otter.ai "deceptively and surreptitiously" recorded private conversations without obtaining consent from all participants.
The lawsuit includes claims under the federal Electronic Communications Privacy Act (ECPA), the California Invasion of Privacy Act, and other privacy statutes. While the case was filed in California, the legal theories apply with even greater force in Massachusetts, where the statute imposes felony penalties and has no misdemeanor alternative.
The Ambriz v. Google "Capability Test"
In Ambriz v. Google (N.D. Cal., February 2025), a federal judge denied Google's motion to dismiss wiretapping claims related to its Cloud Contact Center AI product. The court adopted a "capability test," holding that an AI vendor need only possess the technical capability to use intercepted data for its own purposes (such as model training) to qualify as a third-party eavesdropper.
This ruling is particularly relevant for Massachusetts. Under the capability test, an AI meeting tool does not need to actually use recorded conversations to train its models. The mere capability to do so may be enough to classify the tool as an unauthorized interceptor rather than an extension of the user's recording.
Popular AI Meeting Recording Tools and Massachusetts Compliance
| Tool | Default Behavior | Massachusetts Risk Level |
|---|---|---|
| Otter.ai | Bot joins meetings automatically; host can enable/disable consent prompts | High: default settings may not adequately disclose recording to all participants |
| Fireflies.ai | Bot joins as a participant; Zoom shows recording consent pop-up if configured | Moderate to High: depends on platform settings and whether all participants see the notification |
| Zoom AI Companion | Built-in transcription; shows recording indicator to participants | Moderate: recording indicator visible, but audio-only participants may not see it |
| Microsoft Copilot | Integrated into Teams; shows transcription notification | Moderate: notification appears in chat, but may be missed in large meetings |
| Google Gemini in Meet | Built-in summarization; shows recording banner | Moderate: visual banner present, but phone-in participants cannot see it |
| Read.ai | Bot joins meetings; sends consent requests to participants | Lower: proactive consent requests, but still depends on all participants receiving and understanding them |
The safest approach in Massachusetts is to verbally announce at the start of every meeting that an AI tool is recording and transcribing the conversation, then confirm that all participants acknowledge the disclosure.
Penalties for Violating Massachusetts Recording Laws
Massachusetts imposes the harshest penalty framework among all-party consent states for recording violations.
Criminal Penalties
| Offense | Classification | Maximum Prison | Maximum Fine |
|---|---|---|---|
| Willful interception of communications | Felony | 5 years state prison | $10,000 |
| Possession of intercepting device with intent | Misdemeanor | 2 years house of correction | $5,000 |
| Disclosure of illegally intercepted communications | Felony | 5 years state prison | $10,000 |
Under the February 2024 jury instructions issued by the Massachusetts court system, prosecutors must prove that the defendant willfully committed, attempted, or procured the interception, and that the recording was "secret" (meaning at least one party was unaware). Intent to use the recording for a harmful purpose is not required for conviction.
Civil Remedies
Victims of illegal interception may sue for actual damages, statutory damages, punitive damages, and reasonable attorney's fees. In practice, class action lawsuits against AI recording companies could expose vendors to significant aggregate liability, as each recorded meeting participant in Massachusetts could be a separate claimant.
Evidentiary Consequences
Recordings obtained in violation of Section 99 are inadmissible in criminal proceedings. However, Massachusetts courts have ruled that illegally obtained recordings remain admissible as evidence in civil cases. The statute's remedies are limited to criminal prosecution and civil damages, not exclusion from civil litigation.
Attorney General's AI Advisory (April 2024)
On April 16, 2024, Massachusetts Attorney General Andrea Joy Campbell issued an advisory providing guidance on how the Commonwealth's consumer protection, anti-discrimination, and data security laws apply to artificial intelligence. This was one of the first formal AI advisories issued by any state attorney general in the country.
Key Points for AI Meeting Tools
The advisory identifies several practices that may violate Massachusetts' consumer protection law (M.G.L. c. 93A), including:
- Falsely advertising the quality, value, or usability of AI systems
- Misrepresenting the reliability or performance of AI products
- Misrepresenting audio or video content of a person, including through deepfakes, voice cloning, or chatbots used to engage in fraud
- Supplying defective or unusable AI systems
For AI meeting recording tools, these provisions mean that vendors cannot overstate their compliance features, must accurately represent how recorded data is used, and face potential enforcement action if their tools facilitate unauthorized recording.
Data Privacy Requirements
The advisory confirms that AI systems must comply with Massachusetts' Standards for the Protection of Personal Information (201 CMR 17.00). AI developers, suppliers, and users must safeguard personal information collected through recording and transcription, comply with breach notification requirements, and implement appropriate security measures for stored conversation data.
Enforcement Signal
The advisory does not create new law, but it signals the Attorney General's intent to apply existing statutes aggressively to AI technologies. Companies deploying AI meeting tools in Massachusetts should treat this advisory as a compliance baseline.
Employer and Workplace Recording Rules
Employer Obligations
Massachusetts employers who use AI tools to record or transcribe meetings must comply with Section 99 in full. Recording cannot be secret, so employers must notify all meeting participants that AI recording is active. Monitoring policies should be documented in employee handbooks or workplace agreements.
Employers cannot argue that employees implicitly consented to AI recording by accepting a job offer or signing a general technology use policy. Section 99 requires that recording not be secret at the time it occurs, which means real-time awareness is essential.
Remote and Hybrid Meetings
Remote work has amplified the complexity of Massachusetts recording compliance. When a meeting includes participants in multiple states, the strictest applicable law governs. If even one participant is located in Massachusetts, the entire meeting is subject to Section 99's prohibition on secret recording.
Employers with distributed workforces should implement a company-wide policy requiring verbal disclosure of AI recording at the start of every meeting, regardless of where participants are located. This approach eliminates the need to determine which state's law applies to each individual participant.
Job Interviews and Candidate Screening
AI tools that record or analyze job interviews (including tools that assess vocal tone, facial expressions, or language patterns) present particular risk in Massachusetts. Candidates must be clearly informed before any recording or AI analysis begins. Failure to disclose AI recording during an interview could expose the employer to felony wiretap charges in addition to civil liability.
Proposed Legislation: Senate Bill S.1215
Massachusetts legislators have recognized that Section 99's strict prohibition on secret recording can produce unjust outcomes. Senator Patrick O'Connor introduced Senate Bill S.1215 in the 194th General Court (2025-2026 session), which would create a legal defense for individuals who secretly record threats, harassment, or other crimes.
The bill was motivated by a 2022 case in which a woman with a restraining order against her husband was charged with eight counts of wiretap violations after she secretly recorded him making threats. As of April 2026, S.1215 remains pending in committee. The bill does not address AI meeting recording directly, but its progress reflects ongoing debate about whether Section 99's all-or-nothing approach remains appropriate.
More Massachusetts Laws
Explore other Massachusetts law topics on Recording Law:
Sources and References
- Mass. Gen. Laws ch. 272, Section 99 - Wiretapping Statute(malegislature.gov).gov
- Massachusetts Jury Instructions 7.560 - Wiretapping (February 2024)(mass.gov).gov
- AG Campbell AI Advisory - Consumer Protection and AI (April 2024)(mass.gov).gov
- Massachusetts Law About Artificial Intelligence(mass.gov).gov
- Massachusetts Law About Privacy(mass.gov).gov
- Brewer v. Otter.ai Class Action (NPR, August 2025)(npr.org)
- Ambriz v. Google - AI Wiretapping Ruling (Courthouse News, 2025)(courthousenews.com)
- 18 U.S.C. Section 2511 - Federal Wiretap Act(law.cornell.edu)
- Senate Bill S.1215 - Recording Exception for Threats/Harassment(senatoroconnor.com)