EU Recording Laws: GDPR, AI Act, Consent Rules, and Country-by-Country Guide (2026)

Recording a phone call, video conference, or in-person conversation anywhere in the European Union is governed by at least two legal frameworks simultaneously, and sometimes three. The member state's criminal code determines whether recording is a criminal act. GDPR determines whether the recording is lawful personal-data processing. And since 2025, the EU AI Act imposes additional obligations whenever artificial intelligence tools are involved in capturing, generating, or altering recordings.
This guide explains how those layers interact, maps the consent rules across all 27 EU member states, and links to every country-specific page for detailed analysis.
Quick Answer: Is There a Single EU Recording Law?
No. The European Union has never enacted a unified recording-consent statute. Recording law in Europe operates on a federal-style structure: the EU sets data-protection and AI rules that apply uniformly, but the underlying criminal prohibition on recording without consent is left entirely to each member state.
The practical consequence is that the same phone call may be perfectly legal on one end and a criminal offense on the other end, depending on which country's law applies to each participant.
What the EU does provide uniformly is the data-protection layer. GDPR applies in every member state, to every controller and processor, without exception. So regardless of whether recording is criminally permitted, it must also satisfy GDPR's lawful-basis, transparency, and data-minimisation requirements.
Why There Is No Single EU Recording Law
The EU's legislative competence does not extend to member states' criminal law on private communications. Article 83 of the Treaty on the Functioning of the European Union (TFEU) allows the EU to establish minimum rules for particularly serious cross-border crimes, but wiretapping and private recording have never been harmonised under that power.
Instead, EU law addresses the field obliquely through data protection and telecommunications regulation. The ePrivacy Directive 2002/58/EC requires member states to prohibit interception of electronic communications without consent, but it is a directive, not a regulation. Each member state transposed it differently, and the criminal penalties each attached to violations vary widely.
The result is 27 different sets of criminal rules, all wrapped in a uniform GDPR obligation and now a uniform AI Act obligation. Understanding EU recording law means understanding both the uniform EU layers and the country-by-country criminal rules that sit beneath them.
The EU Legal Layers That Apply Everywhere
GDPR as the Universal Overlay

The General Data Protection Regulation (Regulation 2016/679) has applied directly in all member states since 25 May 2018. It is a regulation, not a directive, meaning it has uniform effect without national transposition.
Any voice recording of an identifiable person constitutes personal data under Article 4(1) GDPR. Processing that personal data -- including capturing, storing, accessing, and sharing the recording -- requires a lawful basis under Article 6. The six available bases are:
- Consent (Article 6(1)(a)): Freely given, specific, informed, and unambiguous agreement to be recorded.
- Contract performance (Article 6(1)(b)): Recording is necessary to fulfil a contract with the recorded person.
- Legal obligation (Article 6(1)(c)): EU or national law requires the recording (for example, MiFID II for financial transactions).
- Vital interests (Article 6(1)(d)): Recording is necessary to protect someone's life.
- Public task (Article 6(1)(e)): Recording is necessary for a task carried out in the public interest under EU or national law.
- Legitimate interests (Article 6(1)(f)): A legitimate interest is being pursued, it is necessary for that purpose, and it is not overridden by the recorded person's rights and freedoms.
Choosing a lawful basis does not end the analysis. GDPR also requires transparency (Article 13/14 notices must be provided before or at the time of recording), purpose limitation (recordings cannot be repurposed without a compatible new basis), data minimisation (record only what is needed), storage limitation (define and enforce a retention period), and appropriate security (Article 32).
The household exemption under Recital 18 excludes purely personal or household recording from GDPR's scope. Recording a family conversation for personal reference is exempt. Sharing it, publishing it, or using it commercially removes the exemption immediately.
The journalistic exemption under Article 85 requires member states to reconcile data protection with freedom of expression for journalistic, academic, artistic, or literary purposes. Member states implement this exemption with significant variation. The Court of Justice of the EU (Buivids, C-345/17, 2019) held that publishing a recording of police officers on YouTube could qualify for the journalistic exemption if the purpose was to disclose information to the public.
ePrivacy Directive 2002/58/EC
The ePrivacy Directive governs the confidentiality of electronic communications. Article 5(1) prohibits member states from permitting listening, tapping, storage, or surveillance of electronic communications without the consent of the users concerned.
Article 5(2) creates a narrow business exception: recording is permitted when it is carried out in the course of a lawful business practice for the purpose of providing evidence of a commercial transaction. This is the legal foundation for recorded customer service and sales calls, provided that prior notice is given to callers.
The proposed ePrivacy Regulation, which would have replaced the Directive with a directly applicable regulation, was first proposed in January 2017. After eight years of legislative deadlock, the European Commission formally withdrew the proposal in February 2025 as part of a broader regulatory simplification drive. The 2002 Directive, as amended in 2009, remains the operative framework.
Because the ePrivacy Directive is a directive, its Article 5 requirements were transposed into national law differently across member states. Germany's implementation (TDDDG), France's implementation (under the Code des postes et des communications electroniques), and the implementations in other member states each add country-specific nuances.
EU AI Act: Articles 5 and 50 (Regulation 2024/1689)
The EU AI Act (Regulation 2024/1689) entered into force on 1 August 2024. Its provisions apply in stages.
Article 5 -- Prohibited Practices (in force 2 February 2025). Article 5 bans a category of AI applications outright. The prohibition most directly relevant to recording is the ban on real-time remote biometric identification systems used in publicly accessible spaces for law enforcement purposes. "Real-time" biometric identification means processing live video or audio feeds to identify individuals by comparing their biometric data against a database. This practice -- exemplified by facial recognition on CCTV feeds -- is prohibited as a default.
Member states may enact limited exceptions by national law for specifically enumerated serious crimes, targeted searches for missing persons, and imminent terrorist threats, subject to prior judicial or independent administrative authorisation in most cases. The Commission published its Guidelines on Prohibited AI Practices in April 2025 to clarify the scope of each prohibition.
Article 5 also prohibits AI systems that categorise individuals based on their biometric data to infer sensitive attributes such as political opinion, religious belief, or sexual orientation. Recording systems equipped with AI that automatically profiles speakers by such attributes falls within this prohibition.
Article 50 -- Transparency for AI-Generated Content (applies 2 August 2026). Article 50 imposes disclosure obligations on providers and deployers of AI systems that generate synthetic audio, image, video, or text. The two obligations with the most direct relevance to recording and media are:
First, AI-generated or AI-manipulated audio, image, or video content (deepfakes) must be disclosed as artificially generated or manipulated when that content depicts real persons. This obligation applies to deployers and must be made in a clear and distinguishable manner, except where the content is covered by a narrow exception for authorised law-enforcement use or proportionate artistic works with appropriate disclosure.
Second, AI-generated text published to inform the public must be disclosed as AI-generated unless it has undergone genuine human review and a natural or legal person assumes editorial responsibility for it.
Penalties for non-compliance with Article 50 reach EUR 15 million or 3 percent of worldwide annual turnover, whichever is higher.
The Commission published the second draft Code of Practice on Transparency of AI-Generated Content in March 2026, open for stakeholder feedback. The Code is intended to bridge the gap between the legal obligation and the technical standards needed to implement machine-readable labelling. Full Article 50 obligations apply from 2 August 2026.
Directive 2024/1385: Combating Violence Against Women
Directive 2024/1385 of 14 May 2024 on combating violence against women and domestic violence criminalises several forms of digital and AI-facilitated abuse at EU level. The two provisions most relevant to recording and synthetic media are:
Article 5 -- Non-consensual production and dissemination of intimate images. Member states must criminalise the production, distribution, or making accessible of real or realistic AI-generated intimate images of a person without their consent where this is likely to cause serious harm to the victim. This directly covers deepfake pornography and non-consensual intimate audio or video recordings.
Article 7 -- Cyber-harassment. Member states must criminalise repeatedly sending threatening or abusive material through electronic communications, including audio and video recordings, to intimidate a person.
The Directive entered into force on 13 June 2024. Member states must transpose it into national law by 14 June 2027.
EU Charter of Fundamental Rights and ECHR Article 8
Two rights-based frameworks constrain how recording law is applied across the EU.
EU Charter Articles 7 and 8 guarantee, respectively, respect for private and family life, home, and communications (Article 7) and protection of personal data (Article 8). These apply when EU law is being implemented or applied, including in proceedings before the CJEU. National recording laws that implement EU directives must comply with the Charter.
European Convention on Human Rights Article 8 guarantees the right to respect for private and family life. The European Court of Human Rights (ECtHR) has developed a substantial body of case law interpreting Article 8 in the context of recordings, surveillance, and publication of private information.
The landmark von Hannover line of cases (von Hannover v. Germany, Application 59320/00, judgment of 24 June 2004; von Hannover v. Germany (No. 2), Application 40660/08 and 60641/08, Grand Chamber judgment of 7 February 2012) established that Article 8 protects individuals not only against interference by public authorities but also against private persons and media outlets. The 2012 Grand Chamber judgment set a proportionality balancing test: publication or use of recordings involving private individuals requires weighing the contribution to a debate of general public interest against the degree of interference with the person's private sphere.
Article 10 ECHR (freedom of expression) operates on the other side of this balance. The ECtHR has consistently held that filming and publishing recordings of public officials, including police, in the performance of their duties can fall within Article 10 protection. Interference with that right must be necessary and proportionate.
Member-State Consent Rules: The Country Table

Each row reflects the criminal-law default as of May 2026. GDPR applies in every row. The country-spoke links below carry full statutory citations, penalties, and recent enforcement actions.
| Country | Consent Rule | Key Statute | Country Guide |
|---|---|---|---|
| Austria | All-party | StGB s. 120 | Austria |
| Belgium | One-party | Art. 314bis CP | Belgium |
| Bulgaria | All-party | Criminal Code Art. 145 | Bulgaria |
| Croatia | All-party | Criminal Code Art. 142 | Croatia |
| Cyprus | All-party | Constitutional Arts. 15/17 | Cyprus |
| Czech Republic | One-party | Criminal Code s. 182 | Czech Republic |
| Denmark | One-party | Criminal Code s. 263 | Denmark |
| Estonia | One-party | Penal Code s. 156-157 | Estonia |
| Finland | One-party | Criminal Code Ch. 38 | Finland |
| France | All-party | Code Penal Art. 226-1 | France |
| Germany | All-party | StGB s. 201 | Germany |
| Greece | All-party | Penal Code Art. 370A | Greece |
| Hungary | All-party | Criminal Code s. 422 | Hungary |
| Ireland | One-party | No explicit criminal prohibition on participant recording | Ireland |
| Italy | One-party | Criminal Code Art. 617 | Italy |
| Latvia | One-party | Criminal Law s. 144 | Latvia |
| Luxembourg | All-party | Penal Code Art. 509-1 | Luxembourg |
| Netherlands | One-party | Criminal Code s. 139a | Netherlands |
| Poland | One-party | Penal Code Art. 267 | Poland |
| Portugal | All-party | Penal Code Art. 190-194 | Portugal |
| Romania | One-party | Criminal Code Art. 302 | Romania |
| Slovakia | All-party | Criminal Code s. 377 | Slovakia |
| Slovenia | All-party | Criminal Code Art. 137 | Slovenia |
| Spain | One-party | Criminal Code Art. 197 | Spain |
| Sweden | One-party | Criminal Code Ch. 4 s. 9a | Sweden |
Note on GDPR. Every country in this table is also subject to GDPR. A recording that is criminally lawful under the national consent rule in column two may still violate GDPR if the recorder lacks a valid lawful basis under Article 6, fails to provide a privacy notice, or retains the recording without a defined retention schedule. These obligations are separate and cumulative.
GDPR Enforcement Decisions on Recording
Two data protection authority decisions illustrate how GDPR operates in practice alongside member-state criminal law.
Denmark -- Datatilsynet v. TDC (2019). Denmark is a one-party consent country: a participant may record a call under Danish criminal law without telling the other party. Despite this, the Danish Data Protection Authority ruled against TDC A/S, Denmark's largest telecommunications company, because TDC recorded customer service calls for training purposes without a valid GDPR lawful basis. Telling callers that calls "may be recorded" was insufficient. Because training is not contract performance and not a legal obligation, TDC needed specific, affirmative consent from each caller. The ruling drew a bright line between criminal lawfulness and GDPR compliance.
Poland -- UODO v. Warsaw Centre (2022). Poland's data protection authority fined a public facility for operating cameras equipped with microphones that captured audio continuously. The facility had legal authorisation for video surveillance but none for audio recording. UODO held that audio capture requires its own independent legal basis under Article 6 GDPR, entirely separate from any authorisation covering video. The fine was modest (PLN 10,000) but the principle was clear: you cannot extend a video-surveillance legal basis to cover audio without a fresh analysis.
EDPB Guidelines 1/2024 on Legitimate Interests. In October 2024, the European Data Protection Board finalised its guidelines on Article 6(1)(f) legitimate interests. The guidelines confirm a three-step test: purpose test (is there a legitimate interest?), necessity test (is recording necessary for that purpose?), and balancing test (do the data subject's rights override the controller's interest?). Fraud prevention and evidence preservation can qualify; general quality monitoring and training, absent consent, require careful case-by-case analysis.
Recording Police and Public Officials Across the EU

Recording police officers and other public officials in the performance of their duties occupies a legally distinct space from recording private conversations. Two competing frameworks apply.
ECHR Article 10 supports recording police. The European Court of Human Rights has consistently treated filming and publishing recordings of public officials performing their duties as protected expression under Article 10 ECHR. Interference with that right must be necessary and proportionate. Member states cannot criminalise recording police solely on the basis that the officer did not consent.
GDPR applies to the recording itself. Publishing a recording of a police officer involves processing their personal data, even though they are acting in a public capacity. The CJEU's Buivids judgment (C-345/17, 2019) confirmed that filming police in a police station and publishing the footage online constitutes personal-data processing. However, the journalistic exemption under Article 85 GDPR (transposed differently in each member state) can justify such processing when the purpose is to inform the public.
Country variation is significant. France explicitly recognises the right to film police in public spaces under the principle of freedom of the press and freedom of expression. Spain's Citizen Security Law (Ley Organica 4/2015) imposes fines for unauthorised publication of images of police that could endanger their safety, though this provision has been criticised as disproportionate and is under ongoing legal challenge. Germany's general all-party consent rule applies even to recording police in public, although the criminal-law prohibition is tempered by constitutional freedom-of-expression considerations when recordings serve a clear public interest.
The practical rule across most of the EU is: recording police performing public duties in a publicly accessible space is generally protected expression. Publishing recordings that identify private individuals captured incidentally in the same scene may engage GDPR obligations. Using AI tools to analyse the recording for biometric identification of police officers in real time is prohibited under EU AI Act Article 5.
Cross-Border Recording Within the EU
Which Country's Criminal Law Applies?
There is no EU-level rule that resolves jurisdictional conflicts between member states' recording statutes. When a call or conversation crosses a national border, both countries' criminal laws can theoretically apply. A call between a person in Germany (all-party consent) and a person in the Netherlands (one-party consent) engages both legal systems simultaneously.
The safest approach is always to comply with the stricter of the applicable rules. In the Germany-Netherlands example, that means treating the call as subject to all-party consent requirements.
Cross-border enforcement is rare for individual private recordings. However, for businesses operating call centres or conducting sales calls across EU member states, the risk of violating the criminal law of a destination country is real and has resulted in enforcement action.
GDPR Territorial Scope
For GDPR purposes, Article 3 provides that the regulation applies wherever a controller or processor is established in the EU, or wherever the personal data of EU individuals is processed by a non-EU entity offering goods or services to EU individuals or monitoring their behaviour.
When a cross-border personal-data processing operation involves establishments in multiple member states, the lead supervisory authority mechanism under Articles 56 and 60 applies. The DPA at the controller's main EU establishment leads the investigation, with other concerned DPAs participating as co-supervisors.
In November 2025, the EU Council adopted a new regulation to accelerate cross-border GDPR enforcement, addressing longstanding frustrations with coordination timelines. The new rules introduce mandatory deadlines for lead supervisory authorities to share draft decisions with concerned DPAs, shortening the average enforcement timeline for cross-border cases.
Practical Guidance for Multinational Operations
Any multinational business recording calls from a base in a one-party-consent country while placing calls to customers in all-party-consent countries must build its recording policy around the all-party standard. Applying the permissive law of the recorder's home country to callers in stricter jurisdictions is the most common compliance mistake in EU call-centre operations.
For businesses subject to MiFID II (financial services), the legal obligation to record transaction-related communications under Article 16(7) of that directive provides a clear GDPR lawful basis under Article 6(1)(c). The recording obligation coexists with national criminal-law consent requirements: even MiFID II-regulated firms must give prior notice to callers in all-party consent jurisdictions.
Deepfakes and Synthetic Audio-Visual Content
The Evolving Legal Framework
Three distinct legal instruments now govern deepfakes and synthetic recordings in the EU, with overlapping but not identical scopes.
EU AI Act Article 50 (from 2 August 2026) requires anyone deploying an AI system to generate realistic synthetic audio, image, or video of a real person to label the output as AI-generated. The obligation applies at the technical level (machine-readable marking in the content's metadata) and, for deepfakes, also at the user-visible level (a clear on-screen or on-audio disclosure). Penalties reach EUR 15 million or 3 percent of global turnover.
Directive 2024/1385 Article 5 (transposition deadline 14 June 2027) criminalises the production and distribution of realistic AI-generated intimate images of a real person without consent. This covers deepfake pornography specifically and is not limited to content that is labelled or not labelled. The prohibition applies regardless of whether the content is disclosed as synthetic. Member states must provide victims with access to rapid removal orders and law enforcement assistance.
GDPR applies to deepfakes as personal data. A realistic AI-generated image or audio clip of an identifiable person constitutes personal data -- arguably biometric data if it enables identification. Processing requires a lawful basis, which deepfake production targeting a private individual without consent cannot normally satisfy.
Practical Consequences
For individuals, creating a realistic deepfake of another person without consent -- whether for intimate purposes or political disinformation -- will be criminally prohibited across the EU once Directive 2024/1385 is transposed, with civil and GDPR liability also available.
For platforms and AI tool providers, Article 50 AI Act means that tools generating realistic human voice or likeness must implement mandatory labelling infrastructure by 2 August 2026. The Commission's March 2026 draft Code of Practice on AI-Generated Content sets out technical standards for machine-readable watermarking and visible labelling.
Business Call Recording: Compliance Checklist
For any business recording calls with EU customers or employees, the following steps are required.
State the purpose at the start of every call. Tell the caller that the call is being recorded, explain why, and specify how long the recording will be kept. Vague statements like "calls may be recorded for quality purposes" do not satisfy GDPR's transparency requirement.
Identify your lawful basis before recording begins. The most defensible bases for business recording are: legal obligation (MiFID II, EMIR, or equivalent sector regulation), contract performance (if recording is necessary to fulfil a specific contract term), or legitimate interests (for fraud prevention, evidence preservation, or regulatory dispute resolution -- supported by a documented balancing test).
Give callers in all-party-consent countries specific notice and, for consent-based recording, a genuine choice. A caller in Germany, France, Austria, Greece, Portugal, Croatia, Slovakia, Slovenia, Luxembourg, Hungary, or Cyprus must be told clearly and given an opportunity to exercise their rights before any recording begins.
Set and enforce a retention schedule. Unlimited retention is never compliant. Define maximum retention periods tied to the specific purpose of each recording, document those periods in your data-retention policy, and implement automated deletion.
Conduct a Data Protection Impact Assessment. Systematic or large-scale call recording qualifies as high-risk processing under Article 35 GDPR, requiring a DPIA before the programme is launched or significantly changed.
Address AI Act requirements for any AI-assisted recording. If call-recording software uses AI to transcribe, analyse sentiment, identify speakers, or generate summaries, assess whether those AI functions fall within the EU AI Act's prohibited, high-risk, or transparency-obligation categories. Real-time biometric identification using voice to identify unknown individuals is prohibited under Article 5 AI Act.
How to Find Your Country's Specific Rules
For the detailed statutory text, criminal penalties, recent DPA enforcement actions, court decisions, and practical guidance for each EU member state, use the country spokes below. Each page covers the specific criminal statute, GDPR implementation, ePrivacy transposition, and the latest developments in that jurisdiction.

EU Member States with Country Guides:
Austria Recording Laws -- All-party consent, StGB section 120, up to one year imprisonment.
Belgium Recording Laws -- One-party consent, Article 314bis Criminal Code, GDPR enforcement by the APD.
Bulgaria Recording Laws -- All-party consent, Criminal Code Article 145, GDPR applied through CPDP.
Croatia Recording Laws -- All-party consent, Criminal Code Article 142, GDPR applied through AZOP.
Cyprus Recording Laws -- All-party consent under constitutional Articles 15 and 17, GDPR applied through the Commissioner for PDPD.
Czech Republic Recording Laws -- One-party consent, Criminal Code section 182, UOOU oversight.
Denmark Recording Laws -- One-party consent, Criminal Code section 263, Datatilsynet active GDPR enforcement.
Estonia Recording Laws -- One-party consent, Penal Code sections 156-157, AKI oversight.
Finland Recording Laws -- One-party consent, Criminal Code Chapter 38, Tietosuojavaltuutetun toimisto guidance.
France Recording Laws -- All-party consent, Code Penal Article 226-1, CNIL active enforcement.
Germany Recording Laws -- All-party consent, StGB section 201, up to three years imprisonment, BfDI and state DPAs.
Greece Recording Laws -- All-party consent, Penal Code Article 370A, up to five years imprisonment, HDPA.
Hungary Recording Laws -- All-party consent, Criminal Code section 422, NAIH oversight.
Ireland Recording Laws -- One-party consent (no explicit criminal prohibition on participant recording), DPC active GDPR enforcement.
Italy Recording Laws -- One-party consent, Criminal Code Article 617, Garante active enforcement.
Latvia Recording Laws -- One-party consent, Criminal Law section 144, DVI oversight.
Luxembourg Recording Laws -- All-party consent, Penal Code Article 509-1, CNPD oversight.
Netherlands Recording Laws -- One-party consent, Criminal Code section 139a, AP active enforcement.
Poland Recording Laws -- One-party consent, Penal Code Article 267, UODO active enforcement.
Portugal Recording Laws -- All-party consent, Penal Code Articles 190-194, CNPD oversight.
Romania Recording Laws -- One-party consent, Criminal Code Article 302, ANSPDCP oversight.
Slovakia Recording Laws -- All-party consent, Criminal Code section 377, UOOU oversight.
Slovenia Recording Laws -- All-party consent, Criminal Code Article 137, IP RS oversight.
Spain Recording Laws -- One-party consent, Criminal Code Article 197, AEPD active enforcement.
Sweden Recording Laws -- One-party consent, Criminal Code Chapter 4 section 9a, IMY oversight.
Sources and References
- GDPR Regulation 2016/679(eur-lex.europa.eu).gov
- ePrivacy Directive 2002/58/EC(eur-lex.europa.eu).gov
- EU AI Act Regulation 2024/1689(eur-lex.europa.eu).gov
- Directive 2024/1385 Violence Against Women(eur-lex.europa.eu).gov
- EDPB Guidelines 1/2024 Legitimate Interest(edpb.europa.eu).gov
- Commission Guidelines on Prohibited AI Practices(digital-strategy.ec.europa.eu).gov
- Von Hannover v. Germany No. 2 (ECHR 2012)(hudoc.echr.coe.int).gov
- Buivids v. Latvia C-345/17 (CJEU 2019)(eur-lex.europa.eu).gov
- UODO Poland Audio Recording Requires Legal Basis (2022)(edpb.europa.eu).gov
- Code Penal Art. 226-1 France(legifrance.gouv.fr).gov
- Finnish DPO on Phone Calls(tietosuoja.fi).gov
- EDPS ePrivacy Directive Overview(edps.europa.eu).gov
- Council Cross-Border GDPR Enforcement 2025(consilium.europa.eu).gov
- Commission Code of Practice AI-Generated Content 2026(digital-strategy.ec.europa.eu).gov