Our Expert in Denmark
No results available
Denmark’s 2026 reforms to its copyright and personality‑rights framework have made it one of the first countries in the world to grant individuals copyright‑style protection over their facial features, voice, and other distinguishing personal characteristics when those characteristics are reproduced in AI‑generated content. For any business that uses real people’s likenesses in advertising, trains machine‑learning models on image or audio datasets, or publishes AI‑generated content depicting identifiable individuals, the deepfake law Denmark reforms demand immediate operational changes. This guide translates the legislation into a practitioner‑level compliance checklist covering contracts, consent workflows, AI training‑data governance and ad‑production pipelines, so that in‑house counsel, marketing teams, AI product owners and creative agencies can act now rather than wait for the first enforcement action.
Key compliance priorities at a glance:
The reforms, which amend Denmark’s existing copyright framework and introduce complementary provisions on personality rights, create a new category of protection that industry observers have described as “copyright for likeness.” According to analysis published by the European Parliament Research Service (EPRS), Denmark’s approach is the most far‑reaching in the EU, granting natural persons an exclusive, enforceable right over the commercial and non‑commercial reproduction of their personal characteristics, including facial geometry, voice timbre, gait and other biometric markers, when that reproduction is generated or substantially assisted by artificial intelligence.
The reform does not fold likeness into traditional copyright (which protects original creative works). Instead, it creates a sui generis right, modelled on copyright‑style remedies, that attaches automatically to every natural person without registration. The practical effect is that the right‑holder, the person whose face, voice or body is reproduced, can authorise or refuse reproduction, distribution, public communication and adaptation of AI‑generated depictions of their personal characteristics, mirroring the bundle of exclusive rights available to authors under the Danish Copyright Act (Ophavsretsloven).
Personality rights Denmark has historically protected through general tort law and the Marketing Practices Act (Markedsføringsloven). The 2026 reforms significantly expand the scope by explicitly listing the following protected characteristics:
Crucially, protection extends to synthetic reproductions, meaning a fully AI‑generated image or voice clone that is recognisably derived from a real individual falls within scope, even if it was never directly copied from a photograph or recording.
| Date | Event | Practical implication |
|---|---|---|
| June 2025 | Danish government announces proposal to grant copyright‑style likeness protections | Businesses given notice of upcoming compliance obligations |
| Late 2025 | Parliamentary passage of amending legislation | Final text of reforms confirmed; businesses should begin contract and workflow review |
| Early 2026 | Reforms enter into force | All covered uses of faces, voices and biometric characteristics must comply; enforcement begins |
The scope of the reforms is intentionally broad. Any entity that creates, commissions, distributes or facilitates AI‑generated content reproducing a person’s protected characteristics can be liable. The following table maps the most common entity types to their primary obligations under Denmark’s 2026 framework.
| Entity type | Key obligations under Denmark 2026 reforms | Practical compliance steps & risk |
|---|---|---|
| Brand / Advertiser | Ensure lawful licence or consent before creating or distributing any likeness or voice; preserve proof of authorisation | Update talent releases; integrate legal sign‑off into every ad creative; remove content promptly on request |
| AI model trainer / vendor | Obtain licences for training data that includes identifiable likenesses; maintain provenance logs | Audit all datasets; add contract warranties and indemnities; implement exclusion lists for opted‑out individuals |
| Platform / Publisher | Take reasonable steps to remove infringing AI deepfakes on notice; cooperate with takedown orders | Implement rapid takedown process and record‑keeping; update terms of service to reflect new obligations |
| Creative agency | Verify that every likeness used in client work is properly licensed under the 2026 rules | Insert upstream representations from talent; carry professional indemnity insurance covering likeness claims |
| HR / ID‑verification provider | Biometric data used for verification may trigger the reforms if reproduced or stored as AI‑generated content | Review data‑processing agreements (DPAs); limit retention; align with GDPR and personality‑rights requirements |
Even businesses headquartered outside Denmark may be caught. Industry observers expect enforcement to follow a “targeting” test similar to GDPR: if your AI‑generated content depicts a person whose protected characteristics are recognisable and you target Danish consumers, the reforms are likely to apply regardless of where your servers or legal entity are located.
This operational checklist is the core of the compliance playbook. Each item addresses a discrete obligation or risk area. Businesses should work through the list systematically, assigning ownership to specific roles and tracking completion against deadlines.
The question of AI training consent Denmark now requires is among the most commercially significant aspects of the reforms. Businesses that train generative models on image, video or audio datasets must evaluate whether those datasets contain identifiable personal characteristics, and if they do, whether they hold a valid licence or consent to reproduce them.
A traditional photographic model release typically grants the licensee permission to reproduce a specific image. Under the 2026 reforms, however, the act of training an AI model on that image, and subsequently generating new synthetic depictions, constitutes a separate reproduction of the person’s protected characteristics. This means a standard release may be insufficient. Businesses should seek an express, forward‑looking licence covering AI training, synthetic generation and distribution, with clear territorial and temporal scope.
One compliance strategy is to de‑identify training data before ingestion, stripping or blurring faces, pitch‑shifting voices, and removing metadata that could link data points to identifiable individuals. While de‑identification can reduce risk, it does not eliminate it: if the resulting AI output is still capable of generating content that is recognisably derived from a specific person, the personality‑rights obligation may still apply. Technical controls should therefore be combined with legal safeguards, not treated as a substitute.
Many AI training pipelines operate across borders. A dataset assembled in the United States, processed in Ireland and used to generate content distributed in Denmark can trigger the Danish reforms if the output reproduces the protected characteristics of an identifiable person. The European Parliament’s EPRS analysis notes that Denmark’s approach could serve as a model for future EU‑wide harmonisation, increasing the likelihood that similar obligations will apply across the single market. Businesses should plan for the strictest applicable standard and build compliance processes that travel with the data.
For marketing teams, the reforms transform how campaigns are produced. Using faces in ads in Denmark, whether through photography, AI‑assisted retouching or fully synthetic generation, now requires documented authorisation that covers the specific personality‑rights obligations introduced by the 2026 reforms.
A pre‑2026 model release is likely sufficient only where the content involves a traditional photograph used in its original form, with no AI‑generated adaptation or synthetic extension. The moment an AI tool is used to alter, animate or extend the depiction, even subtly, such as generating additional poses from a single photograph, the enhanced personality‑rights protections are engaged and supplemental authorisation is needed.
Whether voice cloning is legal in Denmark depends entirely on authorisation. Cloning a person’s voice to generate synthetic speech for a commercial without their express consent is now actionable under the reforms. Endorsement agreements should specify whether the endorser’s voice may be cloned, the permitted uses, and the endorser’s right to approve or reject individual outputs. This is especially relevant for podcast advertising, automated customer‑service agents and interactive media that use voice synthesis.
A three‑step approval process is recommended for every campaign asset:
Understanding the enforcement landscape is essential for calibrating risk. The Denmark deepfake law reforms provide a layered enforcement structure combining private civil remedies with administrative sanctions and, in serious cases, criminal liability under existing penal provisions.
| Enforcement route | Mechanism | Practical exposure |
|---|---|---|
| Civil remedies | Injunctions, takedown orders, damages (compensatory and in some cases punitive), account of profits | Private litigation initiated by the rights‑holder; costs and reputational risk can be significant |
| Administrative sanctions | Regulatory orders via the Danish Patent and Trademark Office (DKPTO) or relevant authority; potential fines for non‑compliance with takedown orders | Administrative fines and orders; platforms face escalating penalties for repeated non‑compliance |
| Criminal liability | Existing penal provisions on identity fraud, harassment and image‑based abuse may apply where deepfakes are used maliciously | Individual criminal prosecution; applies alongside civil and administrative routes |
Platforms and publishers bear particular exposure. If a platform receives a valid takedown notice and fails to act within a reasonable time, early indications suggest that Danish authorities are prepared to treat continued hosting as a separate infringement. Businesses should preserve forensic evidence (timestamps, metadata, access logs) from the moment a complaint is received, both to demonstrate good‑faith compliance and to defend against claims of wilful infringement.
The following templates are provided as starting points. Each must be reviewed and adapted by qualified legal counsel before use in any binding agreement.
The following risk matrix helps teams quickly assess the compliance posture of common use cases involving AI‑generated content under the Denmark deepfake law reforms.
| Use case | Risk level | Recommended control |
|---|---|---|
| Fully synthetic, non‑identifiable faces generated from noise | Low | Standard production sign‑off; document that output is non‑identifiable; retain model provenance logs |
| AI‑enhanced photographs (retouching, background extension) of consented models | Medium | Verify that existing release covers AI adaptation; obtain supplemental consent if not; legal review |
| Voice cloning of a known individual for commercial content | High | Obtain express written consent covering cloning, specific outputs and distribution channels; senior legal approval required |
| Training a generative model on a dataset containing identifiable faces/voices | High | Full dataset audit; obtain licences or consents; implement exclusion‑list process; vendor due diligence; legal sign‑off |
| Distributing AI‑generated video depicting a recognisable public figure | Very high | Do not proceed without explicit, documented consent from the individual; board‑level approval; specialist legal advice essential |
Decision flow (simplified): (1) Does the content depict an identifiable person? If no → proceed with standard review. If yes → (2) Do you hold a valid, 2026‑compliant licence or consent? If yes → (3) Does the intended use fall within the scope of that consent? If yes → proceed with legal sign‑off. At any point where the answer is “no” or “uncertain,” escalate to legal counsel before proceeding.
Compliance is not a one‑time exercise. The following 30/60/90‑day roadmap translates the checklist into an implementation plan.
Industry observers expect Danish enforcement to intensify through the remainder of 2026 as rights‑holders and advocacy groups test the new framework. Early compliance is not merely a legal obligation; it is a competitive advantage. Businesses that can demonstrate robust personality‑rights governance will be better positioned to secure talent partnerships, negotiate favourable data‑licensing terms and avoid costly enforcement actions.
Denmark’s 2026 personality‑rights and deepfake law reforms represent a paradigm shift for any business that uses people’s faces, voices or biometric characteristics in content creation, advertising or AI development. The legislative intent is clear: individuals must have meaningful control over how their personal characteristics are reproduced by AI, and businesses that fail to secure lawful consent or licensing face civil claims, regulatory sanctions and reputational harm. In‑house counsel and compliance leaders should treat this as a board‑level priority, work through the 12‑point checklist without delay, and engage specialist intellectual property counsel with cross‑border experience to ensure their policies, contracts and operational workflows meet the standard that the deepfake law Denmark reforms now require.
Explore the international intellectual property guide or consult the Global Law Experts lawyer directory to connect with qualified advisors.
This article was produced by Global Law Experts. For specialist advice on this topic, contact Kim Larsen, a member of the Global Law Experts network.
posted 14 minutes ago
posted 24 minutes ago
posted 38 minutes ago
posted 48 minutes ago
posted 1 hour ago
posted 2 hours ago
posted 2 hours ago
posted 2 hours ago
posted 2 hours ago
posted 2 hours ago
posted 2 hours ago
posted 3 hours ago
No results available
Find the right Advisory Expert for your business
Sign up for the latest advisor briefings and news within Global Advisory Experts’ community, as well as a whole host of features, editorial and conference updates direct to your email inbox.
Naturally you can unsubscribe at any time.
Global Law Experts is dedicated to providing exceptional legal services to clients around the world. With a vast network of highly skilled and experienced lawyers, we are committed to delivering innovative and tailored solutions to meet the diverse needs of our clients in various jurisdictions.
Global Advisory Experts is dedicated to providing exceptional advisory services to clients around the world. With a vast network of highly skilled and experienced advisors, we are committed to delivering innovative and tailored solutions to meet the diverse needs of our clients in various jurisdictions.
Send welcome message