27 October 2025

Innovation Law Insights

27 October 2025
Digital Intersections: The EDPB Guidelines on the interplay between the DSA and the GDPR

Regulation (EU) 2022/2065 on digital services (hereinafter, DSA) and the General Data Protection Regulation (EU) 2016/679 (hereinafter, GDPR) are designed to redefine the responsibility of online intermediaries and the protection of personal data in the digital economy.

With the adoption of Guidelines 3/2025 on the interplay between the Digital Services Act and the GDPR (hereinafter, the Guidelines), the European Data Protection Board (hereinafter, EDPB) has embarked on a path of systematic interpretation aimed at clarifying how the two regulations should interact in a consistent and mutually integrated manner.

The stated objective is twofold:

  • to ensure that the provisions of the DSA involving the processing of personal data are applied in accordance with the GDPR and,
  • at the same time, to ensure that the implementation of the GDPR takes into account the new liability dynamics introduced by the DSA.

The Guidelines also emphasize the need for structured cooperation between Digital Services Coordinators, Data Protection Authorities, and the European Commission, which is essential to prevent overlapping competences, interpretative misalignments, and risks of bis in idem in the application of the two regulatory regimes.

In this perspective, the EDPB does not limit itself to providing technical clarifications, but outlines a model of integrated digital governance, in which the protection of personal data, the transparency of decision-making processes, and the responsibility of platforms converge towards a single architecture of trust and accountability, destined to become the hallmark of European regulation of online services.

  1. Two sides of the same coin

The DSA and the GDPR are two European digital law pillars: the former aims to build a safe and transparent online space, while the latter ensures that all the processing of personal data activities comply with the principles of lawfulness, proportionality, and data minimization. In force since February 17, 2024, the DSA applies to all online intermediary service providers, imposing enhanced obligations on Very Large Online Platforms (VLOPs) and Very Large Online Search Engines (VLOSEs). The Guidelines clarify that the DSA is not lex specialis with respect to the GDPR: both, of equal rank, must be applied consistently. Article 2(4)(g) and recital 10 DSA confirm that the protection of personal data remains governed by the GDPR and the ePrivacy Directive. Any DSA obligation involving data processing must therefore comply with the GDPR; emblematic in this regard are Article 26(3) DSA on the prohibition of advertising based on special categories of data and Article 28(2) DSA on the reference to profiling in Article 4(4) GDPR. Supervision is entrusted to the Digital Services Coordinators (DSC) in the Member States and to the European Commission for large operators, assisted by the European Board for Digital Services (EBDS). The Guidelines promote structured cooperation between the EBDS and the EDPB, based on the principle of loyal cooperation, to avoid duplication and conflicts of interpretation. In this balance, the DSA structures the responsibility and transparency of platforms, while the GDPR continues to protect fundamental rights: two complementary instruments of the same architecture of trust in the European digital ecosystem.

  1. Voluntary detection of illegal content: balancing platform responsibility and GDPR principles

Article 7 of the DSA allows intermediary service providers - including hosting services, online platforms, and VLOPs - to voluntarily identify and remove illegal content without losing the exemptions from liability referred to in Articles 4–6, provided that they fully comply with the GDPR and its principles of lawfulness, fairness, transparency, and proportionality. Detection techniques based on machine learning or automatic recognition involve potentially invasive processing: the EDPB warns that they may amount to systematic monitoring, with risks of errors, bias, and undue restrictions on freedom of expression, and requires effective human oversight to ensure the proportionality and legitimacy of the intervention.

The Guidelines, all in all, distinguish between two scenarios:

  • if the provider acts on its own initiative, the legal basis is Article 6(1)(f) GDPR and so the controller legitimate interest, applicable only if the processing is necessary, proportionate, and the rights of the data subjects do not prevail, subject to documentation of the balancing test and clear information pursuant to Articles 13 and 14;
  • if the processing arises from a legal obligation, the basis is Article 6(1)(c) GDPR, subject to clear and proportionate rules and verification of compliance with Articles 9 and 10 DSA. Recital 56 confirms that Article 7 does not constitute an autonomous basis for profiling or detecting crimes.

Automated decisions, for their part, require real human control, as mere formal supervision is not sufficient to overcome the prohibition in Article 22(1) GDPR. Articles 14 and 17 DSA reinforce transparency by requiring a statement of reasons for any limitation or removal of content, specifying whether the intervention is automated or voluntary. Since these activities involve high-risk criteria (profiling, systematic monitoring, automated decisions), providers must conduct a Data Protection Impact Assessment (hereinafter, DPIA) pursuant to Article 35 GDPR and, if necessary, consult the authority. Article 7 DSA thus becomes a test bed for digital due diligence: it promotes security and trust only if platforms fully comply with the guarantees of the GDPR.

  1. Reports, complaints, and abuse: data processing in “notice and action” mechanisms

Article 16 DSA requires hosting providers to set up accessible and transparent electronic mechanisms for reporting illegal content (notice and action), including through trusted flaggers. These processes involve the processing of the personal data of the notifier, the recipient, and any third parties, in relation to whom the provider acts as data controller in full compliance with the GDPR.

  • The data collected – ideally only name and email address – must be limited to what is necessary, in line with Article 5(1)(c) of the GDPR, ensuring the possibility of anonymous reports unless identity is essential, as in the case of copyright infringements.
  • The identity of the notifier may only be disclosed if strictly necessary and after providing information in accordance with Article 13 of the GDPR.
  • Article 16(6) DSA allows the use of automated systems for managing reports, but decisions with legal effects must include effective human intervention and comply with the safeguards in Article 22 GDPR.
  • Article 17 DSA reinforces transparency by requiring a statement of reasons for any removal or restriction of content, specifying whether the intervention was automated.
  • Articles 20 and 23 of the DSA complete the framework: the former recognizes the right of complainants and recipients to lodge a complaint, which cannot be decided solely by algorithm; the latter allows for temporary suspension for abuse, provided that it is based on accurate data, proportionate, and complies with the principles of minimization and fairness.
  1. Dark patterns and the right to digital self-determination

With Article 25, the DSA introduces a key principle: digital interfaces must be designed in such a way that they do not compromise users' ability to make autonomous and informed decisions. Ethical design thus becomes a legal parameter, aimed at preventing behavioral manipulation that affects freedom of choice and the protection of personal data. The Guidelines define deceptive design patterns (also known as dark patterns) as interface patterns that induce unwanted choices, often to facilitate data collection or prolong online interaction. When such practices reduce user awareness or alter consent, they violate the principles of lawfulness, fairness, and transparency of the GDPR referred to in Article 5(1)(a), making the processing unlawful ab origine, regardless of formal consent. The prohibition in Article 25(2) DSA is coordinated with the GDPR and the Unfair Commercial Practices Directive, extending protection even to cases where manipulation does not involve the processing of personal data. When, on the other hand, design influences the collection or use of data, jurisdiction lies with the privacy authorities, which must assess its compliance with the principles of lawfulness, minimization, and privacy by design. Particular attention is paid to addictive design (infinite scrolling, autoplay, gamification), which exploits psychological levers to prolong the use of platforms and, based on the analysis of behavioral data, falls fully within the scope of the GDPR.

  1. Advertising, profiling, and recommendation algorithms

The DSA makes transparency a structural principle of the new digital economy: Articles 26, 27, and 38 outline a system aimed at making the algorithmic logic behind advertising and content recommendations understandable, controllable, and compliant with the GDPR.

  • Article 26 DSA requires platform providers to indicate, for each advertisement, the identity of the advertiser, the targeting parameters, and how to modify them, introducing operational transparency that complements Articles 13 and 14 of the GDPR. The requirements of lawfulness, consent, and the right to object under the GDPR and the ePrivacy Directive remain unchanged. The DSA also provides for an absolute ban on advertising based on special categories of data under Article 26(3) DSA, even with explicit consent, removing sensitive data from market logic and protecting digital dignity. Security and confidentiality measures under Article 26(4) DSA must ensure the minimization of information flows and the effective application of the principle of privacy by design.
  • Article 27 DSA extends these obligations to recommendation systems, requiring the criteria determining the priority of content to be specified in the terms of service and allowing users to modify them. When such systems are based on personal data, they constitute a form of profiling within the meaning of Article 4(4) GDPR, subject to the safeguards of Articles 5, 12–14, and 22, which ensure transparency on the logic and effects of the process.
  • Finally, Article 38 DSA requires VLOPs and VLOSEs to offer at least one option that is not based on profiling, presented without nudging and accompanied by a prohibition on collecting data for predictive purposes, in implementation of Article 25 GDPR. Taken together, these articles outline a model of informational accountability based on algorithmic transparency as a tool for digital sovereignty, balancing freedom of choice, data protection, and trust in the online environment.
  1. Protecting minors in the digital ecosystem

The protection of minors is one of the cornerstones of the DSA, which requires providers of platforms accessible to minors to ensure high standards of privacy, security, and protection through technical and organizational measures proportionate to the risks. Article 28 introduces a principle of proactive responsibility: platforms must prevent and mitigate the risks arising from their services - such as exposure to harmful content, undue data collection, or addictive design practices - through tools such as technical standards, codes of conduct, or parental controls, in accordance with the principles of necessity, and minimization. Article 28(3) excludes the obligation to process additional data to verify age, avoiding systematic identification practices. Paragraph 2 prohibits advertising based on profiling when the platform is “reasonably certain” that the user is a minor, in line with Articles 9(1) and 26(3) of the DSA, to prevent the exploitation of cognitive vulnerability and protect digital dignity without imposing new data collection. When protective measures involve data processing, the legal basis may derive from Article 6(1)(c) of the GDPR, provided that the processing is strictly necessary and proportionate; the use of biometric data or special categories of data is excluded, except for the exceptions in Article 9(2). The EDPB recommends privacy-preserving age assurance systems - such as self-certification, parental confirmation, or zero-knowledge proofs - that allow for non-invasive verification without data retention. Compliance with Article 28 is thus achieved through data protection by design and by default, preventing the protection of minors from translating into new profiling risks. For VLOPs, these obligations are integrated with those of systemic risk assessment and mitigation provided for in Articles 34 and 35 of the DSA.

  1. Governing risk and building trust

Articles 34 and 35 of the DSA introduce a key principle of the new digital governance: VLOPs and VLOSEs must identify, assess, and mitigate the systemic risks generated by their services. This is a paradigm shift: from reactive to proactive responsibility, based on the prevention of the collective impacts that algorithmic architectures can have on society, democracy, and fundamental rights. The DSA identifies various categories of risk - dissemination of illegal content, negative effects on fundamental rights, threats to health and safety, gender-based violence, damage to mental and physical well-being - making “systemic risk” a structural responsibility. Since such risks often arise from the processing of personal data, management must comply with the principles of lawfulness, transparency, and proportionality of the GDPR. The link between the two regulations is clear: the DPIA in Article 35 of the GDPR complements the assessment of systemic risks in Article 34 of the DSA, becoming indispensable when the social impact arises from data processing. Article 35 DSA requires reasonable, proportionate, and effective measures consistent with the principles of privacy by design in line with Article 25 GDPR and security of processing under Article 32, including periodic testing, algorithm review, and protection of minors under Article 35(1)(j) through age assurance and parental control. Articles 45–47 complete the picture: codes of conduct, in parallel with Article 40 GDPR, promote cooperation between authorities, businesses, and civil society, translating legal obligations into verifiable standards based on periodic reporting; Article 46 strengthens transparency in the advertising sector, while Article 47 introduces a principle of universal and inclusive accessibility of digital interfaces.

Author: Giulio Napolitano

 

Italian Senate Commission Publishes Updated Text of Draft Bill No. 1136 on the Protection of Minors in the Digital Environment

The Italian Senate Commission has recently published the updated text of Draft Bill No. 1136 (the Draft Bill), entitled “Provisions for the Protection of Minors in the Digital Environment”.

The Draft Bill focuses on key aspects of the protection of minors online, in particular the age for providing valid consent and the regulation of minors’ participation in online promotional activities.

Although the Draft Bill is still under discussion in the Italian Senate – and therefore remains subject to amendments or may not be adopted in its entirety – it still provides a clear indication of the legislative trend toward stronger safeguards for minors in the digital environment in Italy.

Key Innovations Introduced by the Draft Bill

  1. Minimum Age for Registration on Social Networks and Video-Sharing Platforms

Article 2 of the Draft Bill introduces a new provision setting the minimum age for creating an account on social networks and video-sharing platforms at 15 years.

This represents a tightening of the current framework, which currently allows minors aged fourteen and above to register. Contracts concluded in breach of this age limit will be deemed null and void, and any related processing of personal data would consequently be unlawful, as the legal basis of consent would not apply.

The Draft Bill also addresses the issue of user identification, which is to be carried out through a national digital “mini-wallet”, expected to be implemented and fully operational by 30 June 2026. This system is aligned with the EU Digital Identity Wallet, aiming to ensure secure and reliable identification mechanisms while maintaining strong privacy safeguards.

  1. Consent of Minors in the Context of Information Society Services

In addition to the specific rules for social networks, the Draft Bill proposes to amend Article 2-quinquies of the Italian Privacy Code, raising from 14 to 16 years the general minimum age for valid consent to the processing of personal data in the context of information society services.

Accordingly, the Draft Bill introduces a dual-track model:

  • 15 years for registration on social networks and video-sharing platforms;
  • 16 years for other online services.

Below these thresholds, consent must be provided by the holder of parental responsibility.

  1. Regulation of Promotional Activities Conducted Online by Minors

The Draft Bill also addresses the growing phenomenon of promotional and influencer activities carried out by minors.

While the text does not establish a detailed regulatory framework, it entrusts the Italian Communications Authority (AGCOM) with issuing, within 180 days of the law’s entry into force, specific guidelines defining:

  • the scope of promotional activities carried out by minors online;
  • transparency and accuracy requirements;
  • safeguards for the protection of minors and their fundamental rights; and
  • rules on commercial communications and product placement, ensuring that any promotional intent is clearly disclosed.

These guidelines are intended to complement AGCOM’s existing framework on influencer marketing by introducing additional and more targeted rules for influencers aged 15 to 18, with the aim of reducing minors’ exposure to harmful or high-risk content. In any case, given the rapid growth of the phenomenon and its increasing proximity to sensitive sectors, it is likely that specific guidance on underage influencers will be issued in any event.

  1. Measures to Strengthen the Digital Safety of Minors

The Draft Bill also provides that the Fund for the Digital Safety of Minors, established within the Ministry of Enterprises and Made in Italy, will finance information and awareness campaigns targeting both minors and parents. These initiatives are intended to promote responsible internet use, the dissemination of parental control tools, and the prevention of online risks.

Conclusions

Although still under parliamentary scrutiny, Draft Bill No. 1136 represents a significant step forward in the evolving landscape of digital child protection in Italy.

The proposed measures reflect a broader policy shift towards enhanced age verification, informed consent, and regulation of minors’ commercial presence online. As such, regardless of the adoption of this specific piece of legislation, it will be important to monitor the parliamentary debate and forthcoming regulatory developments closely, as further legislative initiatives may follow in this rapidly evolving area and further amend the Italian landscape.

Author: Federico Toscani

 

Gaming and Gambling

Gibson v Betfair – Landmark gambling claim reaches the Court of Appeal

On 7-8 October 2025, the English Court of Appeal (CoA) heard the appeal of a gambler, Lee Gibson, of a decision of the High Court which dismissed his claims to recover his gambling losses (GBP1.5 million) from Betfair. Mr Gibson advanced claims for breach of statutory duty, breach of contract (including implied terms relating to the Licensing Conditions and Codes of Practice (LCCP), and negligence.

If the appeal is successful, it has the potential to open the floodgates to mass claims in England against all operators, as claimant firms encourage players to seek to recover their losses. Whilst we expect the appeal to be dismissed, the judgment will provide critical commentary on significant legal issues for the gambling industry.

Facts

Mr Gibson’s claim centred on whether Betfair breached its licence obligations, and whether such a breach could give rise to a personal right of action in contract or tort.

On the particular facts of the case, Mr Gibson argued that Betfair’s responsible gambling policies were inadequate and that Betfair should have done more to assess his capacity to sustain losses.  The CoA was receptive to the policy intention of protecting vulnerable gamblers but noted that Mr Gibson had passed all AML checks and appeared able to afford his losses. Experts at trial had described Betfair’s policies as “industry leading”.

However, the points of law that the court will now decide will be of general application to operators.  The CoA will rule on:

  • whether Betfair knew, or ought to have known, that Mr Gibson was a problem gambler, and whether knowing that will trigger any contractual or tortious duties.
  • whether compliance with the LCCP can be implied as a contractual term.
  • whether it would be fair or reasonable to impose a common law duty of care on an operator, both on the facts of this case and given the difficulties in identifying problem gamblers.
  • the “illegality” argument (i.e. that all bets should be void if Betfair breached its licence), albeit the Court was sceptical, noting that such an approach would have far-reaching and arguably unintended consequences.

We anticipate the judgment will be published in the next few months.

Implications for operators

The outcome of the appeal is likely to reinforce the current legal position, namely that gambling operators are not generally liable for a customer’s losses unless there is clear evidence that the operator knew of a gambling problem and failed to act appropriately.

This means:

  • There is still a high threshold for customer claims based on an operator's alleged breach of its responsible gambling duties.
  • To meet any such claims, operators should i) maintain robust responsible gambling policies, ii) keep a clear record of interventions and AML checks and ii) facilitate ongoing staff training and policy review.

However, Mr Gibson's claim was fact specific.  It may be that Mr Gibson's appeal is unsuccessful but that the judgment accepts a part of the argument in principle.  This is particularly so where responsible gambling policies have moved on since Mr Gibson's time, as well as the licencing framework and responsible gambling expectations generally. 

Should you wish to discuss the case in more detail, please do not hesitate to contact the authors of this article or your regular DLA Piper contact.

Author: Jeremy Sher, Siona Spillett, Giulio Coraggio, Benjamin Fellows

 

Technology, Media and Telecommunications

Launch of a Public Consultation by BEREC on the Draft Work Programme 2026

BEREC (Body of European Regulators for Electronic Communications) has recently launched a public consultation in view of the adoption of its Work Programme for 2026, which is scheduled for December 2025.

The purpose of this consultation is to gather feedback from interested stakeholders on the Draft BEREC Work Programme 2026, i.e. the draft version of BEREC’s annual work plan outlining the priorities identified by the organisation for 2026.

Among the general objectives pursued by the Work Programme is the alignment with the priorities set by the European Commission in the field of electronic communications.

More specifically, the Draft BEREC Work Programme 2026 sets out a series of projects planned by BEREC in line with the five high-level priorities identified in the BEREC Strategy 2026 – 2030. These principles – defined as “strategic priorities” – are as follows:

  1. Promoting full connectivity and the Digital Single Market;
  2. Supporting competition-driven and open digital ecosystems;
  3. Empowering end-users;
  4. Contributing to environmentally sustainable, secure and resilient digital infrastructures;
  5. Strengthening BEREC’s capabilities and continuous improvement.

The initiatives envisaged under the first strategic priority – “Promoting full connectivity and the Digital Single Market” – are generally aimed at fostering connectivity across all key infrastructures (land, space and submarine), including fixed, wireless, emerging virtual and cloud-related networks, as well as digital infrastructures and emerging technologies. Among the planned initiatives are a Call for Input on Application Programming Interfaces (APIs) for mobile network functionalities; a Fact-Finding Report on competition indicators and regulatory experiences across different jurisdictions; and the "BEREC Report on Virtual Worlds and Web 4.0", which will analyse the potential economic and regulatory impact of virtual worlds, the metaverse and immersive technologies.

Under the second strategic priority – “Supporting competition-driven and open digital ecosystems” – BEREC aims to promote open and competitive digital environments, based on the premise expressed in the draft that “open and competitive markets are fundamental drivers of innovation, investment and end-user welfare". In this context, BEREC intends to contribute to the implementation of the so-called Data Act (Regulation (EU) 2023/2854) and to continue cooperating with the European Commission in the implementation and assessment of developments related to the Digital Markets Act. BEREC also plans to publish a report on Artificial Intelligence (AI), analysing how the use of AI may affect competitive dynamics, the openness of the Internet, and users’ rights.

With regard to the third strategic priority – “Empowering end-users” – BEREC aims, among other things, to prepare a report analysing the difficulties that users continue to face in relation to switching and contract termination processes. BEREC also states its intention to continue activities related to the implementation of the Open Internet Regulation (Regulation (EU) 2015/2120) and the BEREC Open Internet Guidelines.

In the fourth section of the Draft, concerning the priority “Contributing to environmentally sustainable, secure and resilient digital infrastructures”, BEREC lists projects designed to support the development of sustainable, secure and resilient digital infrastructures. Among these initiatives is the collection of input from electronic communications operators with a view to developing a future e EU Code of Conduct for the sustainability of telecommunications networks.

The fifth priority – “Strengthening BEREC’s capabilities and continuous improvement” – relates to the internal evolution of the organisation, which aims to enhance efficiency, ensure high-quality results, increase transparency and improve environmental sustainability, also through closer cooperation with other institutions.

Finally, the document identifies several potential areas of work for 2027, including the monitoring of interconnection through Internet Protocol (IP), the development of connected and automated mobility, and the assessment of the environmental impact of network sharing.

Interested parties are invited to submit their contributions to the public consultation by 3 November 2025.

Author: Massimo D'Andrea, Flaminia Perna, Matilde Losa

 


Innovation Law Insights is compiled by DLA Piper lawyers, coordinated by Edoardo BardelliCarolina BattistellaNoemi CanovaGabriele Cattaneo, Giovanni ChiecoMaria Rita CormaciCamila CrisciCristina CriscuoliTamara D’AngeliChiara D’OnofrioFederico Maria Di Vizio, Enila EleziLaura GastaldiVincenzo GiuffréNicola LandolfiGiacomo LusardiValentina MazzaLara MastrangeloMaria Chiara MeneghettiGiulio NapolitanoAndrea PantaleoDeborah ParacchiniMaria Vittoria PessinaMarianna Riedo, Tommaso RicciMarianna RiedoRebecca Rossi, Dorina Simaku, Roxana SmeriaMassimiliano TiberioFederico Toscani, Giulia Zappaterra.

Articles concerning Telecommunications are curated by Massimo D’AndreaFlaminia Perna, Matilde Losa and Arianna Porretti.

For further information on the topics covered, please contact the partners Giulio CoraggioMarco de MorpurgoGualtiero DragottiAlessandro FerrariRoberto ValentiElena VareseAlessandro Boso CarettaGinevra Righini.

Learn about Prisca AI Compliance, the legal tech tool developed by DLA Piper to assess the maturity of AI systems against key regulations and technical standards here.

You can learn more about “Transfer”, the legal tech tool developed by DLA Piper to support companies in evaluating data transfers out of the EEA (TIA) here, and check out a DLA Piper publication outlining Gambling regulation here, as well as Diritto Intelligente, a monthly magazine dedicated to AI, here.

If you no longer wish to receive Innovation Law Insights or would like to subscribe, please email Silvia Molignani.

Print