A Growing Stakeholder Divide on the DFA
Consumer groups push for an ambitious Digital Fairness Act, while industry calls for a pause and reset, and Commission signals tougher rules to come
Brussels is starting to see the shape of the Digital Fairness Act fight, with consumer and industry coalitions pulling in opposite directions while the Commission signals that enforcement alone will not be enough.
Joint Call for an Ambitious Digital Fairness Act
One notable stakeholder move last week was the joint call published on 13 March, urging the Commission to come forward with an ambitious Digital Fairness Act. The significance is less in any single ask than in the breadth of the coalition behind it. The letter brings together a large cross-sector group of consumer organisations, civil society actors, digital rights groups and academics, all making the case that the DFA should be a meaningful update of horizontal EU consumer law for the digital environment rather than a narrow enforcement adjustment.
They argue that the DFA should close legal gaps left by the current framework, while also presenting stronger consumer protection as compatible with legal certainty, fair competition and competitiveness. The text also pushes back against the idea that simplification should mean lowering ambition.
Industry Joint Call to Pause and Reset the DFA
A second joint call has just landed, this time from the business side. In a letter dated 16 March, 34 European and international business associations urged Henna Virkkunen and Michael McGrath to “pause and reset” the DFA’s trajectory. The letter, initiated by the European Tech Alliance, argues that the DFA risks adding new layers of regulation in areas already covered by EU law, with possible overlap, legal uncertainty and higher compliance costs.
Their preferred direction is clear: focus first on stronger and more consistent enforcement of the existing rulebook, and on greater clarity across the wider digital acquis, especially the interaction between consumer law, data protection, AI and digital services rules.
If the public consultation hadn’t already given us a clear snapshot, taken together, the two joint calls now give the Commission an even clearer idea of the split in stakeholder mood: one coalition asking for a more ambitious horizontal update, the other urging a narrower approach built around enforcement, coherence and simplification.
Panel Discussion with Commission and Major Stakeholders
At CERRE’s Digital Platforms Summit, the panel on “Towards a Consumer Law Coherent and Fit for the Digital Age” offered a useful preview of the debates now crystallising around the Digital Fairness Act, and a clearer indication of what the Commission’s mindset is. This Panel followed the presentation of CERRE’s report on the DFA by Christoph Busch, Amelia Fletcher and Michèle Ledger.
Commission: Hard Bans and Shifting to Regulations
Martins Prieditis (deputy head of unit at DG JUST) said the DFA is being built to fill identified gaps in consumer law through a more design-focused approach, with rules that make rights easier to exercise in practice rather than adding more disclosure obligations. More importantly, he suggested the Commission does not think better enforcement alone can fix the current framework, especially when many traders may be using questionable practices that could amount to dark patterns. In that context, he said the Commission is considering more concrete rules and may even introduce specific prohibitions on certain dark patterns or interface design features. That is one of the clearest signs yet that the DFA may contain hard bans, not just broad fairness principles.
He also said that the CPC review is intended to give the Commission direct enforcement powers in major cross-border cases, which would likely require shifting some existing consumer law rules from directives into regulations.
Prieditis also flagged a key internal milestone: DG JUST is due before the Commission’s Regulatory Scrutiny Board on 1 July, meaning the DFA impact assessment will need to be largely locked down by then, with any negative opinion risking delay.
Cancelling Subscriptions and Burden of Proof
One exchange was about how prescriptive the DFA should be. BEUC’s Urs Busck backed an EU-wide cancellation button for subscriptions and pointed to Fitness Check findings that 96% of consumers find it difficult to cancel subscriptions. But his broader message was that the debate cannot stop there. He said dark patterns, video games, influencer marketing and especially burden of proof are too important to leave out. On burden of proof, his point was direct: if consumers still face excessive difficulties proving harm in technically complex cases, the DFA will make “no difference”.
Personalisation: Opt-in vs Opt-out
An important point of discussion was whether the DFA should focus only on financial harm or also tackle harms linked to time and attention. Dries Cuijpers from the Dutch ACM argued strongly that EU consumer law needs to start addressing non-economic harms. His point was that business models increasingly monetise not just money, but also time and attention, and that consumer law still does not properly deal with that. He also pushed for opt-in rather than opt-out where consumers are given a choice over personalisation, and argued that vulnerabilities should never be abused, whether through advertising, pricing or UX design.
Avoid duplicating the DSA
Industry speakers largely converged on coherence and level playing field. Snapchat and TikTok both argued that the DFA should avoid duplicating the DSA and should only step in where there are genuine gaps. TikTok’s Eugene McQuaid also backed the idea that if a company is already complying with sector-specific rules, that compliance should count under the DFA where relevant.
Gaming: loot boxes
Gaming was one of the more contested themes. BEUC highlighted that the CERRE report failed to properly cover consumer issues in video games, while Tencent pushed back against the idea of broad restrictions on models like loot boxes, arguing that they are non-mandatory and tied to the economics of free-to-play games. But even there, the wider point was really about the DFA’s method: whether it should target specific sectors and features, or stay at the level of broader principles.
Fireside Chat with Commission
CERRE’s Digital Platforms Summit also featured a fireside chat with Isabelle Pérignon (director for consumers at DG JUST), who gave a fairly direct account of where the Commission is heading on the Digital Fairness Act. The fireside chat was moderated by Anupriya Datta, Tech Reporter at Euractiv, who asked particularly clear and pertinent questions.
Leaning towards a Regulation?
Asked whether the DFA could turn consumer protection rules into regulations, and whether the DFA itself could be a regulation rather than a directive, Pérignon said the objective is to be “efficient”.
She said the direction the Commission wants to go in is to have stronger rules and, “if possible indeed a regulation”, which could also serve as the basis for giving the Commission additional enforcement powers, especially in the context of Consumer Protection Cooperation (CPC) Regulation reform.
Commission wants much stronger enforcement
On enforcement, Pérignon said that “all options are on the table indeed”. She described the current system as one the Commission wants to move beyond. The direction she set out was a system that is: “very much stronger”, based on deadlines, equipped with “stronger teeth”, backed by fines with a really higher level, able to use interim measures, and able to take action such as “delisting some platforms” when they do not respect the rules on unfair commercial practices.
Social media bans for children in DFA?
When pressed on whether the DFA could be the right vehicle for social media bans for children, she said that is something her team is analysing. Looking at the issue through the angle of “the teenager, the kid, as a consumer” would place it within the scope of the DFA, but “where and how” to address it remains open, and she repeated that all options are on the table.
Priorities
Asked what she would put in the DFA if it came out tomorrow, Pérignon gave three priorities:
Simplification of rules. She called this a “unique opportunity” to review the existing framework, see what can be abrogated, improved or clarified, and use the file to bring more harmonisation.
A strong protection of minors angle. She sees the need to tackle addictions that start at a young age.
Modernise the existing consumer framework to address new phenomena that were not foreseen a few years ago. The example she gave was “agentic commerce and AI agents” as Christoph Busch suggests.
Special Panel on Child Safety Online
On 5 March, Ursula von der Leyen hosted the first meeting of the special panel on child safety online, with a mandate to deliver expert recommendations and to look at whether the EU should move towards harmonised age restrictions for access to social media. The idea is to build on the existing toolbox (DSA and its minors guidance, BIK+, cyberbullying work, age verification pilots) while taking stock of where policymakers think gaps remain. The panel is co-chaired by Maria Melchior, and Jörg M. Fegert.
The first meeting already put issues like age-appropriate safety by design, addictive features, recommender dynamics, and digital literacy on the table. The panel is expected to report by summer 2026, so its framing is likely to become a reference point for on addictive design and safeguards for minors, including what should be handled via enforcement versus new horizontal rules. If the timeline for the report is respected, it will be published right before the Commission officially tables the Digital Fairness Act. We can therefore expect that the report will influence the Commission on the DFA.
TikTok’s Addictive Design in Breach of DSA
The Commission added a useful data point to the "addictive design" debate on 6 February, sending TikTok preliminary findings that its core engagement features may breach the DSA’s systemic risk obligations. The Commission points to infinite scroll, autoplay, push notifications and TikTok’s highly personalised recommender system, and argues that TikTok’s risk assessment and mitigation measures did not adequately address impacts on users’ wellbeing, including minors and vulnerable adults. TikTok can now access the case file and reply, the European Board for Digital Services will be consulted, and any final non-compliance decision could carry fines of up to 6% of global turnover.
For Digital Fairness Act watchers, the bigger takeaway is the direction of travel. Even before the DFA proposal lands, the DSA is being used to test what "reasonable, proportionate and effective" mitigation looks like in practice, and that enforcement learning will shape expectations across stakeholders on whatever the DFA ends up adding on addictive design.
Shein’s Addictive Design Gets Investigated
On 17 February, the Commission opened formal proceedings against Shein. The case is framed around three buckets: how the platform limits the sale of illegal products in the EU (including content that could constitute CSAM), the risks linked to potentially addictive engagement mechanics (for example points or rewards), and the transparency of the recommender systems that steer users towards products.
For the Digital Fairness Act debate, the significance is about the precedent the Commission is setting. The Commission is testing its own interpretations of “addictive design” through the DSA’s systemic risk and mitigation duties, and doing so on an e-commerce marketplace, not only on social media. That will inevitably feed into the DFA baseline discussion of what can be enforced today versus what needs new horizontal rules. As always, this is still at the investigation stage, with due process and potential remedies or fines only if a final non-compliance decision is reached.
Corporate Europe Observatory Scrutinises DFA Lobbyists
Corporate Europe Observatory (CEO) dropped a report on 5 February, “Addicted to the algorithm”, framing the Digital Fairness Act’s coming fight around one question: will the EU actually curb addictive social media design, or will the file get neutralised in the name of competitiveness and simplification? It maps the political and lobbying terrain now forming around DG JUST’s work on the DFA.
A few signals in it are hard to ignore for anyone tracking Brussels dynamics: CEO counted at least 96 meetings between Commission top officials and DFA lobbyists since December 2024, with roughly 83% involving industry, and highlights the sector’s record lobbying footprint. It also pulls together the now familiar lines of attack we regularly hear (enforce the DSA, rely on voluntary codes, avoid “new burdens”) and points to the growing role of ecosystem players and proxies, including “consumer” voices like “Consumer Choice Center Europe” and media partnerships like EU Tech Loop, in shaping the narrative before the proposal lands.
Contact
As always, feel free to reach out on LinkedIn or at james@edpi.eu if you have any information you would like to share or any questions. Thanks for reading!



