A clash between Poland’s right-wing president and its centrist ruling coalition
over the European Union’s flagship social media law is putting the country
further at risk of multimillion euro fines from Brussels.
President Karol Nawrocki is holding up a bill that would implement the EU’s
Digital Services Act, a tech law that allows regulators to police how social
media firms moderate content. Nawrocki, an ally of U.S. President Donald Trump,
said in a statement that the law would “give control of content on the internet
to officials subordinate to the government, not to independent courts.”
The government coalition led by Prime Minister Donald Tusk, Nawrocki’s rival,
warned this further exposed them to the risk of EU fines as high as €9.5
million.
Deputy Digital Minister Dariusz Standerski said in a TV interview that, “since
the president decided to veto this law, I’m assuming he is also willing to have
these costs [of a potential fine] charged to the budget of the President’s
Office.”
Nawrocki’s refusal to sign the bill brings back bad memories of Warsaw’s
years-long clash with Brussels over the rule of law, a conflict that began when
Nawrocki’s Law and Justice party rose to power in 2015 and started reforming the
country’s courts and regulators. The EU imposed €320 million in penalties on
Poland from 2021-2023.
Warsaw was already in a fight with the Commission over its slow implementation
of the tech rulebook since 2024, when the EU executive put Poland on notice for
delaying the law’s implementation and for not designating a responsible
authority. In May last year Brussels took Warsaw to court over the issue.
If the EU imposes new fines over the rollout of digital rules, it would
“reignite debates reminiscent of the rule-of-law mechanism and frozen funds
disputes,” said Jakub Szymik, founder of Warsaw-based non-profit watchdog group
CEE Digital Democracy Watch.
Failure to implement the tech law could in the long run even lead to fines and
penalties accruing over time, as happened when Warsaw refused to reform its
courts during the earlier rule of law crisis.
The European Commission said in a statement that it “will not comment on
national legislative procedures.” It added that “implementing the [Digital
Services Act] into national law is essential to allow users in Poland to benefit
from the same DSA rights.”
“This is why we have an ongoing infringement procedure against Poland” for its
“failure to designate and empower” a responsible authority, the statement said.
Under the tech platforms law, countries were supposed to designate a national
authority to oversee the rules by February 2024. Poland is the only EU country
that hasn’t moved to at least formally agree on which regulator that should be.
The European Commission is the chief regulator for a group of very large online
platforms, including Elon Musk’s X, Meta’s Facebook and Instagram, Google’s
YouTube, Chinese-owned TikTok and Shein and others.
But national governments have the power to enforce the law on smaller platforms
and certify third parties for dispute resolution, among other things. National
laws allow users to exercise their rights to appeal to online platforms and
challenge decisions.
When blocking the bill last Friday, Nawrocki said a new version could be ready
within two months.
But that was “very unlikely … given that work on the current version has been
ongoing for nearly two years and no concrete alternative has been presented” by
the president, said Szymik, the NGO official.
The Digital Services Act has become a flashpoint in the political fight between
Brussels and Washington over how to police online platforms. The EU imposed its
first-ever fine under the law on X in December, prompting the U.S.
administration to sanction former EU Commissioner Thierry Breton and four other
Europeans.
Nawrocki last week likened the law to “the construction of the Ministry of Truth
from George Orwell’s novel 1984,” a criticism that echoed claims by Trump and
his top MAGA officials that the law censored conservatives and right-wingers.
Bartosz Brzeziński contributed reporting.
Tag - Consumer Policy
WARSAW — Poland’s nationalist President Karol Nawrocki on Friday sided with his
ally U.S. President Donald Trump to veto legislation on enforcing the EU’s
social media law, which is hated by the American administration.
Trump and his top MAGA officials condemn the EU’s Digital Services Act — which
seeks to force big platforms like Elon Musk’s X, Facebook, Instagram to moderate
content — as a form of “Orwellian” censorship against conservative and
right-wingers.
The presidential veto stops national regulators in Warsaw from implementing the
DSA and sets Nawrocki up for a a clash with centrist pro-EU Prime Minister
Donald Tusk. Tusk’s parliamentary majority passed the legislation introducing
the DSA in Poland.
Nawrocki argued that while the bill’s stated aim of protecting citizens —
particularly minors — was legitimate, the Polish bill would grant excessive
power to government officials over online content, resulting in “administrative
censorship.”
“I want this to be stated clearly: a situation in which what is allowed on the
internet is decided by an official subordinate to the government resembles the
construction of the Ministry of Truth from George Orwell’s novel 1984,” Nawrocki
said in a statement — echoing the U.S.’s stance on the law.
Nawrocki also warned that allowing authorities to decide what constitutes truth
or disinformation would erode freedom of expression “step by step.” He called
for a revised draft that would protect children while ensuring that disputes
over online speech are settled by independent courts.
Deputy Prime Minister and Digital Affairs Minister Krzysztof Gawkowski dismissed
Nawrocki’s position, accusing the president of undermining online safety and
siding with digital platforms.
“The president has vetoed online safety,” Gawkowski told a press briefing Friday
afternoon, arguing the law would have protected children from predators,
families from disinformation and users from opaque algorithms.
The minister also rejected Nawrocki’s Orwellian comparisons, saying the bill
explicitly relied on ordinary courts rather than officials to rule on online
content.
Gawkowski said Poland is now among the few EU countries without national
legislation enabling effective enforcement of the DSA and pledged that the
government would continue to pursue new rules.
The clash comes as enforcement of the social media law has become a flashpoint
in EU-U.S. relations.
Brussels has already fined Elon Musk’s X €120 million for breaching the law,
prompting a furious response from Washington, including travel bans imposed by
the Trump administration on former EU Commissioner Thierry Breton, an architect
of the tech law, and four disinformation experts.
The DSA allows fines of up to 6 percent of a company’s global revenue and, as a
measure of last resort, temporary bans on platforms.
Earlier this week, the European Commission expanded its investigation into X’s
AI service Grok after it started posting a wave of non-consensual sexualized
pictures of people in response to X users’ requests.
The European Commission’s digital spokesperson Thomas Regnier said the EU
executive would not comment on national legislative procedures. “Implementing
the DSA into national law is essential to allow users in Poland to benefit from
the same DSA rights, such as challenging platforms if their content is deleted
or their account suspended,” he said.
“This is why we have an ongoing infringement procedure against Poland. We have
referred Poland to the Court of Justice of the EU for failure to designate and
empower the Digital Services Coordinator,” in May 2025, Regnier added.
Gawkowski said that the government would make a quick decision on what to do
next with the vetoed bill but declined to offer specifics on what a new bill
would look like were it to be submitted to parliament again.
Tusk four-party coalition does not have enough votes in parliament to override
Nawrocki’s vetoes. That has created a political deadlock over key legislation
efforts by the government, which stands for reelection next year. Nawrocki,
meanwhile, is aiming to help the Law and Justice (PiS) political party he’s
aligned with to retake power after losing to Tusk in 2023.
Mathieu Pollet contributed reporting.
Meta and TikTok have dealt a blow to the European Commission’s social media rule
book, pressing the EU executive to codify how it calculates the number of users
on online platforms.
The General Court at the Court of Justice of the European Union sided with the
social media companies on Wednesday in their challenge of an annual supervisory
fee the European Union charges to pay for the enforcement of its tech rulebook,
the Digital Services Act (DSA).
It’s the first major court loss for the Commission over the DSA, which entered
into force in 2022 and can be wielded to fine social media and e-commerce
platforms up to 6 percent of their global annual revenue. The EU has yet to
finalize investigations under the law.
At the heart of the case are platforms’ disagreements with how the EU calculated
the fee. The Commission directly supervises “very large online platforms” with
over 45 million average monthly users in the bloc.
Meta and TikTok challenged the European Commission’s decisions imposing
so-called supervisory fees in 2024. These fees are meant to support the
Commission’s work overseeing the very platforms that pay it — an extension of
the “polluter pays” principle often used in environmental policy — and are
proportionate to the number of users platforms have in the EU.
The EU’s General Court said in its ruling the Commission should have passed a
separate set of rules about how users are calculated before determining the
fees. Judges gave the Commission a year to draft a text on how it calculates
platform users, or else potentially refund the platforms’ 2023 fees.
The EU executive has already been working on such rules, called a delegated act.
The Commission said the court merely ruled against it on procedure and not
substance. “The Court confirms our methodology is sound: no error in
calculation, no suspension of any payments, no problem with the principle of the
fee nor the amount,” said spokesperson Thomas Regnier.
Meta said in a statement that the judgement “will force the European Commission
to reassess the unfair methodology being used to calculate these DSA fees,”
adding it “looks forward to the flaws in the methodology being addressed.”
TikTok “welcomed” the decision and will “closely follow the development” of the
case, company spokesperson Paolo Ganino said.
The United States Congress is amping up criticism of the European Union’s social
media law — and this time, they’ve brought receipts.
The U.S. House of Representatives Judiciary Committee is releasing a report on
Friday that singles out the European Union’s Digital Services Act (DSA) as a
“foreign censorship threat.” The report, shared exclusively with POLITICO,
describes the flagship social media law as a “comprehensive digital censorship
law” that threatens the freedom of speech of American citizens.
The White House and U.S. State Department have been going after the EU’s digital
rulebook for months, accusing the bloc of unfair, burdensome rules that target
American companies and free speech.
On the European side, proponents of these laws want to see them enforced
strictly, including on U.S. technology giants. Some EU officials have argued
that Washington officials are simply fronting the arguments of their homegrown
tech firms.
The Judiciary Committee’s 37-page “interim staff report” is the result of a
five-month, still-ongoing inquiry that started with subpoenas issued by the U.S.
Congress in February to Big Tech companies.
Evidence attached to the report includes correspondence between top European
Commission officials and Jim Jordan, the Republican representative from Ohio and
chairman of the U.S. House Judiciary Committee.
It also includes non-public information about how the European Commission and
national authorities implement the rules, including confidential information
from EU workshops, emails between the EU executive and companies, content
takedown requests in France, Germany and Poland and readouts from Commission
meetings with tech firms.
“On paper, the DSA is bad. In practice, it is even worse,” the report said.
“European censors” at the Commission and EU countries “target core political
speech that is neither harmful nor illegal, attempting to stifle debate on
topics such as immigration and the environment,” it said. Their censorship is
“largely one-sided” against conservatives, it added.
Commission spokesperson Thomas Regnier said in a comment that freedom of
expression “is a fundamental right in the EU. And it is at the heart of our
legislations, including the DSA.” He added: “Absolutely nothing in the DSA
requires a platform to remove lawful content.”
According to the Commission, “more than 99 percent of content moderation
decisions are in fact taken proactively by online platforms to enforce their own
Terms & Conditions” and “content removals based on regulatory authorities’
orders to act against illegal content account for less than 0.001 percent,”
Regnier said.
Judiciary Committee Chairman Jordan is set to lead a bipartisan congressional
delegation to Europe in the coming days, including a stop in Brussels, to
discuss issues of censorship and free speech, a source familiar with the
planning said.
BEHIND CLOSED DOORS
The Commission’s line is that the DSA is apolitical and doesn’t determine what
speech is illegal. But the U.S. committee said that behind closed doors it draws
a line in the sand.
At the workshop, which POLITICO reported exclusively in March, the Commission
asked Big Tech platforms, regulators and civil society how they would react to
different scenarios.
One scenario involved a user being exposed to the phrase “We need to take back
our country.” In this scenario on “illegal content,” the statement appeared
under a photo of a woman in a hijab with the caption “terrorist in disguise,”
the documents show.
The Commission said the user in question would be exposed to “illegal hate
speech.” The committee argued that the phrase “take back our country” is “a
common, anodyne political statement” used by the likes of Kamala Harris.
The Commission also instructed platforms to address memes and satirical content
at the workshop.
According to the U.S. committee’s report, the EU kept the workshop secret
because the “Commission wants to hide its censorship aims.”
The Commission regularly meets with technology companies and other stakeholders
to give information on its policymaking. Some of these meetings — including
those informing its competition cases — occur in public, while others take place
under varying levels of confidentiality.
The committee report also criticized a system of third parties, comprising
trusted flaggers and out-of-court settlement dispute bodies, as being neither
impartial nor independent of authorities.
The report criticized the need for fact-checkers to be approved by regulators.
It stated that it had identified various conflicts of interest stemming from
these organizations’ goals, funding, or litigation against tech platforms.
The report also criticized the regime set up for Very Large Online Platforms,
those with more than 45 million monthly active users in the bloc. VLOP
designations are “used to burden” non-EU firms, while EU firms are afforded
workarounds, it said.
One case that drew the attention of U.S. representatives: Spotify has been
allowed to split its products into music and podcasts, thus avoiding the more
cumbersome VLOP rules. In the first quarter of 2025, the Swedish firm reported
689 million monthly active users with 37 percent of its subscribers based in
Europe.
The European Commission on Monday said countries can implement their own
national bans for minors on social media, in new guidelines under its powerful
Digital Services Act.
The EU executive has been under pressure in recent months to roll out measures
to protect minors online. National governments in France, Denmark, Spain and
elsewhere have called for social media restrictions, with some criticizing the
EU for not acting quickly enough.
France and the Netherlands have supported an outright ban of social media for
minors under 15. Greece has said it thinks parental consent should be required
for children under a certain age. Denmark, which currently helms work in the
Council of the EU, is pushing for stronger EU-level actions.
Tech giant Meta has also come out suggesting legal restrictions that would
require parents to consent for their kids being on social media below a certain
age.
“Age verification is not a nice to have. It’s absolutely essential,” said
Denmark’s digital minister Caroline Stage Olsen, who presented the guidelines
alongside the Commission’s tech chief Henna Virkkunen.
The Commission’s new guidelines for minor protection online seek to make sure
platforms face a similar set of rules across Europe under the Digital Services
Act (DSA), the bloc’s landmark social media regulation. The guidelines are
non-binding and set the benchmark for companies to interpret requirements under
the DSA.
The Commission on Monday also released technical specifications for an age
verification app that could help verify if users are over 18 by using IDs and
even facial recognition. The app is set to be tested in France, Greece, Spain,
Italy and Denmark, five countries that are also pushing for restrictions and are
working on their own age verification solutions.
EU countries can also use the app should they decide to implement national
restrictions for social media use at a different age threshold, a senior
Commission official said, granted anonymity to disclose details of the plan
ahead of its release.
High-risk services like porn platforms and online alcohol shops are also
recommended to verify users’ ages.
“It’s hard to imagine a world where kids can enter a store to buy alcohol, go to
a nightclub by simply stating that they are old enough, no bouncers, no ID
checks, just a simple yes, I am over the age of 18,” but this is what “has been
the case online for many years,” said Stage Olsen.
Monday’s guidelines cover how platforms should adapt their systems to better
protect kids along a range of services.
The text suggested that platforms do not use browsing behavior in their
recommender systems; that they turn off features like streaks and read receipts
to decrease the addictiveness of platforms; that they set privacy and security
by default in settings, for example making their accounts invisible to other
users not in their networks; and that they consider turning off some features
like camera access.
The guidelines follow a risk-based approach, meaning platforms can evaluate what
possible threats they pose to minors and adopt measures accordingly.
Tech firms launched a last-minute lobbying push arguing that the guidelines
still allow for cumbersome fragmentation.
This article was updated.