Tag - Encryption

Europe’s digital sovereignty: from doctrine to delivery
When the Franco-German summit concluded in Berlin, Europe’s leaders issued a declaration with a clear ambition: strengthen Europe’s digital sovereignty in an open, collaborative way. European Commission President Ursula von der Leyen’s call for “Europe’s Independence Moment” captures the urgency, but independence isn’t declared — it’s designed. The pandemic exposed this truth. When Covid-19 struck, Europe initially scrambled for vaccines and facemasks, hampered by fragmented responses and overreliance on a few external suppliers. That vulnerability must never be repeated. True sovereignty rests on three pillars: diversity, resilience and autonomy. > True sovereignty rests on three pillars: diversity, resilience and autonomy. Diversity doesn’t mean pulling every factory back to Europe or building walls around markets. Many industries depend on expertise and resources beyond our borders. The answer is optionality, never putting all our eggs in one basket. Europe must enable choice and work with trusted partners to build capabilities. This risk-based approach ensures we’re not hostage to single suppliers or overexposed to nations that don’t share our values. Look at the energy crisis after Russia’s illegal invasion of Ukraine. Europe’s heavy reliance on Russian oil and gas left economies vulnerable. The solution wasn’t isolation, it was diversification: boosting domestic production from alternative energy sources while sourcing from multiple markets. Optionality is power. It lets Europe pivot when shocks hit, whether in energy, technology, or raw materials. Resilience is the art of prediction. Every system inevitably has vulnerabilities. The key is pre-empting, planning, testing and knowing how to recover quickly. Just as banks undergo stress tests, Europe needs similar rigor across physical and digital infrastructure. That also means promoting interoperability between networks, redundant connectivity links (including space and subsea cables), stockpiling critical components, and contingency plans. Resilience isn’t theoretical. It’s operational readiness. Finally, Europe must exercise authority through robust frameworks, such as authorization schemes, local licensing and governance rooted in EU law. The question is how and where to apply this control. On sensitive data, for example, sovereignty means ensuring it’s held in Europe under European jurisdiction, without replacing every underlying technology component. Sovereign solutions shouldn’t shut out global players. Instead, they should guarantee that critical decisions and compliance remain under European authority. Autonomy is empowerment, limiting external interference or denial of service while keeping systems secure and accountable. But let’s be clear: Europe cannot replicate world-leading technologies, platforms or critical components overnight. While we have the talent, innovation and leading industries, Europe has fallen significantly behind in a range of key emerging technologies. > While we have the talent, innovation and leading industries, Europe has fallen > significantly behind in a range of key emerging technologies. For example, building fully European alternatives in cloud and AI would take decades and billions of euros, and even then, we’d struggle to match Silicon Valley or Shenzhen. Worse, turning inward with protectionist policies would only weaken the foundations that we now seek to strengthen. “Old wines in new bottles” — import substitution, isolationism, picking winners — won’t deliver competitiveness or security. Contrast that with the much-debated US Inflation Reduction Act. Its incentives and subsidies were open to EU companies, provided they invest locally, develop local talent and build within the US market. It’s not about flags, it’s about pragmatism: attracting global investments, creating jobs and driving innovation-led growth. So what’s the practical path? Europe must embrace ‘sovereignty done right’, weaving diversity, resilience and autonomy into the fabric of its policies. That means risk-based safeguards, strategic partnerships and investment in European capabilities while staying open to global innovation. Trusted European operators can play a key role: managing encryption, access control and critical operations within EU jurisdiction, while enabling managed access to global technologies. To avoid ‘sovereignty washing’, eligibility should be based on rigorous, transparent assessments, not blanket bans. The Berlin summit’s new working group should start with a common EU-wide framework defining levels of data, operational and technological sovereignty. Providers claiming sovereign services can use this framework to transparently demonstrate which levels they meet. Europe’s sovereignty will not come from closing doors. Sovereignty done right will come from opening the right ones, on Europe’s terms. Independence should be dynamic, not defensive — empowering innovation, securing prosperity and protecting freedoms. > Europe’s sovereignty will not come from closing doors. Sovereignty done right > will come from opening the right ones, on Europe’s terms. That’s how Europe can build resilience, competitiveness and true strategic autonomy in a vibrant global digital ecosystem.
Data
Energy
Security
Borders
Rights
Von der Leyen drifts right with new digital deregulation plans
BRUSSELS — A fresh proposal by European Commission President Ursula von der Leyen to reform digital laws on Wednesday was welcomed by lawmakers on the right but shunned on the left. It signals a possible repeat of a pivotal parliamentary clash last week in which von der Leyen’s center-right European People’s Party sided with the far right to pass her first omnibus proposal on green rules — sidelining the centrist coalition that voted the Commission president into office last year. The EU executive on Wednesday presented plans to overhaul everything from its flagship General Data Protection Regulation to data rules and its fledgling Artificial Intelligence Act. The reforms aim to help businesses using data and AI, in an effort to catch up with the United States, China and other regions in the global tech race. Drafts of the plans obtained by POLITICO caused an uproar in Brussels in the past two weeks, as everyone from liberal to left-leaning political groups and privacy-minded national governments rang the alarm. Von der Leyen sought to extend an olive branch with last-minute tweaks to her proposal, but she’s still a long way away from center-left groups. The Progressive Alliance of Socialists and Democrats, Greens and The Left all slamming the plans in recent days. Tom Vandendriessche, a Belgian member of the far-right Patriots for Europe group, said the GDPR is not “untouchable,” and that there needs to be simplification “to ensure our European companies can compete again.” He added: “If EPP supports that course, we’re happy to collaborate on that.” Charlie Weimers a Swedish member of the right-wing European Conservatives and Reformists, welcomed the plan for “cleaning up overlapping data rules, cutting double reporting and finally tackling the cookie banner circus.” Weimers argued von der Leyen could go further, saying it falls short of being “the regulatory U-turn the EU actually needs” to catch up in the AI race. Those early rapprochements on the right are what Europe’s centrists and left fear most. The digital omnibus “should not be a repetition of omnibus one,” German Greens lawmaker Sergey Lagodinsky told reporters on Wednesday. Lagodinsky warned EPP leader Manfred Weber that “there should be no games with anti-democratic and anti-European parties.” BIG REFORMS, SMALL CONCESSIONS The Commission’s double-decker digital omnibus package includes one plan to simplify the EU’s data-related laws (including the GDPR as well as rules for nonpersonal data), and another specifically targeting the AI Act. A Commission official, briefing reporters without being authorized to speak on the record, said the omnibus’ impact on the GDPR was subject to “intense discussion” internally in the run up to Wednesday’s presentation, after its rough reception from some parliament groups and privacy organizations. Much in the EU executive’s final text remained unchanged. Among the proposals, the Commission wants to insert an affirmation into the GDPR that AI developers can rely on their “legitimate interest” to legally process Europeans’ data. That would give AI companies more confidence that they don’t always have to ask for consent. It also wants to change the definition of personal data in the GDPR to allow pseudonymized data — where a person’s details have been obscured so they can’t be identified — to be more easily processed. The omnibus proposals also aim to reduce the number of cookie banners that crop up across Europe’s internet. To assuage privacy concerns, Commission officials scrapped a hotly contested clause that would have redefined what is considered “special category” data, like a person’s religious or political beliefs, ethnicity or health data, which are afforded extra protections under the GDPR. The new cookie provision will also contain an explicit statement that website and app operators still need to get consent to access information on people’s devices. SEEKING POLITICAL SUPPORT The final texts will now be scrutinized by the Parliament and Council of the European Union. Von der Leyen’s center-right EPP welcomed the digital simplification plans as a “a critical boost for Europe’s industrial competitiveness.” Parliament’s group of center-left Socialists and Democrats came out critical of the reforms. Birgit Sippel, a prominent German member of the group, said in a statement the Commission “wants to undermine its own standards of protection in the area of data protection and privacy in order to facilitate data use, surveillance, and AI tools ‘made in the U.S.’” On the EPP’s immediate left, the liberal Renew group cited “important concerns” about the final texts but said it was “delighted” that the Commission backtracked on changing the definition of sensitive data, one idea in the leaked drafts that triggered a backlash. Renew said it would “support changes in the digital omnibus that will make life easier for our European companies.” If von der Leyen goes looking for votes for her digital omnibus among far-right groups, she will find support but it might not be a united front. German lawmaker Christine Anderson of the Alternative for Germany party, part of the far-right Europe of Sovereign Nations group, warned the digital omnibus could end up boosting “the ability to track and profile people.” Weaker privacy rules would “enable enhanced surveillance architecture,” she said, adding her party had “always opposed” such changes. “On these issues, we find ourselves much closer to the groups on the left in the Parliament,” she said. Pieter Haeck contributed reporting.
Data
Intelligence
Social Media
Far right
Negotiations
Europe’s police want AI to fight crime. They say red tape stands in the way.
The European Union’s law enforcement agency wants to speed up how it gets its hands on artificial intelligence tools to fight serious crime, a top official said. Criminals are having “the time of their life” with “their malicious deployment of AI,” but police authorities at the bloc’s Europol agency are weighed down by legal checks when trying to use the new technology, Deputy Executive Director Jürgen Ebner told POLITICO. Authorities have to run through data protection and fundamental rights assessments under EU law. Those checks can delay the use of AI by up to eight months, Ebner said. Speeding up the process could make the difference in time sensitive situations where there is a “threat to life,” he added. Europe’s police agency has built out its tech capabilities in past years, ranging from big data crunching to decrypting communication between criminals. Authorities are keen to fight fire with fire in a world where AI is rapidly boosting cybercrime. But academics and activists have repeatedly voiced concerns about giving authorities free rein to use AI tech without guardrails. European Commission President Ursula von der Leyen has vowed to more than double Europol’s staff and turn it into a powerhouse to fight criminal groups “navigating constantly between the physical and digital worlds.” The Commission’s latest work program said this will come in the form of a legislative proposal to strengthen Europol in the second quarter of 2026.  Speaking in Malta at a recent gathering of data protection specialists from across Europe’s police forces, Ebner said it is an “absolute essential” for there to be a fast-tracked procedure to allow law enforcement to deploy AI tools in “emergency” situations without having to follow a “very complex compliance procedure.” Assessing data protection and fundamental rights impacts of an AI tool is required under the EU’s General Data Protection Regulation (GDPR) and AI Act. Ebner said these processes can take six to eight months.  The top cop clarified that a faster emergency process would not bypass AI tool red lines around profiling or live facial recognition. Law enforcement authorities already have several exemptions under the EU’s Artificial Intelligence Act (AI Act). Under the rules, the use of real-time facial recognition in public spaces is prohibited for law enforcers, but EU countries can still permit exceptions, especially for the most serious crimes. Lawmakers and digital rights groups have expressed concerns about these carve-outs, which were secured by EU countries during the law’s negotiation. DIGITAL POLICING POWERS Ebner, who oversees governance matters at Europol, said “almost all investigations” now have an online dimension.   The investments in tech and innovation to keep pace with criminals is putting a “massive burden on law enforcement agencies,” he said. European Commission President Ursula von der Leyen has vowed to more than double Europol’s staff and turn it into a powerhouse to fight criminal groups. | Wagner Meier/Getty Images The Europol official has been in discussions with Europe’s police chiefs about the EU agency’s upcoming expansion. He said they “would like to see Europol doing more in the innovation field, in technology, in co-operation with private parties.”  “Artificial intelligence is extremely costly. Legal decryption platforms are costly. The same is to be foreseen already for quantum computing,” Ebner said. Europol can help bolster Europe’s digital defenses, for instance by seconding analysts with technological expertise to national police investigations, he said. Europol’s central mission has been to help national police investigate cross-border serious crimes through information sharing. But EU countries have previously been reluctant to cede too much actual policing power to the EU level authority.  Taking control of law enforcement away from EU countries is “out of the scope” of any discussions about strengthening Europol, Ebner said. “We don’t think it’s necessary that Europol should have the power to arrest people and to do house searches. That makes no sense, that [has] no added value,” he said.   Pieter Haeck contributed reporting.
Data
Security
Regulation
Rights
Artificial Intelligence
One-man spam campaign ravages EU ‘chat control’ bill
BRUSSELS — A website set up by an unknown Dane over the course of one weekend in August is giving a massive headache to those trying to pass a European bill aimed at stopping child sexual abuse material from spreading online. The website, called Fight Chat Control, was set up by Joachim, a 30-year-old software engineer living in Aalborg, Denmark. He made it after learning of a new attempt to approve a European Union proposal to fight child sexual abuse material (CSAM) — a bill seen by privacy activists as breaking encryption and leading to mass surveillance. The site lets visitors compile a mass email warning about the bill and send it to national government officials, members of the European Parliament and others with ease. Since launching, it has broken the inboxes of MEPs and caused a stir in Brussels’ corridors of power.  “We are getting hundreds per day about it,” said Evin Incir, a Swedish Socialists and Democrats MEP, of the email deluge. Three diplomats at national permanent representation offices said they too have received a large number of emails.  Joachim’s website has stoked up an already red-hot debate around the CSAM proposal, which would give police the power to force companies like WhatsApp and Signal to scan their services for the illegal content. Critics fear the bill would enable online state surveillance. Elon Musk’s X said Monday that the bill could enable “government instituted mass surveillance,” and encrypted chat app Signal said last weekend it would pull out of Europe if the bill passed. Meta’s WhatsApp also came out against Denmark’s proposal — backing Europe’s privacy groups, which have railed against the bill ever since its conception. EU countries are split into two camps. One side broadly backs the bill’s measures as a way to stop predators from sharing illegal content of children; the other says it would create a surveillance state and be ineffective. Denmark proposed a new version on its first day holding the presidency of the Council of the EU in July. Danish diplomats hope to get an agreement at a meeting of ministers in Luxembourg next week, and for that, the proposal needs to get past EU ambassadors on Wednesday. MILLIONS OF EMAILS Joachim himself declined to provide his last name or workplace because his employer does not want to be associated with the campaign. POLITICO has verified his identity. Joachim said his employer has no commercial interest in the legislation, and he alone paid the costs associated with running the website. Joachim’s mass email campaign is unconventional as a lobbying tool, differing from the more wonky approach usually taken in Brussels. But the website’s impact has been undeniable. The Polish government responded directly to the campaign in a statement last month, reassuring Poles it’s against mass scanning of messages. A Danish petition, pushed by the Fight Chat Control campaign, now has more than 50,000 signatures, meaning it can be discussed in parliament. Irish national lawmakers asked questions in parliament in September about “Chat Control,” the name for the legislation adopted by its critics and used by Joachim. As of early October, nearly 2.5 million people had visited his website, Joachim said, with most coming from within the EU. The emails are sent from visitors’ own email clients, meaning Joachim doesn’t know how many have been sent, but he estimated that it has triggered several million emails. The campaign has irked some recipients. “In terms of dialog within a democracy, this is not a dialog,” said Lena Düpont, a German member of the European People’s Party group and its home affairs spokesperson, of the mass emails. Joachim’s campaign is blocking more traditional lobbyists and campaigners, too, they said. Mieke Schuurman, director at child rights group Eurochild, said the group’s messages are no longer reaching policymakers, who “increasingly respond with automated replies.”  Joachim, who said he has not paid to promote the site, said it is “regrettable” that child rights campaigners’ emails have received automated responses. But the flood of emails sent by his website visitors is “a quite clear indication that people really care about this … I would actually argue this is as democratic as it gets,” he said. CAPITALS ON EDGE The European Commission presented its original proposal on CSAM in 2022 as an effort to stem the spread of the illegal content. Since then, police authorities have warned the problem has gotten worse, in part because platforms have increasingly enabled privacy technologies and encrypted messaging across some of the most popular services. The rise of artificial intelligence-generated content has added to the problem, authorities have warned. National governments are attempting — for the fifth time, at least — to hash out a compromise on the EU proposal. Countries first need to adopt their own position before negotiations with the European Parliament can take place.  One EU diplomat said some EU member countries are now more hesitant to support Denmark’s proposal, at least in part because of the campaign: “There is a clear link.” Ella Jakubowska, head of policy at digital rights group EDRi, said “This campaign seems to have raised the topic high up the agenda in member states where there was previously little to no public debate.” But Danish Justice Minister Peter Hummelgaard, one of the loudest proponents of tough measures to get child abuse material off online platforms, said in a statement that his proposal is far more balanced than the Commission’s original version and would mean that scanning would only happen as a last resort. “This has nothing to do with ‘chat control,’ as the sponsors of the citizens’ initiative claim,” he said.
Negotiations
Regulation
Technology
Companies
Law enforcement
Deleting texts to save space, Ursula? ‘It’s not the 1990s.’
BRUSSELS — The president of the European Commission auto-deletes messages from her phone in part to save storage space, the EU executive said this week. Tech experts have but one question: Really? Deleting messages to save space “sounds cute but also hard to believe. Let’s not be silly here, it’s not the 1990s,” said Lukasz Olejnik, senior research fellow at King’s College London and a cybersecurity expert. “A text message barely takes any room on a modern phone. Like, you would need to get hundreds of thousands of text messages for it to actually make a difference,” Belgian ethical hacker Inti De Ceukelaire said, calling the Commission’s explanation “a non-argument.” “Why doesn’t she change to a phone with more storage?” asked Francisco Jeronimo, vice president for data and analytics at technology market research firm IDC in Europe. Ursula von der Leyen is in the hot seat over a text message she received from French President Emmanuel Macron last year urging her to block the EU-Mercosur trade deal, as first reported by POLITICO. The message was subsequently deleted from von der Leyen’s phone, the Commission said in response to an access to documents request filed by Follow the Money reporter Alexander Fanta. The Commission told its staff in 2020 to start using Signal, an end-to-end-encrypted messaging app, in a push to increase the security of its communications. | Thomas Fuller/SOPA Images/LightRocket via Getty Images On Wednesday Commission spokesperson Olof Gill told reporters: “The messages are auto-deleted after a while, just for space reasons.” He jokingly added: “Otherwise, the phone would go on fire.” Another spokesperson, Balazs Ujvari, added it also helped prevent security breaches, but doubled down on the idea that it was a means of saving space: “On the one hand, it reduces the risk of leaks and security breaches, which is of course an important factor … And also, it’s a question of space on the phone, so, effective use of a mobile device.” To be sure, many Europeans have struggled with overloaded phone storage. But for most it’s a matter of home videos and reams of family pictures that are clogging devices. “Messages take up a lot of space if we are talking about videos, voice recordings,” IDC’s Jeronimo said, whereas text-based messages “take nearly nothing from the storage.” The Commission told its staff in 2020 to start using Signal, an end-to-end-encrypted messaging app, in a push to increase the security of its communications. The institution recommended using the app’s disappearing messages functionality in a 2022 guidance called “Checklist to Make Your Signal Safer.” For security purposes it makes sense, Jeronimo said. “If someone like [von der Leyen] loses her phone, or if the phone is hacked … there’s a very high risk” that her communications will be compromised. But the Macron text again trains the spotlight on the EU executive’s policies regarding keeping a public record of its leader’s communications, following a scandal dubbed “Pfizergate” in which von der Leyen’s text exchanges with Pfizer CEO Albert Bourla over Covid vaccine contracts were never archived. The European Ombudsman continues to investigate Pfizergate, and this week announced it had opened an investigation into last year’s text from Macron. According to Olejnik, “the truth is that [auto-deleting messages] is great for security, not so [much] for public transparency or accountability.” Gerardo Fortuna contributed reporting.
Mercosur
Technology
Transparency
Communications
Diplomacy
Britain drops demand for access to Apple user data
LONDON — The British government has dropped its demand for Apple to provide “backdoor” access to user data, U.S. Director of National Intelligence Tulsi Gabbard said Tuesday.  “Over the past few months, I’ve been working closely with our partners in the UK, alongside [the president and vice president] to ensure Americans’ private data remains private and our constitutional rights and civil liberties are protected,” Gabbard wrote on X.   Apple took the unprecedented step of removing its highest level of end-to-end encryption software — known as Advanced Data Protection — from the U.K. market in February after the Home Office issued a Technical Capability Notice to access the data under the Investigatory Powers Act, dubbed the “Snooper’s Charter” by critics. The company then filed a complaint with the Investigatory Powers Tribunal challenging the Home Secretary’s powers to issue such a notice.   The dispute has been a sticking point in negotiations for a tech cooperation pact between London and Washington. The Financial Times reported last month that senior Washington officials, including Vice President JD Vance, were pressuring the U.K. to drop its fight with Apple.   The U.S. State Department’s annual assessment of countries’ human rights records published last month raised concerns about U.K. “government regulation to reduce or eliminate effective encryption (and therefore user privacy) on platforms,” though appeared to confuse the Online Safety Act with the Investigatory Powers Act. 
Data
Intelligence
UK
Regulation
Human rights
US says UK human rights record worsening thanks to online safety regime
LONDON — The U.S. claims the U.K’s human rights situation “worsened” in 2024, citing restrictions on freedom of expression largely linked to the U.K.’s Online Safety Act (OSA). The U.S. State Department’s annual assessment of countries’ human rights practices, published Wednesday, criticizes the U.K. government for attempting to “chill speech” around the perpetrator of last summer’s Southport attack in which three young girls were killed, stating that “censorship of ordinary Britons was increasingly routine.” “Significant human rights issues included credible reports of serious restrictions on freedom of expression, including enforcement of or threat of criminal or civil laws in order to limit expression; and crimes, violence, or threats of violence motivated by antisemitism,” the report states. The report singles out the U.K.’s OSA for criticism, claiming the rules “expressly expanded Ofcom’s authority to include American media and technology firms with a substantial number of British users, regardless of whether they had a corporate presence in the UK.” Since a key provision requiring children be restricted from accessing content deemed harmful came into force in July, leading platforms to carry out widespread age checks, digital rights groups including Open Rights Group and Big Brother Watch raised privacy and freedom of speech concerns. However, parts of the U.S. report mischaracterize the U.K.’s online safety laws. It claims the communications watchdog, Ofcom, is authorized by the act to “monitor all forms of communication” for illegal speech, which is not the case. The OSA also does not enable the regulator or the government to direct the removal of specific items of content. Instead — as the report later outlines correctly — service providers have a more general obligation to prevent users from encountering such content, while protecting their right to freedom of expression. The report also highlights some experts’ warnings that “one effect of the [Act] could be government regulation to reduce or eliminate effective encryption (and therefore user privacy) on platforms.” The U.K.’s previous Conservative government pulled back from enforcing the Act’s “spy clause” after conceding that the technology to securely scan end-to-end encrypted messages for child sexual abuse material without undermining privacy did not yet exist. Digital rights groups fear encryption-breaking surveillance could still be pushed in the future, as it still forms part of the OSA. The Home Office is currently locked in a legal row with Apple over end-to-end encryption after the company refused to grant access to information secured to its Advanced Data Protection system earlier this year, but the dispute relates to the Investigatory Powers Act (dubbed “Snooper’s Charter” by critics) rather than the OSA. The Financial Times reported last month that senior Washington officials, including Vice President JD Vance, were pressuring the U.K. to drop its fight with Apple over encryption.
Regulation
Rights
Human rights
Technology
Services