Meta’s EU political ad ban: what it means for charities

A collage of street art posters on a colourful, weathered wall. The central poster features a stylized black-and-white image of a a smiling Mark Zuckerberg, over a background of Facebook logos, with the text 'YOU'VE BEEN ZUCKED' below. The poster is defaced with graffiti. The surrounding area is filled with layers of torn posters, paint, and scribbles.

From 10 October 2025, Meta will stop allowing political and social issue advertising across the European Economic Area (EEA). This includes any ads flagged under its “Social Issues, Elections or Politics” (SEP) category – something that often affects charities promoting work on topics like immigration, housing, or international aid.

The change is in response to the EU’s new TTPA regulation, which introduces tighter rules for transparency and targeting in political ads.

While the Meta ad ban won’t immediately affect UK-based advertisers, the ripple effects are likely to be felt much further. This is part of a crackdown on how big tech platforms handle political content – and charities working across Europe or on socially sensitive issues will need to take notice.

Why Meta is banning political ads in Europe

This change comes in response to the EU’s Transparency and Targeting of Political Advertising (TTPA) regulation, which aims to tackle disinformation and increase transparency in online campaigning. Platforms like Meta, Google, and Tiktok are required to meet stricter rules on how political ads are labelled, targeted, and reported.

Rather than comply, Meta has chosen to withdraw political and social advertising altogether across the EU, citing legal complexity and operational challenges.

This is just the latest in a deteriorating relationship between Meta and EU regulators – from the Digital Services Act to ongoing lawsuits over data use, disinformation, and consent models.

The big picture: Meta, misinformation and regulatory missteps

Meta has long played a central role in political discourse – helping to spread news, shape opinions and mobilise action. It’s been under increasing legal and regulatory scrutiny, which so far has resulted in Meta designating a set of restricted/sensitive categories that require advertisers to pass additional verification checks. SEP is one of these categories.

The EU launched a Digital Services Act (DSA) in 2022, requiring online platforms like Meta to take more decisive action on harmful content and disinformation. However, when Facebook and Instagram failed to act on deceptive advertising in the run up to the European Parliament elections in 2024, proceedings were initiated against them.

This isn’t the only time Meta has run into challenges with EU regulation – it currently pays out over $1bn a year just to the EU in penalties. Multiple legal battles have been brought against the embattled platform over data, privacy, manipulation through disinformation and election interference – including a US lawsuit over its failures in protect young people online, a FTC fine over its involvement with Cambridge Analytica (and a subsequent settlement with investors over its failure to comply), a €200M fine over Meta’s ‘consent or pay’ model, and a $50M AUD settlement over Cambridge Analytica.

Meta (along with other platforms) has refused to sign EU’s voluntary digital code of practice regarding the implementation and usage of AI, which has contributed to a worsening relationship. 

The TTPA is another step in the EU’s attempts to reign in tech giants. It introduces stricter rules for political and social advertising, including new transparency labels and limits on how ads can be targeted.

What’s currently in place, and what’s about to change

Currently, advertisers who run SEP ads on Meta must register their identity, become authorised and include a disclaimer identifying who paid for the ads. Details of these ads and how much has been spent are publicly accessible via an ads library.

The TTPA imposes new, stricter requirements for political advertising, which include:

  • Identifying the sponsors of the ads
  • Identifying the relevant elections impacted by the advertising
  • Disclosing audience targeting methods

It’s clear that the EU considers current attempts to limit disinformation or prevent election interference to be insufficient. The regulation reflects growing frustration with the way big tech handles political content, user privacy and its influence on public discourse.

Notably, consent or pay models are explicitly not allowed in the EU, but the UK may be slightly more favourable so long as it’s implemented consistently with ICO policies.

Meta’s response – complete withdrawal of SEP ads

Meta’s response has been heavy-handed: it has elected to stop SEP ads entirely, and those who advertise in areas adjacent to social and political issues will bear the brunt of this. Where advertisers could previously mark content as SEP in Ads Manager and still publish it after verification, that option is being removed. Ads flagged under these categories won’t be approved or published.

Meta argues that the new regulations are too restrictive, and limit EU citizens’ rights to engage in political or social issues:

The TTPA introduces significant, additional obligations to our processes and systems that create an untenable level of complexity and legal uncertainty for advertisers and platforms operating in the EU. For example, the TTPA places extensive restrictions on ad targeting and delivery which would restrict how political and social issue advertisers can reach their audiences and lead to people seeing less relevant ads on our platforms.

As Liz Carolan points out, banning political ads entirely can backfire – making it harder to monitor harmful behaviour and pushing legitimate organisations off the platform. Without oversight, disinformation may spread more easily. According to Carolan, the responsibility for preventing this still sits with Meta and other major digital platforms. 

Beyond Meta: what this signals about the future of tech regulation

The TTPA signals that the EU will no longer tolerate non-compliance by tech giants on issues like privacy, transparency, and election interference. It also suggests more regulation is on the way.

Right now, the regulation only applies to advertisers targeting EU countries. But given the EU’s influence on global policies – often referred to as the Brussels effect – it’s very possible that other governments, including the UK, will follow its lead (GDPR is a great example of this).

We see the TTPA as a positive step in holding platforms accountable for the spread of harmful content and misinformation. But Meta’s all-or-nothing response means that advertisers working on adjacent or sensitive social issues, including charities, are likely to bear the brunt of the restrictions.

What is clear is that the regulatory environment is growing increasingly more complicated, and that advertisers (including charities) will need to evolve to keep up.

How the restrictions will impact charities

From what we’ve seen, ads don’t need to be overtly political or mention elections to be flagged as SEP. Content may be flagged if it includes words, images, or references that are politically sensitive in the current news cycle or public debate.

This can shift over time, depending on what’s in the headlines. To reduce risk, it’s important to stay alert to public discourse that overlaps with your cause area. You should also prepare a variety of creative assets and copy variations to increase your chances of approval.

In 2020, words like ‘COVID’ or ‘pandemic’ have been flagged as they risked being misused by bad actors. Current issues could include keywords about trans rights, race, homelessness, international aid, or refugees/asylum seekers.

We don’t expect ads to be blocked solely because they’re fundraising (e.g. donations, petitions, handraisers). But if these ads use politically sensitive wording or visuals, they could be caught by the filter.

What charities need to do now

There’s not much that can be done to reverse this, but you can prepare. Here’s what Platypus is doing right now:

  • Reviewing all client accounts to understand specific risks,
  • Flagging potentially sensitive content with EU-based clients,
  • Helping UK clients de-risk upcoming Meta ad content.

We always advise charities to maintain a diverse and resilient mix of advertising platforms. But switching platforms won’t avoid the risk entirely: and the regulatory shift will apply across digital media.

Google is also ending political advertising. TikTok is under investigation for a range of issues including insufficient protection of data and a lack of transparency over political advertising. It’s banned outright in a number of countries due to interference risks.

We believe this trend will eventually reach any platform used to reach EU audiences.

If you’re a UK-only advertiser, and you’ve only occasionally had ads flagged as SEP, there’s no immediate action required – but now is a good time to start preparing.

If you advertise in the EEA, we recommend you:

  • Review past campaigns to see how often your ads have been flagged,
  • Identify patterns in flagged ads (keywords, themes, visuals),
  • De-risk future campaigns by avoiding potentially sensitive language, including on your landing pages
  • Reserve your more politically-adjacent content for stewardship channels like email or SMS, rather than paid social.

Questions? Contact Kris or Jane to arrange a time to speak, or fill in our contact form.

 

Share on social

Share on social

Recent Posts

Find out how we can help your cause

If you would like to learn more about how we can help your cause or you have a general query, please get in touch using the contact form below and we will get back to you as soon as possible.