New study shows how virtual communication networks remove legal content
Platforms are deleting content indiscriminately. Anything that doesn’t fit the state narrative has to go.
In recent years, Europe has reached a clear political consensus: virtual communication networks are breeding grounds for hate speech, disinformation, and extremism. Laws like the German Network Enforcement Act (NetzDG) and the European Digital Services Act (DSA) were intended to remedy this. However, a large-scale analysis of content deletion practices on Facebook and YouTube paints a picture that undermines this narrative. The figures show that content is being deleted on a massive scale – but for the most part, not what is illegal.

Almost everything deleted – almost everything legal
The study analysed over 1.27 million comments posted on major political and media Facebook pages and YouTube channels in Germany, France, and Sweden. Of these, 43,497 comments were deleted, which initially sounds moderate: around 3.4 percent of all posts.
But what matters is not how much is deleted, but what.
The result is sobering: Depending on the country and platform, between 87.5 and 99.7 percent of the deleted comments were legally permissible. In Germany, this figure reached an extreme point:
- 99.7 percent of deleted Facebook comments
- 98.9 percent of deleted YouTube comments
were legal under national law.
This means that almost everything that disappeared could have stayed.
Germany as a special case of over-extinction
Germany stands out not only for the legality of deleted content, but also for the intensity of its moderation. On YouTube, 11.46 percent of all comments were deleted there – more than twice as many as in Sweden (4.07 percent) and significantly more than in France (7.23 percent). While Germany’s figure is lower on Facebook at 0.58 percent, the proportion of legal deletions is still extremely high.
This discrepancy suggests that regulatory pressure, rather than societal coarsening, shapes the behaviour of these platforms. High penalties lead to content being deleted when in doubt – without nuanced review.
Opinions are primarily deleted
The qualitative analysis of the removed content is particularly revealing. On average, over 56 percent of the deleted comments consisted of pure expressions of opinion: political positions, value judgments, criticism, or agreement – without insults, incitement to violence, or illegal content.
Illegal content made up only a small part of it:
- In Germany: 0.3 percent (Facebook) and 1.1 percent (YouTube)
- In Sweden: 5.4 percent on both platforms
- In France: 7.9 percent (Facebook) and 12.5 percent (YouTube)
In other words, even where the most illegal content was found, the vast majority of deletions were not legally justified.
Lack of transparency as a systemic problem
Furthermore, there is a structural democratic deficit: Only 25 percent of the surveyed websites and channels disclosed the additional rules they use to moderate comments. For users, this often remains unclear.
- who deleted the comment
- on what basis
- and whether it is a platform decision, page moderation, or automated removal.
This ambiguity creates uncertainty – and uncertainty breeds self-censorship. Those who don’t know where the line lies speak more cautiously, or not at all.
Platforms as judges without the rule of law
The real problem lies in the shift of power. The state sets vague goals (“fighting hate speech”), threatens harsh penalties – and leaves the implementation to private corporations. These corporations don’t act according to the rule of law, but rather according to risk assessment. For platforms, over-deletion is rational: those who delete too much risk little. Those who delete too little risk fines, political attacks, and public scandals.
This effectively creates a privatized pre-censorship, triggered by state regulation, but without judicial control, without transparency, without effective legal remedies.
The DSA increases the risk
The Digital Services Act (DSA) now extends this model to the entire EU. While the law promises greater transparency and user rights, the fundamental dynamics remain: high political pressure, asymmetric risks, and unclear terminology. For platforms, it remains safer to delete too much than too little.
What is particularly problematic is that there has already been open discussion about using the DSA to restrict or shut down platforms during times of social unrest. What is justified today as hate speech could be politically instrumentalized tomorrow.
Conclusion: Order without freedom
The figures paint a clear picture: The digital space is not becoming cleaner, but poorer. Extremists are not disappearing, but rather debates. Hate is not being combated in a targeted manner, but rather opinions are being filtered across the board.
Democracy thrives on tolerance. On friction, contradiction, and challenge. When legal expression of opinion is deleted en masse, the boundaries of what can be said shift – subtly, technically, seemingly apolitically.
But therein lies the danger: freedom doesn’t die with a prohibition. It dies when no one dares to use it anymore.
yogaesoteric
January 31, 2026