Salta al contenuto principale


Behind the Numbers: Who Moderates the Social Web?


Volunteers, burnout, and the people holding the line Who is doing the work to keep the social web safe? Who responds to reports, blocks malicious actors, answers legal requests, and supports users in distress? According to the 2025 Social Web Trust & Safe

This post is part of an ongoing series exploring the findings and forecasts from the 2025 Social Web Trust & Safety Needs Assessment Report.

Now that we have three years of data, we’ll not only dive into the 2025 results, but also take a broader look at how key patterns have shifted over time. From volunteer burnout to federation policies, this series will highlight what’s changing, what’s staying the same, and what that means for the future of trust and safety on the social web.


Volunteers, burnout, and the people holding the line


Who is doing the work to keep the social web safe? Who responds to reports, blocks malicious actors, answers legal requests, and supports users in distress?

According to the 2025 Social Web Trust & Safety Needs Assessment Report, it is mostly unpaid, overstretched volunteers. This year’s findings confirm what many already know from experience: the people making moderation possible are holding up a system that is growing heavier by the day.

Moderators are doing everything, often alone


Most of the people keeping platforms safe are not working in large teams or focused roles. They are volunteers running small or medium-sized services who also manage hosting, community building, and legal issues.

  • More than half of all respondents said they were the only moderator or part of a very small team
  • Only 13% said their main focus was moderation
  • The rest balance moderation with technical administration, community management, and legal/compliance activities

There is no clear boundary between roles on most services. Instead, safety work is something moderators have to squeeze in along with everything else.

In 2025, 45% of respondents reported handling three or more roles, down from 52% in 2023. This includes those selecting all four roles (moderation, systems admin, community management, and legal/compliance).

This slight but consistent decline may indicate some separation of duties as communities mature. However, it could also reflect role fatigue, reduced participation, or the departure of volunteers who were previously covering multiple responsibilities.

The mod-to-member ratio is getting worse, not better


Based on service account totals, the average ratio of moderators to accounts is now 1:24,288 (total accounts). In 2023 this was 1:6,167. This change is likely not due to improved efficiency, it more likely reflects a growing burden on the same limited pool of volunteers.

While some of the largest instances have dedicated teams, the majority of services are run by one or two people. There is no easy way to scale up this labour, and no capacity to absorb new or worsening threats.

Moderators are burning out


One in five respondents reported that their moderation work had a negative impact on their mental health. This includes trauma, exhaustion, or withdrawal from community life. This number has been consistent since 2023, roughly 20% report the same each year.

The harms moderators are exposed to include spam floods, disinformation campaigns, hate speech, harassment, and occasionally CSAM or reports of serious real-world harm. Most teams do not have access to legal advice, mental health resources, or trauma-informed processes.

“There is no backup. If I disappear for a week, everything piles up” said one respondent. Many moderators do not feel safe or supported. Even those who continue to moderate effectively report a high cost to doing so.

We are not onboarding enough new people


Although the report shows a modest increase in average experience overall, it also reveals a decline in the number of new moderators entering the ecosystem. In many communities, experienced moderators have been doing the work for years, often without formal support or clear succession planning.

Moderator experience appears to be splitting into two distinct groups: a growing number of early-career moderators with fewer than three years of experience, and a smaller but rising group with six to ten years.

Those in the middle, particularly with three to six years of experience, are falling away sharply. Without stronger onboarding and retention support, the gap between newer volunteers and long-time moderators is likely to widen.

If we don’t improve the pathways for new moderators to enter, learn, and stay, the system may not hold. The number of people doing the work will continue to shrink, even as threats increase.

This is not sustainable


Decentralised platforms pride themselves on being community-led and member-directed. But community care requires people. And right now, those people are overwhelmed.

If we want the social web to remain open, resilient, and safe for marginalised users, we need to support the humans at its core.

What will help: shared tools and templates for policy, onboarding, and moderation; access to wellbeing support and peer networks; sustainable funding for training, stipends, and community-led projects; less duplication and more shared infrastructure across services.

We’ll be sharing more posts in the coming weeks, each looking at a different part of the report. From big-picture trends to behind-the-scenes insights, our goal is to make the findings useful, readable, and relevant to the people doing the work. If you’re part of that work, or thinking about getting involved, we hope you’ll follow along.

Support the people doing the work


IFTAS supports the moderators, administrators, and community volunteers who make the social web safer and more resilient. If you believe this work matters, please consider making a donation. Even small contributions help fund training, tools, and care for the people doing the work.

Donate to IFTAS today.


The 2025 Social Web Trust & Safety Report Is Here


New insights into the people, pressures, and infrastructure shaping decentralised platforms
Cover page of the 2025 needs assessment report
Published by IFTAS, this report draws on detailed surveys and community feedback from volunteer moderators, administrators, and community managers across the decentralised social web. It offers our most comprehensive picture yet of the trust and safety landscape across projects like Mastodon, GoToSocial, WordPress, PeerTube, and more.

What’s in the Report


  • New pressures on moderators: The average mod-to-user ratio has worsened to 1:3,500
  • Spam has overtaken CSAM as the top concern for most teams
  • Burnout remains widespread: 1 in 5 admins and moderators reported trauma or exhaustion
  • Most services lack legal or procedural safeguards needed to manage risk
  • Small communities dominate, but the ecosystem lacks tooling designed for them
  • Consent-based federation is emerging as a desired model for growth and safety


What’s New in 2025


  • There is growing consolidation among large services – and growing strain
  • There’s less onboarding of new moderators, even as threats increase
  • Disinformation campaigns and AI-generated spam are now prominent risks
  • Legal and regulatory complexity is increasing – but support remains scarce


Forecasts for 2026


This year’s report also includes a forward-looking forecast, identifying five trends that will shape the coming year:

  • Shared logic and trust signals will replace fragmented blocklists
  • Synthetic media and impersonation will challenge human moderation
  • Infrastructure capture risks are rising as more tooling centralises
  • Global safety regulation is becoming enforceable, not optional
  • Greylisting and allowlisting may soon replace “default open federation”


Why This Matters


Moderators are the backbone of a safer social web – but most are unpaid, under-supported, and under constant strain. If we want a future for decentralised platforms that respects user agency, civil speech, and community autonomy, we need to support the infrastructure that keeps it safe.

Read the Report


Download the 2025 Needs Assessment Report (PDF)

In the weeks ahead, we’ll be publishing a series of follow-up posts that take a closer look at the trends, challenges, and emerging patterns highlighted in this year’s report. These articles will explore context and practical insights for anyone working to support safer, more resilient decentralised platforms.

Media or Press Enquiries


For questions, interviews, or background information related to the report or IFTAS’ work, contact press@iftas.org

Follow IFTAS to stay informed: Mastodon, Bluesky, WordPress (see below)


Questa voce è stata modificata (16 ore fa)

IFTAS reshared this.