Behind the Numbers: Who Moderates the Social Web?
This post is part of an ongoing series exploring the findings and forecasts from the 2025 Social Web Trust & Safety Needs Assessment Report.
Now that we have three years of data, we’ll not only dive into the 2025 results, but also take a broader look at how key patterns have shifted over time. From volunteer burnout to federation policies, this series will highlight what’s changing, what’s staying the same, and what that means for the future of trust and safety on the social web.
Volunteers, burnout, and the people holding the line
Who is doing the work to keep the social web safe? Who responds to reports, blocks malicious actors, answers legal requests, and supports users in distress?
According to the 2025 Social Web Trust & Safety Needs Assessment Report, it is mostly unpaid, overstretched volunteers. This year’s findings confirm what many already know from experience: the people making moderation possible are holding up a system that is growing heavier by the day.
Moderators are doing everything, often alone
Most of the people keeping platforms safe are not working in large teams or focused roles. They are volunteers running small or medium-sized services who also manage hosting, community building, and legal issues.
- More than half of all respondents said they were the only moderator or part of a very small team
- Only 13% said their main focus was moderation
- The rest balance moderation with technical administration, community management, and legal/compliance activities
There is no clear boundary between roles on most services. Instead, safety work is something moderators have to squeeze in along with everything else.
In 2025, 45% of respondents reported handling three or more roles, down from 52% in 2023. This includes those selecting all four roles (moderation, systems admin, community management, and legal/compliance).
This slight but consistent decline may indicate some separation of duties as communities mature. However, it could also reflect role fatigue, reduced participation, or the departure of volunteers who were previously covering multiple responsibilities.
The mod-to-member ratio is getting worse, not better
Based on service account totals, the average ratio of moderators to accounts is now 1:24,288 (total accounts). In 2023 this was 1:6,167. This change is likely not due to improved efficiency, it more likely reflects a growing burden on the same limited pool of volunteers.
While some of the largest instances have dedicated teams, the majority of services are run by one or two people. There is no easy way to scale up this labour, and no capacity to absorb new or worsening threats.
Moderators are burning out
One in five respondents reported that their moderation work had a negative impact on their mental health. This includes trauma, exhaustion, or withdrawal from community life. This number has been consistent since 2023, roughly 20% report the same each year.
The harms moderators are exposed to include spam floods, disinformation campaigns, hate speech, harassment, and occasionally CSAM or reports of serious real-world harm. Most teams do not have access to legal advice, mental health resources, or trauma-informed processes.
“There is no backup. If I disappear for a week, everything piles up” said one respondent. Many moderators do not feel safe or supported. Even those who continue to moderate effectively report a high cost to doing so.
We are not onboarding enough new people
Although the report shows a modest increase in average experience overall, it also reveals a decline in the number of new moderators entering the ecosystem. In many communities, experienced moderators have been doing the work for years, often without formal support or clear succession planning.
Moderator experience appears to be splitting into two distinct groups: a growing number of early-career moderators with fewer than three years of experience, and a smaller but rising group with six to ten years.
Those in the middle, particularly with three to six years of experience, are falling away sharply. Without stronger onboarding and retention support, the gap between newer volunteers and long-time moderators is likely to widen.
If we don’t improve the pathways for new moderators to enter, learn, and stay, the system may not hold. The number of people doing the work will continue to shrink, even as threats increase.
This is not sustainable
Decentralised platforms pride themselves on being community-led and member-directed. But community care requires people. And right now, those people are overwhelmed.
If we want the social web to remain open, resilient, and safe for marginalised users, we need to support the humans at its core.
What will help: shared tools and templates for policy, onboarding, and moderation; access to wellbeing support and peer networks; sustainable funding for training, stipends, and community-led projects; less duplication and more shared infrastructure across services.
We’ll be sharing more posts in the coming weeks, each looking at a different part of the report. From big-picture trends to behind-the-scenes insights, our goal is to make the findings useful, readable, and relevant to the people doing the work. If you’re part of that work, or thinking about getting involved, we hope you’ll follow along.
Support the people doing the work
IFTAS supports the moderators, administrators, and community volunteers who make the social web safer and more resilient. If you believe this work matters, please consider making a donation. Even small contributions help fund training, tools, and care for the people doing the work.
Who Moderates the Social Web? | 2025 IFTAS Report Series
Who moderates the social web? Explore findings on burnout, workload, and experience from the 2025 Trust & Safety Report by IFTAS.IFTAS
IFTAS reshared this.