Physical Address

304 North Cardinal St.
Dorchester Center, MA 02124

Bluesky saw a 17x increase in moderation reports in 2024 after rapid growth


Bluesky published on Friday moderation report for last year, noted the significant growth the social network is experiencing in 2024 and how this is impacting the workload of its Trust & Safety team. He also noted that most reports are of users reporting accounts or posts for harassment, trolling, or intolerance — a problem that Bluesky worries about and even causes as it grows. large-scale protests sometimes over individual moderation decisions.

The company’s report did not explain why it did or did not take action individual users, including those on most blocked list.

The company added more than 23 million users in 2024 as Bluesky became a new destination for ex-Twitter/X users for various reasons. During the year, the social network benefited from a number of changes in X, including the decision to change how blocking works and training AI on user data. Other users The United States left X after the results of the presidential electionsIt’s based on how X owner Elon Musk’s policies have come to dominate the platform. The application has also increased the number of users during its X existence Temporarily banned in Brazil back in September.

To meet the demands created by this growth, Bluesky said it has increased its moderator team to about 100 moderators and continues to hire. The company also began offering psychological counseling to help group members deal with the difficult task of constantly being exposed to graphic content. (We hope that AI will apply one day, because humans are not designed to handle this kind of work.)

In total, Bluesky’s moderation service received 6.48 million reports, which is 17 times more than in 2023, when there were only 358,000 reports.

Starting this year, Bluesky will start accepting moderation reports directly from its app. Like X, it will allow users to more easily track actions and updates. Later, it will also support in-app requests.

When Brazilian users signed up to Bluesky in August, the company was seeing 50,000 reports a day. This led to a backlog in resolving moderation reports and required Bluesky to hire more Portuguese-speaking staff through a contract vendor.

In addition, Bluesky began automating more report categories than just spam to help eliminate the flow, although this sometimes led to false positives. Again, automation helped reduce processing time to mere “seconds” for “high precision” calculations. Before automation, most reports were processed within 40 minutes. Human moderators are now kept in the loop to deal with false positives and referrals, even if they don’t always make the initial decision.

Bluesky says 4.57% (1.19 million) of its active users made at least one moderation report in 2024, up from 5.6% in 2023. Most of these – 3.5 million reports – were for individual posts. Account profiles were reported 47,000 times, often for a profile picture or banner picture. Lists have been reported 45,000 times; DMs were reported 17,700 times, while tapes and Starter Packs received 5,300 and 1,900 reports respectively.

Most of the reports were about anti-social behavior like trolling and stalking – a signal from Bluesky users that they want to see a less toxic social network than X.

Other reports were for the following categories, Bluesky said:

  • Deceptive content (false claims of identity or affiliation, false information or false claims): 1.20 million
  • Spam (excessive posts, replies or duplicate content): 1.40 million
  • Unsolicited sexual content (nudity or adult content not properly labeled): 630,000
  • Illegal or urgent issues (clear violation of laws or Bluesky’s terms of service): 933,000
  • Other (issues that do not fit into the above categories): 726,000

The company also offered an update to its tagging service, which includes tags added to posts and accounts. Human taggers added 55,422 “sexy figure” tags, followed by 22,412 “rude” tags, 13,201 “spam” tags, 11,341 “intolerant” tags, and 3,046 “threatening” tags.

In 2024, 93,076 users submitted a total of 205,000 requests for Bluesky’s moderation decision.

There were also 66,308 account deletions from moderators and 35,842 automated account deletions. Bluesky also fielded 238 requests from law enforcement agencies, governments and law firms. The company responded to 182 of them and fulfilled 146. Most of the requests were from law enforcement agencies in Germany, the United States, Brazil and Japan.

Bluesky’s full report also addresses other issues, including trademark and copyright claims and child safety/CSAM reports. The company noted that it submitted 1,154 verified CSAM reports to the National Center for Missing and Exploited Children (NCMEC).



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *