Community Summit Discussion Notes: Handling Trust & Safety (“T&S”) in the WordPress ecosystem: content moderation and sensitive content

From the session schedule:

WordPress has many community-led initiatives and directories with user-submitted content, like the Pattern Directory, the Photo Directory, Openverse, Themes, Plugins, and Support Forums. These areas hold user-submitted content, whether text or other media. However, each team has its own methods to evaluate its content, and its own moderation practices. This discussion aims to better understand current practices across teams towards establishing project-wide best practices and clear guidelines that prioritize safety and equitable moderation.

Facilitator: Kathryn Presner (@zoonini)

Notetaker: david wolfpaw (@wolfpaw)

Raw Notes

  • There are areas where users can submit media, and each team has protocols on how people moderate content. What are other teams doing, what can we learn for each other, and what are best practices that we can bring that are safe and equitable.
  • Openverse is a search engine for open sourceOpen Source Open Source denotes software for which the original source code is made freely available and may be redistributed and modified. Open Source **must be** delivered via a licensing model, see GPL. media that is integrated in WordPress as of v6.2 to add content to blogs. It is an aggregator of creative commons licensed content. Openverse is an aggregator, unlike Photos team, where people upload things specifically
  • Having aggregated content makes it so that things are not consistent between platforms that are aggregated from. Do we use our own guidelines on how we moderate content that comes in.
  • Are there trust and safety courses on WordPress Learn? It is doubtful since it is a niche area of needs.
  • The community team has resources for de-escalation but there are not a lot specific to our community
  • Are there ways to report content from within WordPress – yes, there is in GutenbergGutenberg The Gutenberg project is the new Editor Interface for WordPress. The editor improves the process and experience of creating new content, making writing rich content much simpler. It uses ‘blocks’ to add richness rather than shortcodes, custom HTML etc. https://wordpress.org/gutenberg/, but what is the mechanism that is triggered when
  • Openverse comes across content that is sensitive in searches and reporting, and they are working on mechanisms to handle this
  • A lot of people who use Openverse are using it in an education context
  • It’s a difficult problem because you cannot apply a one-size-fits approach to moderation
  • If you pt that to one group of people, no one has all insight to things that are offensive in some cultures but not offensive to others.
  • There are legal restrictions in some countries that go against legal requirements in other countries
  • Openverse is in coreCore Core is the set of software required to run WordPress. The Core Development Team builds WordPress., so the software is from a foundation based in the US, so do we have to legally comply with what other countries do?
  • Private companies like WordPress.com abide by US law when it comes to trust and safety issues
  • Sometimes outsourcing happens for moderation, and that includes for different cultures
  • If a country decides to blockBlock Block is the abstract term used to describe units of markup that, composed together, form the content or layout of a webpage using the WordPress editor. The idea combines concepts of what in the past may have achieved with shortcodes, custom HTML, and embed discovery into a single consistent API and user experience. Openverse for not meeting their requirements, it is up to a website owner to determine to manage that accordingly when making their website
  • Some people say, “keep politics out of tech”, but there have to be defaults on some things, and we don’t know how to reach consensus on it, drawing a line in the sand as the WordPress project. It’s good to have options, but most people don’t change defaults.
  • Support has a bunch of things that could be useful for other teams. One thing is that every forum topic has a report this topic button, and people have to choose why to report a topic
  • previously you had to add “mod-look” to a forum to get into the queue of support
  • Content that can be marked as unsafe for everyone,
  • We can flag folks for mod-watch in forums, and they cannot post or make topics without being manually approved by moderators. Volunteers go through that queue to approve. This can be automated depending on what people type, such as “security vulnerability”, or spam checks on their profile and forum replies
  • Things like mature content are also not indexed from the forums so when you search for support it won’t come up top
  • There’s no prioritization on the queue by topic, just by chronological time of adding
  • People can also be moderated manually, with a flag on their posts to be pre-moderated
  • Notes can be left to review about users, and this can be used to unflag people
  • Teams like Support and Openverse could share their T&S guidelines with other teams, though a lot of them are case-specific.
  • We need to have language about calling something sensitive, and then the reasons that they are called sensitive, such as things that are “mature” or not.
  • Basing things on a blocklist can have cultural issues, like names and words that are commonplace or not sensitive in one location but are in others. For instance, the use of “nonce” in American tech culture, versus British English.
  • The sensitive terms list of Openverse is public as file for indexing and reference
  • We don’t obfuscate all content, but you can set limits for what is returned when you search for sensitive content. For instance, blurring search results for images of sensitive content unless you opt not to. We don’t want to stop what people search for, but providing consent on what you will see when you do a search. If you are looking for sensitive content and is relevant we want to acknowledge that while making the platform safe to use
  • Various different categories of sources for content to aggregate to OpenVverse, there are different levels of trust on content. So like in Gutenberg we don’t return searches for Flickr and Wikimedia, which are more akin to social media. They are still available on Openverse, but not in this context
  • In the theme and pluginPlugin A plugin is a piece of software containing a group of functions that can be added to a WordPress website. They can extend functionality or add new features to your WordPress websites. WordPress plugins are written in the PHP programming language and integrate seamlessly with WordPress. These can be free in the WordPress.org Plugin Directory https://wordpress.org/plugins/ or can be cost-based plugin from a third-party space, there are some trust and safety issues that arise. For instance a plugin that was blown up and escalated overnight that was in support of the Russian war. It was a bigger deal than it might have been because there was no formal process for it.
  • Is there some form of record keeping of trust and safety related for when there are problems
  • The people who are most equipped to handle subtle problems with moderation are not always supported with things like mental health, and have to deal with the worst things
  • Doing trust and safety and moderation in a community level is quite different from doing it from a corporate level
  • There are opportunities to build a network to support people at a community level for mental health
  • Is there room for a Trust and Safety Make team that is cross-funtional with other Make teams on ensuring that they have support for this topic.
  • We would like to have someone pursue a cross-functional Make team, and getting sponsored volunteers for moderation.

#summit, #summit-2023