Under our , Medium may suspend or limit the distribution of controversial or extreme content at its discretion, including potentially harmful misinformation and intentionally deceptive disinformation. For all reported content, we take into account factors like , the context and nature of the posted information, reasonable likelihood, breadth, and intensity of foreseeable social harm, and applicable laws.
In evaluating controversial and extreme content under our rules, we apply a risk analysis which includes, at minimum, the following questions:
- What are the foreseeable negative consequences of the information being propagated by our network, and shared on other social media networks?
- How severe might the potential impact be?
- What is the likelihood of the negative consequence occurring?
- Who will likely be affected as a result?
- Is there information from nationally and internationally recognized institutions, (such as the CDC, WHO, and other official bodies) to help us determine if content presents an elevated risk and potentially violates our rules?
As a result, the following are some examples of content areas with elevated risk, which are therefore more likely to be suspended or subject to reduced distribution under our rules:
- Pseudo-scientific claims related to asserting the superiority or inferiority of a particular group (on bases including race, ethnicity or gender).
- Pseudo-scientific health claims or advice which, if acted on, are likely to have detrimental health effects on persons or public safety.
- Conspiracy theories which have an associated history of harassing, hateful, or violent incidents among its adherents OR theories which may foreseeably incite or cause harassment, physical harm, or reputational harm.
- Intentional distortions and especially systematic false claims about historic events and facts.