Note: If you find my posts too long or too dense to read on occasion, please just read the bolded portions. They present the key points I’m making and the most important information I’m sharing.
Facebook’s promotion of low-quality, right-wing content and disinformation has been clearly documented. For example, in April 2021, The Daily Wire, a bigoted, sexist, anti-immigrant, far-right website that produces no original reporting and a low volume of articles had by far the highest distribution / engagement on Facebook. Second highest was the British tabloid, the Daily Mail, followed by Fox News. Four of the top six sources of content engagement on Facebook were right-wing publishers of disinformation. Credible media got much less engagement due to Facebook’s content promotion algorithm. For example, for April 2021: 
- The Daily Wire (1st) 74.9 million Facebook engagements based on 1,385 articles
- CNN (4th) 23.1 million Facebook engagements based on 4,765 articles
- NBC (7th) 18.7 million Facebook engagements based on 2,596 articles
- New York Times (8th) 18.6 million Facebook engagements based on 6,326 articles
- Washington Post (14th) 12.3 million Facebook engagements based on 6,228 articles
Facebook’s reality, driven by its content promotion algorithm, is NOT the reality outside of Facebook. The Daily Wire is NOT more popular than CNN, NBC, the New York Times, and the Washington Post in the world outside of Facebook, let alone more popular than all four of them combined – and the almost 20,000 articles they publish per month compared to the less than 1,400 articles of The Daily Wire, none of which contain original reporting. Facebook promotes this alternative reality because it maximizes its profits. (See this previous post for more detail.)
The election-related disinformation that flourishes on Facebook is a global crisis. There are 36 national elections in countries around the globe in 2022 and many of them will be affected by disinformation on Facebook. Some may be affected to an even greater degree than what has occurred in the U.S., where a strong case can be made that disinformation on social media (with Facebook as a major if not the major player) led to the election of Trump in 2016.
Facebook (and its parent Meta) know how to stop the proliferation of disinformation and have done so for short periods of time at least twice. Meta refers to these instances as “break the glass” emergencies, but the emergency is not short-term and specific incident related, it’s long-term and endemic.
For five days after the 2020 U.S. national election, Facebook’s News Feed and other features operated very differently. Facebook adjusted its content promotion calculations, i.e., its algorithm, to more strongly promote credible news sources. By implication, it deprioritized or down ranked sources publishing disinformation and divisive or hateful content. Facebook did this to slow the spread of disinformation about election fraud and the presidential election being stolen. However, it was too little and too late, lasting only five days in the face of many months of spreading lies about the election. Nonetheless, during the life of the adjusted algorithm, Facebook engagement for credible sources such as the New York Times, CNN, and NPR spiked up and the engagement dropped for the extreme right-wing sources, as well as for hyper-partisan left-wing sources.
Some Facebook staff pushed to make the algorithm change permanent, but were overruled by Facebook’s senior management, including Joel Kaplan, a Republican operative who had previously intervened on behalf of right-wing sources and the Facebook algorithm that promotes them. Moreover, as Facebook returned to “normal” operation, Facebook also eliminated its civic-integrity unit.
After the January 6, 2021, insurrection at the U.S. Capitol, Meta and Facebook again “broke the glass” and instituted more preferential promotion for credible news sources, but again, only for a few days.
Many concerned people from across the globe and from all walks of life – from policy makers to advocates to marginalized people – are calling on Facebook (and other social media platforms, including Instagram [also owned by Facebook’s parent Meta]) to take three steps: 
- Be transparent: disclose business models, algorithms, and content moderation practices; and release internal data on the effects and harms of the current mode of operation. This would allow independent verification of whether content amplification and moderation are effectively combatting disinformation, protecting elections and democracy, and keeping people, especially young people and children, safe.
- Change content promotion algorithms: stop preferential promotion of the most incendiary, hateful, and harmful content to the most vulnerable audiences.
- Protect all people equally: bolster content moderation to protect all people, especially marginalized and vulnerable groups, in all countries and all languages.
Facebook and the other social media companies won’t do this on their own. Without government regulation, they will continue to put profits before social responsibility . We must take steps to reduce the disinformation and divisiveness spread by Facebook and other social media platforms. Doing so is critical to the well-being of all of us, especially our children, and to the well-being of society and democracy. Government regulation clearly has to be an important part of the answer.
I encourage to you contact President Biden and your Congress people. Tell them you want strong regulation of Facebook and other social media platforms, including requirements to implement the three steps outlined above. (See this previous post for more on fixes for the harmful behavior of Facebook and other social media platforms.)
You can email President Biden at http://www.whitehouse.gov/contact/submit-questions-and-comments or you can call the White House comment line at 202-456-1111 or the switchboard at 202-456-1414.
You can find contact information for your U.S. Representative at http://www.house.gov/representatives/find/ and for your U.S. Senators at http://www.senate.gov/general/contact_information/senators_cfm.cfm.
 Legum, J., 5/6/21, “Facebook’s problem isn’t Trump – it’s the algorithm,” Popular Information (https://popular.info/p/facebooks-problem-isnt-trump-its)
 Change the Terms Coalition, retrieved from the Internet 5/2/22, https://www.changetheterms.org/