Facebook – A Major Source of Political Disinformation

Post Reply
A440
Posts: 4793
Joined: Sat Nov 20, 2010 1:56 am

Facebook – A Major Source of Political Disinformation

Post by A440 » Tue Mar 09, 2021 12:40 pm

I know many people use Facebook. Personally, I use it to keep in touch with people I know in Europe only and suppress all news feeds, however Facebook has turned out to be extremely problematic in regards to spreading and amplifying conspiracy theory and political disinformation.

According to the NYU-based group, Cybersecurity For Democracy, far-right accounts known for spreading misinformation are not only thriving on Facebook, they're actually more successful than other kinds of accounts at getting likes, shares and other forms of user engagement:.
"It's almost twice as much engagement per follower among the sources that have a reputation for spreading misinformation," Edelson said. "So, clearly, that portion of the news ecosystem is behaving very differently."
The research team used CrowdTangle, a Facebook-owned tool that measures engagement, to analyze more than 8 million posts from almost 3,000 news and information sources over a five-month period. Those sources were placed in one of five categories for partisanship — Far Right, Slightly Right, Center, Slightly Left, Far Left — using evaluations from Media Bias/Fact Check and NewsGuard.

Facebook disinformation chart.png
See: https://www.npr.org/2021/03/06/974394783/far-right-misinformation-is-thriving-on-facebook-a-new-study-shows-just-how-much

There is also a growing problem with the proliferation of postings of pseudoscience content as well. According to a recent article, "Facebook communities peddling COVID-19 misinformation have grown by 48 per cent: report", Facebook communities pushing misinformation about COVID-19 vaccines in Canada have grown by 48 per cent in the past six months.
See: https://www.ctvnews.ca/sci-tech/facebook-communities-peddling-covid-19-misinformation-have-grown-by-48-per-cent-report-1.5337387

Mind you, these two reports do not include past problems such as: https://www.nytimes.com/2018/12/18/technology/facebook-privacy.html
Facebook allowed Microsoft’s Bing search engine to see the names of virtually all Facebook users’ friends without consent, the records show, and gave Netflix and Spotify the ability to read Facebook users’ private messages.
This is ironic since Facebook News, sponsored by Facebook itself, has a pronounced left-bias and has a much higher rating for factual content though this quality is not reflected in the blatant disinformation promoted within the platform itself. See: https://mediabiasfactcheck.com/facebook-news/

As researcher Laura Edelson of Cybersecurity for Democracy stated, "I think what's very clear is that Facebook has a misinformation problem. I think any system that attempts to promote the most engaging content, from what we call tell, will wind up promoting misinformation."

Please feel free to rate this site accordingly: https://www.mywot.com/scorecard/facebook.com
You do not have the required permissions to view the files attached to this post.

A440
Posts: 4793
Joined: Sat Nov 20, 2010 1:56 am

Re: Facebook – An addendum

Post by A440 » Wed Apr 14, 2021 7:25 am

This Guardian article is about Sophie Zhang’s (former Facebook employee) battle to combat rampant manipulation as executives delayed and deflected. Ms. Zhang issued a 6,600-word internal memo (from a fired Facebook data scientist) which details how the social network knew about specific examples of global political manipulation — and failed to act. Facebook actively attempted to suppress Ms. Zhang's memo as well, which demonstrates deliberate bad faith on the part of Facebook management.
. . . in the 2.5 years I’ve spent at Facebook, I’ve … found multiple blatant attempts by foreign national governments to abuse our platform on vast scales to mislead their own citizenry, and caused international news on multiple occasions. . . I tried to fix this problem within Facebook … I spoke to my manager, my manager’s manager, different teams, and everyone up to a company vice-president in great detail. I repeatedly tried to get people to fix things … I offered to stay on for free after they fired me, and they said no. I hoped that when I made my departure post it might convince people to change things, but it hasn’t.
https://www.theguardian.com/technology/2021/apr/12/facebook-fake-engagement-whistleblower-sophie-zhang
https://www.buzzfeednews.com/article/craigsilverman/facebook-ignore-political-manipulation-whistleblower-memo


Again, this is more than enough reasons to consider facebook an unreliable site and service.

A440
Posts: 4793
Joined: Sat Nov 20, 2010 1:56 am

Re: Facebook – addendum 02

Post by A440 » Sat Apr 17, 2021 7:38 am

Facebook could have stopped 10 billion impressions from "repeat misinformers", but didn't: report

https://www.salon.com/2021/04/12/facebook-could-have-stopped-10-billion-impressions-from-repeat-misinformers-but-didnt-report/
Released by the online advocacy group Avaaz, the report argues that if Facebook had not waited until October (roughly one month before Election Day) before altering its algorithm to reduce the visibility of inaccurate and hateful content, it could have stopped roughly 10.1 billion views from accumulating on 100 pages that frequently disseminated misinformation in the eight months prior to the 2020 election.
However, FB claims it was set up, claiming:
This report distorts the serious work we've been doing to fight violent extremism and misinformation on our platform. Avaaz uses a flawed methodology to make people think that just because a Page shares a piece of fact-checked content, all the content on that Page is problematic.
Which is interesting considering others might easily consider FB's methodology to be flawed as well.

Post Reply

Who is online

Users browsing this forum: No registered users and 10 guests