Academics who researched misinformation on Facebook…are banned by Facebook

Facebook has banned the personal accounts of academics who researched ad transparency and the spread of misinformation on the social network. Facebook says the group violated its term of service by scraping user data without permission. But the academics say they are being silenced for exposing problems on Facebook’s platform.

The researchers were part of NYU Ad Observatory, a project created to examine the origin and spread of political ads on Facebook. As the group explained in a blog post in May, their aim is to uncover who pays for political ads and how they are being targeted. Such work has important implications for understanding the spread of disinformation on Facebook, as the company does not fact-check political ads.

Fact-checking doesn’t have to be some all-inclusive guidebook through the meanders of Facebook. One or a few essential points…clear even to collegiate AI is sufficient to demonstrate the intent to maintain honesty. Advances can and would be made by the growing number of geeks in this land who are concerned with misinformation. Associating profiteering from lies with technology defeats many of the premises of an “open internet” that most of us began with.

6 thoughts on “Academics who researched misinformation on Facebook…are banned by Facebook

  1. 4theRecord says:

    “Just 12 People Are Behind Most Vaccine Hoaxes On Social Media, Research Shows” (NPR) See also CCDH report
    “The Most Influential Spreader of Coronavirus Misinformation Online : Researchers and regulators say Joseph Mercola, an osteopathic physician, creates and profits from misleading claims about Covid-19 vaccines.” (NYT)
    “10 types of COVID-19 vaccine misinformation swirling online, fact-checked” (PolitiFact 7/26/21)
    “Real-time reporting on COVID-19 misinformation.” (NewsGuard)

  2. p/s says:

    Facebook blocks research into political ads, falsely blames FTC privacy order : FTC says Facebook privacy settlement doesn’t require blocking researchers.
    AdObserver is a project of Cybersecurity for Democracy at New York University’s Tandon School of Engineering. This extension was originally developed by researchers from the Algorithmic Transparency Institute, Quartz, New York University, and the University of Grenoble. Technical advice was also provided by ProPublica, WhoTargetsMe, and The Globe And Mail.

  3. Will C. says:

    Facebook may finally be acknowledging that its handling of elections around the world has been less than stellar. And this time, the company’s response could amount to more than just another apology from Mark Zuckerberg.
    The social media company is considering creating an “election commission” that would guide it on election-related issues around the world, according to a report in The New York Times. The commission would advise Facebook on everything from disinformation to political advertising, and if implemented, the change could be a boon for the company’s public relations. The commission would ideally also take some heat off CEO Mark Zuckerberg, who reportedly doesn’t want to be the “sole decision maker on political content,” the Times reports.
    A Facebook spokesperson declined to comment for this story when contacted by Ars Technica.

  4. Pedant says:

    “Experiment with Facebook-flagged content shows groups of laypeople reliably rate stories as effectively as fact-checkers do.” (Massachusetts Institute of Technology)
    “Scaling up fact-checking using the wisdom of crowds”
    “Falsehood flies, and truth comes limping after it, so that when men come to be undeceived, it is too late; the jest is over, and the tale hath had its effect: like a man, who hath thought of a good repartee when the discourse is changed, or the company parted; or like a physician, who hath found out an infallible medicine, after the patient is dead.” Jonathan Swift, The Examiner No. XIV (Thursday, November 9th, 1710)

  5. Oopsie says:

    “Facebook apologized to misinformation researchers for providing them with flawed, incomplete data for their work examining how users interact with posts and links on its platform, the New York Times reported. Contrary to what the company told the researchers, the data Facebook provided apparently only included information for roughly half of its users in the US, not all of them.
    The Times reported that members of Facebook’s Open Research and Transparency team held a call with researchers on Friday to apologize for the error. Some of the researchers questioned whether the mistake was intentional to sabotage the research, or simply an instance of negligence.”
    (NYT): “More than three years ago, Mark Zuckerberg of Facebook trumpeted a plan to share data with researchers about how people interacted with posts and links on the social network, so that the academics could study misinformation on the site. Researchers have used the data for the past two years for numerous studies examining the spread of false and misleading information.
    But the information shared by Facebook had a major flaw, according to internal emails and interviews with the researchers.”

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.