According to a new study in the journal Vaccine, the majority of anti-vaccine advertisements on Facebook were paid for by just two organizations. The same study found that pro-vaccine advertisements were funded by 83 unique organizations.
The authors of the study explained:
In 2018, Facebook introduced Ad Archive as a platform to improve transparency in advertisements related to politics and “issues of national importance.” Vaccine-related Facebook advertising is publicly available for the first time. After measles outbreaks in the US brought renewed attention to the possible role of Facebook advertising in the spread of vaccine-related misinformation, Facebook announced steps to limit vaccine-related misinformation.
CLICK HERE to order the
2020 Cats in Space Quoting Scientists calendar!
SAVE 20% off your order using promo code PEW-PEW!
(You can conduct your own search using Facebook’s Ad Archive tool.)
The study analyzed 309 advertisements and found that 163 (or 53%) of the advertisements were pro-vaccine, meaning that 145 (47%) were anti-vaccine. Over half of all anti-vaccine advertisements (55%, to be specific) promoted potential harm from vaccines.
Writing for The Guardian, Jessica Glenza reported:
The World Mercury Project chaired by Robert F Kennedy Jr, and Stop Mandatory Vaccinations, a project of campaigner Larry Cook, bought 54% of the anti-vaccine ads shown on the platform during the study period.
“Absolutely we were surprised,” said David Broniatowski, a professor of engineering at George Washington University, one of the authors of the report. “These two individuals were generating the majority of the content.”
Cook uses crowd-funding platforms to raise money for Facebook ads and his personal expenses. The crowd-funding platform GoFundMe banned Cook’s fundraisers in March 2019.
I’m not surprised. How could anyone be surprised? It’s a known fact that shitty people will do shitty things.
The reason this is even a discussion that’s happening (I refuse to use the word “debate” here) is because of known fraud and former doctor Andrew Wakefield, and his thoroughly debunked and retracted study that suggested a link between the MMR vaccine and Autism. Study after study has shown that there is absolutely no link between vaccines and Autism.
Of course, if you’re reading this, you’re probably well aware that vaccines are safe and effective. And if you’re on the fence or unsure, you can check out this other article I wrote that might address many of the concerns you have.
What makes this so interesting is the disparity of funding sources for these advertisements, as the study authors noted in their conclusion:
A small set of anti-vaccine advertisement buyers have leveraged Facebook advertisements to reach targeted audiences. By deeming all vaccine-related content an issue of “national importance,” Facebook has further the politicized vaccines. The implementation of a blanket disclosure policy also limits which ads can successfully run on Facebook. Improving transparency and limiting misinformation should not be separate goals. Public health communication efforts should consider the potential impact on Facebook users’ vaccine attitudes and behaviors.
I’m not sure I agree with the claim of Facebook politicizing vaccines here, as I don’t consider “kids not dying from preventable diseases” a political issue. It’s more a “I’m not a shitty person” issue to me, I guess.
I also don’t agree with the statement that “improving transparency and limiting misinformation should not be separate goals.” In my mind, the two issues are independent from one another, as they have different approaches and different outcomes. For one, transparency simply means giving people access to information, like creating a tool for laypeople to search advertisements. Limiting information is quite a bit more complicated than creating a search tool for advertisements.
Misinformation (read: fake news) on Facebook is a somewhat impossible problem for them to solve. The problem is that Facebook has made mistake after mistake in how they approach this problem, and have generally only acted when under threat of interference from Congress. Facebook has contracted with third-party sites to “fact check” articles that appear on its platform.
I’ve gotten caught in this “fact checking” web myself, with the satirical ‘Trump taps ‘Ancient Aliens’ guy as Secretary of Space Force‘ article I posted last summer getting “debunked” by Snopes, who is one of the fact checkers Facebook has partnered with. My close friend James Schlarmann, who has written for this site, in addition to my Alternative Science site and the plethora of comedy sites he runs, has been “debunked” by Snopes no fewer than seven times, according to the search tool on their site. James offered his own fact check on the fact checkers though, since their “fact check” of ‘Did Trump Order Facebook and Twitter to Require Users to Like and Follow Him?‘ doesn’t show up in that search, which begs the question… Who fact checks the fact checkers?
Joking aside, Facebook didn’t create the fact checking policy to “debunk” satire, it was made in the wake of the 2016 presidential election cycle, where Russian actors successfully influenced our national dialogue. I could go on about this, including Facebook’s ridiculous partnership with right-wing propaganda site Daily Caller. But for more detail about the issue of Facebook and fact checking, I highly encourage you to read this article by former Snopes editor, and all-around fantastic person, Brooke Binkowski about Facebook’s fact checking efforts.
What makes this particularly insidious though is the method in which advertisers can target people on Facebook. Bad actors like RFK Jr and Stop Mandatory Vaccinations (or, you know, the Russian government) can easily micro-target communities to sow distrust and encourage people to recede even further into their echo chambers. This is contrary to standard advertisements you’d see on TV, or hear on the radio (for the Gen Z’s reading this: listening to the radio is like listening to a podcast that you can’t control).
While large platforms like Facebook do need to do more to address the scourge of fake news, ultimately the responsibility falls on us as end users to combat fake news. It starts with doing our own mini fact checks before sharing an article or meme, especially if it’s making a bold statement that we strongly agree with. It’s kind of in the same vein as “if it sounds too good to be true, it probably is.” This is a passive method of fighting fake news. Take a few seconds before posting it to do a search to see if other reputable outlets are reporting the same or similar news. If the only source you can find is something like RealLiberalNewsForReal.TrumpSucks.com, what you’re about to post probably isn’t all that accurate.
A more active method of combating fake news is to call it out when you see it. And no, sadly, just shouting “FAKE NEWS!” at people isn’t enough, contrary to what you might see on television.
Instead of just telling people they’re wrong, provide sources. Show how the information they presented is inaccurate, and why. While you may not change the mind of the person who posted the misleading information, that shouldn’t necessarily be your goal in the first place. You’ll rarely be successful at that. Instead, your goal should be to influence those who are reading the post, but not engaging in the comments. Just think about how many times you’ve scrolled through a comment section without engaging int he comments yourself.
Having constructive dialogue and pointing people to correct information is the only way we’re going to be able to replace bad information and bad thought processes with good ones.