top of page

Social Media
and Public Safety


Conspiracy theories have long been a part of the internet and social media. Occasionally a particularly outrageous theory regarding one celebrity or another will gain enough traction to get promoted by algorithms, but for the most part they remain within their minority groups where they originate. In the tenth essay of the Federalist Papers encouraging the ratification of the U.S. Constitution, James Madison warned against the rise of minority power but believed that small factions are overruled by a large country with varying opinions. This principle has been broken down by internet algorithms. Algorithms often prioritize views outside of the mainstream, because they inspire strong responses both negatively and positively. These minority views become far more widespread in circulation than they would be without social media. Minority opinions take precedence over majority views, or even widely-accepted facts, and break down the safety of society. 


One culmination of this has been health guidance in the coronavirus pandemic. As news has become seemingly less trustworthy in an increasingly partisan country, people turn to their friends, family and fellow online users for information. According to the Kaiser Family Foundation in 2021, Americans are becoming less and less confident in the information the news provides them with:

“Overall, there is no news source that garners trust from a majority of the public on the topic of COVID-19. At the top of the list, nearly half say they have “a great deal” or “a fair amount” of trust in COVID-19 information that they see or hear on their local TV news station (47%) and on network news like ABC, NBC, and CBS (45%). About a third put a similar level of trust in information they see on CNN (36%), MSNBC (33%), and NPR (32%) while three in ten say the same about Fox News (29%). A smaller share put at least a fair amount of trust in COVID-19 information from One America News and Newsmax (13% each).” (KFF)

This is noteworthy because people are forced to look to social media for seemingly unbiased reporting, yet are instead finding the most ‘engaging’ content, deemed profitable by the companies involved. When users are not seeing a swath of different opinions and views, they become more inclined to follow what appears to be a majority view. In the case of coronavirus, this can lead to widespread mistrust of vaccines even with comparatively small opposition. Unfortunately, posts that drive controversy especially regarding hot topics like vaccines simply cause more engagement on social media platforms. This means that companies like Facebook and Twitter want to promote these posts, oftentimes at the expense of consumers' information. 


These problems are manifesting in very dangerous ways, especially with respect to the safety of society as a whole. And according to the Kaiser Family Foundation’s 2021 study, those who are getting information about vaccines on social media are less likely to get vaccinated than those who are getting vaccine information from news sources: 


“Smaller shares of those who say they definitely will not get the vaccine and those who say they want to “wait and see” before getting the vaccine say they have gotten at least a fair amount of information about it from cable news or network news. Indeed, those who say they want to “wait and see” or who say they definitely will not get the vaccine are somewhat more likely to say they have gotten information about the vaccine from social media (37% and 40% respectively) than those who are more enthusiastic about getting the vaccine (25%).” (KFF)


Liz Hamel, Lunna Lopes, and Nov 2021. “KFF COVID-19 Vaccine Monitor: Media and Misinformation.” KFF, 16 Nov. 2021, 

This means that if everyone in the United States was receiving the same edited, reviewed information, there would be a greater percentage of the population vaccinated. The reality is otherwise. Well over a third of people who are reluctant to get vaccinated are relying on social media to inform their decision. Unlike the editorial function of a newspaper or program whose purpose is to inform their audience, the editorial role of social media is, in part, taken on by the user’s interests and, in part, by the social media’s algorithm and what it determines will be engaging to the user. These different goals lead to very different results. 


This is worrisome. The autonomy of social media companies can lead to large scale public health consequences–not just due to personal opinions, but instead due to untrue information that is being pushed at consumers. 

Screen Shot 2022-04-03 at 11.48.22 PM.png

A devastating example of this is Spotify’s 2022 deal with pundit Joe Rogan. Spotify has spent $100 million acquiring Rogan’s podcast, now one of the most popular podcasts in the world.  Rogan has voiced many of his opinions on the vaccine as well as other coronavirus prevention measures, and has now been handsomely rewarded financially for his controversial point of view. Some of Rogan’s opinions include: the belief that recovering from COVID enables permanent immunity, that masking is ineffective, that the public has been “hypnotized” by public health messaging and even that vaccines have reproductive consequences. Spotify is aware that much of Rogan’s  content is false and potentially damaging; they have had to remove many episodes for just that reason. However, it appears that they remain perfectly willing to house and promote the podcast. This demonstrates much of the power that media companies have, and why their cost-benefit analysis is so skewed, because the costs are given to the consumer by way of worse public health outcomes as a result of misinformation. 


It is clear that we need to restrict some of this power wielded by social media companies, be it in the form of disinformation regulation, or using antitrust laws to break up companies, it has to be done.

Social media companies claim to not be arbiters of truth, yet they knowingly promote untruths, and need to be stopped. 

bottom of page