Misinformation in Troubling Times

We live in an age where information is easier to access than literally ever before. We no longer have to travel miles like Abraham Lincoln to simply get a book we haven’t read before. We have the ocean at our door and all the knowledge on the planet at our finger tips. While this has nearly unlimited positive side effects and uses, the negatives are nothing to be neglected. In America, politics are incredibly polarized and there are countless opinions on topics strewn across the web. Unfortunately, they’re just that most of the time: opinions. In an endless sea of information (some anecdotal, some intensely researched, and so on), it is difficult to pinpoint what you can truly trust.

The term “fake news” has been popularized recently, initially during the Trump administration. This term has stuck, though, as it is an ever-growing concern. This doesn’t refer to just what you see on the nightly news or in the newspaper, but has been universalized to refer to any misleading or straight up incorrect information put into the world. This could be a video on YouTube, a comment on an Instagram post, a message from a friend in a group chat… the list goes on. Throughout Covid-19, knowledge about this concept has skyrocketed in importance. Though to many it may seem like common sense to acknowledge the fact that the publication of misinformation in media is rampant and that more research should always be done before making choices or developing concrete beliefs, many are unfortunately naive or are unreasonably trusting of certain sources or outlets. The biggest outlet of information on the pandemic was Facebook, with “more than 2 billion people [having viewed] authoritative information about COVID-19 and vaccines” on the platform. This is extremely troubling to hear as Facebook is a site built to share any and every opinion and idea under the sun. It sounds pleasantly expressive when you put it that way, but when people trust what they see wholeheartedly, this becomes a huge issue.

When it comes to a platform that has a constant stream of information flowing through it, I believe that it is that company’s duty to regulate the information and fact check things to keep users safe and not misinformed. This doesn’t mean taking down false content, but instead doing a better job of flagging misleading or provably false “facts” that users may stumble across. Not only would this include posts, but also advertisements and comments. In terms of the pandemic, I believe that the only reason to not get the vaccine is if you have underlying conditions that make getting it potentially life-threatening. Other than that, there is no reason to avoid it, but Facebook has been a hive for combatting the vaccine. If things were better regulated, there is no doubt in my mind that substantially more of the country would be vaccinated and there would be far fewer deaths.