Maria Duque/Contributing Writer
Facebook has become powerful enough to sway elections and take the place of 24/7 news networks. With this position, they have a responsibility to regulate their content to ensure what is being spread on their platform does not lead to mass misinformation.
When I initially heard why people were indignant at the company’s actions, I had mixed feelings. As with so many debates about media and technology today, there is a conflict between promoting free speech and promoting truth on social platforms.
Facebook has refused to remove a video portraying Nancy Pelosi as intoxicated, even though it was proved to be doctored.
My first reaction was to lean toward the side of protecting free speech. According to Facebook’s community standards, they do not remove or censor false news, only reduce its distribution and circulation.
As a private company, I thought, it is fair for Facebook to act according to its own publicized guidelines. It should be as simple as that.
If only it were.
Facebook can no longer be categorized as just any public company. Close to one-third of the world’s population uses Facebook. It is many of these users’ main source of news. It has already shown to have a huge influence on public opinion.
As much as Facebook may concern itself with allowing free speech, it needs to crack down on content that has been proven to be falsified.
Facebook needs to recognize that its position has changed from a purely social network to one where public opinion is created. The company cannot allow these opinions to be formed by such blatantly false content.
One of my biggest problems with the current political climate is that everything is twisted into a political issue. This particular incident with Pelosi has Democrats speaking out against Facebook, but the company recently cracked down on right-wing extremist groups, which prompted backlash from conservative politicians such as Ted Cruz.
There will always be politically charged content on Facebook, and the only way to ensure impartiality is to have clear guidelines outlining what content is not allowed.
All this being said, Facebook has its rules and standards, all of which are available for its users to read. The #deletefacebook movement is an adequate response because it highlights the main issue in this debate: if you do not agree with Facebook’s policies, don’t use it.
As a kid with careful parents who steered me away from social media, I’ve never used Facebook, although I may have an inactive account floating around somewhere. There are enough ways to connect with family and friends and receive news without Facebook.
Even college students, who are notoriously obsessed with social media, would benefit from deleting the application.
At this point, Facebook does not seem like an important resource for college students. In my experience, few organizations use it as a major tool for communication.
Another aspect of the movement includes deleting Facebook owned applications, including Instagram. This is where college students may hesitate.
Instagram is far more popular among college students, and it’s more important for keeping up with school activities.
When debating whether to also delete Instagram simply because it’s owned by Facebook, I again consider their practical uses.
Facebook is no longer simply a social media platform, but Instagram still is. It does not have the level of political influence that Facebook has. It does not paint itself as a news source, so it does not need the strict regulation of content that Facebook does.
Users are absolutely making the right decision by ending their support of Facebook until they take initiative in protecting the spread of truth on their platform.
Be the first to comment on "#deletefacebook is a valid response to the spread of fake news"