9 min read

6 Reasons to quit Facebook in 2021

Featured Image

On April 1, 2021 Socialike quit Facebook. We removed the tracking pixel from our website and deactivated our Facebook Page. These were big steps for a marketing agency to take but, we knew it was the right decision. So, is quitting Facebook in 2021 the right decision for you and your business?  

Here are our top six reasons why you should think about quitting Facebook and unlike the social media platform for good. 

Facebook collects your data. ALOT.  

You’re probably already well aware that Facebook collects data from you. But, are you aware of just how much data they actually track?

To see how much data is actually being collected by Facebook, you just need to look on the App Store at Apple’s Privacy Labels. There, you’ll clearly see that Facebook collects a very large amount of your data. Facebook doesn’t just track your birthday, likes, friends, and activity on the app, they also collect your browsing history, location, and search history. If you think about just how which you use your phone, that’s a large amount of data (a lot of which you probably want to keep private).

Yes, it’s possible to limit the data that Facebook shares with you, but are you really comfortable supporting a company that wants to know every single thing about you so they can maximize their ability to advertise to you and keep you addicted to their platform?

Facebook-owned What’s App shares your data with Facebook 

Besides checking your browsing history, location and search history, Facebook makes sure it gets even more data about you by collecting data from other apps it owns.

If you’re living outside of the EU and UK (who are protected by privacy regulation the EU GDPR), the Facebook-owned app WhatsApp shares your data with Facebook to see who you’re connected to.

Facebook claims that this update “does not change WhatsApp’s data-sharing practices with Facebook and does not impact how people communicate privately with friends or family.” however, the move is still controversial as when Facebook announced its purchase of WhatsApp in 2014, it promised that this is exactly what it would not do, showing that they’re willing to break their word 

Facebook also claims that people saying that they use this data to optimize their advertising to you are wrong. However, if you take a look at WhatsApp's FAQ section on the matter, you'll see that it’s worded very delicately to indicate that the only reason they’re not doing so yet is that because they need to reach agreements with Data Commissions on how the data will be used.  

Facebook tracks you as you browse the internet and use other apps 

But wait, the data tracking doesn’t stop there. Facebook also tracks you as you visit other apps and websites (at this point you’re probably wondering what they’re not tracking). Facebook and Apple have had some tension over this type of tracking, with Apple planning to launch an anti-tracking feature that spells the end of the identifier for advertisers (IDFA). 

Apple’s reasoning behind this is that transparency is key. But the real question is, why do Apple need to step in and make such features available? There’s arguably no other company that has access to such a vast amount of data as Facebook, so in reality, the feature is almost Apple’s way of saying that Facebook’s need for data has begun to spiral out of control, which is interesting to think about.

Facebook, says the loss of this tracking will impact its SMB advertisers (which is true, read our blog about 5 alternatives here), but either way, this is just another example of how much data Facebook really is able to get its hands on.

Facebook algorithms favor controversy, misinformation, and extremism 

So, if Facebook's practices around data weren’t enough to make you think about whether or not it’s time to leave the platform, its algorithm might make you think again.

Put simply, Facebook wants to maximize engagement. Pay incentives at Facebook are tied to engagement and growth metrics. When Mark Zuckerberg meets with those in charge at Facebook, engagement is most likely what’s on their minds.

Enter the Facebook algorithm. This is the algorithm that determines what you see on your newsfeed. Unsurprisingly, it’s designed to maximize engagement. People tend to share and engage with content that outrages or excites them (which is the same reason why they can’t stop watching reality TV or sharing memes that seem ridiculous).

Disappointingly, this means that content promoting hate speech or fake news receives the most engagement. When someone posts something that seems downright outrageous, you’re more likely to share it with a friend and or comment on it. In fact, the more likely a post is to violate Facebook’s community standards (that’s fake news and hate speech), the more user engagement it receives, creating a platform that thrives off content that polarizes people with hate speech and spreads fake news.  

Facebook has been involved in politically and radically motivated movements such as the Capitol Riots of 2021 and the Myanmar Genocide  

If you didn’t think that the Facebook algorithm could actually have real-life consequences, you might be shocked to learn the extent to which it actually has.

In late 2018 the company admitted it helped fuel a genocidal anti-Muslim campaign in Myanmar for several years. Publishing an independent human rights impact assessment on the role Facebook played in Myanmarthe company stated, The report concludes that, prior to this year, we weren’t doing enough to help prevent our platform from being used to foment division and incite offline violence. We agree that we can and should do more."

More recently, a study from New York University found that among partisan publishers’ Facebook pages, those that regularly posted political misinformation received the most engagement in the lead-up to the 2020 US presidential election and the 2021 Capitol riots.

Taken together, these are just two scary examples of how hate can spread on the platform. In 2021, it seems as if any company that’s having such an impact on hate speech and violence around the world needs to be walked away from.  

Facebook is known to misrepresent their data  

Finally, to finish off our list, Facebook likes to misrepresent data. A 2018 lawsuit revealed that Facebook had been aware that it was overestimating how many people advertisers could reach, but did nothing about it to make more money.  

For years Facebook overstated its video statistics, discounting views of less than three seconds from its average duration figures, which had the effect of inflating the amount of time they plausibly claimed people watched its videos. This meant that people kept pumping their money into Facebook. One product owner at Facebook had even told the company it was making revenue it “should never have” off of “wrong data”.

According to the lawsuit, Facebook’s senior executives knew for years that this was a problem, and actually tried to conceal the issues, only to be embarrassed by this becoming painfully obvious when court documents revealed the extent to which they were actually aware.