
You might have some real friends on Facebook. But Facebook isn’t one of them.
Facebook and Instagram use artificial intelligence and algorithms to learn our views on race, identity, religion, and politics. They don’t ask us directly about our views or interact with us in a meaningful way. Instead, they collect data from what we share, like, comment on, and engage with on their platform.
They analyze the data and come up with a profile of me and you (conservative, male, Republican, pro-life or liberal, female, Democrat, pro-choice). Based on that profile, they determine what content to send us. The content they send us reinforces our views, solidifies our attitudes, and affirms our opinions.
Facebook knows which content pulls us in and which content we breeze over.
Facebook knows what we like, who we like, and with whom we like to share.
What’s the danger in that?
What’s the danger of analyzing and understanding our behavior and then delivering content based on that understanding?
Isn’t that a good thing?
No, it is not.
And here’s why.
We share more about ourselves with data scientists at Facebook than with our priests in the confessional.
But the priest (in theory) wants to counsel and help us. Facebook wants to use us.
To Facebook, we are a commodity. And when you’re a commodity on a technology platform with a data-driven business model, you’re prone to exploitation and manipulation by powerful and self-serving individuals and institutions.
Facebook and Instagram are conduits for misinformation and lies. We saw this in real-time with the Big Lie about a stolen election.
We felt it with the fire hose of misinformation about COVID-19 and the COVID-19 vaccine.
The people who consumed and bought into those lies are lost—perhaps forever. Tragically, they’re part of a growing community of people who believe in misinformation. As humans, we long for a sense of community—more so, it seems, than truth.
Mark Zuckerberg and the other executives who launched Facebook did not have bad intentions. They had a business model and the technology to make that business model successful.
What they should have accounted for was the consequence of their success.
Categorized and codified by cold-calculated algorithms, Facebook incentivizes our human desire to be with people who share our views while fueling our dislike of those who don’t.
Because of Facebook, our society is more divided, less trustful, and has more built-up animus than ever before.
We see the unintended consequences of technology and human nature smashing into one another.
That’s why I broke up with Facebook.
For me, the detriments far outweigh the benefits – it’s scary because sometimes I think the best and only way to fight misinformation is to counter it with truth.
If lies and misinformation can spread fast on FB, why not use that platform to spread the truth?
Many of us buy into that argument.
So, we get caught up in this endless battle with others. We live for hours at a time in an environment of constant combat and argument—we look for mistruths, engage the enemy, and fight the fight.
Post-to-Post combat.
Blood pressures rise.
Friendships get wrecked.
Family members are disowned.
Nothing gets solved. We just become agitated at those who don’t share our views.
We willfully retreat to our camps, losing empathy, trust, and any sense of what holds us together as a country and a society.
We lose our ability to compromise and discuss coherently and intelligently with whom we disagree.
Facebook is toxic, destructive, and a danger to society.
We should turn away from it en masse.