
You might have some real friends on Facebook. But Facebook ain’t one of them.
Facebook and Instagram use artificial intelligence and algorithms to learn our views on race, identity, religion, and politics. They don’t come straight out and ask us about our views or interact with us in a meaningful way. Instead, they collect data from what we share, like, comment on, and engage with on their platform.
They analyze the data and come up with a profile of me and you (conservative, male, republican, pro-life or liberal, female, democrat, pro-choice), and based on that profile, they determine what content to send us. And the content they send us reinforces our views, solidifies our attitudes, and affirms our opinions.
Facebook knows which content pulls us in and which content we breeze over.
Facebook knows what we like, who we like, and with whom we like to share.
What’s the danger in that?
What’s the danger of analyzing and understanding our behavior and then delivering us content based on that understanding?
Isn’t that a good thing?
No, it is not.
And here’s why.
We share more about ourselves with data scientists at Facebook than with our priests in the confessional.
But the priest (in theory) wants to counsel and help us. Facebook wants to use us.
To Facebook, we are a commodity. And when you’re a commodity on a technology platform with a data-driven business model, you’re prone to exploitation and manipulation by powerful and self-serving individuals and institutions.
Facebook and Instagram are a conduit for misinformation and lies. We saw this real-time with the Big Lie about a stolen election.
We felt it with the constant stream of misinformation about COVID-19 and the COVID-19 vaccine.
The people who consumed and bought into those lies are lost. Perhaps forever. Tragically, they’re part of a growing community of people who believe misinformation. And as humans, we long for a sense of community – more so, it seems, than truth.
I don’t think Mark Zuckerberg or the other executives who launched Facebook did so with bad intentions. They had a business model and the technology to make that business model successful.
What they didn’t account for was the consequence of their success.
Categorized and codified by cold calculated algorithms, Facebook incentivizes our human desire to be with people who share our views, while fueling our dislike of those who don’t.
Because of Facebook, our society is more divided, less trustful, and has more built-up animus than ever before.
We are seeing the unintended consequences of technology and human nature smashing into one another.
That’s why I broke up with Facebook.
For me, the detriments far outweigh the benefits – it’s scary, because sometimes I think the best and only way to fight misinformation is to counter it with truth.
If lies and misinformation can spread fast on FB, why not use that platform to spread the truth?
I think many of us buy into that argument.
And so we get caught up in this endless battle with others. We live for hours at a time in an environment of constant combat and argument – we look for mistruth, engage the enemy, and fight the fight.
Post-to-Post combat.
Blood pressures rise.
Friendships get wrecked.
Family members are disowned.
Nothing gets solved. We just become agitated at those who don’t share our views.
We willfully retreat to our camps – we lose empathy – we lose trust – we lose any sense of the things that hold us together as a country and a society.
We lose our ability to compromise and discuss coherently and intelligently with whom we disagree.
Facebook is toxic, destructive, and a danger to society.
We should turn away from it en masse.