One of the ways I try to get people to understand just how wrong feeds from places like Facebook are is to think about Wikipedia. When you go to a page, you’re seeing the same thing as other people. So it’s one of the few things online that we at least hold in common.
Now just imagine for a second that Wikipedia said, “We’re gonna give each person a different customized definition, and we’re gonna be paid by people for that.” So, Wikipedia would be spying on you. Wikipedia would calculate, “What’s the thing I can do to get this person to change a little bit on behalf of some commercial interest?” Right? And then it would change the entry.
Can you imagine that? Well, you should be able to, because that’s exactly what’s happening on Facebook. It’s exactly what’s happening in your YouTube feed.
—Jaron Lanier, from the documentary The Social Dilemma
This is not the blogpost I originally started writing.
The first draft of my blogpost is quoted below:
As I lie on the sofa in my darkened apartment, listening to an LGBTQ “Queeraoke” room in Clubhouse (and wondering if I have the audacity to inflict my pitchy tenor voice on the assembly), it occurs to me that my relationship with social media has evolved significantly since I started this blog, a little over four years ago.
I don’t kid myself; my divorce from Facebook (not so much a single event as a series of steps), led not to a reduction in my use of social media, but an overall increase, something about which I have strong mixed feelings about. (It would appear that I am not alone in this: I have noticed a significant uptick in recent views of a blogpost I wrote about Jaron Lanier’s 10 reasons to quit social media, according to my WordPress blog statistics.)
Spending so much of my time in social isolation since the pandemic started 20 months ago, I find myself spending varying amounts of time every day on five wildly disparate social media platforms: Twitter, YouTube, Reddit, Discord, and (the newcomer) Clubhouse. I tell myself that it helps me stay connected to other people, but I also
And then, like so many other blogposts I write, I set it aside, literally mid-sentence, to complete on another day, when the muse struck.
Well, today is another day.
And it is a day that I started watching a one-and-a-half hour documentary on Netflix, which is also available to watch for free on YouTube: The Social Dilemma. And, as it happens, Jaron Lanier also appears in this particular documentary—along with two dozen other experts, many of them executives who formerly held high-ranking positions at social media companies like Facebook, Twitter, and Pinterest.
I full well realize the irony in asking you to watch a YouTube video on social media addiction (given the platform’s at-times-scarily accurate recommendation engine, algorithmically designed to keep you viewing long past your bedtime), but I would urge you to set aside 93 minutes and 42 seconds of your time, and watch this documentary. It is eye-opening, it is disturbing, and it is a wake-up call.
One shocking thing I learned from this documentary is that even the people who designed, created, and tweaked the algorithms that glue us to our cellphones, are addicted to social media and its attendant ills (for example, a more divisive society and increasingly polarized politics).
We are participating in an experiment that is slowly but surely rewiring our brains in ways that we are only now starting to comprehend. Particularly disturbing is the impact that social media algorithms are having on children and teenagers, something once again brought to light by Facebook whistleblower Frances Haugen last week in her testimony to the U.S. Senate.
According to the video description on YouTube, The Social Dilemma was only supposed to be on YouTube until September 30th, 2021, but it’s still up as of today. I don’t know how long it will be available on YouTube, so if you don’t subscribe to Netflix, please don’t delay in watching this.
As I said up top, while I might be proud of my emancipation from Facebook, I have landed up spending more time—a lot more time—on other social media, notably Twitter, YouTube, Reddit, Clubhouse, and Discord. The pandemic (and its lockdowns and social distancing requirements) have only exacerbated the problem over the past 20 months. And I suspect that I am not alone in this.
I might be free of Facebook (which I consider the most egregious culprit), but I am still addicted to social media.
What used to be called “Oculus Connect” for many years, and then was renamed to “Facebook Connect” last year, is now suddenly just “Connect”. You have to scroll down, and hunt around a bit, to find any mention of Facebook on the homepage!
At first I just assumed that (like the renaming of Facebook Horizon to Horizon Worlds), it was a PR move to lessen the association of the now-problematic Facebook brand with the event. But it would appear that it’s more than that.
Facebook is planning to change its company name next week to reflect its focus on building the metaverse, according to a source with direct knowledge of the matter.
The coming name change, which CEO Mark Zuckerberg plans to talk about at the company’s annual Connect conference on October 28th, but could unveil sooner, is meant to signal the tech giant’s ambition to be known for more than social media and all the ills that entail. The rebrand would likely position the blue Facebook app as one of many products under a parent company overseeing groups like Instagram, WhatsApp, Oculus, and more. A spokesperson for Facebook declined to comment for this story.
Facebook already has more than 10,000 employees building consumer hardware like AR glasses that Zuckerberg believes will eventually be as ubiquitous as smartphones. In July, he told The Verge that, over the next several years, “we will effectively transition from people seeing us as primarily being a social media company to being a metaverse company.”
A rebrand could also serve to further separate the futuristic work Zuckerberg is focused on from the intense scrutiny Facebook is currently under for the way its social platform operates today. A former employee turned whistleblower, Frances Haugen, recently leaked a trove of damning internal documents to The Wall Street Journal and testified about them before Congress. Antitrust regulators in the US and elsewhere are trying to break the company up, and public trust in how Facebook does business is falling.
But really, all this is is just a name change. The same fundamental problems that Facebook has are still there; slapping a fresh coat on paint on everything is not going to fix the fact that Facebook requires you to set up an account on its social network in order to use Oculus VR headsets going forward. More and more, people are realizing that it’s not a good ides to trust Facebook with your personal data. As I have written before on this blog:
Some will respond that Google, Apple, Amazon, and many other firms commit the same level of personal data vacuuming that Facebook does, which is true. However, I actually have more faith that those companies will at least not weaponize their data against me. Few companies have seen the level of public distrust rise as high as Facebook (and frankly, the company’s recent fight with Apple over the latter wanting to make transparent how much data Facebook collects on you, is SO nota good look for Mark Z.).
Time and time again over the years, Facebook has shown that it cannot be trusted (see: the Cambridge Analytica scandal and the incitement of violence in Myanmar, to give just two relatively recent examples of egregious behaviour happening on the platform). Combine that lack of trust with its overweening ambitions, and you have a potentially serious problem.
I responded by voting with my feet and my wallet, deleting my Facebook and Oculus accounts, and vowing to never again purchase or participate in any Facebook/Oculus hardware and software, a decision which I explain here, and one which I continue to stand by in good conscience. I full well realize that I might be missing out, but I consider the price of admission to be too high (and frankly, too opaque). God knows how my personal data is being used, and Facebook’s track record frankly sucks.
I even went so far as to ask Facebook to delete all the data it had on me, but I also know that the Facebook social network probably has some sort of “shadow account” on me, based on things such as images uploaded to the social network and tagged with my name by friends and family who are still on Facebook. I am going to assume that Facebook has indeed done what I have asked and removed my data from their social network. Frankly, there is no way for me to actually verify this, as consumers in Canada and the U.S. have zero rights over the data companies like Facebook collects about them, as was vividly brought to life by Dr. David Carroll, whose dogged search for answers to how his personal data was misused in the Cambridge Analytica scandal played a focal role in the Netflix documentary The Great Hack (which, by the way, I very strongly recommend you watch).
And need I remind you that the January 6th, 2021 insurrectionists in Washington, D.C. also used Facebook to help organize? Not to mention the misinformation, disinformation, and crazy conspiracy theories about COVID-19 and vaccines circulating on the platform (although this is a problem on other social media as well). The Facebook social network and its algorithms have become a toxic cesspool, and anything that touches it, or (in the case of Oculus) forcibly integrated with it, becomes tainted by association.
So no, a name change is not enough—not nearly enough.
UPDATE 1:45 p.m.: Of course, Twitter is all over the Facebook rebranding news with its trademark snark. Here’s just a sample of the responses in my feed today:
And perhaps most toxic of all is the radioactive waste left by the Cambridge Analytica scandal. A new shareholder lawsuit, filed in Delaware, based on freshly disclosed internal documents, claims it can prove that Facebook senior executives and board members lied to investors. If it can do that, it will set in motion a chain of consequences that will make Enron look like a teddy bears’ picnic.
Three years ago, Sandy Parakilas, an earlier Facebook whistleblower, explained to me the power of the SEC, which regulates the financial markets, by telling me that in America, money will enable you to get away with most things. “But the one thing you can’t do,” he said, “is to fuck with our capitalism.”
The UN found Facebook helped facilitate a genocide in Myanmar. We know that it helped foment an insurrection at the US Capitol. And its own research says it is harming teenagers. (A 2019 Facebook presentation slide, just revealed, said: “We make body-image issues worse for one in three teenage girls.”)
That’s all fine, it turns out, but if this suit can prove it’s lied to investors, someone is going to jail. If I were a Facebook employee, I’d be browsing the whistleblower section of the SEC’s website, which grants immunity from prosecution, very, very carefully.
In other words, both the Delaware and D.C. lawsuits mean that Facebook is in serious, serious trouble—no matter what they call themselves.