Everyone here seems to think FB is some horrible, unethical company, but what, exactly have they done? It seems to me that all FB is is a platform that allows anyone to connect with and talk to anyone else, post whatever they want, and it comes with something similar to a Netflix recommendation algorithm that guesses what other posts people will be most likely to want to see. And, yes, this can have negative effects, but how is it FB's responsibility if people want to see negative things and interact negatively with other people?
Facebook is an ad company that monetizes social interaction and manipulates the social sharing of information for profit. It isn’t just about people interacting negatively, it’s that negative interactions are directly profitable to facebook, they know this, and they act on it.
Then what makes Facebook different than any other social network? If this is what makes a company evil, then why aren't people up-in-arms about Reddit?
None of them (or none of the big ones at least) are any good. Truth be told, I think a lot of the people up in arms over Facebook and are quiet about Reddit and Twitter use both Reddit and Twitter and don't want to stop using them, whereas they don't use Facebook. Its easier to stand on the soapbox when it has no affect on you personally.
That all being said, that still doesn't make Facebook good, or not evil. It's perhaps the worst offender of the lot
The argument is usually that non-linear effects of the size of their network make the potential or actual harm Facebook do much greater than anyone else, therefore it makes sense to focus on them.
Calling them "evil" can be useful to mobilise people, but it clearly makes the conversation a lot less nuanced than it needs to be, and it's unfortunate that people often use that type of language.
Because Reddit and Facebook really aren't all that similar.
Facebook is like the "book club" that is really a front for gossiping and drinking. You probably know the people you're talking to personally and it's more a conversation with them (that other people might be able to see).
Reddit is like a community message board. You don't know these people personally, so you get an inbuilt skepticism to what they say. It's mainly about sharing links to somewhere else and you can interact with it without reading anyone's comments.
Social media as an umbrella term really doesn't work because Twitter, Facebook, TikTok, Reddit, YouTube, etc all have different ways you interact with them. It would be like saying network TV, the newspaper, the community message board and a guy with a megaphone on the corner are all the same.
I'd disagree with the notion that people treat discussions on Reddit with more skepticism than Facebook. This is coming from someone who was a daily user for over a decade. Due to its low bar to entry and anonymous accounts it's probably astroturfed more than Facebook and such accounts are harder to distinguish.
You're probably right about astroturfing. It's part of the reason the Hail Corporate subreddit exists.
I'd be interested to see the ratio of lurkers to posters. Reddit is a link aggregator, it's primary utility is unrelated to the comments. This isn't the case with Facebook.
Here are just a few examples of obviously unethical behavior by Facebook:
* Their own internal research shows that Instagram harms 1 in 3 users who are teenaged girls
* Lying to companies that paid to advertise on FB about the reach of their ads
* Letting advertisers discriminate based on FB users' race and disability status
* Letting Russian accounts pay for election ads in 2016 IN RUBLES
That's a lie. FB did not aid in anything. Google is not responsible for people who use gmail to plan crimes, nor are ISP's responsible for the content of the packets that go over their wires.
Your comparisons are not even close: Email and ISPs are not amplifiers of content. They do not curate messages.
Facebook's algorithms absolutely were responsible: they amplified the propaganda that led to deaths. And their money-first-research-second approach at censorship is part of the reason.
Rather than turning off the service, they let it run wild.
The problem with this analogy is that GMail and ISPs treat all traffic equally, where as Facebook and other "engagement"-driven social media use algorithms to prioritize content that generates the most engagement (which often happens to be the most outrageous, offensive or divisive content).
So by that logic, if a tv station chooses to broadcast content that propagates racist misinformation and calls for violence, but the station only does it because that's what they algorithmically determine gives them the most viewership, that it's fine?
And yet they are all responsible for reporting CSAM found on their servers. These platforms are not dumb pipes, the ability is already there to decide what is allowed and what is not.
I do tend to agree that FB gets more flack than seems reasonable, it's hard to expect any company with a profit incentive in this space doing much better. Though IMO Twitter is an example of better handling of misinformation, propaganda, toxic interactions, and more.
> it's hard to expect any company with a profit incentive in this space doing much better.
That’s crap. Facebook could easily choose to prioritise the long term health of society over their advertising profits today. Everyone with a seat at the table (and all the engineers on down) live somewhere between comfortably wealthy to obnoxiously wealthy. Zuckerberg and all the engineers involved are making active choices, every day, to line their pockets instead of cleaning up their act.
This is the thing that makes me sad about capitalism as a whole. The only incentive is to make more and more money, and if you have to do unethical things to do that, you do them, and you're rewarded for it.
It's just not that common for executives at a company to forego revenue in service of the common good. Especially at companies like FB, where exploiting the commons is what makes them money in the first place.
Bryan Cantril had a great rant about this a few years ago. He made the point that if you go back to the 60s, the corporate values of basically every company had "integrity" right near the top. Plenty of capitalist companies past and present choose to act in ethical ways.
This isn't an essential problem with capitalism, any more than food poisoning is an essential problem with restraunts. The problem comes from how some people run their companies. Money gives you the power to do what you want. Whether or not they realise it, people in high up positions at FB, Uber and elsewhere are powerful. They have the capacity to choose what their companies do. Claiming their unethical behaviour is "the fault of capitalism" is like a plumber blaming weak building standards for their own shoddy work.
Do better.
And don't place the blame for unethical behaviour amongst our community on capitalism. People chose to build and maintain all of these system. And people continue to choose to keep those systems running long after knowing the harm they cause.
Twitter? That's the most toxic place on the internet. But I don't blame them for what their users want any more than I do FB.
As for the Wikipedia link, 90% of that stuff falls under my initial comment of people using the platform in ways other people wish they wouldn't, so I don't find it to be FB's fault. The privacy stuff I think is real, but it's so standard internet these days that it hardly justifies the vitriol.
Disagree with your second point. Just because there's an entire industry of people that profit from the erosion of privacy, doesn't mean any of them is less deserving of their share of vitriol; if anything, I'd say the amount of vitriol each one of them ought to experience (personally, in their day-to-day lives) should rise exponentially as a function of the amount of people doing the same.
Well, Facebook's own researchers wrote reports (cited last week in the WSJ's Facebook Papers articles) that said that Instagram was worse for teenage girls mental health than either Snapchat or Tik-Tok. So there are gradations of how bad a for-profit social media company can be, and Facebook seems to be worse along several axis.
> it's hard to expect any company [...] doing much better.
That is on you. The simple fact of the matter is that they can do better, they should do better and because they are not doing better they must be punished until they do. That is how you deal with a beast that is actively malicious (it does wrong and it knows it is doing wrong, it does not care).
It’s not clear to me if Twitter has handled it better, or if they had a self selected audience that made their techniques more effective. I don’t see why attempting to flag misinformation would work on Twitter and fail on FB for reasons other than the people who use it.
I can think of several reasons why that would be the case.
The easiest one is just that Twitter might have more competent developers and product managers working on that problem than FB does.
The more sinister option is that FB knows that the kinds of things that end up being misinformation often drive engagement, so maybe they don't try as hard.
(Agreed, though, that it's not clear that Twitter handles this better than FB.)
You are of course encouraged to start reading the news if you have a voracious appetite for learning. In short though, they take everything they can from you and sell it to everyone willing to buy. This gives them an incentive to have you "engage" with content, of which some is actively harmful (but ostensibly encouraged because Facebook wants the effects that such content has).
> Everyone here seems to think FB is some horrible, unethical company
I don't think that's true. Most of us here see Facebook as a drunk 16 year-old with a set of car keys and Daddy's credit card. Not evil, just hopelessly reckless and lacking in self awareness and self control.
Their platform has toxic and negative effects, from teenagers to adults participating in elections. And they know they are bad, and cover up and lie about how much they know and when they knew it.
Facebook is designed to encourage and reward the kind of negative and harmful interactions and results people are noticing. There’s no reason a social network has to work the way Facebook does, other than maximizing revenue.
I agree it has become entirely irrational. I think it's the relentless media campaign starting with Cambridge Analytica, the narrative the intelligentsia left has constructed for itself re. the Trump election and now Covid combined with the new culture of conformity that there is no one to speak up when people like Miguel de Icaza casually call for the entire executive rank to be jailed.