Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I understood the point. My point is that propoganda and it's distribution channels have existed for as long as human speech has. A new distribution channel isn't the problem, it's a society that cannot think for themselves and distinguish between what is real and what is not.

Blaming the platform is scapegoating the real problem.



Moderating the platform seems like a more tractable problem than changing what are effectively hard-wired aspects of human neurology. It’s not about blame, it’s about engineering a cost efficient solution.


I don’t think people are much worse at thinking for themselves these days, and to the extent that they are, it’s probably fallout from social media. I would like for people to have stronger independent thinking skills (just like I would like us to have infallible immune systems and no proclivity towards cancer and so on) but that’s all wishful thinking. Since we can’t do very much to make ours society more immune to social media, then we should change social media so it doesn’t wreak havoc on our society.


I suppose my point and probably the point of the other commenter above is that social media companies utilizing content recommendation systems are not merely distribution channels, by virtue of recommending content. Those recommendations go beyond distribution.

And to be clear, I am blaming corporations for creating these platforms, not blaming the platforms themselves.


I hear you. I agree that there's the incentive for platforms to serve content you want to see and will engage with but is it Facebook's responsibility to stop you from becoming more extreme? How do you even define extreme? There's obvious answers but I'm sure there's a lot less obvious ones too that are impossible to police.


Yes, Facebook has a dial and they know that they can turn that dial to make us more extreme (byproduct of generating engagement). Personal responsibility is a lovely thing but humanity isn’t just going to become more personally responsible over night, so if we’re going to save our society we have to look at the options available to us in reality and not those we wish we had (specifically an extra helping of personal responsibility). We do this all over—we don’t allow the sale of many harmful addictive substances even though one’s health, finances, etc are their own responsibility. We also regulate casinos and tobacco and alcohol. There’s certainly no reason why we can’t regulate social media.


You don't know they have a dial, everyone is taking one sensationalized (and overly dramatic) documentary and making it as gospel. Portraying it as a "dial" is also doing a complete disservice to the actual technical problem involved.

The documentary was successful in underplaying the challenges of moderating/recommending at scale and, in my opinion, scapegoating social media companies as the source of the problem when they're really a symptom.

The main focal point of The Social Dilemma, Tristan Harris, as creator of the Centre for Humane Technology has an obvious agenda (not saying he's wrong). Take the documentary for what it is but to throw the baby out with the bathwater is wrong when there are obvious benefits to social media.

Regulating social media is a fool's errand imo, it's a lose-lose. Either you let ideas flow which includes bad actors/ideas to propagate or you now create gatekeepers and censors with ever moving goalposts. I'm not against more regulation but do you really think the United States Congress is capable of passing legislation to effectively tow that line? Doubtful.


It's not a literal dial--it's an analogy and yes, it's oversimplified (the complexity of the implementation has no bearing on this debate), and the documentary is merely a touchstone. Lots and lots has been written about the subject with many first-hand accounts. Moreover, as discussed elsewhere, these networks have so much power that they can unilaterally influence democratic elections--at least that's certainly the necessary implication if you believe that Russia was able to indirectly influence these curation algorithms to hack the 2016 election (if Russia could manipulate these algorithms indirectly, then how much more power must Jack and Mark have given their direct access?).

> Regulating social media is a fool's errand imo, it's a lose-lose. Either you let ideas flow which includes bad actors/ideas to propagate or you now create gatekeepers and censors with ever moving goalposts. I'm not against more regulation but do you really think the United States Congress is capable of passing legislation to effectively tow that line? Doubtful.

This is just a generic argument against free speech. The obvious problem is that there's no way to ensure that our censors are going to be good actors, and in particular we know with some degree of certainty that Twitter, Facebook, etc are not. Congress (or whomever) doesn't have to toe that line at all--regulating these businesses out of existence is strictly a better option than allowing them to continue poisoning our society. No doubt they deliver some value, but (1) much of that value could be realized through other means (people can still organize on web fora like they did in the brief years prior to social media proper) and (2) they certainly don't deliver enough value to justify the rapid erosion of our social and political fabric. So the worst thing we can do is continue on with the status quo.

That said, I think it's entirely reasonable that we could be more surgical about regulation. There's no reason we can't keep some of the benefits of social media while doing away with the immense costs. For one, we can require social media companies to speak an open protocol such that anyone can compete--not just ad-based businesses with established large networks. We could require their curation algorithms to be made transparent. We could require that they behave as dumb pipes, but they may afford their users mechanisms to curate their own feeds. I'm sure there are many other solutions as well, but again, we oughtn't defer action until we find the best option because we know the status quo is strictly the worst option.


It’s hard to deny that social media is an entirely new way of disseminating content. There is an algorithm that determines what you see. It’s well known how that algorithm encourages echo chambers and the internet itself changes the way we think. It’s not just “people without critical thinking have always existed”.


No, getting content shoved into your face without human review is definitely a new problem. At least before recommendation algorithms, you were either looking up something you specifically sought out, or someone took on the publishing liability to recommend you something.


You're still abdicating personal responsibility. Why is it Facebook's responsibility to keep YOU from becoming more extreme? I'm just playing devil's advocate here because the easy position right now is to blame social media.


It’s same as asking why should government control abuse of meth and heroin. These drugs along with social media, exploit certain nature of how our brain works to make us addicted and alter our state of mind.


Yet we're seeing decriminalization of drugs, needle exchanges, etc that counter your argument. The world is becoming more liberal to drugs because criminalization has created a larger problem (black markets, impure drugs, etc).

I get your point is to have some more regulation, however the argument is much more nuanced than "Ban All Social Media" which is what I'm trying to portray.


I actually agree with decriminalizing drugs, but I think they should be controlled like any other prescriptions or like weed is currently controlled. Specifically portion controlled so that drug does not destroy person's life. I have experimented meth and many other drugs. Mind that if you haven't done meth and you try to compare it to some other drug like alcohol then you are just ignorant. I have to say that it's easy to get lost in meth and a lot of people don't have mental will(or maybe capacity) to get out of the drug and hence I have to say that this type of thing does need to be controlled in some manner.


Who is arguing for "banning all social media" in this comment chain? You're attacking a strawman.

We need high regulation of recommended content, probably by moving recommendations out of the scope of section 230. Pointing to a not banned but extremely highly regulated decriminalized sector kind of speaks for itself. It's not like I could go on a San Francisco street corner tomorrow and start selling pot like a paperboy.


The parent comment (the original one I responded to) said that if we can't find an equitable solution to "regulate them out of existence". That's what I was arguing against.


Moreover, at a certain point we can either hope and dream that humanity becomes endowed with super-human personal responsibility or we can accept that this isn’t going to happen and look at our available options.


> You're still abdicating personal responsibility

No. Blame isn't indivisible. It's not even a zero-sum allocation.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: