There's a fair number of people online who hate me and repeatedly report things that I do. I uploaded a youtube video showing a XSS vulnerability that I found, reported, and had been fixed. Someone reported it and it got deleted, I got a red mark on my account and it put my account in "bad standing". I appealed it and it got put back up and the stuff reversed.
On a different site I got a 2 week ban for posting a link to a completely harmless and completely rule-abiding youtube video. I appealed it twice and it wasn't lifted. On that same site I had my account deleted for posting a rickroll video. That time they listened to the appeal and reinstated my account.
My reddit account was shadowbanned. I appealed it and it got reinstated.
My point is that moderation is always somewhat subjective and usually done by low paid workers going through high volumes of disgusting stuff, so there will always be mistakes. And there are people who get mad and basically treat that report button as a dislike button. I even went to a talk by a facebook employee on the spam team, and he said there are some countries they essentially completely ignore reports from because reporting innocent stuff is so wide spread there.
You think that's bad? Try posting anything critical of what someone else says on HackerNews. Frankly, having discussions about censorship on this platform - including this very comment - is to be willfully blind to the kind of censorship that's going on here and that we're ignoring.
Not necessarily ignoring; if censorship means that only certain types of voices will be heard, sometimes that's all people want to hear.
My view is that there should always be places that are censorship-free, and after that it's okay that other places might be censored. Most participating on HN tend to know that engaging in certain ways (eg. excessive swearing, aggressiveness, etc.) will lead to being censored. As long as there remains other places to engage with people without rules, I'm also grateful for venues that are heavily moderated.
I'm often critical of what someone else says on HN, and receive a mix of up/downvotes for the effort. But I haven't yet noticed a comment deletion.
My profile has long been crapping on about censorship here, and when the topic comes up in an HN article, I usually wade in with a "not happy", but I just don't see what you're seeing.
When groupthink expresses itself as mass downvoting, and the site reduces the size of your font and/or lightens the color until it is invisible, is that not censorship? The mob can be wrong you know, when the mob relies on a bad source.
Locked threads, deleted threads, comments from pg requesting that particular people stop arguing, a flagging system where some people's flags have greater weight, hellbanned accounts where the person doesn't even know they've been banned (there's some poor schizophrenic guy who made an operating system called LoseThos that posts occasionally), badly editorialized submission titles.
But, that's what moderation looks like, generally. Every social structure is moderated in some form.
Well, ok then. Except... in that particular case schizophrenic people usually have a poor sense of reality, heck even words like "hellban" and "dead" may have religious connotations. Not really sure what a better solution is.
You don't seem to know much about the Terry Davis situation. When a persons posts are 50% racist slurs, there isn't much choice but to block them from posting.
The hellban lets that person still post and converse with people who actively seek him out, without the downsides of having tons of hate speech all over the front page of HN.
It's been said before and it'll be said again, schizophrenia is not a catch-all excuse for someone in a public forum.
I just feel bad for the guy. Originally, I was just pointing out it was a kind of censorship greater than downvoting that has some unsavory aspects to it. Like I wrote above, it's not clear what a better solution would be, except perhaps if I was running things I would try just deleting his accounts. Oh, and you can't reply to his posts, but whatever.
Deleting accounts will cause him to create new accounts. The positive of a hellban is that it lets the person still use their account to broadcast to those who explicitly want to hear it, so there's much less reason for the person to make a new one.
Being temporarily banned from an online forum is probably pretty low on the list of shitty things that happen to people with a diagnosis of a severe and enduring mental illness.
The interaction between community mechanics and the resulting political structures (e.g., "threaded vs. nonthreaded comment sections", "the effects of various moderation systems", various methods of giving out higher levels of privileges, etc.) has been a minor hobby of mine for ~15 years. Here's one result I've noticed: Once you introduce scale into a community, in that way that only the Internet really can, you encounter certain major problems such that you end up with only two choices: Literally permit everything up to and including child porn and every other kind of skeevy thing ever, or engage in curation policies that somebody will, with a straight face, call censorship. And it's not the obvious connection you'd think, either. It's not that "removing child porn" will be called censorship (it will, but not by people that very many other people care about), it's that you end up with a certain choice, and the result will inevitably be "censorship" by somebody's accounting, and it'll probably be a reasonable claim from many points of view.
Imagine you have a landscape of statements that could be made. Color some of it black for being unacceptable by your standards, color some of it white for being perfectly acceptable. No matter how you slice it, there's some grey frontier in between. Throw a dart for every statement made by, say, 10 people in a week. A gentle smattering of darts appears. Maybe only a handful of statements will even be close to that grey line, and none of them land on it. Now throw a dart for every statement made by ten million people in a year. The landscape is no longer visible under the mass of the darts. Now tell somebody they have to go on to the field and clear out precisely and exactly the statements that are "on the dark parts". It doesn't matter how you slice it... someone's gonna scream about how their message was removed.
And that metaphor implicitly includes the idea that we all agree on the colors. In reality, of course, we all wildly disagree, even if there are broad patterns within certain cultures that can be seen. [1]
The only two simple solutions to that problem, "allow everything" and "allow nothing" are respectively illegal (and at least arguably unethical), and pointless (what's a community that allows 0 communication internally? I suppose one of those crazy logic puzzles about hat colors and wardens shooting prisoners. But that's about it.).
This theory predicts that all communities at scale will have a set of people complaining about how it censors. This is more than just HN or Reddit, but anything, NYTimes, Usenet, a sufficiently large GitHub project, everything that doesn't literally allow everything. I'm not aware of any counter-example....
So, given that it's impossible to have a scaled community that doesn't have people accusing it of censorship, it doesn't seem all that interesting to abstractly accuse one of it. One would have to criticize it from a more specific point of view, like, is the stuff being removed significantly at odds with what the community it represents expects? Or is there perhaps an ethical standard that you'd like to specify that is more specific than merely "shouldn't 'censor'" that you'd like to discuss? That's a serious question, BTW; I am not merely defending HN as a status-quo defender. Of course, the ethical standard will itself be up for debate. So it goes.
[1]: Standard metaphor warning: Don't argue the metaphor. It's just a metaphor. Neither the landscape, the colors, nor the darts exist, so there's no way they could "really" have three colors or "really" be nasty fractals or "really" anything else. Argue about the point it's making.
Although Wikipedia has a cabal, they try to clearly delineate how they censor material. On HackerNews, we are almost entirely in the dark: they do whatever they want with no oversight. Given that, having a conversation about censorship on Facebook, which is much more of a free speech zone than here, is hypocritical. (Also: I mostly agree with your post)
On Facebook you can't quickly create an anonymous account to say whatever you want to your 500 friends because that account wouldn't be friends with them. You end up censoring yourself because it's friends, family, coworkers, and romantic interests on there reading your posts.
Please do not confuse censorship, which is governmental tampering with free speech, with private moderation of a free message board to whose terms of service you signed voluntarily. This trivializes real censorship issues, which are abundant.
My understanding is that Paul Graham personally censored this website for multiple hours per day for years, and that now there is someone else, or a small team of others, doing it.
There is a lot about HN's policies and the site's user-hostile features that I don't like, and occasionally complain about, but neither dang nor pg have kept the highly curated and moderated nature of Hacker News a secret. Like as not, this site just isn't supposed to be about free and open discourse, but civil, intellectually stimulating content and holding back the inevitable Eternal September effect, and many of the users here agree with that and vote in kind (and when they don't, dang stuffs the ballot.)
And to dang's credit, I follow his comments now and then and most of the times he criticizes people for their comments, they do seem to be obviously over the line.
Unmoderated forums have the potential to violate US law. As a result, none last for long. Even 4chan, while anonymous, has never been completely unmoderated. It has to comply with US laws on child pornography and hate speech.
The internet has a public reputation as being open and free. This has never been the case. You've always had to charge for access and you've always been in danger of having your content taken down at somebody else's discretion. It used to be taken care of by your hosting company or your ISP. Now it's taken care of by Facebook, Twitter and Hacker News.
> I posted the photo to my Instagram account and clicked the button to share to Facebook. But while discussing the incident with a colleague last night, I scrolled back and discovered that both posts had been deleted.
Censorship issues aside (I know, seeing the trees instead of the forest here), I wonder if this is an actual mechanism due to Facebook and Instagram having a high level of integration...i.e. a Facebook mod pushes the "delete" button and it makes a privileged deletion call to Instagram's API (or vice versa)...or did it just happen to be that users from each service separately flagged the posts, and by the time OP noticed, the post had been removed from both services?
reddit is starting to do the same thing. Twitter already does it. Even HN mods "scrub" stories off the home page all the time, usually for very good reasons, but also often for reasons many HN'ers would disagree with. You can see hidden (demoted) posts on http://hnrankings.info/
Private companies have every legal right to do this but it's not ultimately the way the world should be. People themselves should be the ones curating what they see, not faceless administrators secretly making judgment calls using unwritten policies. It creates a situation in which you can't be sure what you're not seeing, which is far worse than sometimes seeing things you don't like.
Curated content can be useful and efficient for a readers time if they want to visit a site with a particular content. It allows a site to focus on a core relevancy and limits the hivemind effect. IMO there is value in both curated and community driven sites. The key is either site is upfront about what they expect to offer.
This isn't limited to community driven sites and has existed long before the word "social" had anything to do with computers. It used to be if you posted controversial things to your website, you had to contend with the company you bought your hosting from.
Facebook and Twitter make it easier, but this is nothing new and has nothing to do with curation.
HN holds itself out to be much closer to a curated service than Twitter, FB, or Reddit. There's no such thing as "off topic" on those services (although within a subreddit there certainly is).
I don't think that "personal curation" is a longterm workable solution. It's too much of a burden for most users. We need audit-able personal algorithmic filters.
That's something that have been happening for a while but, as always, first they came for the undesirables and nobody said a thing.
The Syria civil is going for as long as the term "Arab spring" exists, with protests and the beginning of armed conflict going as back as 2011.
With access to means of communication not controlled by the state the start of this conflict was very well documented, in real time even.
From something happening (a mass protest, a shooting of protesters or policemen, takeover of public buildings) to the post of the video on Twitter or Facebook sometimes not even 15 minutes elapsed. It was all there, live, unedited, raw. Anyone with a mobile phone could show the works exactly what was happening, from all sites involved. All it took was to make a page on FB or an account on Twitter or YouTube and to push "upload".
Or so they thought. See [1] and [2] below for context. Lots and lots of footage, pictures and live testimonial were "scrubbed" by overzealous application of the TOS, usually motivated by organized mass report of content.
It is very sad because not only they believed that, for the first time they could broadcast their message without cost or risk of censorship (they couldn't) but also a very valuable piece of our collective history was lost forever.
Thus first they came for questionable content from a sensitive subject of a conflict in the other side of the world. Now they can apply the same reasoning at home and people will have little recourse because the precedent was set. More and more we hear "first amendment only applies to the government, not to private companies" and that's true but doesn't make it any less sad to cope with the repercussions of this deletionism.
Maybe he didn't actually share it to Facebook. He's just going on the fact that he can't find it, and immediately assuming Facebook just deleted it without mention. I'm skeptical any of this happened. That's not how it works.
I posted the photo to my Instagram account and clicked the button to share to Facebook. But while discussing the incident with a colleague last night, I scrolled back and discovered that both posts had been deleted.
Even if he might not have shared it to Facebook, he does mention that his Instagram post was also deleted.
Is this FB deleting user content on its own, or responding to reports by other users who have been somehow "offended"? We had something similar happen in my country recently, repeatedly:
A Facebook page dedicated to cataloging photos and images from the country's history posted a 100-200 year old black and white image of a bare-breasted woman (something not unseen in Ceylon before Europeans missionaries brought Victorian values to the island). This was deleted. A discussion followed, where the majority consensus was that the photos should stay up. But every subsequent attempt to repost resulted in the image being taken down. The administrators of the page eventually gave up.
> I posted the photo to my Instagram account and clicked the button to share to Facebook. But while discussing the incident with a colleague last night, I scrolled back and discovered that both posts had been deleted.
That was my first thought as well. If there are people who can corroborate that they commented or liked it, great, but if not it's just a single unreliable anecdote.
Facebook has a huge team in the Philippines doing manual gore removal. There was an article on how the people doing that work can't do it very long as it's mentally taxing. Since this was a picture of a dead person, I bet this was just removed by that team.
I have; how would I write this comment from the heading "I've been scrubbed"? But the cleanup team didn't read the post. The photo is shown at the bottom, together with the byline it looks like there might be a dead person behind that tarp.
I'm immensely skeptical that Facebook would delete this from his personal Facebook account.
It's not that I don't think Facebook is capable of censorship. It's that I doubt they'd extend their hand over such a small matter. What possible crazy conspiracy theory connects duck boats and Facebook headquarters (beyond "all corporations are evil and regularly meet to decide how to suppress the evidence")?
Facebook is actually well known for deleting drugs, gore, etc.. You should consider every picture you post to Facebook to be queued to be seen by a moderator. A cheap, third world country moderator, true, but one none the less.
Normally I don't comment on my flags, but just to dampen some potential future conspiracy theories, the low quality comments generated by this thread are the reason I flagged it. So if it disappears from the front page, for me it will be despite my opinion about the importance of the topic or about the quality of the article itself. It will be because it is making HN worse.
That used to be true. It isn't anymore, because people are working on decentralizing the web. This makes the web uncensorable and permanent.
The two most promising projects in this area are IPFS[1] and Ethereum's Swarm[2]. When we rebuild social networking and publishing on top of protocols like these, the problem this post illustrates will go away. Anyone who wants to help build these systems can email me (address in profile) and I'll try to send you to the right people and resources.
If you look at 8chan, they are unable to accept donations because paypal etc. will not process them. So it's generally extremely hard to do anything that big companies don't find acceptable.
Nope. You still rent bandwidth from your ISP. If you're doing something they don't like, they can and will shut you down. Whether we like it or not, everything on the internet exists at the whim of one company or another.
Having run my own servers for almost 20 years, I think you're orders of magnitude safer with your own box collocated with some ISP. Commercial ISPs have very few fucks to give for people who are upset about content. Other than spam, network abuse, and crime, which are direct threats to their business, or properly filed DCMA notices, which incur legal liability, I just don't see any concern. In fact, I see a studious and practice avoidance of interest. Large ISPs just aren't very whimsical.
For the really paranoid, you can get yourself bandwidth from multiple providers; then you're only worried about synchronized whimsy. Which, I suppose is definitionally not whim at all.
While I will buy this argument with regard to the web, the internet also includes bitorrent, which is quit a bit more difficult to censor (e.g. the celeb nude photos from the icloud leak).
I don't think the offensive part is necessarily the "scrubbing", but the lack of transparency. The post just disappeared with no notification or explanation. If I made a post and then subsequently received an email from Facebook or whoever saying "We have deemed your post inappropriate because ...; it has been removed," I would have a lot less of a problem with it than if it just disappeared without a trace.
From a cursory look at the picture, this might have got caught by some crazy "no free advertisement" bot, since a phone number is clearly legible. That probably wasn't the intention, but it is easy to see how a DNN would classify this as a commercial advertising picture without additional context.
It doesn't seem to me like he has an agenda. Keep in mind that the "duck boats" are pretty much universally reviled by natives in Philadelphia; they are loud, feel out of place in the neighborhoods they operate in and are full of gawking tourists.
The incident in 2010 he mentioned was pretty big news in Philly. Just seems like a concerned reporter who was at the right place at the right time and noticed a series of recurring events with these boats.
Seconded. Although it's true this was likely the fault of the woman (she apparently walked in to traffic against a light in a high-traffic neighborhood while staring at her phone) it took about 5 minutes for news of it to get around Philly. I learned about it before the scene had even been cleared.
That guy defending his choice to post pictures of dead people, pretending the "corporate platform" (because well, it's always Us vs Evil Corporations isn't it -- don't forget those corporations are just collections of people like us) Facebook for taking it down, and not once taking responsibility for how his choice to post that may have been legitimately upsetting or offensive, or even morally wrong. Why does it work for his amateur opportunistic "scoop" take precedence over the deceased's privacy? Why does it work for him to claim posting a picture of someone violently killed by a truck is some kind of moral good, and yet taking it down is assaulting the very foundation of all that preserves the shrinking territory of what is good in our society.
This crusader has his moral compass all wrong.
As if he's angry his amateur "scoop" has been censored, like he's some kind of force of history and truth, when really he's just posting gore, declining to think of the people who height be harming by doing so. Because it's really all about him and his "rights", isn't it?
This guys confusion of his personal amoral choice with some fantasy of a moral crusade, and fake embedding of himself in some kind of narrative of evil corporations vs moral individuals, is just self-aggrandising hackery, that his gore lust doesn't merit. That his deluded ideas has any currency is one the things wrong with contemporary thought.
The more likely scenario is a relative of the person killed reported it because they had a similar reaction. People don't understand when they are allowed to have their pictures taken.
Facebook and Twitter are not places that withhold a no-censorship philosophy, so no one should be surprised (except for possibly why it was taken down.) There are a number of imageboards and forums that do, but you wouldn't reach the larger audience of these moderated outlets.
While your snarky comment seeks to point out your distaste for libertarianism and possibly this community itself, I think it is intellectually dishonest to characterize HN as a single entity. It is many people, with many different viewpoints, spanning in age from young to old. HN doesn't have an "opinion", people utilizing it do. One would do well to remember that when characterizing communities such as reddit, hn and 4chan. They are the sum total of many people and are often characterized by a vocal or visible minority.
edit: The parent said something similar to "Hacker news should love this, corporations can do what they want, hail Rand".
It is slightly ironic that that comment was deleted on a post on this subject. Did the poster do it themself or were they censored? Part of the conspiracy?
Bit ironic really. This article is about the preservation of content. I personally dislike coming into threads and seeing a deleted conversations, so I try not to delete my comments.
I hate to sound all conspiracy theorist crazy but this makes me scared of the fact that people may start to just up and disappear 1984 style. Covering up murders of people is not good on even a personal level as nobody gets closure, but on a wider societal scale its just frightening. It starts with takedowns on Facebook and moves into the larger media: whats next? CNN?
It's happened in the past, the same drives that motivate "disappearances" haven't gone anywhere and we're a country with a recent history of disappearing people to black sites to never be seen again.
Inconvenient content is buried or erased on the internet all the time. Entire events have been tried to be erased from the from the internet (think Great Wall).
Nothing is more inconvenient than Joe Citizen catching you red handed and making it public.
Monitoring or shutting down social media and chat services is one of the first thing that happens during a political upheaval. Violence and people dying in the streets is inconvenient.
It's too bad that you had to say that in a public venue, but at least it's a niche like HN - where it's all tech geeks, and virtually certain that no one here actually knows or interacts with you on any regular basis. Expect a knock at your door in 3... 2...
I don't know if duck boats are "bad" as they are simply objects. However, there are many more people using automobiles in our society, and the utility of automobiles is much higher than duck boats (at least how we choose to employ them currently).
Also duck boats can be bad and so can car accidents. This is a non-sequitor.
That's one aspect I really like about South East Asian culture - it hasn't been infected yet by the whites "politically correctness" - life happens and they simply get on with it. I follow some people and man some of the graphic photos that get posted (Facebook) there are intense, think /r/gore x10.
Anyways minority ruling the majority... 1 person complains all obey, sad
China doesn't massively censor its society based on what the authorities decide is correct?
You can't get any more of a controlled, politically correct society than China when it comes to censorship. They're the actual definition of political correctness censorship.
How about porn in China? The authorities decided long ago that was politically incorrect, and it's illegal.
How about Japan censoring - for 70 years - all sorts of aspects of what they did during and before WW2?
How about Singapore's censorship dictatorship, where Lee Kuan Yew ruled the news media there with an iron fist, and directly dictated what was culturally and politically acceptable. The very definition of political correctness based censorship.
Is your theory that South Korea has an open society when it comes to political correctness? I completely disagree, they outlaw anything even remotely pornographic, and they block all pornographic web sites. Again, the definition of enforced political correctness.
Then we can get into Thailand, which is censored heavily by their political correctness machine; it regulates everything from speaking out against regimes to pornography.
How about Indonesia's anti-porn laws, the very definition of enforced political correctness:
This only begins to touch the surface of the political correctness censorship rampant in Asia. And that's all before we get into the really bad examples, like Myanmar and North Korea.
Right - having read that, my use of "politically correct" was not what you've laid out and I've used an inaccurate term to describe what I want to say.
When in Cambodia, Vietnam and Thailand, there is a sense that people look out for themselves. The govt is there, however they aren't there to make sure you wear your seat belt, tuck in your shirt, etc. Sure there is an exercised control over media/news and ensuring the message about the government is inline with what the government wants. I'm not referring to this.
I'm referring to the: the photo is there, I don't have to look at it if I don't like it. I'll call you a foreigner, because you are. I'll call you white, because you are. Referring to people relating to each other. The people are most interested in themselves and their families rather than what other people are up to. The opposite of the American way "how dare you exist and offend me, I'll sue you!!" - whatever this is...
Cheers for taking the time to re-define what Politically Correct actually means :)
There's a fair number of people online who hate me and repeatedly report things that I do. I uploaded a youtube video showing a XSS vulnerability that I found, reported, and had been fixed. Someone reported it and it got deleted, I got a red mark on my account and it put my account in "bad standing". I appealed it and it got put back up and the stuff reversed.
On a different site I got a 2 week ban for posting a link to a completely harmless and completely rule-abiding youtube video. I appealed it twice and it wasn't lifted. On that same site I had my account deleted for posting a rickroll video. That time they listened to the appeal and reinstated my account.
My reddit account was shadowbanned. I appealed it and it got reinstated.
My point is that moderation is always somewhat subjective and usually done by low paid workers going through high volumes of disgusting stuff, so there will always be mistakes. And there are people who get mad and basically treat that report button as a dislike button. I even went to a talk by a facebook employee on the spam team, and he said there are some countries they essentially completely ignore reports from because reporting innocent stuff is so wide spread there.