Certainly some of the suggestions in the article have merit. Encouraging people to browse and vote on stuff in new would be good.
But I agree with you. You don't fix human problems by adding technical controls. You fix human problems by adding technical measures designed to enable human controls.
My proposal is relatively simple: figure out a high karma number (high enough few people have it) and allow these people to issue an incivility warning. If you get more than a certain number of incivility warnings in a month you get banned, first temporarily and on the second time perhaps permanently. These reprimands would be public.
People need feedback. People need to know when they are stepping out. One of the real concerns I have with hellbanning (though I recognize in some corner cases it may be useful) is that it denies people this feedback and therefore does not encourage people to stay within the lines of acceptable conduct. So make hellbanning only for cases of people who are long-term repetitive problems or spammers. Implement a system that gives feedback to users for the rest. And allow the most senior people to enforce civility constraints.
Technology needs to empower people to manage people. Software should not manage people.
> "Technology needs to empower people to manage people. Software should not manage people."
I agree, in a way.
The only communities I have seen that associates numbers with human merit or trustworthiness are the ones populated largely by programmers. I feel this is telling.
I always have low scores on these sites because I am just not motivated to contribute to them. While Jeff Atwood was on to something when he said that users can do a good job when they feel invested in the quality of content on a site, we just cannot expect numbers representing reputation or consensus to always be set rationally. That, and I just hate thinking I will be assigned a score for trying to contribute to a conversation when I have no idea how the site's culture will interpret what I do. We may as well carry those 1-10 judge cards around our necks and stick gold stars on outsiders' foreheads. It's absurd... but apparently it's necessary. It shouldn't be, but I know I can't handle 1 million+ users as a moderator.
One of my old jobs was to facilitate a chapter of Socrates Cafe, a national discourse organization. We allowed anyone at all to come in and share his or her experiences. My job was to facilitate the discussion and put out flames fanned by conversations between opposing parties. So, when atheist/theists, right-wing/left-wing wars erupt, I was supposed to take varying measures to restore order without evoking enough negativity to scare away members. That is, I had to try and settle everyone down while making them feel BETTER about everyone else.
I cannot begin to tell you how ridiculously hard this can be, and technology does not help unless you use it for shameless enforcement.
Intuition and context-awareness helps tremendously, and there have been times where I simply could not think of any solution to one particular heated debate outside of, no joke, an air horn. I hated having to blast all of the words right out of the room, but the group we had (to me) represented a community gone wrong that needed flat out intervention. We never once used tools to try and change the rules of conduct. Instead, we used technology as a brutal and unapologetic last-ditch enforcement of existing rules.
Now... That's a group where everyone saw each other in person. Clearly not the same thing as an online group. That's worse. In this setting, technology is the only thing you HAVE to mediate all interactions, even the "human" ones.
I don't have a real solution to the sorry state of large, rude online communities. In fact, if you made me a moderator, I would probably go on a banning spree until everyone was too terrified to stay on their own accord.
Even so, but I doubt anyone else really has a solution either... Including the OP. Improvement can only be measured until online cultures change in a way to push newly set boundaries.
The problem, I think, is that engineers have been taught over and over to see the humans as outside the system. You have the system components, you have a user interface, and the user. This makes sense in a way because you can mathematically understand the components of your software, but you cannot mathematically understand the user of your software.
Recently I have started to realize just how wrong this is, first in reading what others have written on IEEE spectrum on the so-called "Automation Paradox" and later internalizing that business systems usually treat people as integral components. Even if humans are relegated to a supervisory role it is important to given them enough to do to maintain the system and failure to do so in vehicles can result in air travel catastrophes or ships running aground. But the same holds true for every other field. There is such a thing as too much automation. Especially where we are dealing with social environments, like HN, that human touch is what is most important.
I think in this the first question is what should humans be doing, and the second is how can the computer system enable them to do it. Unfortunately engineers tend to forget the first question and so never reach the second.
But I agree with you. You don't fix human problems by adding technical controls. You fix human problems by adding technical measures designed to enable human controls.
My proposal is relatively simple: figure out a high karma number (high enough few people have it) and allow these people to issue an incivility warning. If you get more than a certain number of incivility warnings in a month you get banned, first temporarily and on the second time perhaps permanently. These reprimands would be public.
People need feedback. People need to know when they are stepping out. One of the real concerns I have with hellbanning (though I recognize in some corner cases it may be useful) is that it denies people this feedback and therefore does not encourage people to stay within the lines of acceptable conduct. So make hellbanning only for cases of people who are long-term repetitive problems or spammers. Implement a system that gives feedback to users for the rest. And allow the most senior people to enforce civility constraints.
Technology needs to empower people to manage people. Software should not manage people.