So, a further thought about Paul Graham’s Hacker News and its comments policy. (See yesterday’s post for details.) If reddit allows you to approve or disapprove of things you haven't even read, Hacker News appears not to allow you to disapprove of things at all: you can click the “up” arrow or . . . well, do nothing at all. Or so it appears. Because it turns out that when your karma points reach a certain threshold — apparently 100 — you suddenly acquire the opportunity to downvote a post/link. Interesting! It’s only when you have contributed value to the community that you are entrusted with the power of negativity.Something similar is being done at another hacker site, Stack Overflow, where upvotes add 10 karma or “reputation” points to a post’s author, while downvotes remove two reputation points from the post’s author — and one from the reputation of the person doing the downvoting. This too is interesting! Here you have to ask yourself before voting something down whether you feel strongly enough about it to take a chunk out of your own reputation to register your disapproval. Kinda like real life.These are great examples of “choice architecture,” but not quite of the “nudge” variety. They are more than nudging you to make certain kinds of decisions, though. Hacker News is allowing you to purchase power with good behavior, while Stack Overflow is subtly threatening people who consistently misbehave with expulsion from the community. I like these models very much, but at the moment I can't see how they could be applied to sites where there are just comments rather than votes on the value of posts. Regular old blogs — as ace commenter Tony Comstock remarks in relation to my previous post — may have to depend on the blogger’s own ability to model civil discourse and to gently manage comment threads. But I have seen many, many peaceable and thoughtful bloggers get overwhelmed by trolls and other hostile figures. So there is, I think, a desperate need to develop a choice architecture that works for the garden variety blog and its comment threads.
These are very interesting ideas for policing blogs. Indeed, while hackers are notorious for flame wars, they also have developed various ways to police online communication. For instance, IRC channels (around for 20 years) have ops who can summarily ban offending users. And trolls on mailing lists are quickly shamed and shouted down. Granted, mailing lists and forums can also be sites of uncivil discourse, intimidating to the unitiated, RTFM being the most famous insult.
These sites work because hackers communicate with specific, practical aims in mind (getting software to work, fixing bugs, etc.). Blogs tend to have looser, less goal-oriented communities. Thus, the policing breaks down.
Just a side thought:
Our website gets a steady stream of traffic from (IIRC) The Tilted Forum Project. I don't know much about them, except (again IIRC) that it costs $5-$10 to join and be able to post. Interestingly enough, $5-$10 is enough to get people to respond positively to suggestions that if they don't shape up they'll get banned.
Remembering that has made me remember a psych experiment we were taught in psych 101. (Once more, IIRC) a bunch of crazy people in a mental hospital were given financial incentive not to act like jackasses, which I believe is the clinical term for manifesting their disorders.
I believe the history of political activity at malls over the last 20 years is also instructive.
These incentives proved remarkable effect, but people got all up in they're crazy when the incentives were removed. Sorry, I can't footnote it. Maybe I'm making it up.
With about 15 years now in dealing with online communities, from IRC to Usenet to PHPNuke fora to Slashdot etc etc to running my own community-based startup, I can say that the cheapest and most effective form of community policing is an attentive moderator with a hair-trigger — effectively a (hopefully) benevolent dictator. Note that I say this as a political liberal with no affection for dictatorial solutions in meatspace. But I cannot count the number of fora that I have been involved in or witnessed that were destroyed by a combination of mischief and a moderation philosophy centered on the idea that censorship was the worst possible sin and banning was such an enormous consequence that it should be undertaken rarely.
Some people are there just to screw with your community, and the nature of online interaction means there are little or no consequences. Additionally, some quirks of the chronology of commenting mean that the thread-destroying pile-on is almost guaranteed. Worse than one clown saying something stupid to get a rise are the 50 consecutive replies from otherwise good community members who want to get the rush from self-righteously lecturing an easy target.
Ban the clown early and often. If your moderation policy or implementation stinks, people will vote with their keyboards. There is no lack of communities that will prosper in your stead. Don't feel guilty about it.
Because trying to be patient and 'reform' troublemakers or trying to feel fairminded by imagining you're giving voice to a legitimate dissenting opinion are poison to the normal working of your community. Think of your comments as a pub. People are thrown out of pubs all the time for being disorderly and troublesome. Nobody thinks pubs are fascist or unAmerican (except the non American ones) for doing it and pub owners don't feel sorry for it, because they know who their customers are, and it isn't the troublemakers.
"Think of your comments as a pub. People are thrown out of pubs all the time for being disorderly and troublesome."
This is exactly the analogy my best online friend uses, and I think it's apt!
What a bunch of Godforsaken tripe!
All right, Douglas, get outta my pub. NOW.
Dave is right that policing becomes difficult when the "goal" of a site is the non-goal of conversation. In such an environment it gets much harder to define what it or is not appropriate, what is or is not, on-topic, etc.
I've seen more than a few blogs, communities and such drown in their own success — the burden of policing or even simply "keeping up" grows faster than the rewards the blog, communities, etc, can return to those responsible for its care and feeding, and I'm tempted to veer into a rant about the need for Good Things to be conceived in a way that makes them self-sustaining.
Of course maybe that's just what Alan is getting at. My typical answer is "find a way to make it pay your bills", but intelligent implementations of technology and/or cunning design can carry (at least some of) the burden as well.
Comments are closed.