Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                

PreemptiveModeration

One way to keep spam off of an online community is to PreemptiveModeration; i.e. no one may post until that post has been "vetted" somehow.

Methods

There are a few different ways you may "vet" posts.

The simplest way is for the site host to read each comment and accept or reject it. Of course, this is impractical for most sites. Of course, this is impractical for most websites; it is, however, a system often used for small newsgroups or mailing lists. BBCi used to do it across the board, but were eventually forced into retreat by weight of numbers. Individual users may still be placed on PreemptiveModeration, however.

One can electronically vet posts. This can't keep out all spam, but it can keep out swear words. Some sites use other criteria; SlashDot, for example, has a "lameness filter" that won't let you post if certain criteria aren't met (I forgot exactly what; I think one of them is that the post must be a certain length).

One can have another random user vet the post. The trick here is to ensure that the other user is really a separate user (i.e. that the poster can't StuffTheBallotBox?). An extention of this would be to have multiple distinct "referees" vet each post1.

In the extreme, one could have a whole subcommunity of users use any sort of RatingSystem on the post, before it is "published" as a "real" post.

CensoredBeforeRead

The goal of pre-emptive moderation is for the content to be censored before it's been read. Although a post can be hidden after it's caused offence, this is often of limited effectiveness - it helps for promoting ForgiveAndForget, and also for providing a sense of veangance to the victim, but these don't address the fundamental cause.

CensoredBeforeWritten

Going one step further, we may want to censor inappropriate material before it's even been written. A "captcha" (an automated TuringTest) is an example of this where postings written by bots are considered inappropriate. A GatedCommunity is another approach. On the SoftSecurity side, you can aim to LimitTemptation by being boring and making it easy to vandalise.


CategoryModeration?

Footnotes:

1. Note that this has a precedent in paper media; anonymous peer reviewers in scientific journals
(CommunityWikiFooter)

Define external redirect: StuffTheBallotBox CategoryModeration

Languages:

The same page elsewhere:
MeatBall:PreemptiveModeration