Is there any law enforcement in Theodorus? How can you make sure I don’t spam and write irrelevant stuff?
You’ve probably noticed you have a score of credit-points that as it increases it allows you to do more and more things (like commenting, or suggesting ideas, or adding law-alternatives). Your score increases when you use the system properly and when other people approve of your actions (by endorsing your ideas, for example). However, if you do something inappropriate you’ll get penalty points which you’ll need to get rid of (same way you added points to your score) before your score could advance. Not only that, but some actions are actually disabled when you penalty points, and lastly – for every time you sanctioned on the same accusation, the number of penalty – points you receive will increase.
That’s sounds a little draconian, like a police-state. What are the “inappropriate actions” anyhow?
Actually there aren’t that many bad things you might do – I’ll list them according to the degree of the offence:
Irrelevant content is basically harmless but it disrupts the discussion. The penalty for it, obviously, is relatively light. As you referred to this design as “draconian” I should mention that jokes shouldn’t necessarily be considered as irrelevant as it is really question of context, timing and appropriateness.
Offensive content might be demeaning or plain hurtful. It might be addressed toward an individual or towards a group. For example, “women are only good to bring babies” is offensive (in many cultures at least). “Women should not be allowed to drive”, as it is an opinion is more questionable but the community can have its judgement on that. It’s worth mentioning that anyone can report an item as offensive – and not only victim of the offence.
Spam – is or soliciting, or basically trying to “sell” you something that is irrelevant for the discussion will get a bigger penalty than a simple “innocent” irrelevant content. Some scenarios might show a very thin-line between recommending something relevant to the discussion and trying to sell it. Again, it’s up to the community to draw the line.
And finally – content that encourages violence is inappropriate and will be dealt with severely. If you’re afraid of “police-state”, I see no reason to differentiate between police-violence and any other kind of violence – physical, emotional and verbal.
Although many of the capabilities are disabled when you have penalty points, two function are always left available – you’re always allowed access to the system if you gained enough credit points
You keep saying that the community is the judge. how does this works exactly?
Anyone can a REPORT a message as inappropriate. it’s an anonymous action so no one will know. Reporting an item doesn’t require you to classify the inappropriateness – only to click the “report” button. If the message you reported really is inappropriate, you’ll get points to your score as you did your civic duty. If, however, you keeping reporting falsely, straining the system, you will be penalised for it.
Users who gained enough experience with the system may choose to become a “Moderator”. A moderator have a special inbox with all the reports the different users clicked. for every report a moderator must pass its judgement – whether it’s ok or inappropriate for one of the reason mentioned above (unlike the reporting user, the moderator need to check a radio-button).
Moderators get credited every time the settle a refute.
Aren’t you worried that moderators will exploit their powers?
The moderator’s decision should be objective so he doesn’t see who wrote the questionable message: he only see the message itself and if it’s a commend or a law-proposal, he’ll see the idea as well. Assuming the community is very big and likely to be impersonal, nepotism is strongly discouraged in this method.
Also, a single moderator is not enough for a sentence and you actually need to approval of several moderators. The most-voted judgement is selected to be acted upon and the moderators to have selected it gets additional credit.
It should be mentioned that having any amount penalty points automatically pends moderation capabilities and furthermore – some negative badges (encouraging violence) for example will deny you access to moderation completely.
And lastly, moderators cannot perform other duties in the Theodorus environment (such as support or representations).
I noticed you keep avoiding providing numbers – how many moderators do you need to pass a judgement? how many penalty-points you can get?
You’re very acute. Yes, I purposely avoided giving any numbers. because every community that will decide to use Theodorus can set up its own numbers. They can actually have a democratic voting on this – where every member of the community will be allowed to throw his opinions on how many judges should be per trial and the average will the be the decision de facto. To be honest, I don’t know what the right numbers are. it requires a community to actually experiment and see what are the proper values.
One Last question (for now): Do you think this concept will work elsewhere?
So just to summarise what the underlying concept: anyone can report a crime; false report is a crime on its own; a random set of judges pass their verdict objectively. The punishment is set automatically upon verdict and is a function of the crime and its reoccurrence.
To be honest, I doubt it will work elsewhere, especially in the real world:
Today, it’s considers legitimate to report an online inappropriate behaviour. It’s also relatively easy. Real-world-informant gets let appreciated as it seem they are poking their nose on somebody else’s business.
Second, The scoring point, which has implications on the rest of the system is also motivating factor that cannot easily be translated to other environments.
And lastly, Moderators actually function much like real-world jury, only here it’s people who volunteered as believing they are making their community better, while in the real-world it is an annoying duty most people would like to avoid.