How to protect the system and its users from harm, in gentle and unobtrusive ways. WhyWikiWorks. Assumes some degree of rationality and/or human response on the part of the attacker and thereby fails in the case of persistent (pathological) attack, or equivalently fails in the face of automated (non-human) attack. ''That's not entirely true. A SurgeProtector is considered SoftSecurity as it doesn't close off voices, just over exuberant ones.'' See the page on MeatBall for more on this. See also: SharkBot ---- One way to do this would be to have a throttle/flood control. For e.g. a rule that one single editor cannot change more then 3 pages in a minute will make most automated attacks too slow to be practical. Two questions that arise -- * How do you figure out if the entity/editor editing a page is same as another? cookies would be too obtrusive, and IP address may not work for proxy servers. Maybe if we logged traffic from a given IP and if it is significantly more then the historic traffic then throttle it? * How to you save yourself from Distributed attacks? Maybe have a global flood control. If too many edits are taking place at the same time, add return delays in the save button press. I think flood control is in keeping with Wiki idea of openness. Any human can still come and edit, but threat from malicious programs is limited. -- Kautilya And in fact flood control of that sort is in place here. How does the current flood control work? -- Kautilya ---- CategorySecurity CategoryWikiSecurity