* What is the role of filters among existing (algorithmic) quality-control mechanisms (bots, semi-automated tools, ORES, humans)? Which type of tasks do filters take over?
* How have these tasks evolved over time (are they changes in the type, number, etc.)?
* What are suitable areas of application for rule-based systems such as filters in contrast to the other ML-based approaches?
---
## What is an edit filter
* MediaWiki extension
* regex based filtering of edits and other actions (e.g. account creation, page deletion or move, upload)
* triggers *before* an edit is published
* different actions can be defined
---
## Motivations for its introduction
* disallow certain types of obvious pervasive (perhaps automated) vandalism directly
* takes more than a single click to revert
* human editors can use their time more productively elsewhere
---
## Edit filters in the quality control mechanisms frame
* the question of infrastructure
* guidelines say: for in-depth checks and problems with a particular article bots are better (don't use up resources)
* they were introduced before the ml tools came around.
* they probably work, so no one sees a reason to shut them down
---
* hypothesis: Wikipedia is a diy project driven by volunteers; they work on whatever they like to work
* hypothesis: it is easier to understand what's going on than it is with a ML tool. people like to use them for simplicity and transparency reasons
* hypothesis: it is easier to set up a filter than program a bot. Setting up a filter requires "only" understanding of regular expressions. Programming a bot requires knowledge of a programming language and understanding of the API.