The present work can be embedded in the context of (algorithmic) quality-control mechanisms on Wikipedia.
There is a whole ecosystem (syn?) of actors struggling to maintain the anyone-can-edit encyclopedia as good^^ and vandalism free as possible.
We want to be able to better understand the role of edit filters in the vandal fighting network of humans, bots, semi-automated tools, and the machine learning framework ORES.
To this end, in the current chapter we investigate vandalism and vandalism fighting guidelines, as well as the mechanisms mentioned above.
After all, edit filters were introduced to Wikipedia quite late, compared to the remaining mechanisms: in 2009. %TODO: when was the other stuff introduced
To this end, in the current chapter we study scientific literature on vandalism in Wikipedia and the quality control mechanisms mentioned above.
\section{Vandalism on Wikipedia}
%TODO put here papers on vandalism
\begin{comment}
\subsection{Vandalism: Original Research}
According to Wikipedia's newspaper, the Signpost, edit filters were initially introduced as a vandalism prevention mechanism~\cite{Signpost2009}.
The aim of this section is to provide a better understanding of vandalism on Wikipedia: What is vandalism, and what not; who engages in vandalism; who is striving to prevent it and with what means?
...
...
@@ -83,6 +87,7 @@ And there are also users who specifically dedicate substantial amount of their W
These dedicated vandal fighters mostly do so with the aid of some (semi or fully) automated tools which not only significantly speeds up the process (see below),
but, according to research, fundamentally changes the nature of the encyclopedia and its collaboration ecosystem~\cite{GeiRib2010}.