diff --git a/thesis/5-Overview-EN-Wiki.tex b/thesis/5-Overview-EN-Wiki.tex
index 24597cd41f4ad549da6e34d539612922fa2789f7..9f797f1c4e333687e1a3707f610f236d2a45b5c1 100644
--- a/thesis/5-Overview-EN-Wiki.tex
+++ b/thesis/5-Overview-EN-Wiki.tex
@@ -348,127 +348,55 @@ It is signaled, that the mailing list is meant for sensitive cases only and all
 %TODO decide whether to include this here or move back to actions
   ** there's a tendency of editors to hide filters just for the heck of it (at least there are never clear reasons given), which is then reverted by other editors with the comment that it is not needed: 148, 225 (consesus that general vandalism filters should be public \url{[Special:Permalink/784131724#Privacy of general vandalism filters]}), 260 (similar to 225), 285 (same), 12 (same), 39 (unhidden with the comment "made filter public again - these edits are generally made by really unsophisticated editors who barely know how to edit a page. --zzuuzz")
 
+
 \section{Types of edit filters: Manual Classification}
 \label{sec:manual-classification}
 
-Apart from filter typologies that can be derived directly from the DB schema (available fields/existing features), we propose a manual classification of the types of edits edit filters found on the EN Wikipedia target (there are edit filters with different purposes).
+The aim of this section is to get a better understanding of what exactly it is that edit filters are filtering.
+Based on grounded theory methodology presented in chapter~\ref{chap:methods}, I applied emergent coding to all filters, scrutinising their patterns, comments and actions.
+%TODO Comment on exact process of coding (check with coding book, I think a lot is explained there already)
+
+Three big clusters of filters were identified, namely ``vandalism'', ``good faith'' and ``maintenance''. %TODO define what each of them are
 
-Based on the GT methodology, I scrutinised all filters, with their patterns, comments and actions. %TODO define more precisely what exactly are we studying
-We found 3 big clusters of filters that we labeled ``vandalism'', ``good faith'' and ``maintenance''.
 It was not always a straightforward decision to determine what type of edits a certain filter is targeting.
-This was of course, particularly challenging for private filters where only the public comment (name) of the filter was there to guide us.
+This was of course, particularly challenging for private filters where only the public comment (name) of the filter was there to guide the coding.
 On the other hand, guidelines state up-front that filters should be hidden only in cases of particularly persistent vandalism, in so far it is probably safe to establish that all hidden filters target some type of vandalism.
 However, the classification was difficult for public filters as well, since oftentimes what makes the difference between a good-faith and a vandalism edit is not the content of the edit but the intention of the editor.
-While there are cases of juvenile vandalism (putting random swear words in articles) or characters repetiton vandalism which are pretty obvious, that is not the case for sections or articles blanking for example. %TODO explain why
+While there are cases of juvenile vandalism (putting random swear words in articles) or characters repetiton vandalism which are pretty obvious, that is not the case for sections or articles blanking for example.
+For these, from the edit alone there is no way of knowing whether the deletion was malicious or the editor conducting it just wasn't familiar with say the correct procedure for moving an article.
+
+%TODO compare with code book and kick the paragraph out
 In such ambiguous cases, we can be guided by the action the filter triggers (if it is ``disallow'' the filter is most probably targeting vandalism).
 At the end, we labeled most ambiguous cases with both ``vandalism'' and ``good faith''.
 
-In the subsections that follow we discuss the salient properties of each manually labeled category.
 
-\begin{comment}
-    \item how often were (which) filters triggered: see \url{filter-lists/20190106115600_filters-sorted-by-hits.csv} and~\ref{tab:most-active-actions}; see also jupyter notebook for aggregated hitcounts over tagged categories
-    \item percentage filters of different types over the years: according to actions (I need a complete abuse\_filter\_log table for this!); according to self-assigned tags %TODO plot!
-\end{comment}
+%TODO include here a diagram with overview of the categories distribution
 
-%TODO What were the first filters to be implemented immediately after the launch of the extension?
-The extension was launched on March 17th, 2009.
-Filter 1 is implemented in the late hours of that day.
-Filters with IDs 1-80 (IDs are auto-incremented) were implemented the first 5 days after the extension was turned on (17-22.03.2009).
-So, apparently the most urgent problems the initial edit filter managers perceived were:
-page move vandalism (what Filter 1 initially targeted; it was later converted to a general test filter);
-blanking articles (filter 3)
-personal attacks (filter 9,11) and obscenities (12)
-some concrete users/cases (hidden filters, e.g. 4,21) and sockpuppetry (16,17)
+In the subsections that follow the salient properties of each manually labeled category are discussed.
 
-Following filter categories have been identified (sometimes, a filter was labeled with more than one tag):
-%TODO make a diagramm with these
-- Vandalism
-  - hoaxing
-  - silly vandalism (e.g. repeating characters, inserting swear words)
-  - spam
-  - sockpuppetry
-  - long term abuse // there seems to be separate documentation for this, see notes;
-  - harassment/personal attacks
-    - doxxing
-    - impersonation
-  - trolling
-  - copyright violation
-
-  Labeled along the vandalism typology (check above)
-  - link vandalism
-  - abuse of tags
-  - username vandalism
-  - image vandalism
-  - avoidant vandalism
-  - talk page vandalism
-  - page move vandalism
-  - template vandalism
-  - vandalbots
-
-  Kind of similar:
-  - seo
-  - stockbroker vandalism
-  - biased pov
-  - self promotion
-  - conflict of interest
-
-Inbetween
-- edit warring
-- political controversy
-- politically/religiously motivated hate
-
-- Good faith
-  - bad style ("unencyclopedic edits" e.g. citing a blog or mentioning a hypothetical future album release)
-  - lazyness
-
-
-- Maintenance
-  - bugs
-  - wiki policy (compliance therewith)
-  - test filters
-
-%TODO: develop and include memos
 \subsection{Vandalism}
-\begin{comment}
-# Filters targetting vandalism
 
-The vast majority of edit filters on EN Wikipedia could be said to target (different forms of) vandalism.
-Examples herefor are filters for *juvenile* types of vandalism (inserting swear or obscene words or nonsence sequences of characters into articles), for *hoaxing* or for *link spam*.
-In principle, one can open quite a few subcategories here (also check https://en.wikipedia.org/wiki/Wikipedia:Vandalism for a "in-house" classification of vandalism types on Wikipedia).
-Some vandalism types seem to be more severe than others (*sock puppetry* or persistant *long term* vandals).
-For these, often times, the implemented filters are **private**.
+The vast majority of edit filters on EN Wikipedia could be said to target (different forms of) vandalism, i.e. maliciously intended disruptive editing.
+Examples thereof are filters for juvenile types of vandalism (inserting swear or obscene words or nonsence sequences of characters into articles), for hoaxing (inserting obvious or less obvious false information in articles) or for template vandalism (modifying a template in a disruptive way which is quite severe, since templates are displayed on various pages).
+A more elaborate subclassification was conducted; all codes belonging to the vandalism cluster together with definition and examples can be consulted in the code book attached in the appendix~\ref{}.
+
+Some vandalism types seem to be more severe than others (sock puppetry or persistant long term vandals).
+It's mostly in these cases that the implemented filters are hidden.
+
+%TODO where is the best place for this? I've got the feeling it's explained somewhere already and here it's quite late
+\begin{comment}
 This means, only edit filter editors can view the exact filter pattern or the comments of these.
 Although this clashes with the overall *transparency* of the project (is there a guideline subscribing to this value? couldn't find a specific mention), the reasoning here is that otherwise, persistent vandals will be able to check for the pattern of the filter targetting their edits and just find a new way around it~\cite{Wikipedia:EditFilter}. %TODO compare with https://en.wikipedia.org/w/index.php?title=Wikipedia:About&oldid=891256910 about transparency as a value
-There are also private filters targetting personal attack or abuse cases.
-Here, filters are private in order to protect the affected person(s)~\cite{Wikipedia:EditFilter}.
 
 The current state is also an "improvement" compared to the initially proposed visibility level of edit filters.
 In the initial version of the EditFilters Page (https://en.wikipedia.org/w/index.php?title=Wikipedia:Edit_filter&oldid=221158142) Andrew Garrett (User:Werdna), the author of the AbuseFilter MediaWiki extension, was suggesting that all filters should be private and only a group of previously approved users should be able to view them.
     (This was met by the community with a strong resistence, especially since at the time one of the most discussed features was the ability of filters to (temporarily) block users. Editors involved in the discussion felt strongly that no fully automated agent should be able to block human editors.)
+\end{comment}
 
-According to https://en.wikipedia.org/wiki/Wikipedia:Vandalism following (mostly disruptive) behaviours are **not vandalism**:
-- boldly editing
-- copyright violation
-- disruptive editing or stubbornness --> edit warring
-- edit summary omission
-- editing tests by experimenting users: "Such edits, while prohibited, are treated differently from vandalism"
-- harassment or personal attacks: "Personal attacks and harassment are not allowed. While some harassment is also vandalism, such as user page vandalism, or inserting a personal attack into an article, harassment in itself is not vandalism and should be handled differently."
-- Incorrect wiki markup and style
-- lack of understanding of the purpose of wikipedia: "editing it as if it were a different medium—such as a forum or blog—in a way that it appears as unproductive editing or borderline vandalism to experienced users."
-- misinformation, accidental
-- NPOV contraventions (Neutral point of view)
-- nonsense, accidental: "sometimes honest editors may not have expressed themselves correctly (e.g. there may be an error in the syntax, particularly for Wikipedians who use English as a second language)."
-- Policy and guideline pages, good-faith changes to: "If people misjudge consensus, it would not be considered vandalism;"
-- Reversion or removal of unencyclopedic material, or of edits covered under the biographies of living persons policy: "Even factually correct material may not belong on Wikipedia, and removing such content when it is not in line with Wikipedia's standards is not vandalism."
-- Deletion nominations: "Good-faith nominations of articles (or templates, non-article pages, etc) are not vandalism."
-
-Several of these behaviours could actually be conceived as **good faith** edits.
-And, for several of them (as noted in the **good faith memo**), it is not immediately distinguishable whether it's a **good faith** or a **vandalism** edit.
-Ultimately, the "only" difference between the two arises from the motivation/context of the edit.
-
-## Properties/Characteristics
+There are also private filters targetting personal attack or abuse cases.
+Here, filters are private in order to protect the affected person(s)~\cite{Wikipedia:EditFilter}.
+A dedicated subcluster of ``hardcore vandalism'' was defined (syn!) for these cases.
 
-- maliciously intended disruptive editing
 
 motivations:
 - seeking attention
@@ -495,8 +423,35 @@ One of the strategies to spot vandalism is "Watching for edits tagged by the abu
     Level two: {{subst:uw-vandalism2}} This warning is also fairly mild, though it explicitly uses the word 'vandalism' and links to this Wikipedia policy.
     Level three: {{subst:uw-vandalism3}} This warning is sterner. It is the first to warn that further disruptive editing or vandalism may lead to a block.
     Level four: {{subst:uw-vandalism4}} This is the sharpest vandalism warning template, and indicates that any further disruptive editing may lead to a block without warning."
+
+
+\subsection{Disruptive Editing}
+
+According to https://en.wikipedia.org/wiki/Wikipedia:Vandalism various behaviours are (highly) disruptive albeit not vandalism.
+Filters targeting such behaviours (syn) were identified and grouped in the ``disruptive editing'' cluster. %TODO elaborate with code book
+
+\begin{comment}
+- boldly editing
+- copyright violation
+- disruptive editing or stubbornness --> edit warring
+- edit summary omission
+- editing tests by experimenting users: "Such edits, while prohibited, are treated differently from vandalism"
+- harassment or personal attacks: "Personal attacks and harassment are not allowed. While some harassment is also vandalism, such as user page vandalism, or inserting a personal attack into an article, harassment in itself is not vandalism and should be handled differently."
+- Incorrect wiki markup and style
+- lack of understanding of the purpose of wikipedia: "editing it as if it were a different medium—such as a forum or blog—in a way that it appears as unproductive editing or borderline vandalism to experienced users."
+- misinformation, accidental
+- NPOV contraventions (Neutral point of view)
+- nonsense, accidental: "sometimes honest editors may not have expressed themselves correctly (e.g. there may be an error in the syntax, particularly for Wikipedians who use English as a second language)."
+- Policy and guideline pages, good-faith changes to: "If people misjudge consensus, it would not be considered vandalism;"
+- Reversion or removal of unencyclopedic material, or of edits covered under the biographies of living persons policy: "Even factually correct material may not belong on Wikipedia, and removing such content when it is not in line with Wikipedia's standards is not vandalism."
+- Deletion nominations: "Good-faith nominations of articles (or templates, non-article pages, etc) are not vandalism."
 \end{comment}
 
+Several of these behaviours could actually be conceived as **good faith** edits.
+And, for several of them (as noted in the **good faith memo**), it is not immediately distinguishable whether it's a **good faith** or a **vandalism** edit.
+Ultimately, the "only" difference between the two arises from the motivation/context of the edit.
+
+
 \subsection{Good Faith}
 \begin{comment}
 # Good faith edits
@@ -585,7 +540,7 @@ A user who just registered an account is most probably inexperienced with Wikipe
 
 It is also quite likely (to be verified against literature!) that majority of vandalism edits come from the same type of newly/recently registered accounts.
 In general, it is highly unlikely that an established Wikipedia editor should at once jeopardise the encyclopedia's purpose and start vandalising.
-\end{coment}
+\end{comment}
 
 \subsection{Maintenance}
 
@@ -622,6 +577,16 @@ Most of them do log only.
         I actually think, a bot fixing this would be more appropriate.
 \end{comment}
 
+%TODO What were the first filters to be implemented immediately after the launch of the extension?
+The extension was launched on March 17th, 2009.
+Filter 1 is implemented in the late hours of that day.
+Filters with IDs 1-80 (IDs are auto-incremented) were implemented the first 5 days after the extension was turned on (17-22.03.2009).
+So, apparently the most urgent problems the initial edit filter managers perceived were:
+page move vandalism (what Filter 1 initially targeted; it was later converted to a general test filter);
+blanking articles (filter 3)
+personal attacks (filter 9,11) and obscenities (12)
+some concrete users/cases (hidden filters, e.g. 4,21) and sockpuppetry (16,17)
+
 \section{Fazit}
 
 \begin{comment}