diff --git a/thesis/2-Background.tex b/thesis/2-Background.tex
index fdc049e7125498cfdbeb6622cef7cd6afb0825f9..1cc0f93a91b44a0f920b17f6bdc81f739bac753b 100644
--- a/thesis/2-Background.tex
+++ b/thesis/2-Background.tex
@@ -48,7 +48,7 @@ However, the ones we focus on here—the rapid response anti-vandalism agents su
 
 Different aspects of bots and their involvement in quality control have been investigated:
 In the paper referenced above, the researchers employ their method of trace ethnography (more on it in chapter~\ref{chap:methods}) to follow a disrupting editor around Wikipedia and comprehend the measures taken in collaboration by bots (ClueBot~\cite{Wikipedia:ClueBot} and HBC AIV helperbot7~\cite{Wikipedia:HBCAIVHelperbot}) as well as humans using semi-automated tools (Huggle~\cite{Wikipedia:Huggle} and Twinkle~\cite{Wikipedia:Twinkle}) up until they achieved that the malicious editor in question was banned~\cite{GeiRib2010}.
-Halfaker and Riedl offer a historical review of bots and semi-automated tools and their involvement in vandal fighting~\cite{HalRied2012}, assembling a comprehensive list of tools and touching on their working principle (rule vs machine learning based).
+Halfaker and Riedl offer a historical review of bots and semi-automated tools and their involvement in vandal fighting~\cite{HalRied2012}, assembling a comprehensive list of tools and touching on their working principle (rule vs. machine learning based).
 They also develop a bot taxonomy we will come back to in chapter~\ref{chap:overview-en-wiki}. %TODO quote bot taxonomy here?
 In~\cite{GeiHal2013}, Geiger and Halfaker conduct an in-depth analysis of ClueBot NG, ClueBot's machine learning based successor, and its place within Wikipedia's vandal fighting infrastructure concluding that quality control on Wikipedia is a robust process and most malicious edits eventually get reverted even with some of the actors (temporaly) inactive, although at a different speed.
 They discuss the mean times to revert of different mechanisms, their observations coinciding with diagram~\ref{fig:funnel-no-filters},
diff --git a/thesis/4-Edit-Filters.tex b/thesis/4-Edit-Filters.tex
index 614d579e4f50b12fe1dc7fad2ec94b6476a7dadd..21d254b04e72e4bcdde771d21487b3baf1d4ac7a 100644
--- a/thesis/4-Edit-Filters.tex
+++ b/thesis/4-Edit-Filters.tex
@@ -169,11 +169,13 @@ for the period between the announcement that the extension is planned up until t
 For a while at the beginnings of the discussion, there was some confusion among editors regarding the intended functionality of the edit filters.
 Participants invoked various motivations for the introduction of the extension (which sometimes contradicted each other) and argued for or against the filters depending on these.
 The discussion reflects a mix of ideological and practical concerns.
-The biggest controversies lay along the lines of filters being public-vs-private (hidden from public view) and the actions the filters were to invoke upon a match.
+The biggest controversies lay along the lines of filters being public vs. private (hidden from public view)
+\footnote{The terms ``private'' and ``hidden'' are used interchangeably for such filters throughout the thesis.}
+and the actions the filters were to invoke upon a match.
 An automated rights revokation or a block of the offending editor with no manual confirmation by a real person were of particular concern to a lot of editors (they were worried that the filters would not be able to understand context thus resulting in too many false positives and blocking many legitimate edits and editors).
 As far as I understood, these features were technically implemented but never really used on English Wikipiedia.
 
-As to the public-vs-private debate, the initial plan was that all filters are hidden from public view and only editors with special permissions (the edit filter managers) were supposed to be able to view and modify the patterns and consult the logs.
+As to the public vs. private debate, the initial plan was that all filters are hidden from public view and only editors with special permissions (the edit filter managers) were supposed to be able to view and modify the patterns and consult the logs.
 The core developer of the extension was reasoning that its primary purpose was to fend off really persistent vandals with reasonable technical understanding who were ready to invest time and effort to circumvent anti-vandal measures
 and that it was therefore unwise to make circumvention easier to them by allowing them to view the pattern according to which their edits were supressed.
 This was however met with serious resistence by the community who felt that such secret extension was contradicting Wikipedia's values of openness and transparency.
@@ -390,7 +392,7 @@ and since they are triggered \emph{before} an edit is published–by not allowin
 By being able to disallow such malicious edits from the beginning, the extension was to reduce the workload of other mechanisms and free up resources for vandal fighters using semi-automated tools or monitoring pages manually to work on less obvious cases that required human judgement, reasoned proponents of the filters.
 
 %Structural/soft factors
-The rest of the arguments for edit filters vs bots touched on in the discussion prior to introducing filter~\cite{Wikipedia:EditFilterTalkArchive1} were more of infrastructural/soft nature. %TODO find a better description for this.
+The rest of the arguments for edit filters vs. bots touched on in the discussion prior to introducing filter~\cite{Wikipedia:EditFilterTalkArchive1} were more of infrastructural/soft nature. %TODO find a better description for this.
 The plugin's developers optimistically announced that it was going to be open source, the code well tested, with framework for testing single filters before enabling them and edit filter managers being able to collaboratively develop and improve filters.
 They viewed this as an improvement compared to (admin) bots which would be able to cover similar cases but whose code was mostly private, not tested at all, and with a single developer/operator taking care of them who was often not particularly responsive in emergency cases
 \footnote{For the sake of completeness, it should be mentioned here that the most popular semi-automated anti-vandalism tools are also open sourced.
diff --git a/thesis/5-Overview-EN-Wiki.tex b/thesis/5-Overview-EN-Wiki.tex
index cc8e04dbb94979eaf63d7bf26426e77b6313576c..3e4b95633b8e4768734e19b5a45f7b45b3b797ec 100644
--- a/thesis/5-Overview-EN-Wiki.tex
+++ b/thesis/5-Overview-EN-Wiki.tex
@@ -145,8 +145,8 @@ used to code all filters where the functionality stayed completely opaque for th
 \section{Filter characteristics}
 \label{sec:patterns}
 
-This section explores some general traits/patterns of/trends in the edit filters on Engish Wikipedia, or respectively the data from the \emph{abuse\_filter} table.
-The scripts that generate the statistics (syn?) discussed here, can be found in the jupyter notebook in the project's repository~\cite{gitlab}.
+This section explores some general features of the edit filters on Engish Wikipedia based on the data from the \emph{abuse\_filter} table.
+The scripts that generate the statistics discussed here, can be found in the jupyter notebook in the project's repository~\cite{gitlab}.
 
 
 \subsection{General traits}
@@ -154,10 +154,10 @@ The scripts that generate the statistics (syn?) discussed here, can be found in
 As of January 6th, 2019 there are $954$ filters in the \emph{abuse\_filter} table.
 It should be noted, that if a filter gets deleted, merely a flag is set to indicate so, but no entries are removed from the database.
 So, the above mentioned $954$ filters are all filters ever made up to this date.
-This doesn't mean that it never changed what the single filters are doing, since edit filter managers can freely modify filter patterns, so at some point the filter could be doing one thing and in the next moment it can be filtering a completely different phenomenon.
-There are cases of filters being ``repurposed'' or modified to filter for example a more general occurance/phenomenon.
+This doesn't mean that it never changed what the single filters are doing, since edit filter managers can freely modify filter patterns, so at some point a filter could be doing one thing and in the next moment it can be filtering a completely different phenomenon.
+There are cases of filters being ``repurposed'' or modified to filter for example a more general occurance.
 This doesn't happen very often though.
-Mostly, if a filter is not useful anymore it is just disabled and eventually deleted and new filters are implemented for current problems.
+Mostly, if a filter is not useful anymore, it is just disabled and eventually deleted and new filters are implemented for current problems.
 
 $361$ of all filters are public, the remaining $593$—hidden.
 $110$ of the public ones are active, $35$ are disabled, but not marked as deleted, and $216$ are flagged as deleted.
@@ -178,24 +178,25 @@ The community discussions rebutted that so a guideline was drafted calling for h
 ``only where necessary, such as in long-term abuse cases where the targeted user(s) could review a public filter and use that knowledge to circumvent it.''~\cite{Wikipedia:EditFilter}.
 This is however not always complied with and edit filter managers do end up hiding filters that target general vandalism despite consensus that these should be public~\cite{Wikipedia:PrivacyGeneralVandalism}.
 Such cases are usually made public eventually (examples hereof are filters 225 ``Vandalism in all caps'', 260 ``Common vandal phrases'', or 12 ``Replacing a page with obscenities'').
-Also, oftentimes when a hidden filter is marked as ``deleted'', it is made public. %TODO examples?
-%TODO this seems out of place
+Also, oftentimes when a hidden filter is marked as ``deleted'', it is made public.
+
 Further, caution in filter naming is suggested for hidden filters and editors are encouraged to give such filters just simple description of the overall disruptive behaviour rather than naming a specific user that is causing the disruptions.
 (The latter is not always complied with, there are indeed filters named after the accounts causing a disruption.)
 
 Still, it draws attention that currently nearly $2/3$ of all edit filters are not viewable by the general public (compare figure~\ref{fig:general-stats}).
-Unfortunately, without the full \emph{abuse\_filter\_history} table we cannot know how this ration has developed historically.
-However, the numbers fit the assertion of the extension's core developer according to whom edit filters target particularly determined vandals.
+Unfortunately, without the full \emph{abuse\_filter\_history} table there is no way to know how this ration has developed historically.
+However, the numbers fit the assertion of the extension's core developer according to whom edit filters target particularly determined vandals (filters aimed at whom are, as a general rule, hidden in order to make circumvention more difficult).
 
-On the other hand, if we look at the enabled filters only, there are actually more or less the same number of public enabled and hidden enabled filters ($110$ vs $91$).
+On the other hand, if we look at the enabled filters only, there are actually more or less the same number of public enabled and hidden enabled filters ($110$ vs. $91$).
+%TODO this is a kind of an interpretation. Take it out here and put it in the fazit of the chapter?
 This leads to the hypothesis that it is rather that hidden filters have higher fluctuation rates, i.e. that they target specific phenomena that are over after a particular period of time after which the filters get disabled and eventually—deleted.
-This makes sense when we compare it to the hidden vs public filter policy: hidden filters for particular cases and very determined vandals, public filters for general patterns which reflect more timeless patterns.
+This again makes sense when compared to the hidden vs. public filter policy: hidden filters for particular cases and very determined vandals, public filters for general patterns which reflect more timeless patterns.
 
 
 \subsection{Filter actions}
 
-Another interesting parameter we could observe are the currently configured filter actions for each filter.
-Figure~\ref{fig:all-active-filters-actions} depicts the actions configured for all enabled filters.
+Another interesting parameter observed here are the currently configured filter actions for each filter.
+Figure~\ref{fig:all-active-filters-actions} depicts the actions set up for all enabled filters.
 And figures~\ref{fig:active-public-actions} and~\ref{fig:active-hidden-actions} show the actions of all enabled public and hidden filters respectively.
 It is noticeable that the most common action for the enabled hidden filters is ``disallow'' whereas most enabled public filters are set to ``tag'' or ``tag,warn''.
 This is congruent with the community's claim that hidden filters target particularly perstistent vandalism, which is best outright disallowed.
@@ -271,8 +272,8 @@ The detailed distribution of manually assigned codes and their parent categories
 
 \subsection{Who trips filters}
 
-As of March 15, 2019 $16,489,266$ of the filter hits were caused by IP users, whereas logged in users had tripped a filter $6,984,897$ times.
-A lot of the logged in users have newly created accounts (many filters look for newly created or not confirmed accounts in their pattern). %TODO what is confirmed exactly.
+As of March 15, 2019 $16,489,266$ of the filter hits were caused by IP users, whereas logged in users had matched an edit filter's pattern $6,984,897$ times.
+A lot of the logged in users have newly created accounts (many filters look for newly created, or respectively, not confirmed accounts in their pattern). %TODO what is confirmed exactly: 4 day and 10 edits
 A user who just registered an account (or who doesn't even bother to) is rather to be expected to be inexperienced with Wikipedia, not familiar with all policies and guidelines and perhaps nor with MediaWiki syntax.
 
 It is also quite likely (to be verified against literature!) that majority of vandalism edits come from the same type of newly/recently registered accounts.
diff --git a/thesis/6-Discussion.tex b/thesis/6-Discussion.tex
index 67964e85c3d1461cdb5e911203f6ef3c29d5e114..0eb064edf6149806a8f03703e003884c006605ad 100644
--- a/thesis/6-Discussion.tex
+++ b/thesis/6-Discussion.tex
@@ -41,12 +41,12 @@ The argument that someone powered off the basement computer on which they were r
 % more on bots vs filters
 % collaboration possible on filters?
 % who edits filters (edit filter managers, above all trusted admins) and who edits bots (in theory anyone approved by the BAG)
-Above all the distinction of bots vs filters: what tasks are handled by which mechanism and why? slides (syn!) into the foreground over and over aagain.
+Above all the distinction of bots vs. filters: what tasks are handled by which mechanism and why? slides (syn!) into the foreground over and over aagain.
 After all the investigations I would venture the claim that from end result perspective it probably doesn't make a terrible difference at all.
 As mentioned in the paragraph above, whether malicious content is directly disallowed or reverted 2 seconds later (in which time probably who 3 user have seen it, or not) is hardly a qualitative difference for Wikipedia's readers. %TODO (although I'm making a slightly different point in the paragraph above, clean up!)
 I would argue though that there are other stakeholders for whom the choice of mechanism makes a bigger difference:
 the operators of the quality control mechanisms and the users whose edits are being targeted.
-The difference (syn!) for edit filter managers vs bot developers is that the architecture of the edit filter plugin supposedly fosters collaboration which results in a better system (compare with the famous ``given enough eyeballs, all bugs are shallow''~\cite{Raymond1999}).
+The difference (syn!) for edit filter managers vs. bot developers is that the architecture of the edit filter plugin supposedly fosters collaboration which results in a better system (compare with the famous ``given enough eyeballs, all bugs are shallow''~\cite{Raymond1999}).
 Any edit filter manager can modify a filter causing problems and the development of a single filter is mostly a collaborative (syn!) process.
 Just a view on the history of most filters reveals that they have been updated multiple times by various users.
 In contrast, bots' source code is often not publicly available and they are mostly run by one operator only, so no real peer review of the code is practiced and the community has time and again complained of unresponsive bot operators in emergency cases.
@@ -189,7 +189,7 @@ Throughout the thesis, a variety of intriguing questions arose which couldn't be
 Here, a comprehensive list of all these pointers for possible future research is provided.
 
 \begin{enumerate}
-    \item \textbf{How have edit filters's tasks evolved over time?}: Unfortunately, no detailed historical analysis of the filters was possible, since the database table storing changes to individual filters (\emph{abuse\_filter\_history}) is not currently replicated (see section~\ref{sec:overview-data}). A patch aiming to renew the replication of the table is currently under review~\cite{gerrit-tables-replication}. When a dump becomes available, an extensive analysis (sym) of filter creation and activation patterns, together with .. will be possible (syn).
+    \item \textbf{How have edit filters's tasks evolved over time?}: Unfortunately, no detailed historical analysis of the filters was possible, since the database table storing changes to individual filters (\emph{abuse\_filter\_history}) is not currently replicated (see section~\ref{sec:overview-data}). As mentioned in section~\ref{sec:overview-data}, a patch aiming to renew the replication of the table is currently under review~\cite{gerrit-tables-replication}. When a dump becomes available, an extensive analysis (sym) of filter creation and activation patterns, together with .. will be possible (syn).
         (Actually there is some historical stuff: e.g. temporal overview of hits, broken down by filter action... Beware however, it is the *current* filter action they were plotted with and it is very possible that the corresponding filters had a different action switched on some time ago. %TODO check whether that's actually true
         (or another visibility level, different filter pattern which would've resulted in a different manual tag)
     \item \textbf{What are the differences between how filters are governed on EN Wikipedia compared to other language versions?}: Different Wikipedia language versions each have a local community behind them. %TODO quote?
@@ -212,7 +212,7 @@ There are also various complaints/comments by users bewildered that their edits
     \item \textbf{What are the urgent situations in which edit filter managers are given the freedom to act as they see fit and ignore best practices of filter adoption (i.e. switch on a filter in log only mode first and announce it on the notice board so others can have a look)? Who determines they are urgent?}: I think these cases should be scrutinised extra carefully since ``urgent situations'' have historically always been an excuse for cuts in civil liberties.
 %* is there a qualitative difference between complaints of bots and complaints of filters?
     \item \textbf{Is there a qualitative difference between the tasks/patterns of public and hidden filters?}: We know of one general guideline/rule of a thumb (cite!) according to that general filters are to be public while filters targeting particular users are hidden. Is there something more to be learnt from an actual examination of hidden filters? One will have to request access to them for research purposes, sign an NDA, etc.
-    \item \textbf{Do edit filter managers specialize on particular types of filters (e.g. vandalism vs good faith?)} \emph{abuse\_filter\_history } table is needed for this
+    \item \textbf{Do edit filter managers specialize on particular types of filters (e.g. vandalism vs. good faith?)} \emph{abuse\_filter\_history } table is needed for this
     \item \textbf{What proportion of quality control work do filters take over?}: compare filter hits with number of all edits and reverts via other quality control mechanisms
     \item \textbf{Do edit filter managers stick to the edit filter guidelines?}: e.g. filters should't be implemented for trivial problems (such as spelling mistakes); problems with specific pages are generally better taken care of by protecting the page and problematic title by the title blacklist; general filters shouldn't be hidden
 \end{enumerate}