diff --git a/thesis/4-Edit-Filters.tex b/thesis/4-Edit-Filters.tex
index c832dc0ff11d5c720166e7bcbdffa46f2f4bc998..510a701fc841429888c883bcf288102f16998d47 100644
--- a/thesis/4-Edit-Filters.tex
+++ b/thesis/4-Edit-Filters.tex
@@ -8,7 +8,7 @@
 
 The extension, or at least its end user facing parts, was later renamed to ``edit filter'' in order to not characterise false positives as ``abuse'' and thus alienate good faith editors striving to improve the encyclopedia~\cite{Wikipedia:EditFilter},~\cite{Wikipedia:EditFilterTalkArchiveNameChange}.
 
-In this chapter, we aim to understand how edit filters work, who implements and runs them and above all, how and why they were introduced in the first place and what the qualitative difference is between them and other algorithmic quality control mechanisms.
+The aim of this chapter is to understand how edit filters work, who implements and runs them and above all, how and why they were introduced in the first place and what the qualitative difference is between them and other algorithmic quality control mechanisms.
 The analysed data is presented in the following section.
 Section~\ref{sec:4-def} defines what an edit filter is.
 The AbuseFilter MediaWiki extension is introduced in section~\ref{sec:mediawiki-ext}.
@@ -48,7 +48,7 @@ And what are the patterns of harmful editing addressed by the filters?
 
 At least the ``mainly'' question is swiftly answered by the paragraph itself, since there is a footnote stating that ``[e]dit filters can and have been used to track or tag certain non-harmful edits, for example addition of WikiLove''~\cite{Wikipedia:EditFilter}.
 The controls that can be set are looked into in the sections that follow.
-We discuss (who is in) the edit filter manager group in section~\ref{section:who-can-edit} and the patterns of harmful editing (as well as some further non-harmful edit patterns) are inspected in detail in the next chapter.
+The edit filter manager group and its members are discussed in section~\ref{section:who-can-edit} and the patterns of harmful editing (as well as some further non-harmful edit patterns) are inspected in detail in the next chapter.
 
 \subsection{Example of a Filter}
 
@@ -285,7 +285,7 @@ A discussion is held there, usually for 7 days, before a decision is reached~\ci
 \footnote{According to the documentation, the Edit Filter Noticeboard is also the place to discuss potential permission withdraws in cases of misuse where raising the issue directly with the editor concerned has not resolved the problem.}.
 
 As of 2017, when the ``edit filter helper'' group was introduced (editors in this group have the \emph{abusefilter-view-private} permission)~\cite{Wikipedia:EditFilterHelper},
-the usual process seems to be that editors request this right first and only later the full \emph{abusefilter-modify} permissions\footnote{That is the tendency we observe at the Edit filter noticeboard~\cite{Wikipedia:EditFilterNoticeboard}.}.
+the usual process seems to be that editors request this right first and only later the full \emph{abusefilter-modify} permissions\footnote{That is the tendency observed at the Edit filter noticeboard~\cite{Wikipedia:EditFilterNoticeboard}.}.
 
 According to the edit filter managers list for the EN Wikipedia~\cite{Wikipedia:EditFilterManagersList}, as of 10 May 2019 there are 154 users in this group
 \footnote{For comparison, as of 9 March 2019 there are 1181 admins~\cite{Wikipedia:Admins}. The role does not exist at all on the German, Spanish and Russian Wikipedias where all administrators have the \emph{abusefilter\_modify} permission~\cite{Wikipedia:EditFilterDE}, \cite{Wikipedia:EditFilterES}, \cite{Wikipedia:EditFilterRU}.}.
@@ -352,7 +352,7 @@ and all edits that trigger an edit filter are listed in the Abuse Log~\cite{Wiki
 \section{Edit Filters' Role in the Quality Control Ecosystem}
 \label{sec:filters-role-qc-system}
 
-The purpose of the present section is to review what we have learnt so far about edit filters and summarise how they fit in Wikipedia's quality control ecosystem.
+The purpose of the present section is to review what has been learnt so far about edit filters and summarise how they fit in Wikipedia's quality control ecosystem.
 
 As timeline~\ref{fig:timeline} shows, the time span in which algorithmic quality control mechanisms (first vandal fighting bots and semi-automated tools, and later filters) were introduced fits logically the period after the exponential growth of Wikipedia took off in 2006 (compare figures~\ref{fig:editors-development},~\ref{fig:edits-development}).
 The surge in numbers of editors and contributions implied a rapidly increasing workload for community members dedicated to quality assurance
@@ -398,7 +398,7 @@ As shown elsewhere~\cite{HalGeiMorRied2013}, this shift had a lot of repercussio
 
 As we can read from timeline~\ref{fig:timeline}, edit filters were introduced at a moment when bots and semi-automated tools were already in place.
 Thus, the question arises: Why were they implemented when already these other mechanisms existed?
-Here, we review the salient features of the different quality control mechanisms and the motivation for the filters' introduction.
+Here, the salient features of the different quality control mechanisms and the motivation for the filters' introduction are reviewed.
 A concise summary of this discussion is offered in table~\ref{table:mechanisms-comparison}.
 
 Since edit filters are a fully automated mechanism, above all a comparison to bots seems obvious.
@@ -530,7 +530,7 @@ Application areas    |
 %\subsection{Collaboration with bots (and semi-automated tools)}
 \label{subsection:collaboration-bots-filters}
 
-So far we have juxtaposed the single quality control mechanisms and compared them separately.
+So far, the single quality control mechanisms have been juxtaposed and separately compared.
 It is however worth mentioning that they not only operate alongside each other but also cooperate on occasions.
 
 Such collaborations are studied for instance by Geiger and Ribes~\cite{GeiRib2010} who go as far as describing them as ``distributed cognition''.
@@ -553,7 +553,7 @@ The AbuseFilterIRC task ``[r]elays all edit filter hits to IRC channels and allo
 
 On the other hand, there are also examples for filters supporting bot work:
 Filter 323 (``Undoing anti-vandalism bot'') tags edits reverting revisions by XLinkBot and ClueBot NG.
-Although it is hidden, so we cannot view any details, filter 603 is named ``Special case of reverting XLinkBot reverts'' so it is probably safe to assume that is filtering what it claims to be.
+Although it is hidden, so no details can be viewed by a unauthorised user, filter 603 is named ``Special case of reverting XLinkBot reverts'' so it is probably safe to assume that is filtering what it claims to be.
 And there are several filters (historically) configured to ignore particular bots: filter 76 (``Adding email address'') exempting XLinkBot, filter 28 (``New user redirecting an existing substantial page or changing a redirect'') exempting Anybot, filter 532 (``Interwiki Addition'') exempting Cydebot are some examples thereof.
 There are also filters configured to ignore all bots: filter 368 (``Making large changes when marking the edit as minor''), filter 702 (``Warning against clipboard hijacking''), filter 122(``Changing Username malformed requests'').
 
@@ -564,14 +564,15 @@ Moreover, on occasions, data from the Abuse Log is used for (semi-)protecting fr
 \subsection{Conclusions}
 %Conclusion, resume, bottom line, lesson learnt, wrap up
 
-In short, in this chapter we studied edit filters' documentation and community discussions and worked out the salient characteristics of this mechanism.
-We also compared the filters to other quality control technologies on Wikipedia such as bots, semi-automated anti-vandalism tools and the machine learning framework ORES.
-We considered edit filters in the context and time of their introduction and concluded that the community implemented them as a means to fight obvious, particularly persistent, and cumbersome to remove vandalism by disallowing it on the spot.
+In short, this chapter studied edit filters' documentation and community discussions and worked out the salient characteristics of this mechanism.
+Moreover, the filters were compared to other quality control technologies on Wikipedia such as bots, semi-automated anti-vandalism tools and the machine learning framework ORES.
+Edit filters were considered in the context and time of their introduction and it was concluded that the community implemented them as a means to fight obvious, particularly persistent, and cumbersome to remove vandalism by disallowing it on the spot.
 Other ``softer'' arguments such as dissatisfaction with bot development processes (poorly tested, non-responsive operators) seemed to encourage the introduction as well.
 It was found that the individual filters are implemented and maintained by edit filter managers, a special highly-restricted user group.
 
-Revising the quality control ecosystem diagram~\ref{fig:funnel-no-filters} introduced in chapter~\ref{chap:background}, we can now properly place the filters on it (see figure~\ref{fig:funnel-with-filters}),
-and conclude that claims of the literature (see section~\ref{section:bots}) should be revised: in terms of temporality not bots but edit filters are the first mechanism to actively fend off a disruptive edit.
+Revising the quality control ecosystem diagram~\ref{fig:funnel-no-filters} introduced in chapter~\ref{chap:background}, filters can now be properly placed on it (see figure~\ref{fig:funnel-with-filters}).
+It seems that claims of the literature (see section~\ref{section:bots}) should be revised:
+In terms of temporality not bots but edit filters are the first mechanism to actively fend off a disruptive edit.
 
 \begin{landscape}
 \begin{figure}
diff --git a/thesis/5-Overview-EN-Wiki.tex b/thesis/5-Overview-EN-Wiki.tex
index 0ae14e57b60b64f8b59cdb7369024dc555f3cdb9..eb6d07cdac2a70ccbacded5e72c8142e921ff84a 100644
--- a/thesis/5-Overview-EN-Wiki.tex
+++ b/thesis/5-Overview-EN-Wiki.tex
@@ -68,7 +68,7 @@ During the first labeling, these were labeled ``unknown'', ``unclear'' or ``not
 For the second round, I have unified all of them under ``unclear''.
 
 For a number of filters, it was particularly difficult to determine whether they were targeting vandalism or good faith edits.
-The only thing that would have distinguished between the two would have been the contributing editor's motivation, which we had no way of knowing.
+The only thing that would have distinguished between the two would have been the contributing editor's motivation, which no one could have known (but the editor in question themself).
 During the first labeling session, I tended to label such filters with ``vandalism?, good\_faith?''.
 For the second labeling, I stuck to the ``assume good faith'' guideline~\cite{Wikipedia:GoodFaith} myself
 and only labeled as vandalism cases where good faith was definitely out of the question.
@@ -224,9 +224,9 @@ A lot of public filters on the other hand still assume good faith from the edito
 This section examines in detail the results of the manual tagging of the filters according to their perceived functionality described in section~\ref{sec:manual-classification}.
 As figures~\ref{fig:manual-tags-all} and \ref{fig:manual-tags-active} demonstrate, the majority of filters seem to target vandalism (little surprise here).
 The second biggest category comprise the ``good faith'' filters, while ``maintenance'' and ``unknown'' filters make up a relatively small part of the total number of filters.
-The proportion of vandalism related filters is higher when we look at all filters compared to the enabled ones.
+The proportion of vandalism related filters is higher when all filters are considered and not just the enabled ones.
 Again, this is probably due to the presumed higher fluctuation rates of hidden filters which (according to my labeling, see section~\ref{sec:manual-classification} for rationale) are always vandalism related.
-It also comes to attention that the relative share of maintenance related filters is higher when we look at all filters.
+It also comes to attention that the relative share of maintenance related filters is higher when all filters are regarded.
 The detailed distribution of manually assigned codes and their parent categories can be view on figure~\ref{fig:manual-tags}.
 
 %TODO make these two subfigures of the same figure
@@ -458,18 +458,18 @@ The table also shows that the mechanism ended up being quite active in preventin
   \centering
     \begin{tabular}{p{1cm} r p{5cm} p{2cm} p{3cm}}
     % \toprule
-        Filter ID & Hitcount & Publicly available description & Actions & Manual tag (parent category) \\
+        Filter ID & Hitcount & \raggedright Publicly available description & Actions & Manual tag (parent category) \\
     \hline
-        61 & 1,611,956 & new user removing references & tag & good\_faith\_refs (good\_faith) \\
-        135 & 1,371,361 & repeating characters & tag, warn & silly\_vandalism (vandalism)\\
-        527 & 1,241,576 & T34234: log/throttle possible sleeper account creations (hidden filter) & throttle & sockpuppetry (vandalism) \\
-        384 & 1,159,239 & addition of bad words or other vandalism & disallow & profanity\_vandalism (vandalism) \\
-        172 & 935,925 & section blanking & tag & good\_faith\_deletion (good\_faith) \\
-        30 & 840,871 & large deletion from article by new editors & tag, warn & good\_faith\_deletion (good\_faith) \\
-        633 & 808,716 & possible canned edit summary & tag & general\_vandalism (vandalism) \\
-        636 & 726,764 & unexplained removal of sourced content & warn & good\_faith\_deletion (good\_faith) \\
-        3 & 700,522 & new user blanking articles & tag, warn & good\_faith\_deletion (good\_faith) \\
-        650 & 695,601 & creation of a new article without any categories & (log only) & general\_tracking (maintenance) \\
+        61 & 1,611,956 & \raggedright new user removing references & tag & good\_faith\_refs (good\_faith) \\
+        135 & 1,371,361 &\raggedright repeating characters & tag, warn & silly\_vandalism (vandalism)\\
+        527 & 1,241,576 &\raggedright T34234: log/throttle possible sleeper account creations (hidden filter) & throttle & sockpuppetry (vandalism) \\
+        384 & 1,159,239 &\raggedright addition of bad words or other vandalism & disallow & profanity\_vandalism (vandalism) \\
+        172 & 935,925 & \raggedright section blanking & tag & good\_faith\_deletion (good\_faith) \\
+        30 & 840,871 & \raggedright large deletion from article by new editors & tag, warn & good\_faith\_deletion (good\_faith) \\
+        633 & 808,716 &\raggedright possible canned edit summary & tag & general\_vandalism (vandalism) \\
+        636 & 726,764 &\raggedright unexplained removal of sourced content & warn & good\_faith\_deletion (good\_faith) \\
+        3 & 700,522 & \raggedright new user blanking articles & tag, warn & good\_faith\_deletion (good\_faith) \\
+        650 & 695,601 & \raggedright creation of a new article without any categories & (log only) & general\_tracking (maintenance) \\
   \end{tabular}
   \caption{What do most active filters do?}~\label{tab:most-active-actions}
 \end{table*}
@@ -704,7 +704,7 @@ It was determined that hidden filters seem to fluctuate more, which makes sense
 Public filters often target silly vandalism or test type edits, as well as spam.
 Disallowing edits by very determined vandals handled by hidden filters are in accord with the initial aim with which the filters were introduced (compare section~\ref{section:4-history}).
 The high number of such filters (compare section~\ref{sec:what-do-filters-target}) seems to confirm that edit filters are fulfilling their purpose.
-On the other hand, when we look at the ten most active filters of all times (see table~\ref{tab:most-active-actions}), only one of them appears to take care of the malicious determined vandals who motivated the creation of the AbuseFilter extension.
+On the other hand, when the ten most active filters of all times (see table~\ref{tab:most-active-actions}) are regarded, only one of them appears to take care of the malicious determined vandals who motivated the creation of the AbuseFilter extension.
 The rest of the most frequently matching filters target a combination of good faith edits (above all such concerning deletions) and silly/profanity vandalism.
 Interestingly, that is not what the developers of the extension believed it was going to be good for:
 ``It is not, as some seem to believe, intended to block profanity in articles (that would be extraordinarily dim), nor even to revert page-blankings, '' claimed its core developer on 9 July 2008~\cite{Wikipedia:EditFilterTalkArchive1Clarification}.
diff --git a/thesis/conclusion.tex b/thesis/conclusion.tex
index 73d05b286f48860360c1261e57c6130b25f8533e..8846842e8db84ecf132f9e50018bc55637dc7ffa 100644
--- a/thesis/conclusion.tex
+++ b/thesis/conclusion.tex
@@ -28,11 +28,11 @@ Compared to machine learning techniques, rule-based systems such as the edit fil
 
 % TODO Refer back to title! Who is allowed to publish? Who decides?
 Taking a step back,
-according to the Wikipedian community people adding made up information like references to Brazilian aardvarks or proclaiming themselves mayors of small Chinese towns~\cite{Wikipedia:ChenFang} shall preferably not publish at all.
-If we are to handle this type of disruption with edit filters, two approaches seem feasible:
+according to the Wikipedian community, people adding made up information like references to Brazilian aardvarks or proclaiming themselves mayors of small Chinese towns~\cite{Wikipedia:ChenFang} shall preferably not publish at all.
+If this type of disruption is to be handled with edit filters, two approaches seem feasible:
 Warn editors adding the information that their contribution does not contain any references, or outright disallow such edits
-(which does not solve the problem of freely invented sources)
-, but that was pretty much it.
+(which does not solve the problem of freely invented sources),
+but that was pretty much it.
 Albeit edit filters may not be the ideal mechanism to deal with hoaxes, what they can do effectively is prevent someone from moving hundreds of pages to titles containing ``ON WHEELS'', thus sparing vandal fighters the need to track down and undo these changes, allowing them to use their time more productively by for example fact checking unverified claims and hence reducing the number of fake aardvarks and increasing the overall credibility of the project.
 
 %Outlook: centralisation, censorship
diff --git a/thesis/introduction.tex b/thesis/introduction.tex
index a6925fe092d373a7fcf892de786005b14517c6e3..a9f5ec7f231211cb4d87c48e27df618dd2350c6d 100644
--- a/thesis/introduction.tex
+++ b/thesis/introduction.tex
@@ -10,14 +10,14 @@
 \label{chap:introduction}
 
 In May 2014 the US American magasine \textit{The New Yorker} published a story called ``How a Raccoon Became an Aardvark'' in its column ``Annals of Technology''~\cite{Randall2014}.
-It tells an anecdote about a New York student who, some 6 years before, edited the Wikipedia article on ``coati'' (a member of the racoon family native to South and Central America) to state that the coati is ``also known as [...] Brazilian aardvark''~\cite{Wikipedia:Coati}.
+It tells an anecdote about a New York student who, some six years before, edited the Wikipedia article on ``coati'' (a member of the racoon family native to South and Central America) to state that the coati is ``also known as [...] Brazilian aardvark''~\cite{Wikipedia:Coati}.
 
 This simple action is a mundane example of how Wikipedia works:
 Anyone can edit and small contribution by small contribution the world's largest knowledge base is created.
 Except, the student made the whole thing up and published on Wikipedia an inside joke he had with his brother on their holiday trip to Brazil.
 Unsourced pieces of information are not supposed to survive long on Wikipedia and he thought that the edit would be swiftly deleted.
 Fast-forward to 2014, not only had this part of the ``coati'' entry not changed, but it cited a 2010 article by the newspaper the \textit{Telegraph} as evidence~\cite{Wikipedia:CoatiEvidence}.
-In the meantime, apparently several newspapers, a YouTube video and a book published by the University of Chicago~\cite{Henderson2013} claimed that the coati was known as a Brazilan aardvark.
+In the meantime, apparently several newspapers, a YouTube video and a book published by the University of Chicago~\cite{Henderson2013} claimed that the coati was known as Brazilan aardvark.
 It proved not trivial to erase the snippet from Wikipedia since there were all these other sources affirming the statement.
 By then, it was not exactly false either: the coati \emph{was} known as ``Brazilian aardvark'', at least on the Internet.
 
@@ -29,9 +29,9 @@ The Wikipedian community is well-aware of their project's poor reliability reput
 Not only hoaxes, but profanities, malicious vandals, and spammers have been there since the very beginning and their numbers have increased with the rise of the project to prominence.
 %Since its conception in 2001, when nobody believed it was ever going to be a serious encyclopedia, the project has grown steadily.
 At the latest, with the exponential surge in the numbers of users and edits around 2006, the community began realising that they needed a more automated means for quality control.
-The same year, the first anti-vandal bots were implemented, followed by semi-automated tools facilitating revision verification such as Twinkle (in 2007) and Huggle (in the beginnings of 2008).
+The same year, the first anti-vandal bots were implemented, followed by semi-automated tools facilitating revision verification such as Twinkle~\cite{Wikipedia:Twinkle} (in 2007) and Huggle~\cite{Wikipedia:Huggle} (in the beginnings of 2008).
 In 2009, yet another mechanism dedicated to quality control was introduced.
-Its core developer, Andrew Garrett, known on Wikipedia as User:Werdna~\cite{Wikipedia:UserWerdna}, has called it ``abuse filter'', and according to EN Wikipedia's newspaper, The Signpost, its purpose was to ``allow[...] all edits to be checked against automatic filters and heuristics, which can be set up to look for patterns of vandalism including page move vandalism and juvenile-type vandalism, as well as common newbie mistakes''~\cite{Signpost2009}.
+Its core developer, Andrew Garrett, known on Wikipedia as User:Werdna~\cite{Wikipedia:UserWerdna}, has called it ``abuse filter'', and according to EN Wikipedia's newspaper, The Signpost, its purpose was to ``allow [...] all edits to be checked against automatic filters and heuristics, which can be set up to look for patterns of vandalism including page move vandalism and juvenile-type vandalism, as well as common newbie mistakes''~\cite{Signpost2009}.
 %TODO decide whether to cite the Signpost here already, since it appears again in chapter4
 
 %TODO right now, an abrupt end
@@ -102,7 +102,7 @@ The present work can be embedded in the context of (algorithmic) quality-control
 There is a whole ecosystem of actors struggling to maintain the anyone-can-edit encyclopedia as accurate and free of malicious content as possible.
 The focus of the this work are edit filters, the mechanism initially introduced by User:Werdna under the name of ``abuse filters'', previously unexplored by the scientific community.
 The goal of the current project is to better understand the role of edit filters in the vandal fighting network of humans, bots, semi-automated tools, and the Wikipedian machine learning framework ORES.
-After all, edit filters were introduced to Wikipedia at a time when the majority of the afore mentioned mechanisms already existed and were involved in quality control
+After all, edit filters were introduced to Wikipedia at a time when the majority of the aforementioned mechanisms already existed and were involved in quality control
 \footnote{Edit filters were introduced in 2009.
 The page of the semi-automated tool Twinkle~\cite{Wikipedia:Twinkle} was created in January 2007, the one of the tool Huggle~\cite{Wikipedia:Huggle}—in the beginning of 2008.
 Bots have been around longer, but first records of vandal fighting bots come from 2006.}.
@@ -135,14 +135,14 @@ The aim of this work is to find out why edit filters were introduced on Wikipedi
 Further, this research seeks to understand what tasks are taken over by filters %in contrast to other quality control meachanisms
 and—as far as practicable—track how these tasks have evolved over time (are there changes in type, numbers, etc.?).
 %and understand how different users of Wikipedia (admins/sysops, regular editors, readers) interact with these and what repercussions the filters have on them.
-Last but not least, it is discussed why a classic rule based system such as the filters is still operational today when more sophisticated machine learning (ML) approaches exist.
+Moreover, it is discussed why a classic rule based system such as the filters is still operational today when more sophisticated machine learning (ML) approaches exist.
 Since this is just an initial discovery of the features, tasks and repercussions of edit filters, a framework for future research is also offered.
 
 %\section{Methods}
 To this end, a three path approach is pursued.
-Firstly, we review the academic contributions on Wikipedia's quality control mechanisms in order to gather a better understanding of the different quality control mechanisms, their tasks, and the challenges they face.
-Moreover, we study documentation of the MediaWiki AbuseFilter extension, together with guidelines for its use, various noticeboards, and discussion archives prior to its introduction in an attempt to understand how and why filters were introduced and how they function.
-Thirdly, we look into the filters implemented on English Wikipedia
+Firstly, the academic contributions on Wikipedia's quality control mechanisms are reviewed in order to gather a better understanding of the different quality control mechanisms, their tasks, and the challenges they face.
+Then, the documentation of the MediaWiki AbuseFilter extension is studied, together with the guidelines for its use, various noticeboards, and discussion archives prior to its introduction in an attempt to understand how and why filters were introduced and how they function.
+Thirdly, I look into the filters implemented on English Wikipedia
 \footnote{Throughout the work, the abbreviated form ``EN Wikipedia'' is used to denote the English language version of Wikipedia.}
 themselves, as well as their log data in order to determine what they actually do.
 
@@ -183,9 +183,9 @@ Revise Questions from Confluence:
 
 This thesis is organised in the following manner:
 Chapter~\ref{chap:background} situates the topic in the academic discourse by examining the role of different quality control mechanisms on Wikipedia hitherto studied by the scientific community.
-In chapter~\ref{chap:methods}, I present the methodological frameworks on which the present research is based.
-Next, I describe the edit filter mechanism in general: How and why it was conceived, how it works and how it can be embedded in Wikipedia's quality control ecosystem (chapter~\ref{chap:filters}).
+In chapter~\ref{chap:methods}, I present the methodological frameworks on which this research is based.
+Next, the edit filter mechanism in general is described: How and why it was conceived, how it works and how it is embedded in Wikipedia's quality control ecosystem (chapter~\ref{chap:filters}).
 A detailed analysis of the current state of all implemented edit filters on English Wikipedia is presented in chapter~\ref{chap:overview-en-wiki}.
-We discuss the findings and limitations of the present work, as well as directions for future investigations in chapter~\ref{chap:discussion}.
+Chapter~\ref{chap:discussion} discusses the findings and limitations of the present work, as well as directions for future investigations.
 Finally, the research is wrapped up in chapter~\ref{chap:conclusion}.
 
diff --git a/thesis/thesis_main.tex b/thesis/thesis_main.tex
index 1497cb160f28948148f09e3a5903ca7874ff8ff2..8436b70da0ef82bc07d85ce18a7bfb673ff84e61 100755
--- a/thesis/thesis_main.tex
+++ b/thesis/thesis_main.tex
@@ -52,6 +52,7 @@
 \usepackage{verbatim}
 \usepackage{longtable} % for multipage tables
 \usepackage[stable]{footmisc} % for putting footnotes in section titles
+\usepackage{ragged2e} % for avoiding nastily justified text
 
 %
 %---------------------------------------------------