diff --git a/notes b/notes index 84868114505c0f12759d239ec204bb07706179df..b4f79417516f7e59e557ae8286fcd6918b7e0bee 100644 --- a/notes +++ b/notes @@ -1602,7 +1602,16 @@ It sounds like this permission, and abusefilter-modify, might be in the process ideological and practical concerns mix -https://en.wikipedia.org/wiki/Wikipedia_talk:Edit_filter/Archive_1#Security_through_private_obscurity_-_mbots +"My reading of this page is that the most frequently mentioned advantage of the proposed anti-vandalism Mediawiki extension, is speed of response compared to similar private bots. The major improvement over bots seems to be that of amplifying speed-of-response. +Why not create the mediawiki extension as an amplifier toolkit for "bots"? " + +"owever, my proposal is to "increase" the effectiveness of our handling, by targetting the vandals before they can vandalise, and by placing restrictions on their accounts and IP addresses when they try, resulting in zero actual vandalism, and all of the admin and checkuser work automated by software. This is an effective system. + +You cannot possibly be suggesting that, because our current handling of repeat vandals does eventually reverse and prevent some vandalism, we should rest on our laurels and do what we currently do because it's "not broken". Here, we have an opportunity to improve the effectiveness of our handling of repeat vandals. Let us not ignore it because our current system is "not broke". - —Preceding unsigned comment added by Werdna (talk • contribs) 02:07, 16 July 2008 " + +A lot of controversy along the lines of +* public/private filters +* what actions exactly are ok to be taken by the filters; strong objections from community members about filters blocking/taking away rights etc. from editors; and although (both?) of theses functionalities ended up being implmented actually none of them is being actively used on the EN WP (where the "strictest" action applied is "disallow" and the last time a filter took an action different from disallow/tag/warn/log was "blockautopromote" and "aftv5flagabuse" (not sure what exactly this is) in 2012, see ipnb) ======================================================================= https://en.wikipedia.org/w/index.php?title=Wikipedia:Edit_filter&oldid=221994491 diff --git a/thesis/2-Background.tex b/thesis/2-Background.tex index 4091f70680cc839ead5006471ce6ca6e5eb98318..d22649b6063596586e0f088f9ded4861b0226449 100644 --- a/thesis/2-Background.tex +++ b/thesis/2-Background.tex @@ -28,6 +28,19 @@ This is so unusual, we don’t even have a word for it. It’s tempting to say % Aim: I want to know why are there filters? How do they fit in the quality control ecosystem? Distinction filters/Bots: what tasks are handled by bots and what by filters (and why)? What difference does it make for admins? For users whose edits are being targeted? +So, after reading quite some of the discussion surrounding the introduction of the edit filter MediaWiki extention (\url{https://en.wikipedia.org/wiki/Wikipedia_talk:Edit_filter/Archive_1}), +I think motivation for the filters was following: +bots weren't reverting some kinds of vandalism fast enough, or, respectively, these vandalism edits required a human intervention and took more than a single click to get reverted. +(It seemed to be not completely clear what types of vandalism these were. +As far as I understood, and what made more sense to me, above all, it was about mostly obvious but pervasive vandalism, possibly aided by bots/scripts itself, that was immediately recognisable as vandalism, but take some time to clean up. +Motivation of extention's devs was that if a filter just disallows such vandalism, vandal fighters could use their time for checking less obvious cases where more background knowledge/context is needed in order to decide whether an edit is vandalism or not.) +The extention's developers felt that admins and vandal fighters could use this valuable time more productively. +Examples of type of edits that are supposed to be targeted: +\url{https://en.wikipedia.org/wiki/Special:Contributions/Omm_nom_nom_nom} +* often: page redirect to some nonsence name +\url{https://en.wikipedia.org/wiki/Special:Contributions/AV-THE-3RD} +\url{https://en.wikipedia.org/wiki/Special:Contributions/Fuzzmetlacker} + \cite{AstHal2018} have a diagram describing the new edit review pipeline. Filters are absent. Why is it important we study these mechanisms?