Skip to content
Snippets Groups Projects
Code owners
Assign users and groups as approvers for specific file changes. Learn more.
notes 150.69 KiB
# Wichtige Fragen

* Wie funktionieren sie?
  * technisch
  * governance/community level
* Sind sie wirksam (aus wessen Perspektive)
* Was für Wirkmechanismen stecken dahinter?
* Wie komplex sind die?
* Wie sichtbar sind sie für normalsterbliche Nutzer*innen
* Wer macht sie? Wie viele Menschen sind das?
* Was sind das für Filter
* Wie sind sie entstanden?
  * Aus was für eine Debatte sind sie entstanden?
* Wie haben sie sich über die Jahre entwickelt?
* Wie ist es organisiert?
* Was sind die Auswirkungen auf die Nutzer*innen?
* Wie viele False Positives erzeugen die Filter? Wie wird damit umgegangen?


# Weitere Gedanken

* "*Can* prevent *potentially* harmful behaviour" <--- wo kommt das her?
* general context: upload filter
* check toollab (stats + graphs)
* was gibts dazu bereits für Forschung?
* was wird eigentlich als Vandalismus definiert? (falls Vandalismus der Hauptanlass für die Filter is) (auch historisch)


========================================================================
https://en.wikipedia.org/wiki/Wikipedia:Edit_filter
https://en.wikipedia.org/w/index.php?title=Wikipedia:Edit_filter&oldid=877829572

"The edit filter is a tool that allows editors in the edit filter manager group to set controls mainly[1] to address common patterns of harmful editing."
Current filters: https://en.wikipedia.org/wiki/Special:AbuseFilter

DEF:
"A filter automatically compares every edit made to Wikipedia against a defined set of conditions. If an edit matches the conditions of a filter, that filter will respond by logging the edit. It may also tag the edit summary, warn the editor, revoke his/her autoconfirmed status, and/or disallow the edit entirely.[2]"
Footnote 2: "The extension also allows for temporary blocking, but these features are disabled on the English Wikipedia." <-- TODO: Is there wikipedia on which it isn't disallowed?

Software: https://www.mediawiki.org/wiki/Extension:AbuseFilter
---> enabled on English Wikipedia 2009

On the name:
"The term "edit filter" rather than "abuse filter" is currently used for user-facing elements of the filter as some of the edits it flags are not harmful;[1] the terms are otherwise synonymous."

"Because even the smallest mistake in editing a filter can disrupt the encyclopedia, only editors who have the required good judgment and technical proficiency are permitted to configure filters."
--> Who are these editors? Who decides they are qualified enough?

"Filters are created and configured by edit filter managers, but they can be requested by any editor."
"all administrators can view private filters"
"This group is assignable by administrators, who may also assign the right to themselves"
"The assignment of the edit filter manager user right to non-admins is highly restricted. It should only be requested by and given to highly trusted users, when there is a clear and demonstrated need for it."
"demonstrated knowledge of the extension's syntax and in understanding and crafting regular expressions is absolutely essential"
"Editors who are not edit filter managers should consider helping out at requested edit filters and troubleshooting at false positives to help gain experience and demonstrate these skills"
"Requests for assignment of the group to non-admins can be made at the edit filter noticeboard, where a discussion will be held before a decision is made;discussions are normally held open for 7 days."
"If an edit filter manager is misusing the user right, the concern should first be raised with them directly. If discussion does not resolve the issue, a request for discussion or removal of the user right may be made at the edit filter noticeboard. "
"If you have the edit filter manager user right, please ensure you follow the Password strength requirements and appropriate personal security practices. Two-factor authentication enrollment is available for edit filter managers. Because edit filters affect every edit made, a compromised account will be blocked and its privileges removed on grounds of site security. In the unlikely event that your account is compromised, notify an administrator or bureaucrat (for administrators) immediately so they can block your account and remove any sensitive privileges to prevent damage. "
//interessanterweise is 2factor-auth auch nur für diese speziellen Benutzer*innen erlaubt; sonst kann man die Seite nicht ansehen

List of current edit filter managers
EN: https://en.wikipedia.org/wiki/Special:ListUsers/abusefilter (currently: 155)
CAT: https://ca.wikipedia.org/wiki/Especial:Usuaris/abusefilter (currently: 4 users)

-- auf Spanisch/Deutsch/Russisch existiert die Rolle nicht; interessant zu wissen, ob sie iwo subsumiert wurde
-- auf Bulgarisch übrigens auch nicht, aber da existiert auch die gesamte EditFilter seite nicht

What do filters do?/What actions they trigger (vgl DEF) in order of graveness:
- disallow -- editor is informed, if their edit is being disallowed and offered the option to report a false positive;
  "It is also possible to have a user's autoconfirmed status revoked if a user trips the filter."
  caution to use it seldomly and after a thorough discussion on what is a undesirable edit
- warn -- editor is informed that their edit may be problematic and given the option to save or abort the edit (and in report the false positive trigerred by the filter)
- add a tag - "edit is tagged for review by patrollers." -- TODO who are patrollers? are there some in lang versions other than EN?
  "Patrols are a specialized type of WikiProject used in the English Wikipedia to watch over a class of pages and take any appropriate actions. Most patrol actions are performed by individual Wikipedians, but some are performed by bots—computer programs or preprogrammed scripts that make automated edits without a need for real time human decision-making. " https://en.wikipedia.org/wiki/Wikipedia:Patrols
- log the edit - "In this case, the edit is merely added to the AbuseLog. When testing new filters, this is the suggested setting to use."

"Except in urgent situations, new edit filters should generally be tested without any actions specified (simply enabled) until a good number of edits have been logged and checked before being implemented in "warn" or "disallow" modes. If the filter is receiving more than a very small percentage of false positives it should usually not be placed in 'disallow' mode."

Alternatives:
"Edit filter managers should be familiar with alternatives that might be more appropriate in a given situation. For example, problems on a single page might be better served with page protection, and problems with page titles or link spam may find the title blacklist and spam blacklist more effective respectively. Because edit filters check every edit in some way, filters that are tripped only rarely are discouraged. "

Exemptions for "urgent situation" -- what/how are these defined?
Discussions may happen postfactum here and filter may be applied before having been thoroughly tested; in this case the corresponding editor is responsible for checking the logs regularly and making sure the filter acts as desired

Hidden filters!
"Non-admins in good standing who wish to review a proposed but hidden filter may message the mailing list for details."
// what is "good standing"?
// what are the arguments for hiding a filter?
// are users still informed if their edit triggers a hidden filter?

"For all filters, including those hidden from public view, a brief description of what the rule targets is displayed in the log, the list of active filters, and in any error messages generated by the filter. "

"Filters should only be hidden where necessary, such as in long-term abuse cases where the targeted user(s) could review a public filter and use that knowledge to circumvent it. Filters should not generally be named after abusive editors, but rather with a simple description of the type of abuse, provided not too much information is given away."

"Be careful not to test sensitive parts of private filters in a public test filter (such as Filter 1): use a private test filter (for example Filter 2) if testing is required."

harassment! mailinglist
"If it would not be desirable to discuss the need for a given edit filter on-wiki, such as where the purpose of the filter is to combat harassment by an abusive banned user who is likely to come across the details of the request, edit filter managers can be emailed directly or on the wikipedia-en-editfilters mailing list at wikipedia-en-editfilters@lists.wikimedia.org."

https://lists.wikimedia.org/mailman/listinfo/wikipedia-en-editfilters
"private mailing list used by English Wikipedia edit filter managers, "
"primarily for discussing hidden filters."
"The mailing list should not be used as a venue for discussions that could reasonably be held on-wiki."


batch testing interface

=================================================================
https://de.wikipedia.org/wiki/Wikipedia:Bearbeitungsfilter

"Der Bearbeitungsfilter (englisch: edit filter; früher: „Missbrauchsfilter“ oder abuse filter) ist ein vielseitig einsetzbares Werkzeug zur Beobachtung und Verhinderung problematischer Bearbeitungen. Dazu gehört die Bekämpfung von Verstößen gegen die Wikipedia-Richtlinien, insbesondere Wikipedia:Vandalismus."

"Derzeit können alle Administratoren Filter bearbeiten."
--> Unterschied zu EN, ne? Da gibts ne Spezielle Gruppe für? (Aber Admins können da Menschen reinstecken, auch sich selber)

Mögliche Auswirkungen
"
* das Verhindern von Edits, die bestimmte Eigenschaften erfüllen
* das Hinweisen des agierenden Benutzers bei solchen Edits
* das reine Aufspüren solcher Edits"

"Für jede aktive Filterregel gilt, dass sie begründet sein muss und die Verhältnismäßigkeit durch die konkret anzunehmenden potentiell schädigenden Bearbeitungen gewahrt ist."

"Gründe für den Einsatz des Bearbeitungsfilter können sein:
* Vandalismus/Sperrumgehungen durch einen Wikipedianer mit wechselnden Benutzerkonten/IP-Adressen oder auf mehreren Seiten, wobei eine konventionelle Artikel- oder Benutzer-Sperre zu viele Nebenwirkungen hätte
* sonst schwer zu findende (unabsichtliche) Verstöße gegen die Wikipedia-Richtlinien
* Erstellung von Wartungslisten (wobei versucht werden sollte, hier eher Bots zu nutzen)"

=================================================================
https://en.wikipedia.org/wiki/Wikipedia:Edit_filter_noticeboard

According to the Edit filter Notice board:
"There are currently 196 enabled filters and 13 stale filters with no hits in the past 30 days (Purge). See also the edit filter graphs." (Stand: 24.11.2018)
"There are currently 198 enabled filters and 11 stale filters with no hits in the past 30 days (Purge). See also the edit filter graphs." (Stand: 25.11.2018, seems to change frequently!)

- discuss current filter behaviour
- suggest filter for deletion, since it's not particularly helpful: " unnecessary, is preventing good edits, or is otherwise problematic,"
  (you can also raise the issue directly with the filter manager who created or enabled the filter)

apart from that: current ongoing discussions on single filters/problems that may require a filter
===============================================================
https://en.wikipedia.org/wiki/Wikipedia:Edit_filter/Requested
https://en.wikipedia.org/w/index.php?title=Wikipedia:Edit_filter/Requested&oldid=871023624
-- gibts nur noch auf Deutsch

- suggest new filters

"This page is for people without the abusefilter-modify permission or people without sufficient knowledge about the coding involved to make requests to enact edit filters."

There's a "Bear the following in mind:" checklist
"Filters are applied to all edits. Therefore, problematic changes that apply to a single page are likely not suitable for an edit filter."
- filter, after adding up, make editing slower
- in depth checks should be done by a separate software that users run on their own machines
- no trivial errors should be catched by filters (ala style guidelines)
- there are Titles Blacklist and Link/Spam Blacklist

===============================================================
https://de.wikipedia.org/wiki/Wikipedia:Bearbeitungsfilter/Antr%C3%A4ge
DE für  https://en.wikipedia.org/wiki/Wikipedia:Edit_filter/Requested

viel mehr kram auf der oberseite als bei der englischen; diskussionen zu jeden einzelnen(?) filter sind direkt verlinkt

"Auf dieser Seite können Vorschläge für neue Regeln des Bearbeitungsfilters eingereicht werden. Diskussionen zu Filterregeln (z.B. wegen irrtümlichen Blockaden), deren Nummer nicht bekannt ist, werden ebenfalls hier geführt. Zusätzlich gibt es für jede bereits bestehende Regel eine eigene Diskussionsseite. Eine Übersicht zu diesen einzelnen Seiten gibt der Abschnitt #Liste der Diskussionen zu einzelnen Regeln. Im Archiv werden Diskussionen gesammelt, für die keine Regeln erstellt wurden."

Ein Bsp:
https://de.wikipedia.org/wiki/Wikipedia:Bearbeitungsfilter/271
https://de.wikipedia.org/wiki/Wikipedia:Vandalismusmeldung/Archiv/2018/11/25#Benutzer:Zollwurf_(erl.)
-- ich hab mittlerweile den Eindruck, dass so was evtl auf der Engl. Wikipedia nicht öffentlich wäre?
-- allerdings, ist so ein Problem mit nem Filter zu lösen? Wie kann es sonst gelöst werden?

===============================================================
https://en.wikipedia.org/wiki/Wikipedia:Long-term_abuse

117 active cases [Stand 24.11.2018]
there's a list available at least for the active cases with detailed abuse reports
There's also an archive page of abuse cases: https://en.wikipedia.org/wiki/Wikipedia:Long-term_abuse/Archive (25 entries [Stand 24.11.2018])
And full list of cases: https://en.wikipedia.org/wiki/Wikipedia:Long-term_abuse/Full

"This page summarises a limited number of long term abusers, to assist members of the community who believe they may have cause to report another incident. Note that this page is not a noticeboard. Names should only be added for the most egregious and well-attested cases. Most users here will have been banned, some on multiple occasions. "

"Don't provide too much info
    The text should tread a careful balance between providing useful information and providing enough to obstruct detection. In general such information should only be shared with users of a high level of reputation."

===============================================================
https://de.wikipedia.org/wiki/Wikipedia:WikiProjekt_Vandalismusbek%C3%A4mpfung/Troll-Dokumentationsseiten
DE zu https://en.wikipedia.org/wiki/Wikipedia:Long-term_abuse

"Hier werden Seiten im Benutzernamensraum gesammelt, auf denen die Aktivitäten unterschiedlicher Wikipediastörer charakterisiert oder dokumentiert werden sollen. "
!!!"Eine Definition von störendem Verhalten, z. B. in Bezug auf Art, Umfang oder Dauer der Projektstörung, die eine Bezeichnung als „Wikipediatroll“ rechtfertigt, ist in der Autorengemeinschaft umstritten."

sonst eben wieder listen mit bekannten Fällen; kein Count der aktuell aktiven Fällen wie bei der Englischen Seite

===============================================================
https://en.wikipedia.org/wiki/Special:AbuseFilter/384

bad words in articles and user names filter

===============================================================
https://en.wikipedia.org/wiki/Special:AbuseLog
https://de.wikipedia.org/wiki/Spezial:Missbrauchsfilter-Logbuch
https://es.wikipedia.org/wiki/Especial:RegistroAbusos
https://ca.wikipedia.org/wiki/Especial:Registre_dels_abusos
https://bg.wikipedia.org/wiki/%D0%A1%D0%BF%D0%B5%D1%86%D0%B8%D0%B0%D0%BB%D0%BD%D0%B8:%D0%94%D0%BD%D0%B5%D0%B2%D0%BD%D0%B8%D0%BA_%D0%BD%D0%B0_%D1%84%D0%B8%D0%BB%D1%82%D1%8A%D1%80%D0%B0

can search for all filter triggers in a period of time/by a specific user

auf der Deutschen Seite ist die Filter-Aktion und -Beschreibung aufgelistet, aber die Filternummer ist nicht verlinkt, also kann man nicht direkt den Quellcode ansehen;
Außerdem muss man direkt auf die Seite gehen, um die versuchte Änderungen in der Versionsgeschichte anzugucken

die Spanische Seite ist ähnlich aufgebaut wie die Deutsche.
!!!Da es aber auf Spanisch anscheinend keine "Saved Revisions" (oder wie hieß das nochma) gibt, kann man eigenltich die abusiven Änderungen gar nicht mehr sichten und noch kann man den Filter angucken, der eigentlich getriggered wurde

Katalanisch ist das selbe wie Spanisch, mit dem Unterschied, dass aus irgendeinem Grund (keine Regelmäßigkeit festgestellt), manchmal die Summaries auf Englisch sind, da wird auch ein Diff mit den offending Changes angezeigt.

the Bulgarian page uses the same form as the English one, so source code of the filter as well as diff of the changes can be viewed

===============================================================
https://en.wikipedia.org/wiki/Wikipedia:Edit_filter/False_positives
https://es.wikipedia.org/wiki/Wikipedia:Filtro_de_ediciones/Portal/Archivo/Reporte_de_falsos_positivos/Actual
https://ca.wikipedia.org/wiki/Viquip%C3%A8dia:Filtre_d%27edicions/Falsos_positius

a detailed page with ongoing/recently reported cases

there seems to be no such page for BG and DE --> no possibility to report false positives?

================================================================
Current status (29.11.2018)

EN: There are currently 201 enabled filters, and 12 stale filters with no hits in the past 30 days (Purge).
from https://en.wikipedia.org/wiki/Special:AbuseFilter
DE: 170 enabled, disabled, privat, öffentlich https://de.wikipedia.org/wiki/Spezial:Missbrauchsfilter/?deletedfilters=hide&limit=250&title=Spezial%3AMissbrauchsfilter%2F
ES: 92 https://es.wikipedia.org/wiki/Especial:FiltroAntiAbusos/?deletedfilters=hide&limit=100&title=Especial%3AFiltroAntiAbusos%2F
CA: 24  https://ca.wikipedia.org/wiki/Especial:Filtre_d%27abuses (spannend: man muss sich anmelden um das angucken zu können!)
BG: 24 https://bg.wikipedia.org/wiki/%D0%A1%D0%BF%D0%B5%D1%86%D0%B8%D0%B0%D0%BB%D0%BD%D0%B8:%D0%A4%D0%B8%D0%BB%D1%82%D1%8A%D1%80_%D1%81%D1%80%D0%B5%D1%89%D1%83_%D0%B7%D0%BB%D0%BE%D1%83%D0%BF%D0%BE%D1%82%D1%80%D0%B5%D0%B1%D0%B8

=================================================================
https://en.wikipedia.org/wiki/Special:AbuseFilter

"There are currently 203 enabled filters, and 11 stale filters with no hits in the past 30 days (Purge)."

that's the management interface!
"Welcome to the Edit Filter management interface. Using the Edit Filter, authorized users can configure a wide range of tests, which may help identify and prevent potentially harmful edits and other activities before they are added to the wiki, and the automatic actions to be taken."

"PLEASE be careful. This is potent stuff. Unless it's urgent, always test your filters with no actions enabled first."

weird? the test interface https://en.wikipedia.org/wiki/Special:AbuseFilter/test
says: "For security reasons, only users with the right to view private abuse filters or modify filters may use this interface."
shouldn't all filter editors be able to test??

Collaboration with bots:
"There is a bot reporting users tripping certain filters at WP:AIV and WP:UAA; you can specify the filters here."
https://en.wikipedia.org/wiki/User:DatBot/filters

Sortable table of all filters with following columns:
Filter ID 	Public description 	Actions 	Status 	Last modified 	Visibility 	Hit count
links to single filters, e.g. --> https://en.wikipedia.org/wiki/Special:AbuseFilter/1 (see bellow for detailed filter page)
"Actions" is one of: warn | tag | disallow | throttle | ?? (possibly more, not directly visible)
"Status" is: enabled | disabled
"Last modified" provides a link to diff between versions and the user who did the modification
"Visibility" is: private | public
"Hit count": which period is counted? total number of hits since the filter was enabled? (for all enabled periods, in case it was enabled/disabled multiple times?)

Filter with most hits:
Filter ID 	Public description 	Actions 	Status 	Last modified 	Visibility 	Hit count
61 	New user removing references 	Tag 	Enabled 	12:43, 14 May 2017 by Zzuuzz (talk | contribs) 	Public 	1,593,851 hits

=====================================================================
https://en.wikipedia.org/wiki/Special:AbuseFilter/1

where following information can be viewed:
Filter id; public description; filter hits; statistics; code (conditions); notes (left by filter editors to log changes;); flags ("Hide details of this filter from public view", "enable this filter", "mark as deleted");
links to: last modified (with diff and user who modified it), edit filter's history; "export this filter to another wiki" tool;

Actions to take when matched:
Trigger actions only if the user trips a rate limit
Trigger these actions after giving the user a warning
Prevent the user from performing the action in question
Revoke the user's autoconfirmed status
Tag the edit in contributions lists and page histories
and the filter can be modified if the viewing editor has the right permissions

statistics are info such as "Of the last 1,728 actions, this filter has matched 10 (0.58%). On average, its run time is 0.34 ms, and it consumes 3 conditions of the condition limit." (that's filter id 61) // not sure what the condition limit is
"Of the last 5,616 actions, this filter has matched 0 (0.00%). On average, its run time is 0 ms, and it consumes 0 conditions of the condition limit." (that's filter id 1)

=========================================================================
https://en.wikipedia.org/wiki/Special:AbuseFilter/history/1

Time 	User 	Public filter description 	Flags 	Actions 	Changes

the link with the timestamp links back to the filter editor (see previous page)
user links to the user
changes links to a diff of the current revision with the previous one

=========================================================================
https://en.wikipedia.org/wiki/Wikipedia:Wikipedia_Signpost/2009-03-23/Abuse_Filter

"The AbuseFilter extension, developed by User:Werdna, is now enabled on English Wikipedia. The extension allows all edits to be checked against automatic filters and heuristics, which can be set up to look for patterns of vandalism including page move vandalism and juvenile-type vandalism, as well as common newbie mistakes. When a match is found, the extension can take specified actions, ranging from logging the edit to be checked, giving a warning (e.g. "did you really intend to blank a page?"), to more serious actions such as blocking users."
from https://en.wikipedia.org/wiki/Wikipedia:Wikipedia_Signpost/2009-03-23/Abuse_Filter
("The Signpost is a monthly community-written and -edited online newspaper covering the English Wikipedia, its sister projects, the Wikimedia Foundation, and the Wikimedia movement at large." https://en.wikipedia.org/wiki/Wikipedia:Wikipedia_Signpost/About)

Note: User:Werdna
https://en.wikipedia.org/wiki/User:Werdna
"I'm Andrew Garrett. I started volunteering in 2005, and I worked at the Wikimedia Foundation from 2009 until I left in 2015 because 7 years is a long time. This is my personal account, so while I'll make mistakes, I intend for actions taken with this account to be in my personal capacity only. My work account was Andrew Garrett."
http://www.andrewjgarrett.com/

========================================================================
https://en.wikipedia.org/w/index.php?title=Wikipedia:Edit_filter&oldid=221158142
Edit_filter page first version, created 23.06.2008, where User:Werdna announced the upcoming MediaWiki Extention he was working on.

"I've been developing an extension which allows privileged users to add very specific restrictions on actions going through on the wiki.

This gives us the opportunity to prevent damage from vandals with very specific modi operandi."

"I submit to the community that this gives us an extraordinary opportunity to disallow some of the worst and most annoying types of vandalism which occur on Wikipedia, and to refocus our efforts into doing other, more productive things than cleaning up after page-move vandalism."

There's a list of things that the filters can filter on

"It is noteworthy here that no rights, even to view information, are granted to all users. This is deliberate. Information about the filters active on Wikipedia would be sensitive information, and, if it were to be publically available, release to those who wish to circumvent them would be inevitable. I currently propose that the right to view filters be almost as well-protected as the right to modify them."

"Of course, this issue of the closed nature of the extension is the extension's main problem, and the one which I foresee the most objections on. "

=========================================================================
https://en.wikipedia.org/wiki/Wikipedia:Edit_filter/Instructions

"This section explains how to create a filter with some preliminary testing, so you don't flood the history page."
- read the docs https://www.mediawiki.org/wiki/Extension:AbuseFilter/Rules_format
- test with debugging tools  https://en.wikipedia.org/wiki/Special:AbuseFilter/tools (visible only for users who are already in the edit filter managers user group)
- test with batch testing interface (dito)
- create logging only filter: https://en.wikipedia.org/wiki/Special:AbuseFilter/new (needs permissions)
- Post a message at WP:EFN (edit filter notice board), so other edit filter managers have a chance to improve it
- Finally, fully enable your filter, e.g. add warning, prevention, tagging, etc.

tips on controlling efficiency/order of operations
lazy evaluation: when 1st negative condition is met, filter terminates execution

"You should always order your filters so that the condition that will knock out the largest number of edits is first. Usually this is a user groups or a user editcount check; in general, the last condition should be the regex that is actually looking for the sort of vandalism you're targeting. "

===========================================================================
https://www.mediawiki.org/wiki/Extension:AbuseFilter

Author(s)
    Andrew Garrett, <-- lead dev
    River Tarnell
    Victor Vasiliev
    Marius Hoch

a media wiki extention written in php;
licensed under GPL 2.0
no further dependencies needed

code repo: https://gerrit.wikimedia.org/g/mediawiki/extensions/AbuseFilter
issue tracker: https://phabricator.wikimedia.org/tag/abusefilter/

"Once the extension has been installed, filters can be created/tested/changed/deleted and the logs can be accessed from the Abuse filter management page Special:AbuseFilter. "

you can import filters from wikipedia

Creates following tables
mysql> describe abuse_filter; (https://www.mediawiki.org/wiki/Extension:AbuseFilter/abuse_filter_table)
+--------------------+---------------------+------+-----+---------+----------------+
| Field              | Type                | Null | Key | Default | Extra          |
+--------------------+---------------------+------+-----+---------+----------------+
| af_id              | bigint(20) unsigned | NO   | PRI | NULL    | auto_increment |
| af_pattern         | blob                | NO   |     | NULL    |                |
| af_user            | bigint(20) unsigned | NO   | MUL | NULL    |                | // user ID of last modifier
| af_user_text       | varbinary(255)      | NO   |     | NULL    |                | // user name of last modifier
| af_timestamp       | binary(14)          | NO   |     | NULL    |                | // last modified
| af_enabled         | tinyint(1)          | NO   |     | 1       |                |
| af_comments        | blob                | YES  |     | NULL    |                |
| af_public_comments | tinyblob            | YES  |     | NULL    |                |
| af_hidden          | tinyint(1)          | NO   |     | 0       |                |
| af_hit_count       | bigint(20)          | NO   |     | 0       |                |
| af_throttled       | tinyint(1)          | NO   |     | 0       |                |
| af_deleted         | tinyint(1)          | NO   |     | 0       |                |
| af_actions         | varbinary(255)      | NO   |     |         |                |
| af_global          | tinyint(1)          | NO   |     | 0       |                |
| af_group           | varbinary(64)       | NO   | MUL | default |                |
+--------------------+---------------------+------+-----+---------+----------------+

mysql> describe abuse_filter_log; https://www.mediawiki.org/wiki/Extension:AbuseFilter/abuse_filter_log_table
+------------------+---------------------+------+-----+---------+----------------+
| Field            | Type                | Null | Key | Default | Extra          |
+------------------+---------------------+------+-----+---------+----------------+
| afl_id           | bigint(20) unsigned | NO   | PRI | NULL    | auto_increment |
| afl_filter       | varbinary(64)       | NO   | MUL | NULL    |                |
| afl_user         | bigint(20) unsigned | NO   | MUL | NULL    |                | \\User ID of the author of the action.
| afl_user_text    | varbinary(255)      | NO   |     | NULL    |                | \\User name of the author of the action.
| afl_ip           | varbinary(255)      | NO   | MUL | NULL    |                |
| afl_action       | varbinary(255)      | NO   |     | NULL    |                | \\The action which triggered the filter. Values can include the following values: edit, delete, createaccount, move, upload, autocreateaccount, stashupload
| afl_actions      | varbinary(255)      | NO   |     | NULL    |                | \\What the filter made about the action
| afl_var_dump     | blob                | NO   |     | NULL    |                | \\Value of the variables of the filter that matched the edit, stored as a serialized PHP array.
| afl_timestamp    | binary(14)          | NO   | MUL | NULL    |                |
| afl_namespace    | tinyint(4)          | NO   | MUL | NULL    |                | \\Target Namespace of the filtered action.
| afl_title        | varbinary(255)      | NO   |     | NULL    |                | \\Target title of the filter action.
| afl_wiki         | varbinary(64)       | YES  | MUL | NULL    |                |
| afl_deleted      | tinyint(1)          | NO   |     | 0       |                | \\"Whether the AbuseLog entry was suppressed. 1 if suppressed, 0 otherwise.
" ohm, that means, that if 1, the rest of the line would be empty?
| afl_patrolled_by | int(10) unsigned    | YES  |     | NULL    |                | \\unused
| afl_rev_id       | int(10) unsigned    | YES  | MUL | NULL    |                | \\Foreign key to revision.rev_id, only populated for saved edits in order to show a diff link. I've got the feeling, it is also unused, for filter id 23 it is empty for all log entries
| afl_log_id       | int(10) unsigned    | YES  | MUL | NULL    |                | \\unused
+------------------+---------------------+------+-----+---------+----------------+
16 rows in set (0.00 sec)

mysql> describe abuse_filter_history; (from https://www.mediawiki.org/wiki/Extension:AbuseFilter/abuse_filter_history_table)
+---------------------+---------------------+------+-----+---------+----------------+
| Field               | Type                | Null | Key | Default | Extra          |
+---------------------+---------------------+------+-----+---------+----------------+
| afh_id              | bigint(20) unsigned | NO   | PRI | NULL    | auto_increment |
| afh_filter          | bigint(20) unsigned | NO   | MUL | NULL    |                |
| afh_user            | bigint(20) unsigned | NO   | MUL | NULL    |                |
| afh_user_text       | varbinary(255)      | NO   | MUL | NULL    |                |
| afh_timestamp       | binary(14)          | NO   | MUL | NULL    |                |
| afh_pattern         | blob                | NO   |     | NULL    |                |
| afh_comments        | blob                | NO   |     | NULL    |                |
| afh_flags           | tinyblob            | NO   |     | NULL    |                |
| afh_public_comments | tinyblob            | YES  |     | NULL    |                |
| afh_actions         | blob                | YES  |     | NULL    |                |
| afh_deleted         | tinyint(1)          | NO   |     | 0       |                |
| afh_changed_fields  | varbinary(255)      | NO   |     |         |                |
| afh_group           | varbinary(64)       | YES  |     | NULL    |                |
+---------------------+---------------------+------+-----+---------+----------------+
13 rows in set (0.00 sec)

Note! no public view of table abuse_filter_history at the moment

mysql> describe abuse_filter_action; (from https://www.mediawiki.org/wiki/Extension:AbuseFilter/abuse_filter_action_table)
+-----------------+---------------------+------+-----+---------+-------+
| Field           | Type                | Null | Key | Default | Extra |
+-----------------+---------------------+------+-----+---------+-------+
| afa_filter      | bigint(20) unsigned | NO   | PRI | NULL    |       |
| afa_consequence | varbinary(255)      | NO   | PRI | NULL    |       |
| afa_parameters  | tinyblob            | NO   |     | NULL    |       |
+-----------------+---------------------+------+-----+---------+-------+
3 rows in set (0.00 sec)

Seems to contain data for currently enabled filters only;
Question: how do we find data for disabled filters?

# API calls

## List information about filters:
https://en.wikipedia.org/w/api.php?action=query&list=abusefilters&abfshow=!private&abfprop=id%7Chits
or in the sandbox:
https://en.wikipedia.org/wiki/Special:ApiSandbox#action=query&list=abusefilters&abfshow=!private&abfprop=id%7Chits

Parameters

    abfstartid: The filter id to start enumerating from
    abfendid: The filter id to stop enumerating at
    abfdir: The direction in which to enumerate (older, newer)
    abfshow: Show only filters which meet these criteria (enabled|!enabled|deleted|!deleted|private|!private)
    abflimit: The maximum number of filters to list
    abfprop: Which properties to get (id|description|pattern|actions|hits|comments|lasteditor|lastedittime|status|private)

When filters are private, some of the properties specified with abfprop will be missing unless you have the appropriate user rights.

## List instances where actions triggered an abuse filter.
https://en.wikipedia.org/w/api.php?action=query&list=abuselog&afluser=SineBot&aflprop=ids
or in the sandbox:
https://en.wikipedia.org/wiki/Special:ApiSandbox#action=query&list=abuselog&afluser=SineBot&aflprop=ids

Parameters

    aflstart: The timestamp to start enumerating from
    aflend: The timestamp to stop enumerating at
    afldir: The direction in which to enumerate (older, newer)
    afluser: Show only entries where the action was attempted by a given user or IP address.
    afltitle: Show only entries where the action involved a given page.
    aflfilter: Show only entries that triggered a given filter ID
    afllimit: The maximum number of entries to list
    aflprop: Which properties to get (ids|user|title|action|result|timestamp|details)

===========================================================================
https://www.mediawiki.org/wiki/Extension:AbuseFilter/Rules_format

Manual on writing filter rules;

===========================================================================
https://phabricator.wikimedia.org/tag/abusefilter/

keep in mind in case of problems

===========================================================================
Google search for something on the quarry.wmflabs.org page

site:quarry.wmflabs.org all tables

===========================================================================
https://en.wikipedia.org/wiki/Wikipedia:Namespace

Namespaces
Subject namespaces 	Talk namespaces
0 	(Main/Article) 	Talk 	1
2 	User 	        User talk 	3
4 	Wikipedia 	    Wikipedia talk 	5
6 	File 	        File talk 	7
8 	MediaWiki 	    MediaWiki talk 	9
10 	Template 	    Template talk 	11
12 	Help 	        Help talk 	13
14 	Category 	    Category talk 	15
100 	Portal 	    Portal talk 	101
108 	Book 	    Book talk 	109
118 	Draft 	    Draft talk 	119
710 	TimedText 	TimedText talk 	711
828 	Module 	    Module talk 	829
2300 	Gadget 	    Gadget talk 	2301
2302 	Gadget definition 	Gadget definition talk 	2303
Virtual namespaces
-1 	Special
-2 	Media

============================================================================
https://en.wikipedia.org/wiki/Wikipedia:Vandalism

"This is not a noticeboard for vandalism. Report vandalism from specific users at Wikipedia:Administrator intervention against vandalism, or Wikipedia:Requests for page protection for specific pages.
Not to be confused with Wikipedia:Disruptive editing."

"This page documents an English Wikipedia policy."

"This page in a nutshell: Intentionally making abusive edits to Wikipedia will result in a block."

DEF Vandalism:
"On Wikipedia, vandalism has a very specific meaning: editing (or other behavior) deliberately intended to obstruct or defeat the project's purpose, which is to create a free encyclopedia, in a variety of languages, presenting the sum of all human knowledge."
"The malicious removal of encyclopedic content, or the changing of such content beyond all recognition, without any regard to our core content policies of neutral point of view (which does not mean no point of view), verifiability and no original research, is a deliberate attempt to damage Wikipedia. There, of course, exist more juvenile forms of vandalism, such as adding irrelevant obscenities or crude humor to a page, illegitimately blanking pages, and inserting obvious nonsense into a page. Abusive creation or usage of user accounts and IP addresses may also constitute vandalism."

Consequences of vandalism, vandalism management
"Vandalism is prohibited. While editors are encouraged to warn and educate vandals, warnings are by no means a prerequisite for blocking a vandal (although administrators usually only block when multiple warnings have been issued). "

"Even if misguided, willfully against consensus, or disruptive, any good-faith effort to improve the encyclopedia is not vandalism."
"For example, edit warring over how exactly to present encyclopedic content is not vandalism." !!!
"Careful consideration may be required to differentiate between edits that are beneficial, edits that are detrimental but well-intentioned, and edits that are vandalism."
"If it is clear that the editor in question is intending to improve Wikipedia, those edits are not vandalism, even if they violate some other core policy of Wikipedia."
"When editors are editing in good faith, mislabeling their edits as vandalism makes them less likely to respond to corrective advice or to engage collaboratively during a disagreement,"

Handling
"Upon discovering vandalism, revert such edits, using the undo function or an anti-vandalism tool. Once the vandalism is undone, warn the vandalizing editor. Notify administrators at the vandalism noticeboard of editors who continue to vandalize after multiple warnings, and administrators should intervene to preserve content and prevent further disruption by blocking such editors. Users whose main or sole purpose is clearly vandalism may be blocked indefinitely without warning."

"examples of suspicious edits are those performed by IP addresses, red linked, or obviously improvised usernames"

One of the strategies to spot vandalism is "Watching for edits tagged by the abuse filter. However, many tagged edits are legitimate, so they should not be blindly reverted. That is, do not revert without at least reading the edit."

"Warn the vandal. Access the vandal's talk page and warn them. A simple note explaining the problem with their editing is sufficient. If desired, a series of warning templates exist to simplify the process of warning users, but these templates are not required. These templates include

    Level one: {{subst:uw-vandalism1}} This is a gentle caution regarding unconstructive edits; it encourages new editors to use a sandbox for test edits. This is the mildest warning.
    Level two: {{subst:uw-vandalism2}} This warning is also fairly mild, though it explicitly uses the word 'vandalism' and links to this Wikipedia policy.
    Level three: {{subst:uw-vandalism3}} This warning is sterner. It is the first to warn that further disruptive editing or vandalism may lead to a block.
    Level four: {{subst:uw-vandalism4}} This is the sharpest vandalism warning template, and indicates that any further disruptive editing may lead to a block without warning."

Types of vandalism:
"
* Abuse of tags: Bad-faith placing of non-content tags such as {{afd}}, {{db}}, {{sprotected}}, or other tags on pages that do not meet such criteria. This includes baseless removal of {{policy}} and related tags.

* Account creation, malicious: Creating accounts with usernames that contain deliberately offensive or disruptive terms is considered vandalism, whether the account is used or not. For Wikipedia's policy on what is considered inappropriate for a username, see Wikipedia:Username policy. See also Wikipedia:Sock puppetry.

* Avoidant vandalism: Removing {{afd}}, {{copyvio}} and other related tags in order to conceal deletion candidates or avert deletion of such content. However, this is often mistakenly done by new users who are unfamiliar with AfD procedures and such users should be given the benefit of the doubt and pointed to the proper page to discuss the issue.

* Blanking, illegitimate
For legitimate cases of blanking articles, see Wikipedia:Redirect § Redirects that replace previous articles.
Removing encyclopedic content without any reason, or replacing such content with nonsense. Content removal is not considered to be vandalism when the reason for the removal of the content is readily apparent by examination of the content itself, or where a non-frivolous explanation for the removal of apparently legitimate content is provided, linked to, or referenced in an edit summary.

Blanking that could be legitimate includes blanking all or part of a biography of a living person. Wikipedia is especially concerned about providing accurate and unbiased information on the living; blanking may be an effort to remove inaccurate or biased material. Due to the possibility of unexplained good-faith content removal, {{uw-test1}} or {{uw-delete1}}, as appropriate, should be used as initial warnings for content removals without more descriptive edit summaries.

* Copyrighted material, repeated uploading of: Uploading or using material on Wikipedia in ways which violate Wikipedia's copyright policies after having been warned is vandalism. Because users may be unaware that the information is copyrighted, or of Wikipedia policies on how such material may and may not be used, such action only becomes vandalism if it continues after the copyrighted nature of the material and relevant policy restricting its use have been communicated to the user.

* Edit summary vandalism: Making offensive edit summaries in an attempt to leave a mark that cannot be easily expunged from the record (edit summaries cannot simply be "reverted" and require administrative action if they have to be removed from a page's history). Often combined with malicious account creation.

* Format vandalism: Changing the formatting of a page unreasonably and maliciously. But many times, editors might just make an unintended mistake or are testing how the wikicode works. Sometimes it might be a bug in the Wikipedia software. Some changes to the format are not vandalism, but rather either good faith edits of editors who don't know the guidelines or simply a different opinion on how the format should look, in which case it is just a disputed edit.

* Gaming the system: Deliberate attempts to circumvent enforcement of Wikipedia policies, guidelines, and procedures by causing bad faith edits to go unnoticed. Includes marking bad faith edits as minor to get less scrutiny, making a minor edit following a bad faith edit so it won't appear on all watchlists, recreating previously deleted bad faith creations under a new title, use of the {{construction}} tag to prevent deletion of a page that would otherwise be a clear candidate for deletion, or use of sock puppets.

* Hidden vandalism: Any form of vandalism that makes use of embedded text, which is not visible to the final rendering of the article but visible during editing. This includes link vandalism, or placing malicious, offensive, or otherwise disruptive or irrelevant messages or spam in hidden comments for editors to see.

* Hoaxing vandalism: Deliberately adding falsities to articles, particularly to biographies of living people, with hoax information is considered vandalism.

* Image vandalism: Uploading shock images, inappropriately placing explicit images on pages, or simply using any image in a way that is disruptive. Please note though that Wikipedia is not censored for the protection of minors and that explicit images may be uploaded and/or placed on pages for legitimate reasons (that is, if they have encyclopedic value).

* Link vandalism: Adding or changing internal or external links on a page to disruptive, irrelevant, or inappropriate targets while disguising them with mislabeling.

* Page creation, illegitimate: Creating new pages with the sole intent of malicious behavior. It also includes personal attack pages (articles written to disparage the subject), hoaxes and other intentionally inaccurate pages. There are many other types of pages that merit deletion, even speedy deletion, but which are not vandalism. New users sometimes create test pages containing nonsense or even autobiographies, and doing so is not vandalism; such pages can also be moved to become their sandbox or userpage. Pages on non-notable topics are not vandalism. Blatant advertising pages, and blatant POV pushes, are not vandalism, but frequently happen and often lead to editors being blocked. It's important that people creating inappropriate pages be given appropriate communication; even if they aren't willing to edit within our rules, they are more likely to go away quietly if they understand why their page has been deleted.

* Page lengthening: Adding very large (measured by the number of bytes) amounts of bad-faith content to a page so as to make the page's load time abnormally long or even make the page impossible to load on some computers without the browser or machine crashing. Adding large amounts of good-faith content is not vandalism, though prior to doing so, one should consider if splitting a long page may be appropriate (see Wikipedia:Article size).

* Page-move vandalism: Changing the names of pages to disruptive, irrelevant, or otherwise inappropriate names. Only autoconfirmed or confirmed users can move pages.

* Silly vandalism: Adding profanity, graffiti, or patent nonsense to pages; creating nonsensical and obviously unencyclopedic pages, etc. It is one of the most common forms of vandalism. However, the addition of random characters to pages is often characteristic of an editing test and, though impermissible, may not be malicious.

* Sneaky vandalism: Vandalism that is harder to spot, or that otherwise circumvents detection, including adding plausible misinformation to articles (such as minor alteration of facts or additions of plausible-sounding hoaxes), hiding vandalism (such as by making two bad edits and only reverting one), simultaneously using multiple accounts or IP addresses to vandalize, abuse of maintenance and deletion templates, or reverting legitimate edits with the intent of hindering the improvement of pages. Impersonating other users by signing an edit with a different username or IP address also constitutes sneaky vandalism, but take care not to confuse this with appropriately correcting an unsigned edit made by another user. Some vandals even follow their vandalism with an edit that states "Rv vandalism" in the edit summary in order to give the appearance the vandalism was reverted.

* Spam external linking: Adding or continuing to add spam external links is vandalism if the activity continues after a warning. A spam external link is one added to a page mainly for the purpose of promoting a website, product or a user's interests rather than to improve the page editorially.

* Stockbroking vandalism: Adding information to pages about quoted companies concerning forthcoming mergers, announcements, and the like. The vandal's intent is to provide credibility to their attempt to promote shares.

* Talk page vandalism: Illegitimately deleting or editing other users' comments. However, it is acceptable to blank comments constituting vandalism, internal spam, or harassment or a personal attack. It is also acceptable to identify an unsigned comment. Users are also permitted to remove comments from their own user talk pages. A policy of prohibiting users from removing warnings from their own talk pages was considered and rejected on the grounds that it would create more issues than it would solve.

* Template vandalism: Modifying the wiki language or text of a template in a harmful or disruptive manner. This is especially serious, because it will negatively impact the appearance of multiple pages. Some templates appear on hundreds or thousands of pages, so they are permanently protected from editing to prevent vandalism.

* User and user talk page vandalism: Unwelcome, illegitimate edits to another person's user page may be considered vandalism. User pages are regarded as within the control of their respective users and generally should not be edited without permission of the user to whom they belong. See WP:UP#OWN. Related is Wikipedia:No personal attacks.

* Vandalbots: A script or "robot" that attempts to vandalize or add spam to a mass of pages."

This is not vandalism:
- boldly editing
- copyright violation
- disruptive editing or stubbornness --> edit warring
- edit summary omission
- editing tests by experimenting users: "Such edits, while prohibited, are treated differently from vandalism"
- harassment or personal attacks: "Personal attacks and harassment are not allowed. While some harassment is also vandalism, such as user page vandalism, or inserting a personal attack into an article, harassment in itself is not vandalism and should be handled differently."
- Incorrect wiki markup and style
- lack of understanding of the purpose of wikipedia: "editing it as if it were a different medium—such as a forum or blog—in a way that it appears as unproductive editing or borderline vandalism to experienced users."
- misinformation, accidental
- NPOV contraventions (Neutral point of view)
- nonsense, accidental: "sometimes honest editors may not have expressed themselves correctly (e.g. there may be an error in the syntax, particularly for Wikipedians who use English as a second language)."
- Policy and guideline pages, good-faith changes to: "If people misjudge consensus, it would not be considered vandalism;"
- Reversion or removal of unencyclopedic material, or of edits covered under the biographies of living persons policy: "Even factually correct material may not belong on Wikipedia, and removing such content when it is not in line with Wikipedia's standards is not vandalism."
- Deletion nominations: "Good-faith nominations of articles (or templates, non-article pages, etc) are not vandalism."

=====================================================================
https://en.wikipedia.org/wiki/Wikipedia:Administrator_intervention_against_vandalism

Notice board;
"This page is intended only for reports about active, obvious, and persistent vandals and spammers."
"Don't forget that blocking is a last resort;"

=====================================================================
https://en.wikipedia.org/wiki/Wikipedia:Disruptive_editing

"Disruptive editing is not vandalism, though vandalism is disruptive."
"Disruptive editing is not always intentional. Editors may be accidentally disruptive because they don't understand how to correctly edit, or because they lack the social skills or competence necessary to work collaboratively "
Okay what are disruptive edits that are not vandalism? (apart from edit wars)

"sometimes attracts people who seek to exploit the site as a platform for pushing a single point of view, original research, advocacy, or self-promotion."
"not verifiable through reliable sources or insisting on giving undue weight to a minority view."

"Collectively, disruptive editors harm Wikipedia by degrading its reliability as a reference source and by exhausting the patience of productive editors who may quit the project in frustration when a disruptive editor continues with impunity."

examples of disruptive editing:
"Engages in "disruptive cite-tagging"; adds unjustified {{citation needed}} tags to an article when the content tagged is already sourced, uses such tags to suggest that properly sourced article content is questionable."
"Rejects or ignores community input: resists moderation and/or requests for comment, continuing to edit in pursuit of a certain point despite an opposing consensus from impartial editors."

=====================================================================
https://en.wikipedia.org/wiki/Wikipedia:WikiBullying

"This is an explanatory supplement to the Wikipedia:Civility and Wikipedia:Ownership of articles policies.
This page is intended to provide additional information about concepts in the page(s) it supplements. This page is not one of Wikipedia's policies or guidelines, as it has not been thoroughly vetted by the community."

"WikiBullying is using Wikipedia to threaten and/or intimidate other people, whether they are Wikipedia editors or not."
"If you feel that you are being bullied or another user has threatened you with bodily harm, it is important that you report them immediately to the Incidents page on the Administrator's Noticeboard so the matter can be properly dealt with."
"All complaints about bullying, even those which turn out to be unjustified should be treated with seriousness and respect, and any WP:BOOMERANG on individuals who have complained they are being bullied is contrary to the principles of respect for thoughtful intellectual discourse that Wikipedia represents. No one should ever fear coming forward to make the community aware of a bullying concern."

"There are essentially two forms of bullying on Wikipedia: attacks against the individual editor by targeting a single user, or giving the perception of power aimed at the entire Wikipedia community at large."

"Forms of WikiBullying:

    1.1 Asserting ownership: "No article on Wikipedia is owned by any editor. Any text that is added to Wikipedia is freely licensed under WP:CC-BY-SA and other users are free to add, remove or modify it at will, provided that such editing is done responsibly."
    1.2 POV Railroading: "Point of View (POV) railroading refers to the use of bullying tactics to discredit an editor with an opposing viewpoint or eliminate them from a discussion."
    1.3 False accusations: "False accusations are a common form of bullying on Wikipedia, although people do sometimes make honest mistakes. Accusations of misconduct made without evidence are considered a serious personal attack."
    1.4 Misrepresentation: "Quoting others out of context and other forms of straw man argument are against the civility policy. Again, try to find out if there has been a misunderstanding."
    1.5 Making "no-edit" orders contrary to policy: "Another form of wikibullying is to issue no-edit orders which are not backed by current policies (or guidelines). A "no-edit" order is a message sent to a single editor (who is not banned) or to the Wikipedia community not to edit at all or in a particular manner, or not to edit a particular page or part of a page at all or in a particular manner. These messages can be sent to a user's talk page, placed on an article's talk page, or in hidden text that would not be missed if an editor attempts to edit the article or section. No editor may unilaterally take charge over an article or part of an article by sending no-edit orders.

There are some no-edit orders that are acceptable. For example, if a consensus has already been formed regarding a topic, and a single editor has constantly stubbornly defied the ruling, politely discussing this one-on-one on the user's talk page is acceptable."
    1.6 Wikihounding: "Wikihounding is the singling out of one or more editors, and joining discussions on multiple pages or topics they may edit or multiple debates where they contribute, to repeatedly confront or inhibit their work. This is with an apparent aim of creating irritation, annoyance or distress to the other editor. Wikihounding usually involves following the target from place to place on Wikipedia."
    1.7 Use of hidden text: "Some unacceptable uses are:

    Telling all other editors not to edit the page
    Telling others not to remove a section of the article, as if the section were written in stone
    Telling others that a page should not be proposed for deletion, when this may be doubted by others
    Writing new guidelines that apply specifically to the page and branding them as "policy." In the past, policies that have been proposed for a single article have failed to attain a consensus."
    1.8 Real life threats: "The Wikimedia Foundation, if need be, will investigate or arrange for law enforcement to investigate threats of violence."
"

============================================================
https://en.wikipedia.org/wiki/Wikipedia:Requests_for_page_protection

"Full protection is used to stop edit warring between multiple users or to prevent vandalism to high-risk templates; semi-protection and pending changes are usually used only to prevent IP and new user vandalism (see the rough guide to semi-protection); and move protection is used to stop pagemove revert wars. Extended confirmed protection is used where semi-protection has proved insufficient (see the rough guide to extended confirmed protection)."

=============================================================
https://en.wikipedia.org/wiki/Wikipedia:Offensive_material

"In original Wikipedia content, a vulgarity or obscenity should either appear in its full form or not at all;"

"A cornerstone of Wikipedia policy is that the project is not censored. Wikipedia editors should not remove material solely because it may be offensive, unpleasant, or unsuitable for some readers. However, this does not mean that Wikipedia should include material simply because it is offensive, nor does it mean that offensive content is exempted from regular inclusion guidelines. "

=============================================================
https://en.wikipedia.org/wiki/Wikipedia:Neutral_point_of_view

"This page in a nutshell: Articles must not take sides, but should explain the sides, fairly and without editorial bias. This applies to both what you say and how you say it."
"This policy is non-negotiable, and the principles upon which it is based cannot be superseded by other policies or guidelines, nor by editor consensus. "

"Achieving what the Wikipedia community understands as neutrality means carefully and critically analyzing a variety of reliable sources and then attempting to convey to the reader the information contained in them fairly, proportionately, and as far as possible without editorial bias"
"Wikipedia aims to describe disputes, but not engage in them."
" Editors, while naturally having their own points of view, should strive in good faith to provide complete information, and not to promote one particular point of view over another. As such, the neutral point of view does not mean exclusion of certain points of view, but including all verifiable points of view which have sufficient due weight."

"As a general rule, do not remove sourced information from the encyclopedia solely on the grounds that it seems biased. Instead, try to rewrite the passage or section to achieve a more neutral tone."

"Remove material only where you have a good reason to believe it misinforms or misleads readers in ways that cannot be addressed by rewriting the passage."

"The best name to use for a topic may depend on the context in which it is mentioned; it may be appropriate to mention alternative names and the controversies over their use, particularly when the topic in question is the main topic being discussed."
"Try not to quote directly from participants engaged in a heated dispute; instead, summarize and present the arguments in an impartial tone."

==============================================================
Filters manual tags evaluation

Following filter categories have been identified (sometimes, a filter was labeled with more than one tag):

- Vandalism
  - hoaxing
  - silly vandalism (e.g. repeating characters, inserting swear words)
  - spam
  - sockpuppetry
  - long term abuse
  - harassment/personal attacks
    - doxxing
    - impersonation
  - trolling
  - copyright violation

  Labeled along the vandalism typology (check above)
  - link vandalism
  - abuse of tags
  - username vandalism
  - image vandalism
  - avoidant vandalism
  - talk page vandalism
  - page move vandalism
  - template vandalism
  - vandalbots

  Kind of similar:
  - seo
  - stockbroker vandalism
  - biased pov
  - self promotion
  - conflict of interest

Inbetween
- edit warring
- political controversy
- politically/religiously motivated hate

- Good faith
  - bad style ("unencyclopedic edits" e.g. citing a blog or mentioning a hypothetical future album release)
  - lazyness


- Maintenance
  - bugs
  - wiki policy (compliance therewith)
  - test filters


A lot of filters are disabled/deleted bc:
* they hit too many false positives
* they were implemented to target specific incidents and these vandalism attempts stopped
* they were tested and merged into other filters
* there were too few hits and the conditions were too expensive

Multiple filters have the comment "let's see whether this hits something", which brings us to the conclusion that edit filter editors have the right and do implement filters they consider necessary


================================================================
https://en.wikipedia.org/w/api.php?action=help&modules=main

action

    Which action to perform.

    abusefiltercheckmatch
        Check to see if an AbuseFilter matches a set of variables, an edit, or a logged AbuseFilter event.
    abusefilterchecksyntax
        Check syntax of an AbuseFilter filter.
    abusefilterevalexpression
        Evaluates an AbuseFilter expression.
    abusefilterunblockautopromote
        Unblocks a user from receiving autopromotions due to an abusefilter consequence.

================================================================
https://en.wikipedia.org/wiki/Wikipedia:Database_download


================================================================
https://stats.wikimedia.org/v2

To generate stats for different wiki projects

=====================================================================
Claudia: * A focus on the Good faith policies/guidelines is a historical development. After the huge surge in edits Wikipedia experienced starting 2005 the community needed a means to handle these (and the proportional amount of vandalism). They opted for automatisation. Automated system branded a lot of good faith edits as vandalism, which drove new comers away. A policy focus on good faith is part of the intentions to fix this.

=====================================================================

Toolforge links:
https://wikitech.wikimedia.org/wiki/Help:Toolforge#Troubleshooting_2
https://wikitech.wikimedia.org/wiki/Help:Toolforge#Contact
https://wikitech.wikimedia.org/wiki/Analytics#Contact
https://www.mediawiki.org/wiki/Toolserver:Main_Page

Abuse Filter Project on Phabricator
https://phabricator.wikimedia.org/project/view/217/

Abuse Filter git repo
https://gerrit.wikimedia.org/r/#/c/mediawiki/extensions/AbuseFilter/+/489705/3/includes/AbuseFilter.php

Evidence abuse filter history still exists
https://en.wikipedia.org/wiki/Special:AbuseFilter/history

====================================================================

Get hold of abuse_filter_history

https://gerrit.wikimedia.org/r/plugins/gitiles/operations/puppet/+/refs/heads/production/modules/profile/templates/labs/db/views/maintain-views.yaml
https://gerrit.wikimedia.org/r/plugins/gitiles/mediawiki/extensions/AbuseFilter/+/refs/heads/master/abusefilter.tables.sql#63
https://phabricator.wikimedia.org/T123978

#wikimedia-cloud
"
there'll be such a table, I don't think there's a view for it to expose it to the public
it's not in https://gerrit.wikimedia.org/r/plugins/gitiles/operations/puppet/+/refs/heads/production/modules/profile/templates/labs/db/views/maintain-views.yaml
I don't know enough about AbuseFilter to know how easy it would be to write a view for this table
https://gerrit.wikimedia.org/r/plugins/gitiles/mediawiki/extensions/AbuseFilter/+/refs/heads/master/abusefilter.tables.sql#63
you'd probably need to respect afh_deleted, and join against abuse_filter to check filter visibility?
in order to expose a view?
aha: https://phabricator.wikimedia.org/T123978
yes
"

===================================================
https://en.wikipedia.org/wiki/Wikipedia:No_original_research

"Wikipedia articles must not contain original research. The phrase "original research" (OR) is used on Wikipedia to refer to material—such as facts, allegations, and ideas—for which no reliable, published sources exist.[a]"
"(This policy of no original research does not apply to talk pages and other pages which evaluate article content and sources, such as deletion discussions or policy noticeboards.) "
"The prohibition against OR means that all material added to articles must be attributable to a reliable, published source, even if not actually attributed.[a]"

===================================================
https://en.wikipedia.org/wiki/Wikipedia:Harassment
https://en.wikipedia.org/w/index.php?title=Wikipedia:Harassment&oldid=886343748

"This page in a nutshell: Do not stop other editors from enjoying Wikipedia by making threats, repeated annoying and unwanted contacts, repeated personal attacks, intimidation, or posting personal information."
"Usually (but not always), the purpose is to make the target feel threatened or intimidated"

"Edits constituting harassment will be reverted, deleted, or suppressed, as appropriate, and editors who engage in harassment are subject to blocking. "

"The prohibition against harassment applies equally to all Wikipedians. It is as unacceptable to harass a user with a history of inept or disruptive behavior as it is to harass any other user."

Hounding
"Hounding on Wikipedia (or "wikihounding") is the singling out of one or more editors, joining discussions on multiple pages or topics they may edit or multiple debates where they contribute, to repeatedly confront or inhibit their work. "
" Hounding usually involves following the target from place to place on Wikipedia. "

Doxxing
"Posting another editor's personal information is harassment, unless that person has voluntarily posted his or her own information, or links to such information, on Wikipedia. Personal information includes legal name, date of birth, identification numbers, home or workplace address, job title and work organisation, telephone number, email address, other contact information, or photograph, whether such information is accurate or not. Posting such information about another editor is an unjustifiable and uninvited invasion of privacy"

"attempted outing is sufficient grounds for an immediate block. This applies to the personal information of both editors and non-editors. "

" When reporting an attempted outing take care not to comment on the accuracy of the information. "

"Dredging up their off-site opinions to repeatedly challenge their edits can be a form of harassment"

"Nothing in this policy prohibits the emailing of personal information about editors to individual administrators, functionaries, or arbitrators, or to the Wikimedia Foundation, when doing so is necessary to report violations of confidentiality-sensitive policies"
" Only the minimum information necessary should be conveyed and the minimum number of people contacted. Editors are warned, however, that the community has rejected the idea that editors should "investigate" each other."

"Posting links to other accounts on other websites is allowable in specific situations (but see also Wikipedia:Linking to external harassment)."

"Also, if individuals have identified themselves without redacting or having it oversighted, such information can be used for discussions of conflict of interest (COI) in appropriate forums"

Private correspondence
"There is no community consensus regarding the posting of private off-wiki correspondence. "
"The Wikipedia Arbitration Committee once stated as an editing principle that "In the absence of permission from the author (including of any included prior correspondence) or their lapse into public domain, the contents of private correspondence, including e-mails, should not be posted on-wiki""

User space harassment
"A common problem is harassment in userspace. Examples include placing numerous false or questionable "warnings" on a user's talk page, restoring such comments after a user has removed them, placing "suspected sockpuppet" and similar tags on the user page of active contributors, and otherwise trying to display material the user may find annoying or embarrassing in their user space."

Off Wiki harassment
"Inappropriate or unwanted public or private communication, following, or any form of stalking, when directed at another editor, violates the harassment policy."

"In alignment with the protection of editors from harassment described throughout the rest of this policy, edits that harass living or recently deceased people who are not members of the Wikipedia community are also prohibited. Per the oversight policy, harassing content will be deleted or suppressed."

"In case of problems administrators have exactly the same right as any other user to decline or withdraw from a situation that is escalating or uncomfortable, without giving a reason, or to contact the Arbitration Committee if needed."

"Users who insist on a confrontational style marked by harassment and/or personal attacks are likely to become involved in the dispute resolution process, and may face serious consequences such as blocks, arbitration, or being subjected to a community ban. Harassment negatively affects editor retention. "

=====================================================
https://en.wikipedia.org/wiki/Wikipedia:Responding_to_threats_of_harm

emergency response page

========================================================
https://en.wikipedia.org/wiki/Wikipedia:Assume_good_faith
https://en.wikipedia.org/w/index.php?title=Wikipedia:Assume_good_faith&oldid=889253693

"It is a generally accepted standard that editors should attempt to follow, though it is best treated with common sense, and occasional exceptions may apply. Any substantive edit to this page should reflect consensus. When in doubt, discuss first on the talk page."

"This page in a nutshell:

* Unless there is clear evidence to the contrary, assume that people who work on the project are trying to help it, not hurt it.
* If criticism is needed, discuss editors' actions, but avoid accusing others of harmful motives."
//especially the 2nd one is interesting, bc harmful motives are what distinguishes good faith (but nevertheless disruptive edits) from vandalism

"Most people try to help the project, not hurt it. If this were untrue, a project like Wikipedia would be doomed from the beginning. "
"This guideline does not require that editors continue to assume good faith in the presence of obvious evidence to the contrary (e.g. vandalism)"
"Assuming good faith does not prohibit discussion and criticism. Rather, editors should not attribute the actions being criticized to malice unless there is specific evidence of such. "

"When disagreement occurs, try to the best of your ability to explain and resolve the problem, not cause more conflict, "

"When doubt is cast on good faith, continue to assume good faith yourself when possible. Be civil and follow dispute resolution procedures, "

" If you wish to express doubts about the conduct of fellow Wikipedians, please substantiate those doubts with specific diffs and other relevant evidence, so that people can understand the basis for your concerns. "

"Although bad conduct may seem to be due to bad faith, it is usually best to address the conduct without mentioning motives"

"Be careful about citing this principle too aggressively. Just as one can incorrectly judge that another is acting in bad faith, so too can one mistakenly conclude that bad faith is being assumed; exhortations to "Assume Good Faith" can themselves reflect negative assumptions about others." lol

Good faith and newcomers
"It is important to be patient with newcomers, who will be unfamiliar with Wikipedia's culture and rules, but may nonetheless turn out to be valuable contributors. "
"Many new users who lack an intuitive grasp of Wikipedia customs are gradually brought around, once the logic behind these customs becomes clearer to them. "

======================================================================
https://en.wikipedia.org/wiki/Wikipedia:Counter-Vandalism_Unit/Vandalism_studies

The Vandalism Studies project is a portion of the Counter-Vandalism Unit designated to conduct research related to unconstructive edits on Wikipedia.

There are 3 Studies:
Study 3 (suggest ideas) 	Discussion ongoing..., but planned for November
Obama article study (talk page) 	☒ Stale
Study 2 (talk page) 	☒ Not done and not likely to be done
Study 1 (talk page) 	 Done

====================================================================
https://en.wikipedia.org/wiki/User:Angela/Vandalism_study

User:Angela conducted a vandalism study on her own user page.

"Vandalism studies/Study1 found that almost all vandalism (97%) is made by unregistered users. Looking at vandalism on my own user page, I find a very different result. Almost half of the vandalism is made by registered users. "

=======================================================================
https://en.wikipedia.org/wiki/User:Colonel_Chaos/study

Another user conducted vandalism study.
This one vandalises featured articles (different types of vandalism) and looks at the times to revert.

========================================================================
https://en.wikipedia.org/wiki/Wikipedia:Wikipedia_Signpost/2009-06-22/Vandalism

And yet another user conducted study.
This one is "Loren Cobb (User:Aetheling) holds a Ph.D. in mathematical sociology and is a research professor in the Department of Mathematical and Statistical Sciences at the University of Colorado Denver."

studies the survival function of vandalism

The two primary results from this study are: (a) the median time to correction is down to four minutes, and (b) some subtle forms of vandalism still persist for months and even years.

suggest using median rather than mean time to correction

100 randomly selected articles
"All data collection occurred on 2009-06-11"

"
Results
    Of the 100 articles, fully 75 had never been vandalized.
    Of the 25 articles that were vandalized at least once, the most recent such instance of vandalism was eventually corrected in 23 articles.
    In five (20%) of the vandalized articles, the most recent instance of vandalism was corrected in less than one minute. A further four instances were corrected in less than two minutes.
    The median time to correction was four minutes.
    Two articles were found to have suffered vandalism that was never corrected. One of these was a subtle act of vandalism that was committed on 2007-02-23, and still not detected by the date of the study, 2009-06-11.
"

=======================================================================================
https://en.wikipedia.org/wiki/Wikipedia_talk:Counter-Vandalism_Unit/Vandalism_studies/Archive_1

"
Sources of vandalism

Vandalism comes from:

    Anonymous IP addresses
    Newly registered users (typically vandal-only accounts)
    Disruptive editors (limited but some constructive work)
    Trolls, sock puppets, etc. - disgruntled "power users"
"

=======================================================================================
https://en.wikipedia.org/wiki/Wikipedia:STiki

"STiki is a tool available to trusted users that is used to detect and revert vandalism, spam, and other types of unconstructive edits made at Wikipedia. "

"STiki chooses edits to show to end users; if a displayed edit is judged to be vandalism, spam, etc., STiki streamlines the reversion and warning process. STiki facilitates collaboration in reverting vandalism; a centrally stored lists of edits to be inspected are served to STiki users to reduce redundant effort."

"STiki may only be used by editors with a Wikipedia account. Additionally, the account must meet some qualifications to reduce the probability of users misidentifying vandalism."

"The account must have any one of: (1) the rollback permission/right, (2) at least 1000 article edits (in the article namespace, not to talk/user pages), or (3) special permission via the talk page. We emphasize that users must take responsibility for their actions with STiki. "

"After login, users primarily interact with the GUI tool by classifying edits into one of four categories:
vandalism
good faith revert
pass
innocent "
//interestingly, at the initial tool presentation~\cite{WestKanLee2010}, there was no "good faith" option. It seemed to have been added quite promptly after though, since the screenshot of the tool on the page has the button already and claims to have been made on 28 February 2010

"Uncertainty over malice: It can be tricky to differentiate between vandalism and good-faith edits that are nonetheless unconstructive. "

============================================================================
https://en.wikipedia.org/wiki/Wikipedia:STiki/leaderboard

"Above all else, it should be emphasized that STiki is not a competition."
//compare also~\cite{HalRied2012} who warn against Wikipedia becoming gamified with vandals being "monsters"

" STiki users who operate the tool recklessly in the hope of inflating their statistics are not helping themselves or the project "

=========================================================================
https://netzpolitik.org/2019/npp-170-momentum-memes-und-grossproteste/
30.03.2019 um 09:00 Uhr

"europaweiten Aktionstag am 23. März, bei dem zusammengerechnet mehr als 170.000 Menschen auf die Straße gingen"

"Sie ließen sich nicht abbringen durch Politiker, die ihren Protest als „von Google gekauft“ diskreditierten"

=========================================================================
https://www.heise.de/newsticker/meldung/Wir-sind-die-Bots-ueber-1000-demonstrieren-gegen-Artikel-13-4311105.html
16.02.2019 18:04 Uhr

"Am Freitag bezeichnete der Europaabgeordnete Sven Schulze die Protest-Mails, die bei ihm eingingen, auf Twitter sogar als eine von Google gesteuerte Fake-Kampagne, weil viele der Absender Gmail-Adressen verwendeten."
https://twitter.com/statuses/1096445520770404352

"So soll am Anfang kommender Woche die Online-Petition gegen Upload-Filter, die bereits über 4,7 Millionen Unterschriften gesammelt hat, an Vertreter der Bundesregierung übergeben werden."

=========================================================================
https://netzpolitik.org/2019/reaktionen-auf-urheberrechtsreform-schwarzer-tag-fuer-die-netzfreiheit/
27.03.2019 um 11:04 Uhr

"Nach einer Protestkampagne, die in den letzten Wochen immer mehr an Fahrt aufnahm und am vergangenen Wochenende mit Protesten von fast 200.000 Menschen ihren Höhepunkt fand, kam gestern mit der Abstimmung im Europaparlament die Ernüchterung: Das Parlament nahm die EU-Urheberrechtsreform an."

"Die linke EU-Abgeordnete Martina Michels kommentierte:

    Naive Technikgläubigkeit und härtester Lobbyismus von Springer & Co. haben dazu geführt, dass wir jetzt mit einer Richtlinie konfrontiert sind, die die Meinungsfreiheit bedroht, die Medienpluralität einschränkt und den meisten Kreativen keinen Cent mehr bringen wird."

"In eine ähnliche Richtung kommentierte Diego Naranjo vom Dachverband digitaler Bürgerrechtsorganisatioen EDRi:

    Enttäuschenderweise hilft die neu angenommene Richtlinie nicht kleinen unabhängigen Urhebern, sondern stärkt Technologiegiganten. Noch alarmierender ist, dass Artikel 13 der Richtlinie einen gefährlichen Präzedenzfall für Internetfilter und automatisierte Zensurmechanismen schafft – in der EU und weltweit."

"Tabea Rößner, Netzpolitische Sprecherin der Grünen im Bundestag, sagt:

    Die EU-Richtlinie schafft allerdings hauptsächlich Rechtsunsicherheit zulasten kleiner Plattformen, über denen künftig das Haftungsrisiko schwebt. Plattformen werden eher rigoros löschen, bevor sie das Risiko eines Gerichtsprozesses auf sich nehmen. Für die großen Plattformen, auf die die Regelung abzielt, ist das aufgrund ihrer Marktmacht kein Problem. Im Gegenteil, für sie wird sogar noch ein neues Geschäftsmodell mit ihren bereits entwickelten Filtertechnologien geschaffen. Diese sind allerdings sehr fehleranfällig und können beispielsweise Satire oder Zitate nicht von einer Urheberrechtsverletzung unterscheiden."

=========================================================================
https://twitter.com/Senficon/status/1110527108349087744
"Die allerletzte Chance, #Uploadfilter & #Artikel13 zu stoppen: Wenn die Bundesregierung ihre Zustimmung im Rat zurückzieht, kann die Reform nicht in Kraft treten (Abstimmung im @EUCouncil vorauss. 9. April)." -- Julia Reda

=========================================================================
https://netzpolitik.org/2019/die-protestbewegung-hat-gewonnen/
26.03.2019 um 15:07 Uhr
"In Deutschland haben in den vergangenen Wochen 300.000 Menschen gegen die Klimapolitik und fast 200.000 gegen die Netzpolitik der Bundesregierung demonstriert. Die Jugend entdeckt die Straße, die selbst gemalten Schilder und den Protest. Und das ist ein gutes, ein hoffnungsvolles Zeichen für die Zukunft. "

"Am Ende war es das zu langsame Überschwappen der Proteste in andere Länder, ein paar Tausend Leute in Paris oder Rom die fehlten, um die Reform zu kippen."

"Die Konservativen im Parlament wollten sogar die Abstimmung vorziehen, weil sie bemerkten, dass die Proteste mit jedem Tag stärker wurden. Die Kommission beschimpfte den Protest als Mob, Politiker verunglimpften die Demonstranten als Bots, das Parlament spielte Werbevideos für die Reform auf Twitter ab und der CDU-Mann Caspary griff gar in die Diktatoren-Argumentekiste – und bezeichnete die Demonstrierenden als gekauft."

=========================================================================
https://netzpolitik.org/2019/chance-verpasst-dieses-urheberrecht-bleibt-in-der-vergangenheit-stecken/
26.03.2019 um 12:54 Uhr

"Dabei sollten wir uns eigentlich freuen: Wir sind Urheber, wir sind Verleger und laut den Verbänden, die uns vertreten, sollten jetzt goldene Zeiten für uns anbrechen. Das Geld von Google und Co. wird nur so zu uns fließen! Allerdings bezweifle ich, dass uns diese Reform mehr Geld einbringen wird. Und selbst wenn, die Kollateralschäden für eine demokratische Netzöffentlichkeit sind viel größer als den Befürwortern dieser Reform mit ihrem Tunnelblick auf wirtschaftliche Eigeninteressen bewusst sein dürfte."

"Alte und reformunfähige Verwertungsgesellschaften verfügen weiterhin über die Deutungshoheit, obwohl weite Teile der neuen Urheber in diesen aus vielen Gründen keine Heimat finden werden. Kein Wunder, dass hier massive Konflikte entstehen, zwischen neuen und alten Verwertungsformen, zwischen denen, die das Internet in ihre Verwertungsstrategie eingebunden haben und denjenigen, die hoffen, dass die alte Welt weiter läuft wie bisher."

" Beim Leistungsschutzrecht für Presseverleger hieß es lange Zeit, die Einnahmen würden zwischen Verlegern und Urhebern geteilt. Beim letzten Kompromiss sind die Ansprüche der Urheber dann „zufällig“ zugunsten der Verleger und Medienkonzerne rausgeflogen."

"Hier hat sich die alte Medienwelt nach einem 15 Jahre währenden Kreuzzug gegen Youtube nochmal durchgesetzt. Der Kollateralschaden ist, dass sie damit ihre alte Welt mit ihren alten Werkzeugen für weitere 20 Jahre in Stein gemeißelt haben – ohne Antworten für eine sich entwickelnde neue Welt mit neuen Produzenten zu geben."

"Mit der Schrotflinte auf Youtube geschossen, halbes Netz mitgetroffen"

"Das Problem von Anfang an bei dieser Reform: Man schießt mit der Schrotflinte auf die großen Plattformen, die Uploadfilter-Systeme schon mehr und weniger schlecht einsetzen und trifft so vor allem zahlreiche kleine Plattformen, die bislang ohne Filter-Systeme ausgekommen sind.  "
"Sie müssen künftig ebenfalls filtern, auf Uploads ihrer Nutzer verzichten oder den Dienst ganz einstellen."

"Wir sind gespannt, wann Verlagsjustiziare feststellen, dass ihre Angebote mit Nutzerinhalten auch unter diese Definition fallen, sie dafür haften könnten und auch Uploadfilter installieren müssen."

"Wir sind mittlerweile alle Urheber. Diese Reform geht aber immer noch davon aus, dass es nur wenige professionelle Urheber gibt, für die es einen Rechtsrahmen braucht. Das ist nur aus einer Perspektive zu schaffen, die das Internet noch eher aus der Zeitung kennt."

"Diese Reform hat keine Antwort darauf gefunden, dass private Nutzer bei nicht-kommerziellen Alltagshandlungen Abmahnungen riskieren, wenn sie etwa Memes auf ihrer Webseite veröffentlichen "

"Diese Reform hat keine Antwort darauf, dass legitime Nutzungsformen ständig mit einem Bein in einer Urheberrechtsverletzung stehen. Da reicht es auch nicht, wenn im Text steht, dass die Upload-Filter bitte lieb zu Memes sein sollten. Das ist kein Rechtsanspruch wie ein Recht auf Remix im Rahmen einer Schrankenregelung, die man hätte einbauen können."

"Profitieren werden die Großen, seien es die Plattformen oder Medienkonzerne. Darunter leiden werden die Kleinen"
"Urheber müssen mehr mit Verwertern teilen, ihre Stellung wird aber nicht wirklich verbessert. "
"Neue Ausnahmen für Fair Use, für Remix oder für Bagatellnutzungen wird es weiterhin nicht geben."

============================================================================
https://netzpolitik.org/2019/internetpioniere-warnen-vor-uploadfiltern-im-kampf-gegen-terror/
03.04.2019

Upload filters should be useful not only to enforce copyright, but apparently in the war on terror -.-
internet pionerees sign an open letter against the idea

"Uploadfilter und kurze Löschfristen

Der im Herbst vorgestellte Entwurf der EU-Kommission zielt darauf ab, mutmaßlich terroristische Inhalte aus dem Netz zu entfernen. Dies soll die Radikalisierung von Nutzern eindämmen und in einem weiteren Schritt dazu führen, dass weniger Terroranschläge verübt werden, führte jüngst die Justizkommissarin Věra Jourová gegenüber netzpolitik.org aus.

Umsetzen will die Kommission das mit einer engmaschigen Zensurinfrastruktur, die für alle in Europa tätigen Online-Dienste gelten soll. Innerhalb von nur einer Stunde müssten sie auf einen Hinweis von Behörden reagieren und mutmaßlich terroristische Inhalte auf ihren Plattformen sperren oder löschen.

Zudem sieht der Entwurf „proaktive Maßnahmen“ vor, also Uploadfilter, mit denen die Anbieter solche Inhalte direkt nach dem Hochladen erkennen, einschätzen und gegebenenfalls entfernen sollen. Im Falle von Verstößen kann der Einsatz von Uploadfiltern angeordnet werden, hohe Geldstrafen sollen für Abschreckung sorgen."

"Ein ungenaue Definition von „terroristischen Inhalten“ würde durch ihre „extreme Bandbreite“ dazu führen, dass im Zweifel zu viel als zu wenig gelöscht würde, auch journalistische oder wissenschaftliche Inhalte. "

"Die kaum umsetzbare Löschfrist würde zudem insbesondere kleinere Anbieter hart treffen und neben befürchteten „Overblocking“-Effekten auch „große multinationale Plattformen“ stärken"

" Darüber hinaus riskiere der Einsatz der bis auf Weiteres mangelhaften Technologie das Unterdrücken wichtiger Nachrichten, etwa Zeitungsartikel oder Berichterstattung aus Kriegsgebieten."

==================================================================================
https://en.wikipedia.org/wiki/Wikipedia:Administrator_intervention_against_vandalism

"This page is intended only for reports about active, obvious, and persistent vandals and spammers. "
"Don't forget that blocking is a last resort;"
"if there is a less restrictive way to solve the problem, please pursue it."

"Checklist:
1.The edits of the reported user must be obvious vandalism or obvious spam.
2.Except for egregious cases, the user must have been given enough warning(s) to stop their disruptive behavior.
3.The warning(s) must have been given recently and there must be reasonable grounds to believe the user(s) will further disrupt the site in the immediate future.
4.The appropriate vandal template should be used if you decide that a report should be filed.
  * For logged-in users only: It should look like: *{{vandal|Example user}} concise reason e.g. vandalised past 4th warning. ~~~~
  * For IP users only: it should look like: *{{IPvandal|192.0.2.16}} concise reason e.g. vandalised past 4th warning. ~~~~
5.Requests for further sanctions against a blocked user (e.g., talk page, e-mail blocks) should be made at Wikipedia:Administrators' noticeboard/Incidents, as a bot automatically removes reports made here about accounts that are blocked.
6.Reports of sockpuppetry should be made at Wikipedia:Sockpuppet investigations unless the connection between the accounts is obvious and disruption is recent and ongoing."

There are user reported and bot reported section

================================================================================
https://en.wikipedia.org/wiki/Wikipedia:Guide_to_administrator_intervention_against_vandalism

"This help page is a how-to guide.
It details processes or procedures of some aspect(s) of Wikipedia's norms and practices. It is not one of Wikipedia's policies or guidelines, as it has not been thoroughly vetted by the community."

"This guide is intended to prevent invalid reports that needlessly take up administrators' time,"

"Administrator intervention against vandalism is for reporting users currently engaging in persistent, clear vandalism or spamming."

"Report only clear violations that do not require discussion or detailed explanations. If there is a reasonable chance that something may not be vandalism, it probably should be reported elsewhere, or not at all. "

"Vandals should always receive enough warnings before being reported unless they are vandalism-only accounts. What constitutes "enough" is left to your best judgment. "

"Consider the user's past edits, warnings and blocks, the severity of their offense, the likelihood that their edit(s) could have been made in error or otherwise in good faith, and the type of user in question (IP addresses may be shared or dynamic, and old warnings could be irrelevant to the current situation). "

!!"Blocking is meant to be preventive, not punitive."
"Therefore, the user must show a strong likelihood of making further disruptive edits despite warnings and being informed of the blocking policy. "
"Always give a final warning, and report only if the vandal has vandalized at least once after that."

"When reporting at AIV is not appropriate

    Any vandal who hasn't been warned properly should not be reported.
    Violations of the three-revert rule should be reported to Wikipedia:Administrators' noticeboard/3RR
    If you suspect someone of sockpuppetry, file a report at Wikipedia:Sockpuppet investigations. Obvious and malicious sockpuppets engaging in vandalism may be reported to AIV. A link to the sockpuppetry report should be included in the reason for reporting.
    Usernames that clearly violate username policy should be reported at Wikipedia:Usernames for administrator attention.
    Do not report edit wars or other disruptive behavior that doesn't fit the description at Wikipedia:Vandalism. These can instead be reported to the administrators' noticeboard or its incidents subpage. AIV deals mainly with obviously malicious edits that require no discussion; complex cases should usually be referred to other boards."

=================================================================================
https://en.wikipedia.org/wiki/Wikipedia_talk:Edit_filter/Archive_1

"What is the intent of the filter? We can't possible expect it to block all malicious edits. That's absurd. Do we expect it to block the sort of edits now automatically detected by bots? in other words, do we expect to lower the workload for vandal patrollers and return WP to a state where most vandal patrolling is done via Recent changes? Or do we expect the filter to block more sophisticated attempts at vandalism?"
// unfortunately, there's no direct answer to this

"Now, suppose that the configuration is public. I'm still not sure I see the point. Anti-vandal bots are well suited for this work. They're more flexible than anything you can write in an extension. Their code can be changed more easily, because it can be tweaked live by someone a) dedicated to the specific project and b) without root database access. Their configuration can also probably be changed more easily. There's a good chance that any change to this configuration will require endless community discussion and bickering and polls and consensus and who knows what. Look at how enwiki deals with questions like changing autoconfirmed criteria, which is more or less a much narrower version of the same thing. A bot operator, on the other hand, can make a change immediately if it appears to be a good idea. If the community then objects, of course, they can be forced to change it back or have their bot blocked.

So overall, I oppose this proposal unconditionally if any steps are taken to hide the configuration, and seriously question whether it's a good or useful idea in any event. —Simetrical (talk • contribs) 18:50, 24 June 2008 (UTC)

    So you're saying if it performs the same job as an anti-vandal bot with a low or no false positive rate and ALL of the heuristics are public you wouldn't support the project? Let's presume it just blocks edits that include "PENNIISSSSSSSS!!!!!!!" by IP's on articles. The only impact there is removing the workload from editors who are currently patrolling with bots. Protonk (talk) 19:33, 24 June 2008 (UTC)

        I might be fine with that, yes. Assuming all configuration and logs were public, I would not necessarily oppose. I might or might not, depending on details, but probably not very strongly in any event. —Simetrical (talk • contribs) 22:00, 25 June 2008 (UTC)
"

// sofar, community members participating in the discussion are not sure either what the difference from bots and the need for filters is.
// okay, so according to Protonk 24.08.2008 19:33, filters remove workload from editors patrolling with bots; so bot patrolling is viewed still as a somewhat "manual" process. but in what sense are then edit filters different? there's still the need for human editors to update them, the same way as it is with bots, if vandalism patterns change..

"Simetrical, I don't think you quite understand what this extension is all about. It's not about targetting bog-standard vandalism. It's not about targetting any significant proportion of vandalism, but about targetting very specific behaviours, like certain page-move vandals, and so on. You can read more about this by reading my comment prior to yours, and on the accompanying page to this talk page. It's about finding a real solution to vandalism, doing more intelligent things than increase the requirements for moving pages, and reactive checkusers. And so, by targetting specific behaviours, like moving userspace pages as your tenth edit, to some long title like User:Werdna is a big meanie, and an asshole, and he made my mummy cry, we can eliminate the 'shotgun' method of simply requiring twenty edits and seven days rather than four edits and ten days to move pages. It gives us the opportunity to lighten the hard restrictions we put on all users, instead placing tougher restrictions on those who are actually causing the problem. In comparison to this goal, certainly a worthy one, I cannot see the merit in your argument, which, to me, seems largely ideological, with very little basis in the practicalities and mechanics of dealing with these sorts of problems. — Werdna talk 10:40, 25 June 2008 (UTC)"
// "targetting very specific behaviours..."

answer to previous comment:
"You say that it's to be used for targeting very specific behaviors, but you aren't going to be able to enforce that once it's enabled. It's going to be used for whatever people want to use it for, and depending on who controls the buttons, it may or may not be restricted to your original idea. You should know perfectly well how commonly software features are used for totally different things from their intended purpose. And if the configuration isn't public, we won't even know what it's being used for.

As for how well it addresses your goal ― even leaving aside, for a moment, how it will inevitably be used to further other goals ― you've failed to make it clear to me how it's any more useful than an admin-bot. Your rationale for the proposal does not even include the word "bot" except when listing user groups. Anyone, including you, could run a bot to do exactly the same, with the only difference being that all actions would be logged in the normal fashion, instead of in some secret log viewable only to sysops or whatever. You'd also have to get it admin status, but that should be no harder than getting this proposal approved, if there's any reason in people's decision-making here. (And if there's not, software workarounds for that aren't the way to deal with it.)
[...]
Your implication that this proposal is an alternative to stricter autoconfirmation is a false dichotomy. FlaggedRevs, for one, will greatly reduce the need for things like outright semi-protection, in a much more contributor-friendly fashion. Moreover, as I emphasize above, a bot could serve the same purpose, and if you want this enabled you should make it clear what its advantages are over an admin bot.
"

"    So as I understand it, this process is going to completely automate the role of the administrator. I say go for it! But the process should be open and transparent:

        If some sanction is imposed on you, then you should be able to see that it was imposed by an automated process, and likewise, if your edits are reverted, you should be able to see that it was an automated process that reverted them, and in any case, appeal to any administrator; and
        all administrators should have read-access to the rules. Why do you suppose that it is so hard to become an administrator? It's because it is a position of trust. Anybody who has earned that trust should be able to see what the rules say. I take a moderate position on security through obscurity, but I am not a big fan of it.

    The first point above I am sure is already part of the plan; the second is my point of view only. Please feel free to comment. Bwrs (talk) 20:11, 26 June 2008 (UTC)"

"Given the concerns of editors who, rightly, have concerns about the need for secrecy with this extension, I've decided to relax my proposed secrecy provisions, after considering the cost/benefit.

Therefore, with recent modifications to the software, I propose to require all filters to have an accurate short description of them, which will be publically visible, and included in block summaries, in the abuse log (which will be open for viewing to all users), and the relevant logs.

In addition, I've added a feature which allows specific filters to be hidden from public view. I intend to allow administrators to view all filters, except those which have been hidden (which would be visible only to those with permission to edit them). I am open to implementing a feature which allows unhidden filters to be disabled by administrators (but not edited in any other way), if this would make the feature more palatable to the community.

I hope that this will allay some of the concerns which have been put forward. — Werdna talk 13:17, 29 June 2008 (UTC) "
//more or less the current status

"This is something that needs to be discussed in more detail, but I can't see any reason why the right to edit the filters should not be a sysop permission. Admins already have access to a number of blacklists; there is nothing that the extension does (to my knowledge) that a bot-literate admin cannot already do, with the sole exception of invoking rights changes. I think that this extension has enormous potential, but it needs to be able to be adapted quickly to evolution in attack algorithms: tightly restricting access to it is not the best way to get full use of it, IMO. More (and separate) discussion needed. Happy‑melon 16:19, 29 June 2008 (UTC)
[...]
 If this ends up like oversight or checkuser, it's going to be almost useless (and the strongest wiki-cabal we've ever seen). Happy‑melon 10:54, 30 June 2008 (UTC)
"

"My concern is that it is not beyond some of our more dedicated trolls to "work accounts up" to admins, and then use them for their own purposes. Compare with, for instance, Runcorn, who developed an admin account, and used it to change blocks on tor. Compare with, for instance, the constant leaks to Wikitruth of deleted articles, and so on. — Werdna talk 00:50, 30 June 2008 (UTC) "
// my 2 cents: then is pretty much any technical protection useless; trolls can "work up" their account to get an "abusefilter-modify" permission as well

"is it possible to block an edit similar to the spam-block-fliter? "abusefilter-blockreason" appears to be something like that.
e.g. in the german wikipedia we have a user, which performs a lot (!) of unuseful edits like changing "das gleiche" ("the equal one") to "dasselbe" ("the same one"). however, would it be possible to prevent this user, who often uses open proxies, from making such nonsens?
áis it possible, to restrict the blocking to specific ip-ranges? -- seth (talk) 13:43, 6 July 2008 (UTC)

This is not the intended purpose of the extension, and the extension's code includes mechanisms which actively prevent the use of the extension to apply restrictions to particular users or pages. In particular, it is not possible to target specific IP ranges (as this presents a hazard of releasing private data). — Werdna • talk 10:15, 7 July 2008 (UTC) "
//this has changed as well though, hasn't it?

"For the record, by the way, I'm happy to proceed with a lower level of privacy than I originally intended, and, if it becomes a problem, we'll see about having another poll on that. I have to write a bit more of a paper trail now (gasp, revisions of it). — Werdna • talk 06:43, 8 July 2008 (UTC) "
"I know that my opinion isn't really the say so upon which this project rests ( :) ), but I do think this is a very good idea, divorced from the obscurity idea. Even if implemented as an elaborate "editprotected" type system (where dedicated vandals can avoid it with effort but 99% of IP vandals are stymied), we can see a huge ROI. I think that we get into serious diminishing returns as we attempt to eliminate the more persistent vandalism. Let's consider the huggle/twinkle userhours we save that can be diverted to other uses in just stemming IP vandalism. Protonk (talk) 04:17, 9 July 2008 (UTC) "
//so, someone hopes/understands that the extention will help to "reduce huggle/twinkle working hours"

"I encourage the development and implementation of automated defensive software to counter automated attacks on the integrity of the information in Wikipedia. One fast-typing goon or one automated attack can create so much mischief that it takes a dozen admins a long time to correct it. If there is a willingness on the part of Werdna to tweak the filters to reduce any false positives, I do not see a downside. Why would any new user find it necessary to do a great many page moves in a short time? Edison (talk) 04:56, 9 July 2008 (UTC) "
// "tweak the filters to reduce false positives", huh? sounds like matter of 2 hours.

So, slowly, I have the overall impression that the perceived problem is that blatant vandalism doesn't get reversed fast enough and occupies a lot of people
But it's so far not been clearly stated what exactly the problem is the extention is trying to solve and why it isn't addressed by the existing mechanisms.

"Small points of clarification: it's a not a bot, it's a proposed extension to the software."

"Well, yes, I'm not suggesting that rogue admins are a huge problem that needs this extension to be fixed. But, if it comes for free when we solve a problem that DOES exist, and IS causing serious damage (Grawp), then we're all the better for it. — Werdna talk 07:42, 24 June 2008 (UTC) "
// Werdna mentions Grawp several times when arguing in favour..
// According to next note (see below), grawp is "This would include characteristic pagemove vandalism (such as the user who we refer to as 'Grawp')"; however here https://en.wikipedia.org/wiki/Wikipedia_talk:Edit_filter/Archive_1#Biggest_problems_on_wikis_are_not_obvious_vandalisms, User:William_Ortiz gives a bunch of silly vandalism examples (such as misspelling, putting random, easily spotable, sometimes profane stuff in articles) as examples for "grawp"

So, to summarise once again. Problem is blatant vandalism, which apparently doesn't get reverted fast enough.
Human editors are not very fast in general and how fast it is solving this with a bot depends on how often the bot runs and what's its underlying technical infrastructure (e.g. I run it on my machine in the basement which is probably less robust than a software extension that runs on the official Wikipedia servers).

Discussion continues... and not everyone is convinced; (compare also "botophobia")
"No no no, we have more than enough problems with automatic responses for the existing bots, almost all of which I'd like to see scaled back considerably to require manual confirmation. We have over a hundred thousand reasonably active editors, and all they need is help in watching things. A report is quite another matter--certainly we can have reports of actions that merit investigation. I would never call them anythign so derogatory as "abuse reports" -- we AGF in our users. The mostthey will be is worth investigating, at various levels of priority. DGG (talk) 18:44, 23 June 2008 (UTC) "

a problem with bots; however, the editor uses this also as a argument against (blackbox) edit filters
"My initial reaction is much the same as DGG. Currently I don't really have a great deal of trust in the bot community. It's been unresponsive to many reasonable requests and done a bad job cleaning its own house. The community has, unfortunately, shown itself unfit for this level of responsibility: it has operated numerous unapproved adminbots--sometimes even to carry out completely pointless tasks--sometimes even over community objection--and sometimes failed to really communicate about it. "

"Firstly, I must note that the code of the extension itself will be public in the MediaWiki subversion repository, that the filters will be editable by anyone with the appropriate privileges, and that it would be very simple to disable any user's use of the filtering system, any particular filter, or, indeed, the entire extension. This is quite different from, say, an anti-vandalism adminbot. The code is private, and, in any case, too ugly for anybody to know how to use it properly. The code can only be stopped in real terms if somebody blocks and desysops the bot, and the bot is controlled by a private individual, with no testing.

In this case, there are multiple hard-coded safeguards on the false positive rate of individual filters, and the extension itself will be well-tested. In addition, I suggest that a strong policy would be developed on what the filters can be used to do, and on what conditions they can match on: I've developed a little system which tests a filter on the last several thousand edits before allowing it to be applied globally.

So I stress that, unlike unauthorised adminbots, there are numerous safeguards, checks and balances, which allow it to appropriately target behaviours such as blocks and desysoppings — if you don't intend to delete the main page, or mess around with moving several other users' userpages in quick succession as a new account, you probably don't have anything to worry about. — Werdna talk 07:41, 24 June 2008 (UTC) "
// so there's general discontent with bots (bot governance) that has motivated the creation of this extention?
// the argument "bots are poorly tested and this is not is absurd before anything has happened."
// when was the BAG and the formal process there created?

"I agree with Mr. Z-man: maybe preventing specific types of vandalism such as page-moves would be useful, but preventing too much vandalism will shift their attention to other types of more difficult to recognise vandalism. --Steven Fruitsmaak (Reply) 07:31, 9 July 2008 (UTC) "
// so "grawp" == page move vandalism?

"The filters would then be a work in progress, just like the rest of Wikipedia. I don't consider that this will diminish the tool's effectiveness, as surely most of the vandals targeted are idiots/bored kids, who are not going to take the time and effort to understand the detail of comprehensive filters in order to circumvent them? Rjwilmsi 07:00, 9 July 2008 (UTC)

    We're not targetting the 'idiots and bored kids' demographic, we're targetting the 'persistent vandal with a known modus operandi and a history of circumventing prevention methods' demographic. — Werdna • talk 07:28, 9 July 2008 (UTC)"
// who do the filters target!

"Many coming here seem to be under the impression that the purpose of this extension is to prevent common, garden-variety vandalism. This is not the case.

The abuse filter is designed with specific vandalism in mind. For instance, adding something about elephants because of what you saw on Stephen Colbert, or moving pages to 'ON WHEELS!', or whatever.

It is designed to target repeated behaviour, which is unequivocally vandalism. For instance, making huge numbers of page moves right after your tenth edit. For instance, moving pages to titles with 'HAGGER?' in them. All of these things are currently blocked by sekrit adminbots. This extension promises to block these things in the software, allowing us zero latency in responding, and allowing us to apply special restrictions, such as revoking a users' autoconfirmed status for a period of time.

It is not, as some seem to believe, intended to block profanity in articles (that would be extraordinarily dim), nor even to revert page-blankings. That's what we have ClueBot and TawkerBot for, and they do a damn good job of it. This is a different tool, for different situations, which require different responses. I conceive that filters in this extension would be triggered fewer times than once every few hours. — Werdna • talk 13:23, 9 July 2008 (UTC) "
// longer clarification what is to be targeted. interestingly enough, I think the bulk of the things that are triggered today are precisely the ones Werdna points out as "we are not targeting them".

// interestingly, here someone is also under the impression, these are the types of vandalism that will be targeted
"The filters Werdna proposes sound like obvious stuff everyone looks for anyway, which in its simplest form is if the account appears to be single-purpose or not. But this filter does not sound like it will find any subtle inaccuracies or subtle vandalism, just obvious vandalism."

//and one more user under the same impression
"The fact that Grawp-style vandalism is easily noticeable and revertible is precisely why we need this extension: because currently we have a lot of people spending a lot of time finding and fixing this stuff when we all have better things to be doing. If we have the AbuseFilter dealing with this simple, silly, yet irritating, vandalism; that gives us all more time to be looking for and fixing the subtle vandalism you mention. This extension is not designed to catch the subtle vandalism, because it's too hard to identify directly. It's designed to catch the obvious vandalism to leave the humans more time to look for the subtle stuff. Happy‑melon 16:35, 9 July 2008 (UTC) "
// and this is the most sensible explaination so far

"Indeed. Happy is correct. While subtle vandalism is more difficult to detect, it also has two other properties that make it a whole different matter: 1)Anyone can revert it easily. and 2)It is impossible to auto-block. In contrast, vandalism such as the above is difficult to revert and often an admin must be found to clean up. Also, this type of vandalism can be auto-blocked and I think it should because doing this will free the rest of us to clean up the subtle vandalism without worrying about hundreds of pages being moved in a few seconds. Thingg⊕⊗ 23:06, 9 July 2008 (UTC)"

"It's not really for that. The idea is to automatically deal with the very blatant, serial pagemovers for instance, freeing up human resources to deal with the less blatant stuff. SQLQuery me! 20:13, 9 July 2008 (UTC)"
// and one more user whose claims about the purpose of the filters partially clash with these of the core developer

"Do we really need this extension, after all we have our bots, our admins, and dedicated antivandal fighters... all this effort doesn't seem worth it, since the troublemakers are only a small percentage of the users. NanohaA'sYuriTalk, My master 00:54, 10 July 2008 (UTC)

    I'm inclined to disagree. I think the idea is that the filters act as a passive vandal-fighter, allowing less manpower to be eaten up on vandals, so (hopefully) more could be spent on other activities. I believe this could be a useful feature. -FrankTobia (talk) 01:17, 10 July 2008 (UTC)"
// still, someone has to maintain the filters; it may be not comparable to bot maintenance but it's still work that requires humans

"Automatic censorship won't work on a wiki. " // so, people already perceive this as censorship; user goes on to basically provide all the reasons why upload filters are bad idea (Interlanguage problems, no recognition of irony, impossibility to discuss controversial issues); they also have a problem with being blocked by a technology vs a real person

" This extension has zero latency: when an edit pattern like this is detected, the account will be blocked instantly, with no time to cause disruption." //which doesn't happen (any more); no block action is active on EN WP

"This extension has zero latency: when an edit pattern like this is detected, the account will be blocked instantly, with no time to cause disruption. Similarly, any questionable edit to the main page should incite a block-first-ask-questions-later approach. It's things like this that the extension is designed for, not to replace Clue Bot, VoABot, etc. I can make a personal promise that I will immediately remove any filter that triggers on the use of the word "nigger" - that would be foolish beyond belief. I could not agree more that secret settings are totaly incompatible with the wiki philosophy; but this extension is most definitely not. Happy‑melon 15:58, 2 July 2008 (UTC) "
// for the record, such filter exists and is active today, id=384

"All of these measures are already in force: the restriction for autoconfirmed status, which allows moving pages, is 4 days and ten edits. The reason we need this extension is because simple, sweeping heuristics like these just don't work. It's far too easy to get around them, and we can't make them more restrictive without an unacceptable level of false-positives. "
// but the filters are also simple heuristics with regexes at the end; getting around them is somewhat harder in the case of hidden filters when you cannot check the filter pattern directly

"But with this extension comes the ability to make more intelligent decisions. For instance, the most common way of bypassing the editcount restriction is by making trivial edits to a userpage: with this extension, we could set a pagemove filter that was independent of autoconfirmed status, requiring the ten edits to be to separate pages, to have provided substantial content, or not to have been reverted."

People have really strong opinions on (against) using filters to block editors.

". While blocks may generally not be punitive, blocks shouldn't made by machines, either. This is a very special case. "

Freedom of speech concerns
" Do we think that automatons have the judgement to apply prior restraint to speech? Do we think they should be allowed to do so even if they can be imbued with excellent judgement? We don't allow the government to apply prior restrain to speech, why would we build robots to do it? Laziness?

TheNameWithNoMan (talk) 17:39, 9 July 2008 (UTC)"

"This extension is designed to be used for vandals like the ones I link to below: intelligent, aggressive, destructive editors who aim to do as much damage as possible to the wiki and the people who edit it. It's not on the main field that we need this extension: the anti-vandal bots do a stunning job, and they do it at just the right level of efficiency. It's on the constant skirmishes against the handful of intelligent and malicious persistent vandals that we need every tool available just to stay ahead of their game. These are the editors who would, in the real world, be tried for crimes against humanity - the users who have demonstrated time and time again that all they want to do is do as much damage as possible. Do we allow ourselves to use prior restraint against them? No, because we don't need to - they've already done enough harm to condemn themselves many times over. Happy‑melon 21:07, 9 July 2008 (UTC)"

" And yet it's not good enough - within those few seconds, damage is caused that takes ten minutes or more to clear up. With this extension, we have zero latency: we can do the same job that's being done already, without having to have a user running a script on a paid-for server that has to fetch the block token every twenty seconds to make sure it can respond as fast as inhumanly possible; and we can do it instantly, cleanly, and without any fuss"

Happy-melon explains again what kind of vandalism the filters are supposed to target (in their view):
"I think a lot of people misunderstand the speed of response that's required to effectively stop the sort of vandalism that this extension is designed to combat. The users I linked to above were blocked by an adminbot script within five seconds of beginning to edit disruptively, and look at the mess they were allowed to make. No matter how efficiently ANI posts are processed, no matter how little time a human needs to review the situation, it's too long. These vandals are either using carefully-prepared tabbed browsers, or fully-automated vandalbots, which have been specifically designed to cause as much damage as possible in as short a space of time as possible. "

Discussion on permissions
"abusefilter-view

It sounds like this permission, and abusefilter-modify, might be in the process of being converted into an array similar to the edit-protected system: with different levels of access available as different permissions. However, it seems that a consensus has developed above that at least the majority of filters should be available in their entirety for all users to view, which corresponds to abusefilter-view → '*'. Comments? Happy‑melon 16:53, 29 June 2008 (UTC)

    I'd disagree with this. Would prefer abusefilter-view → 'sysop', per above. — Werdna talk 00:52, 30 June 2008 (UTC) "

"I think here we need to remember what this extension is supposed to be used for: its primary advantage is that, being part of the site software, it has zero-latency: Misza13's anti-Grawp script can slam in a block token just 5 seconds after detecting a heuristic-matching edit pattern, but this extension can do it before the first vandal action has even been completed. It has no real advantages over anti-vandal bots other than speed and tidiness: the majority of its functions can be performed just as well by a well-written script running on an admin account. However, there are some functions, most notably rights changes, which are way beyond what an admin can imitate. I have a suspicion that a filter could easily be implemented to desysop any specific user on their next edit; or (worse still) desysop all admins as-and-when they edit. Even granting this permission only to bureucrats would be giving them a right that they don't currently have - full access to this extension gives users half the power of a steward. Consequently, the ability to set filters which invoke rights changes should, in my opinion, be assigned separately to the other permissions, and only to completely trusted users. I would say give it only to the stewards, but they do not have a local right on en.wiki that the extension can check; my second choice would be those already trusted to the level of 'oversight', which is essentially the ArbCom (and stewards if necessary). Everything else the extension offers can already be done by admins, and I can see no reason not to give them all the tools available. My personal preference, therefore, would be abusefilter-modify → 'sysop' and abusefilter-modify-rights → oversight. I'm especially keen to hear other people's views on this area. Happy‑melon 16:53, 29 June 2008 (UTC) "

"Well, we can, of course, disable the 'desysop' action on Wikimedia quite simply. I think that may be the way to go for the moment — I included it only for completeness, and took care that it could be easily disabled. That said, I would still like to restrict modification of filters to a smaller group (and viewing of hidden filters is the same right), although I suppose restricting it to 'admins' would be better than nothing. The reason I suggest this is my above comments — we have lots of admins, and lots of precedents for disgruntled admins doing some leaking. — Werdna talk 00:56, 30 June 2008 (UTC"
//corresponds to current situation

"I don't agree that hiding heuristics from the public is a problematic form of 'security through obscurity'. The point of AbuseFilter is to target vandalism with specific modi operandi — for instance, Willy on Wheels, Stephen Colbert, and meme vandalism. By their nature, many of these vandals will be quite determined, and, therefore, if we expose the heuristics we use to detect them, they will simply move to other forms of vandalism which aren't targetted by the filters. If, however, we pose a barrier, even as low as needing a sysop to leak the filter's information, or getting a proxy IP blocked, or something, then the user's ability to determine what's in the filters is limited, and so they can't simply circumvent the filter by changing individual aspects of their behaviour. SQL has told me that he has had instances of vandals following his subversion commits to determine ways to circumvent restrictions on use of the account-creation tool. In short, I don't think open viewing is going to cut it. — Werdna talk 11:54, 30 June 2008 (UTC) "
// so, according to Werdna, main targetted group are especially determined vandals in which case it makes sense to hide filters' heuristics from them. Which would also explain why 2/3 of the filters are hidden

ideological and practical concerns mix

"My reading of this page is that the most frequently mentioned advantage of the proposed anti-vandalism Mediawiki extension, is speed of response compared to similar private bots. The major improvement over bots seems to be that of amplifying speed-of-response.
Why not create the mediawiki extension as an amplifier toolkit for "bots"? "

"owever, my proposal is to "increase" the effectiveness of our handling, by targetting the vandals before they can vandalise, and by placing restrictions on their accounts and IP addresses when they try, resulting in zero actual vandalism, and all of the admin and checkuser work automated by software. This is an effective system.

You cannot possibly be suggesting that, because our current handling of repeat vandals does eventually reverse and prevent some vandalism, we should rest on our laurels and do what we currently do because it's "not broken". Here, we have an opportunity to improve the effectiveness of our handling of repeat vandals. Let us not ignore it because our current system is "not broke". - —Preceding unsigned comment added by Werdna (talk • contribs) 02:07, 16 July 2008 "

A lot of controversy along the lines of
* public/private filters
* what actions exactly are ok to be taken by the filters; strong objections from community members about filters blocking/taking away rights etc. from editors; and although (both?) of theses functionalities ended up being implmented actually none of them is being actively used on the EN WP (where the "strictest" action applied is "disallow" and the last time a filter took an action different from disallow/tag/warn/log was "blockautopromote" and "aftv5flagabuse" (not sure what exactly this is) in 2012, see ipnb)

=======================================================================
https://en.wikipedia.org/w/index.php?title=Wikipedia:Edit_filter&oldid=221994491

Justification for filters:

"This gives us the opportunity to prevent damage from vandals with very specific modi operandi. For instance, we can quite easily zero in on Willy on Wheels copycats by looking at edit-counts, the rate at which actions are going through, the age of the accounts, the types of edits made previously, keywords in move-to titles, other accounts created on the same day from the same range, and so on.

I submit to the community that this gives us an extraordinary opportunity to disallow some of the worst and most annoying types of vandalism which occur on Wikipedia, and to refocus our efforts into doing other, more productive things than cleaning up after page-move vandalism. "

"The second concern I foresee with this extension, is that it promotes a kind of cabal, which may abuse the AbuseFilter extension (if you will pardon the pun), to further their own ends on Wikipedia. I emphatically reject any suggestion of this. I doubt that users who abuse this extension will be able to do so a second time, and I stress that it is to be used only for targetting blatant, widespread vandalism, with a recognisable modus operandi. This would include characteristic pagemove vandalism (such as the user who we refer to as 'Grawp'), certain meme vandalism which is easily identified, and some vandalism which would indicate a compromised account (for instance, attempting to delete the main page, which is currently prevented by the software, but which might act as a useful honeypot for desysopping and blocking compromised administrator accounts early). I whole-heartedly accept any solution which promotes the group's accountability in its use of this tool, without compromising specific filters to circumvention. I would welcome, for instance, the oversight of members of the arbitration committee, who could ensure that the policies and practices for this tool's use are being complied with fully.

In closing, I will summarise my main points: I have developed a MediaWiki extension which, if enabled on English Wikipedia, would afford us extraordinary power in preventing editing abuse. Unfortunately, to be fully effective, we will need to be a little secretive about the way it's been set up, so as not to make circumvention too easy. The tool's use would need to be very carefully monitored to prevent use for any other reason than preventing blatant vandalism with few false positives. However, if we can get this set up correctly, we would have an unprecedented ability to prevent abuse of Wikipedia for vandalism and so on. "

========================================================================

Timeline

   Oct 2001 : automatically import entries from Easton’s Bible Dictionary by a script
29 Mar 2002 : First version of https://en.wikipedia.org/wiki/Wikipedia:Vandalism (WP Vandalism is published)
   Oct 2002 : RamBot
       2006 : BAG was first formed
13 Mar 2006 : 1st version of Bots/Requests for approval is published: some basic requirements (also valid today) are recorded
28 Jul 2006 : VoABot II ("In the case were banned users continue to use sockpuppet accounts/IPs to add edits clearly rejected by consensus to the point were long term protection is required, VoABot may be programmed to watch those pages and revert those edits instead. Such edits are considered blacklisted. IP ranges can also be blacklisted. This is reserved only for special cases.")
21 Jan 2007 : Twinkle Page is first published (empty), filled with a basic description by beginings of Feb 2007
24 Jul 2007 : Request for Approval of original ClueBot
16 Jan 2008 : Huggle Page is first published (empty)
18 Jan 2008 : Huggle Page is first filled with content
23 Jun 2008 : 1st version of Edit Filter page is published: User:Werdna announces they're currently developing the extention
 2 Oct 2008 : https://en.wikipedia.org/wiki/Wikipedia_talk:Edit_filter was first archived; its last topic was the voting for/against the extention which seemed to have ended end of Sep 2008
   Jun 2010 : STiki initial release
20 Oct 2010 : ClueBot NG page is created
11 Jan 2015 : 1st commit to github ORES repository
30 Nov 2015 : ORES paper is published

========================================================================
https://en.wikipedia.org/wiki/Vandalism_on_Wikipedia

"Vandalism includes the addition, removal, or modification of the text or other material that is either humorous, nonsensical, a hoax, or that is an offensive, humiliating, or otherwise degrading nature."

"Vandalism is easy to commit on Wikipedia because anyone can edit the site,[2][3] with the exception of articles that are currently semi-protected, which means that new and unregistered users cannot edit them. "

"
# Fighting vandalism

There are various measures taken by Wikipedia to prevent or reduce the amount of vandalism. These include:

    Using Wikipedia's history functionality, which retains all prior versions of an article, restoring the article to the last version before the vandalism occurred; this is called reverting vandalism.[4] The majority of vandalism on Wikipedia is reverted quickly.[9] There are various ways in which the vandalism gets detected so it can be reverted:
        Bots: In many cases, the vandalism is automatically detected and reverted by a bot. The vandal is always warned with no human intervention.
        Recent changes patrol: Wikipedia has a special page that lists all the most recent changes. Some editors will monitor these changes for possible vandalism.[10]
        Watchlists: Any registered user can watch a page that they have created or edited or that they otherwise have an interest in. This functionality also enables users to monitor a page for vandalism.[10]
        Incidental discovery: Any reader who comes across vandalism by chance can revert it. In 2008 it was reported that the rarity of such incidental discovery indicated the efficacy of the other methods of vandalism removal.[10]
    Locking articles so only established users, or in some cases, only administrators can edit them.[4] Semi-protected articles are those that can only be edited by those with an account that is considered to be autoconfirmed – an account that is at least 4 days old with at least 10 edits, for most accounts. Fully protected articles are those that can only be edited by administrators. Protection is generally instituted after one or more editors makes a request on a special page for that purpose, and an administrator familiar with the protection guidelines chooses whether or not to fulfill this request based on the guidelines.
    Blocking and banning those who have repeatedly committed acts of vandalism from editing for a period of time or in some cases, indefinitely.[4] Vandals are not blocked as an act of punishment – the purpose of the block is simply to prevent further damage.[11]
    The "abuse filter" extension, which uses regular expressions to detect common vandalism terms.[12]

Editors are generally warned prior to being blocked. Wikipedia employs a 4-stage warning process up to a block. This includes:[13]

    The first warning assumes good faith and takes a relaxed approach to the user. (in some cases, this level can be skipped if the editor assumes the user is acting in bad faith[14]).
    The second warning does not assume any faith and is an actual warning (in some cases, this level may also be skipped).
    The third warning assumes bad faith and is the first to warn the user that continued vandalism may result in a block.
    The fourth warning is a final warning, stating that any future acts of vandalism will result in a block.
    After this, other users may place additional warnings, though only administrators can actually carry out the block.
In 2005, the English Wikipedia started to require those who create new articles to have a registered account in an effort to fight vandalism. This occurred after inaccurate information was added to Wikipedia in which a journalist was accused of taking part in Kennedy's assassination.[2]

Wikipedia has experimented with systems in which edits to some articles, especially those of living people, are delayed until it can be reviewed and determined that they are not vandalism, and in some cases, that a source to verify accuracy is provided. This is in an effort to prevent inaccurate and potentially damaging information about living people from appearing on the site.[15][16] "

Title blacklist
Spam blacklist
Bad image list
"The entirety of the MediaWiki namespace, the main page, and high risk templates are protected "

===================================================
https://en.wikipedia.org/wiki/User:TheBuddy92/Willy_on_Wheels:_A_Case_Study

"Why did Willy stop? It seems that he was flustered at the futility of his vandalism and he was contemplating more effective methods, for five days later emerged User:The Willy on Wheels. This new incarnation of Willy proceeded to move random articles to random names, usually by adding "on wheels" or incorporating "Willy", drawing attention to himself, as can be seen on one of his sock's contributions. These changes were just as obvious as his previous changes, but at the time they were impossible for anyone but administrators to reverse, because moving a page leaves a redirect at the previous name, and the page could not be moved back without first deleting that redirect. This technique, called high-speed page move vandalism, strained the relatively small number of administrators who cleaned up vandalism and increased revert times. The account was banned almost instantly, but the damage was done. "
//I'd say, edit filters were implemented to counteract these types of vandalism

"One of our best sources of information on Willy is an old revision of User:WoW. This shows that Willy's ultimate goal is to vandalize as many pages as he possibly can before being banned. He is constantly attempting to innovate new techniques toward this end, and sees it as a competition. He claims affiliation with troll groups like the GNAA and the "anti-cabal" movement against the leadership of Wikipedia. There may even be multiple Willies competing for this "honor", as suggested by his proposal for a competition. "

"He despises the power structure of Wikipedia, as evidenced by his proclaimed affiliation with anti-cabal groups - this strongly suggests that he is not in fact an admin at all, but may have been a registered user."

========================================================
https://en.wikipedia.org/wiki/Wikipedia:Edit_filter/Documentation

"The extension defines a domain-specific language solely to write filter rules. Since the language is not Turing complete, it cannot replace bots for more complex tasks. "

"Actions which can be assigned in response to filtered edits

If a user triggers a filter, the edit filter can apply any of the following sanctions based on the severity of the offense:

    All actions triggering a filter are logged at a special page.
    The user's action can be tagged for further review.
    The user can be warned that their actions may be unconstructive.
    The user's action may be disallowed.
    The user's autoconfirmed status may be revoked (or delayed if the user doesn't hold it).

The following actions are currently not available on this wiki:

    The user's account may be blocked from editing, along with all IP addresses used in the last 7 days.
    The user's account may be removed from all privileged groups (such as sysop, bot, rollbacker).

Note: Individual sanctions can be disabled selectively. Any edit filter manager can restore autoconfirmed status in case of an error. "

"Condition limit

The condition limit is a limit imposed by the software on the total number of conditions that can be evaluated by the filters. It is arbitrarily fixed at 1,000. While the aim of this limit is to avoid performance issues, it should be noted that this isn't a good metric of how heavy a filter is: for instance, a filter using dozens of simple comparisons (and thus dozens of conditions) is much lighter than one using a single check on the all_links variable. See mw:Extension:AbuseFilter/Conditions and mw:Extension:AbuseFilter/Rules format#Performance for more details. "

"Safeguards

To protect the wiki against poorly configured filters, a technical limit is imposed on the maximum percentage of actions that will trigger a given filter. Other technical limits are in the process of being written. "

==========================================================================
https://www.mediawiki.org/wiki/Extension:AbuseFilter/Conditions

"Essay: The condition limiter is a somewhat ad hoc tool for preventing performance problems. To the extent that you want to worry about performance, execution times are generally better measure to be thinking about. The per filter time and conditions numbers are somewhat broken (race conditions can cause them to be off), but most of the time they should be good enough to rely on.

The condition limit is (more or less) tracking the number of comparison operators + number of function calls entered. "

========================================================================
https://en.wikipedia.org/wiki/Wikipedia_talk:Edit_filter/Archive_3#Request_for_name_change
https://en.wikipedia.org/w/index.php?title=Wikipedia_talk:Edit_filter/Archive_3&oldid=883700704#Request_for_name_change

"Could the name of this log be changed, please? I just noticed the other day that I have entries in an "abuse" log for linking to YouTube and for creating articles about Michael Jackson, which triggered a suspicion of vandalism. A few other people are voicing the same concern at AN/I, and someone suggested posting the request here. SlimVirgin talk|contribs 18:11, 2 July 2009 (UTC) "

"    I would support a name change on all public-facing parts of this extension to "Edit filter". Even after we tell people that "Entries in this list do not necessarily mean the edits were abusive.", they still worry about poisoning of their well. –xenotalk 18:14, 2 July 2009 (UTC)"

as well as several more comments in favour

==========================================================================
https://ifex.org/international/2019/02/21/technology-block-internet/

"For the same reason, filtering software also frequently over-censors, for example blocking scholarly discussion of terrorism while attempting to limit access to hate speech, or restricting access to sexual health education material while attempting to prevent access to pornography."

==========================================================================
https://wikimediafoundation.org/about/

"
The Wikimedia Foundation mission

To empower and engage people around the world to collect and develop educational content under a free license or in the public domain, and to disseminate it effectively and globally.
"

==========================================================================
https://en.wikipedia.org/w/index.php?title=Wikipedia:Five_pillars&oldid=901615643

Wikipedia is an encyclopedia

Our encyclopedia combines many features of general and specialized encyclopedias, almanacs, and gazetteers. Wikipedia is not a soapbox, an advertising platform, a vanity press, an experiment in anarchy or democracy, an indiscriminate collection of information, or a web directory. It is not a dictionary, a newspaper, or a collection of source documents, although some of its fellow Wikimedia projects are.

Wikipedia is written from a neutral point of view

We strive for articles in an impartial tone that document and explain major points of view, giving due weight with respect to their prominence. We avoid advocacy, and we characterize information and issues rather than debate them. In some areas there may be just one well-recognized point of view; in others, we describe multiple points of view, presenting each accurately and in context rather than as "the truth" or "the best view". All articles must strive for verifiable accuracy, citing reliable, authoritative sources, especially when the topic is controversial or is on living persons. Editors' personal experiences, interpretations, or opinions do not belong.

Wikipedia is free content that anyone can use, edit, and distribute

Since all editors freely license their work to the public, no editor owns an article and any contributions can and will be mercilessly edited and redistributed. Respect copyright laws, and never plagiarize from any sources. Borrowing non-free media is sometimes allowed as fair use, but strive to find free alternatives first.

Wikipedia's editors should treat each other with respect and civility

Respect your fellow Wikipedians, even when you disagree. Apply Wikipedia etiquette, and don't engage in personal attacks. Seek consensus, avoid edit wars, and never disrupt Wikipedia to illustrate a point. Act in good faith, and assume good faith on the part of others. Be open and welcoming to newcomers. Should conflicts arise, discuss them calmly on the appropriate talk pages, follow dispute resolution procedures, and consider that there are 5,878,577 other articles on the English Wikipedia to improve and discuss.

Wikipedia has no firm rules

Wikipedia has policies and guidelines, but they are not carved in stone; their content and interpretation can evolve over time. The principles and spirit matter more than literal wording, and sometimes improving Wikipedia requires making exceptions. Be bold but not reckless in updating articles. And do not agonize over making mistakes: every past version of a page is saved, so mistakes can be easily corrected.

==========================================================================
https://en.wikipedia.org/w/index.php?title=Wikipedia:About&oldid=891256910

"Anyone with Internet access can write and make changes to Wikipedia articles, except in limited cases where editing is restricted to prevent disruption or vandalism."

"English Wikipedia right now:

Wikipedia is running MediaWiki
version 1.34.0-wmf.10 (5b7d3f4).
It has 5,877,868 content articles,
and 48,053,456 pages in total.
There are 882,172 uploaded files.
There have been 898,827,431 edits.
There are 36,606,590 registered users,
including 1,155 administrators."

". While most articles may be altered by anyone, in practice editing will be performed by a certain demographic (younger rather than older, male rather than female, rich enough to afford a computer rather than poor, et cetera) and may, therefore, show some bias. Some topics may not be covered well, while others may be covered in great depth." //at least they admit to it

"Wikipedia's radical openness also means that any given article may be, at any given moment, in a bad state, such as in the middle of a large edit, or a controversial rewrite."

"Censorship or imposing "official" points of view is extremely difficult to achieve and usually fails after a time. Eventually for most articles, all notable views become fairly described and a neutral point of view reached."

"Wikipedia operates a full editorial dispute resolution process, one that allows time for discussion and resolution in depth, but one that also permits disagreements to last for months before poor-quality or biased edits are removed."

"hierarchy of permissions and positions, some of which are listed hereafter:

    Anyone can edit most of the articles here. Some articles are protected because of vandalism or edit-warring, and can only be edited by certain editors.
    Anyone with an account that has been registered for four days or longer and has made at least ten edits becomes autoconfirmed, and gains the technical ability to do five things that non-autoconfirmed editors cannot:
        Create pages.
        Move pages.
        Edit semi-protected pages.
        Upload files.
        Vote in certain elections (minimum edit count to receive suffrage varies depending on the election).
    Many editors with accounts obtain access to certain tools that make editing easier and faster. Few editors learn about most of those tools, but one common privilege granted to editors in good standing is "rollback", which is the ability to undo edits more easily.
    Administrators ("admins" or "sysops") have been approved by the community, and have access to some significant administrative tools. They can delete articles, block accounts or IP addresses, and edit fully protected articles.
    Bureaucrats are chosen in a process similar to that for selecting administrators. There are not very many bureaucrats. They have the technical ability to add or remove admin rights and approve or revoke "bot" privileges.
    The Arbitration Committee is analogous to Wikipedia's supreme court. They deal with disputes that remain unresolved after other attempts at dispute resolution have failed. Members of this Committee are elected by the community and tend to be selected from among the pool of experienced admins.
    Stewards hold the top echelon of community permissions. Stewards can do a few technical things, and one almost never hears much about them since they normally only act when a local admin or bureaucrat is not available, and hence almost never on the English Wikipedia. There are very few stewards.
    Jimmy Wales, the founder of Wikipedia, has several special roles and privileges. In most instances, however, he does not expect to be treated differently than any other editor or administrator."

==========================================================================
https://transparency.wikimedia.org/

"Our mission is to provide free access to the sum of all human knowledge. We believe that protecting user privacy and defending against censorship are essential to the success of that mission. "

==========================================================================
https://wikimediafoundation.org/about/values/

"We strive for excellence."
"We welcome and cherish our differences."
"We are in this together." //collaboration
"We engage in civil discourse."
"We are inspired."

The only mention of transparency is this here:
" For it to work well, each of us needs to be honest, accountable, and transparent to one another."

==========================================================================
https://en.wikipedia.org/wiki/Wikipedia:Purpose
https://en.wikipedia.org/w/index.php?title=Wikipedia:Purpose&oldid=890601199

"Wikipedia's purpose is to benefit readers by acting as an encyclopedia, a comprehensive written compendium that contains information on all branches of knowledge. The goal of a Wikipedia article is to present a neutrally written summary of existing mainstream knowledge in a fair and accurate manner with a straightforward, "just-the-facts style". Articles should have an encyclopedic style with a formal tone instead of essay-like, argumentative, promotional or opinionated writing. "

==========================================================================
https://en.wikipedia.org/wiki/Sockpuppet_(Internet)
https://en.wikipedia.org/w/index.php?title=Sockpuppet_(Internet)&oldid=902985493

"A sockpuppet is an online identity used for purposes of deception"
"The term now includes other misleading uses of online identities, such as those created to praise, defend or support a person or organization,[2] to manipulate public opinion,[3] or to circumvent a suspension or ban from a website. A significant difference between the use of a pseudonym[4] and the creation of a sockpuppet is that the sockpuppet poses as an independent third-party unaffiliated with the main account operator."

Types:
Block evasion
Ballot stuffing
Strawman sockpuppet: "is a false flag pseudonym created to make a particular point of view look foolish or unwholesome in order to generate negative sentiment against it. Strawman sockpuppets typically behave in an unintelligent, uninformed, or bigoted manner and advance "straw man" arguments that their puppeteers can easily refute. "
Meatpuppet: "The term "meatpuppet" (or "meat puppet") is an online version of a shill," ("A shill, also called a plant or a stooge, is a person who publicly helps or gives credibility to a person or organization without disclosing that they have a close relationship with the person or organization. ")

==========================================================================
https://en.wikipedia.org/wiki/Wikipedia:Sock_puppetry
https://en.wikipedia.org/w/index.php?title=Wikipedia:Sock_puppetry&oldid=903464918

"This page documents an English Wikipedia policy."
"This page in a nutshell: The general rule is one editor, one account. Do not use multiple accounts to mislead, deceive, vandalize or disrupt; to create the illusion of greater support for a position; to stir up controversy; or to circumvent a block, ban, or sanction. Do not ask your friends to create accounts to support you. Do not revive old unused accounts and use them as different users, or use another person's account. Do not log out just to vandalize as an IP address editor."

"Sock puppetry takes various forms:

    Logging out to make problematic edits as an IP address
    Creating new accounts to avoid detection
    Using another person's account (piggybacking)
    Reviving old unused accounts (sometimes referred to as sleepers) and presenting them as different users
    Persuading friends or colleagues to create accounts for the purpose of supporting one side of a dispute (usually called meatpuppetry)

Misuse of multiple accounts is a serious breach of community trust. It may lead to:

    a block of all affected accounts
    a ban of the user (the sockmaster or sockpuppeteer) behind the accounts (each of which is a sockpuppet or sock)
    on-project exposure of all accounts and IP addresses used across Wikipedia and its sister projects
    the (potential) public exposure of any "real-world" activities or personal information deemed relevant to preventing future sock puppetry or certain other abuses.[1]"

==========================================================================
https://en.wikipedia.org/wiki/Wikipedia:Long_term_abuse

Cases of long term abuse can be reported here.

"This page summarises a limited number of long term abusers, to assist members of the community who believe they may have cause to report another incident. Note that this page is not a noticeboard. Names should only be added for the most egregious and well-attested cases. Most users here will have been banned, some on multiple occasions.
In the vast majority of cases, Deny recognition and Revert, block, ignore are more suitable approaches."

"Verify that the following criteria have been met:

        The user has been abusing Wikipedia over a long duration of time.
        The user account has a history of repeated egregious disruption, and despite indefinite block or ban, continues vandalism and/or abuse beyond the point of any usual blocked user.
        There is a significant chance that the information will be of value against future repetition."


============================================================================
Investigating pick in filter hits beginnings of 2016

Looking at january 2016:

till now it comes to attention that a lot of accounts named something resembling <FirstnameLastname4RandomLetters> were trying to create an account  (while logged in?) (or maybe it was just that the creation of these particular accounts itself was denied); this triggers filter 527 ("T34234: log/throttle possible sleeper account creations
")
There are in the meantime over 5 pages of them, it is definitely happening automatically

TODO: download data; write script to identify actions that triggered the filters (accountcreations? edits?) and what pages were edited