维基百科:社群健康先導計畫

维基百科,自由的百科全书
跳到导航 跳到搜索
Plant a Sapling for Better Future.jpg

社群健康先導計畫
協助維基媒體志工社群減少在各維基媒體計畫發生的騷擾與破壞行為。

Starting in March 2017, the Wikimedia Foundation's 社群技術(Community Tech) and 社群參與(Community Engagement) teams will begin a multi-year project of research and product development to help the Wikimedia volunteer community to reduce the level of harassment and disruptive behavior on our projects.

This initiative addresses the major forms of harassment reported on the Wikimedia Foundation’s 2015年騷擾調查(2015 Harassment Survey), which covers a wide range of different behaviors: content vandalism, stalking, name-calling, trolling, doxxing, discrimination — anything that targets individuals for unfair and harmful attention.

This will result in improvements to both the tools on the MediaWiki software and the policies on communities suffering the most from disruptive behavior. From research and community feedback, the tools built in this initiative will focus on four areas: Detection, Reporting, Evaluating, and Blocking. Of course, these improvements need to be made with the participation and support of the volunteers who will be using the tools. As a product development team, we don't want to create new reporting, evaluation and blocking workflows that create more work for an already overburdened team of wiki administrators. We want to help make these tasks less grueling for admins, and able to more consistently produce effective outcomes.

背景[编辑]

在維基媒體計劃中的騷擾現象[编辑]

On Wikipedia and other Wikimedia projects, harassment typically occurs on talk pages (article, project, and user), noticeboards, user pages, and edit summaries. Edit warring and wiki-hounding can also be forms of harassment. Conduct disputes typically originate from content disputes, such as disagreements about the reliability of a source, neutrality of a point-of-view, or article formatting and content hierarchy.

The English-language Wikipedia community (and most other projects) have drafted conduct policies英语Wikipedia:List_of_policies#Conduct for their communities to follow including policies on Wikipedia:文明, Wikipedia:騷擾, Wikipedia:不要人身攻击, and Wikipedia:争论的解决. The spirit of these policies is right-hearted but enforcement is difficult given deficiencies in the MediaWiki software and the ratio of contributors to active administrators.[1] The dispute resolution processes encourage users to attempt to resolve issues between themselves before bringing the situation to the attention of administrators on the Administrator's Noticeboard, and eventually ArbCom for extreme situations.[2]

Harassment Survey 2015 - Results Report

Online harassment is a problem on virtually every web property where users interact. In 2014, the Pew Research Center concluded that 40% of all internet users have been the victim of online harassment.[3] In 2015 the Wikimedia Foundation conducted a Harassment Survey with 3,845 Wikimedia user participants to gain deeper understanding of harassment occurring on Wikimedia projects. 38% of the respondents confidently recognized that they had been harassed while 51% of respondents witnessed others being harassed. The most commonly reported types of harassment included:

  • Content vandalism (26% of responses)
  • Trolling/flaming (24%)
  • Name-calling (17%)
  • Discrimination (14.5%)
  • Stalking (13%)

In 2016-17 Jigsaw英语Jigsaw_(company) and Wikimedia researchers used machine learning techniques to to evaluate harassment on Wikipedia in the Detox research project. Significant findings include:

  • Only 18% of all identified attacks on English Wikipedia resulted in a block or warning.
  • 67% of attacks come from registered users.
  • Nearly 50% of all attacks come from contributors with over 100 annual edits.[4]

This research is illuminatory and one of the impetuses for this Community Health Initiative, but is only the beginning of the research we must conduct for this endeavor to be successful.

社群希望可以開發新的處理工具[编辑]

The Wikimedia community has long struggled with how to protect its members from bad-faith or harmful users. The administrative toolset that project administrators can use to block disruptive users from their projects has not changed since the early days of the MediaWiki software. Volunteers have asked the Wikimedia Foundation to improve the blocking tools on a number of occasions, including:

In preparing for this initiative, we've been discussing issues with the current tools and processes with active administrators and functionaries. These discussions have resulted in requested improvements in several key areas where admins and functionaries see immediate needs — better reporting systems for volunteers, smarter ways to detect and address problems early, and improved tools and workflows related to the blocking process. These conversations will be ongoing throughout the entire process. Community input and participation will be vital to our success.

外部資金[编辑]

2017 grant proposal for Anti-Harassment Tools For Wikimedia Projects

In January 2017, the Wikimedia Foundation received initial funding of US$500,000 from the Craig Newmark Foundation and craigslist Charitable Fund to support this initiative.[5] The two seed grants, each US$250,000, will support the development of tools for volunteer editors and staff to reduce harassment on Wikipedia and block harassers. The grant proposal is available for review at Wikimedia Commons.

聚焦方向[编辑]

In short, we want to build software that empowers contributors and administrators to make timely, informed decisions when harassment occurs. Four focus areas have been identified where new tools could be beneficial in addressing and responding to harassment:

偵測[编辑]

We want to make it easier and more efficient for editors to identify and flag harassing behavior. We are currently questioning how harassment can be prevented before it begins, and how minor incidents be resolved before they snowball into larger uncivil problems.

Potential features:

  • Performance, usability, and stability improvements to AbuseFilter
  • Reliability and accuracy improvements to ProcseeBot英语User:ProcseeBot
  • Anti-spoof improvements to pertinent tools
  • Features that surface content vandalism, edit warring, stalking, and harassing language to wiki administrators and staff

回報[编辑]

According to Detox research, harassment is underreported on English Wikipedia.[4] No victim of harassment should abandon editing because they feel powerless to report abuse. We want to provide victims improved ways to report instances that are more respectful of their privacy, less chaotic and less stressful than the current workflow. Currently the burden of proof is on the victim to prove their own innocence and the harasser's fault, while we believe the MediaWiki software should perform the heavy lifting.

Potential features:

  • A new harassment reporting system that doesn't place the burden of proof on or further alienate victims of harassment.

審核[编辑]

Proficiency with MediaWiki diffs, histories, and special pages is imperative for admins to be able to analyze and evaluate the true sequence of events of a conduct dispute. Volunteer-written tools such as Editor Interaction Analyzer and WikiBlame help, but current processes are time consuming. We want to build tools to help volunteers understand and evaluate harassment cases, and inform the best way to respond.

Potential features:

  • A robust interaction history tool, which will allow wiki administrators to understand the interaction between two users over time, and make informed decisions in harassment cases.
  • A private system for wiki administrators to collect information on users’ history with harassment and abuse cases, including user restrictions and arbitration decisions.
  • A dashboard system for wiki administrators to help them manage current investigations and disciplinary actions.
  • Cross-wiki tools that allow wiki administrators to manage harassment cases across wiki projects and languages.

封禁[编辑]

We want to improve existing tools and create new tools, if appropriate, to remove troublesome actors from communities or certain areas within and to make it more difficult for someone who's blocked from the site to return.

Some of these improvements are already being productized as part of the 2016 Community Wishlist. See meta:Community Tech/Blocking tools for more information.

Potential features:

  • Per-page blocking tool, which will help wiki administrators to redirect users who are being disruptive without completely blocking them from contributing to the project; this will make wiki admins more comfortable with taking decisive action in the early stages of a problem.
  • Make global CheckUser tools work across projects, improving tools that match usernames with IP addresses and user agents so that they can check contributions on all Wikimedia projects in one query.
  • Sockpuppet blocking tools.

社群的參與[编辑]

Gathering, incorporating, and discussing community input is vital to the success of this initiative. We are building features for our communities to use — if we design in a vacuum our decisions will assuredly fail.

The plans presented in the grant, on this page, and elsewhere will certainly change over time as we gather input from our community (including victims of harassment, contributors, and administrators,) learn from our research, and learn from the software we build. Community input includes, but is not limited to:

  • Socializing our goals
  • Generating, refining, validating, and finalizing ideas with community stakeholders
  • Conversations about freedom of expression vs. political correctness. It’s very important that this project is seen as addressing the kinds of abuse that everyone agrees about (obvious sockpuppet vandalism, death threats) and the kinds of abuse that people will differ over (gender, culture, etc.). The project will not succeed if it’s seen as only a “social justice” power play.[6]

Over the course of this initiative we plan to communicate with the community via regular wiki communication (talk pages, email, IRC) in addition to live-stream workshops, in-person workshops at hack-a-thons and Wikimanias, and online community consultations. At the moment, the best place to discuss the Community health initiative is on meta:Talk:Community health initiative.

目標規劃[编辑]

目前進度[编辑]

As of this writing (mid February 2017), the 社群技術(Community Tech) and 社群參與(Community Engagement) teams are in the process of hiring new team members, recruited specifically for this project. The goal is to have the new team in place by March 2017:

2017年3月-2017年6月[编辑]

Depending on team member start-dates and on-boarding, the first few months will include:

  • Development work on some of the tools that have already been identified by the community as needing improvement, most likely AbuseFilter and ProcseeBot.
  • The start of new research projects to help the team gain a deeper understanding of the current needs and problems.
  • Discussions with the volunteer community as a whole, as well as targeted discussions with wiki functionaries who handle disruptive behavior, people who have experienced and witnessed harassment on our wikis, and other important stakeholders.

2017年7月-2018年6月[编辑]

The first full fiscal year of this initiative is planned to include:

  • Research:
    • Focused research on the Harassment Reporting System.
    • Work with the community on potentially creating a new user group for volunteer administrators who want to work specifically on harassment cases. The community advocate will work with the Support and Safety team to provide training and support for these volunteers.
  • Designing, building, releasing, and iterating on:
    • usability & performance improvements to AbuseFilter
    • accuracy improvements to anti-spoof tools across multiple pertinent tools
    • a robust interaction history tool, similar to the edit interaction analyzer
    • a private system for admins to discuss and evaluate incidents of harassment
    • per-page blocking tools
    • cross-project improvements to CheckUser tools
    • robust sockpuppet identification and blocking tools

2018年7月-2019年6月[编辑]

Our plans will change based on our continual learnings, but at this moment in time the second fiscal year of this initiative is planned to include:

  • Designing, building, releasing, and iterating on:
    • the Harassment Reporting System
    • tools that surface brewing situations of harassment and vandalism to community leaders before they become large-scale incidents
    • a dashboard system for wiki administrators to help them manage current investigations and disciplinary actions
    • cross-wiki tools that allow wiki administrators to manage harassment cases across wiki projects and languages
  • Internationalizing all tools we've built to date

參見[编辑]

參考文獻[编辑]

  1. ^ Wikipedia:List of administrators/Active. 2017-02-13 (英语). 
  2. ^ Wikipedia:Harassment § Dealing with harassment.. 2017-02-12 (英语). 
  3. ^ Duggan, Maeve. Online Harassment. Pew Research Center: Internet, Science & Tech. 2014-10-22 [2017-02-13]. 
  4. ^ 4.0 4.1 Algorithms and insults: Scaling up our understanding of harassment on Wikipedia – Wikimedia Blog. [2017-02-13]. 
  5. ^ Wikimedia Foundation receives $500,000 from the Craig Newmark Foundation and craigslist Charitable Fund to support a healthy and inclusive Wikimedia community – Wikimedia Blog. [2017-02-13]. 
  6. ^ File:Wikimedia Foundation grant proposal - Anti-Harassment Tools For Wikimedia Projects - 2017.pdf - Meta (PDF). meta.wikimedia.org. [2017-02-14] (英语).