维基百科讨论:Guestbook for non-Chinese speakers

维基百科,自由的百科全书
跳转至: 导航搜索
快捷方式
WT:GB

Welcome to Chinese Wikipedia!

Please feel free to ask or request anything related to Chinese Wikipedia. Keep in mind our policies and guidelines.


Information: This wiki adheres to the standard bot policy, and allows global bots. Your bot needs approval before performing any new tasks.

Tools: bot approval, username change and usurpation
中文用户请到互助客栈

Leave a message in the guestbook Mitteilung an die gästebuch Consulter le livre d'or Mensaje a el libro de visitas IRC

Archives
Archives

Wiki labels & Revision Scoring as a Service for Chinese Wikipedia[编辑]

Hello Chinese Wikipedia,

I apologize for my complete lack of Chinese skills. I would most welcome if my post is translated to Chinese.

So computers are very good at crunching numbers. Your average calculator can out smart you in arithmetic. However computers are terrible at pretty much in everything else. Programming computers to under take any task no matter how simple beyond computing tends to be very difficult. This is where Artificial Intelligence comes in. With Artificial Intelligence we teach computers how to solve problems without explicit programming for the solution. This is what we are doing.

We are working on a project called m:Research:Revision scoring as a service which aims to provide quality control Artificial Intelligence infrastructure for Mediawiki and Wikimedia projects. We already have our system implemented and running on Azerbaijani, English, French, Indonesian, Persian, Portuguese, Spanish, Turkish and Vietnamese editions on Wikipedia. We are hoping to adapt our tool to serve Chinese language as well as a number of other languages.

We are currently mainly focusing on vandalism detection where we provide an API (m:ORES) that provides scores. We have made an effort to keep our system robust.

The examples I'll provide are based on a machine learning algorithm that was trained to use 20,000 reverted edits. This is kind of modelling is problematic for two reasons. First is, there are non-vandalism related reasons for edits to be reverted such as mistakes from new users, this would develop such an unproductive bias. Second problem would be it lacks the ability to distinguish good faith users from malicious ones. To demonstrate our system I will give three examples from English wikipedia. I have picked these three semi-random.

  • Score of 90% diff en:Moncef Mezghanni
    • As visible in the diff, it is clearly something that shouldn't be welcome on English wikipedia. Algorithms confidence also matches my human assessment.
  • Score of 75% diff en:Monin
    • When I look at the diff it isn't immediately clear to me if this should be reverted. Detailed look reveals that prior version had more neutral information, but new version at a glance isn't exactly clear cut vandalism, albeit spammy. Algorithms confidence drops just as my human assessment.
  • Score of 19% diff en:Curiosity killed the cat, but satisfaction brought it back
    • As visible in the diff this edit clearly improves the article. The algorithms confidence plummets as well. Algorithm is more confident that this edit should NOT be reveted.

We are also working towards a system for article quality where we use existing assessment by en:Wikipedia:Version 1.0 Editorial Teamto train our system. We only have this system on English wikipedia at the moment but we would be more than happy to expand to other language editions. I am uncertain if Chinese Wikipedia has a similar quality assessment scale. I have picked 5 random articles to demonstrate this.

Typical problem is that humans typically do not re-asses articles over time or articles are never assessed in the first place. Our system circumvents this problem by automating this.

We unfortunately lack language features such as bad words, informal words and stop words. This would be very helpful. We also need a localization of en:Wikipedia:Labels serving as our local landing page.

Once these are complete, we would like to start an edit quality campaign where we request the local community to hand code/label ~2000 revisions labeling them productive/damaging and good faith/bad faith. This would be similar to the campaign on English Wikipedia en:Wikipedia:Labels/Edit quality.

After this we will be able to generate scores for revisions that is usable by gadgets such as ScoredRevisions as well as (potentially) tools like huggle. If community desires it, it can even be used to create a local vandalism reversion bot.

So in a nutshell our algorithm relies on community input to support the community. Feel free to ask any questions. Either here, on meta or on IRC on the freenode server and #wikimedia-ai channel where we hang out. You can also reach us at https://github.com/wiki-ai

-- とある白い猫 chi? 2015年8月7日 (五) 20:03 (UTC)

User:Jianhui67 do you think you can help with this? -- とある白い猫 chi? 2015年8月7日 (五) 20:03 (UTC)
Awesome! Just in last few days, much of WikiProjects were set up in Chinese Wikipedia, and a large number of talk pages were tagged unassessed banner by bots, just like this (1,067 unassessed article out of 1,369). Plus, a very very very large number of pages even not tagged a banner. As one who also focus on assessment, I believe It's a very useful tool for auto-assessing Stub, Start and C-Class articles. --CAS222222221 2015年8月8日 (六) 06:46 (UTC)
User:CAS222222221. A good start would be the translation of wikilabels. -- とある白い猫 chi? 2015年8月28日 (五) 21:17 (UTC)