Mailing List Archive

[Wikimedia-l] [Wikimedia Research Showcase] Earlier time! September 18, 2019 at 9:30 AM PT, 16:30 UTC
Hello everyone,

The next Research Showcase will be live-streamed next Wednesday, September
18, at 9:30 AM PT/16:30 UTC. This will be the new time going forward for
Research Showcases in order to give more access to other timezones.

YouTube stream: https://www.youtube.com/watch?v=fDhAnHrkBks

As usual, you can join the conversation on IRC at #wikimedia-research. You
can also watch our past research showcases here:
https://www.mediawiki.org/wiki/Wikimedia_Research/Showcase

This month's presentations:

Citation Needed: A Taxonomy and Algorithmic Assessment of Wikipedia's
Verifiability

By Miriam Redi, Research, Wikimedia Foundation

Among Wikipedia's core guiding principles, verifiability policies have a
particularly important role. Verifiability requires that information
included in a Wikipedia article be corroborated against reliable secondary
sources. Because of the manual labor needed to curate and fact-check
Wikipedia at scale, however, its contents do not always evenly comply with
these policies. Citations (i.e. reference to external sources) may not
conform to verifiability requirements or may be missing altogether,
potentially weakening the reliability of specific topic areas of the free
encyclopedia. In this project
<https://meta.wikimedia.org/wiki/Research:Identification_of_Unsourced_Statements>,
we aimed to provide an empirical characterization of the reasons why and
how Wikipedia cites external sources to comply with its own verifiability
guidelines. First, we constructed a taxonomy of reasons why inline
citations are required by collecting labeled data from editors of multiple
Wikipedia language editions. We then collected a large-scale crowdsourced
dataset of Wikipedia sentences annotated with categories derived from this
taxonomy. Finally, we designed and evaluated algorithmic models to
determine if a statement requires a citation, and to predict the citation
reason based on our taxonomy. We evaluated the robustness of such models
across different classes of Wikipedia articles of varying quality, as well
as on an additional dataset of claims annotated for fact-checking purposes.

Redi, M., Fetahu, B., Morgan, J., & Taraborelli, D. (2019, May). Citation
Needed: A Taxonomy and Algorithmic Assessment of Wikipedia's Verifiability.
In The World Wide Web Conference (pp. 1567-1578). ACM.
https://arxiv.org/abs/1902.11116


Patrolling on Wikipedia

By Jonathan T. Morgan, Research, Wikimedia Foundation

I will present initial findings from an ongoing research study
<https://meta.wikimedia.org/wiki/Research:Patrolling_on_Wikipedia> of
patrolling workflows on Wikimedia projects. Editors patrol recent pages and
edits to ensure that Wikimedia projects maintains high quality as new
content comes in. Patrollers revert vandalism and review newly-created
articles and article drafts. Patrolling of new pages and edits is vital
work. In addition to making sure that new content conforms to Wikipedia
project policies, patrollers are the first line of defense against
disinformation, copyright infringement, libel and slander, personal
threats, and other forms of vandalism on Wikimedia projects. This research
project is focused on understanding the needs, priorities, and workflows of
editors who patrol new content on Wikimedia projects. The findings of this
research can inform the development of better patrolling tools as well as
non-technological interventions intended to support patrollers and the
activity of patrolling.

--
Janna Layton (she, her)
Administrative Assistant - Product & Technology
Wikimedia Foundation <https://wikimediafoundation.org/>
_______________________________________________
Wikimedia-l mailing list, guidelines at: https://meta.wikimedia.org/wiki/Mailing_lists/Guidelines and https://meta.wikimedia.org/wiki/Wikimedia-l
New messages to: Wikimedia-l@lists.wikimedia.org
Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/wikimedia-l, <mailto:wikimedia-l-request@lists.wikimedia.org?subject=unsubscribe>
Re: [Wikimedia-l] [Wikimedia Research Showcase] Earlier time! September 18, 2019 at 9:30 AM PT, 16:30 UTC [ In reply to ]
Hello everyone,

This is just a reminder that the Research Showcase will be this Wednesday,
with Miriam Redi and Jonathan Morgan from the Foundation presenting.

On Wed, Sep 11, 2019 at 3:10 PM Janna Layton <jlayton@wikimedia.org> wrote:

> Hello everyone,
>
> The next Research Showcase will be live-streamed next Wednesday, September
> 18, at 9:30 AM PT/16:30 UTC. This will be the new time going forward for
> Research Showcases in order to give more access to other timezones.
>
> YouTube stream: https://www.youtube.com/watch?v=fDhAnHrkBks
>
> As usual, you can join the conversation on IRC at #wikimedia-research. You
> can also watch our past research showcases here:
> https://www.mediawiki.org/wiki/Wikimedia_Research/Showcase
>
> This month's presentations:
>
> Citation Needed: A Taxonomy and Algorithmic Assessment of Wikipedia's
> Verifiability
>
> By Miriam Redi, Research, Wikimedia Foundation
>
> Among Wikipedia's core guiding principles, verifiability policies have a
> particularly important role. Verifiability requires that information
> included in a Wikipedia article be corroborated against reliable secondary
> sources. Because of the manual labor needed to curate and fact-check
> Wikipedia at scale, however, its contents do not always evenly comply with
> these policies. Citations (i.e. reference to external sources) may not
> conform to verifiability requirements or may be missing altogether,
> potentially weakening the reliability of specific topic areas of the free
> encyclopedia. In this project
> <https://meta.wikimedia.org/wiki/Research:Identification_of_Unsourced_Statements>,
> we aimed to provide an empirical characterization of the reasons why and
> how Wikipedia cites external sources to comply with its own verifiability
> guidelines. First, we constructed a taxonomy of reasons why inline
> citations are required by collecting labeled data from editors of multiple
> Wikipedia language editions. We then collected a large-scale crowdsourced
> dataset of Wikipedia sentences annotated with categories derived from this
> taxonomy. Finally, we designed and evaluated algorithmic models to
> determine if a statement requires a citation, and to predict the citation
> reason based on our taxonomy. We evaluated the robustness of such models
> across different classes of Wikipedia articles of varying quality, as well
> as on an additional dataset of claims annotated for fact-checking purposes.
>
> Redi, M., Fetahu, B., Morgan, J., & Taraborelli, D. (2019, May). Citation
> Needed: A Taxonomy and Algorithmic Assessment of Wikipedia's Verifiability.
> In The World Wide Web Conference (pp. 1567-1578). ACM.
> https://arxiv.org/abs/1902.11116
>
>
> Patrolling on Wikipedia
>
> By Jonathan T. Morgan, Research, Wikimedia Foundation
>
> I will present initial findings from an ongoing research study
> <https://meta.wikimedia.org/wiki/Research:Patrolling_on_Wikipedia> of
> patrolling workflows on Wikimedia projects. Editors patrol recent pages and
> edits to ensure that Wikimedia projects maintains high quality as new
> content comes in. Patrollers revert vandalism and review newly-created
> articles and article drafts. Patrolling of new pages and edits is vital
> work. In addition to making sure that new content conforms to Wikipedia
> project policies, patrollers are the first line of defense against
> disinformation, copyright infringement, libel and slander, personal
> threats, and other forms of vandalism on Wikimedia projects. This research
> project is focused on understanding the needs, priorities, and workflows of
> editors who patrol new content on Wikimedia projects. The findings of this
> research can inform the development of better patrolling tools as well as
> non-technological interventions intended to support patrollers and the
> activity of patrolling.
>
> --
> Janna Layton (she, her)
> Administrative Assistant - Product & Technology
> Wikimedia Foundation <https://wikimediafoundation.org/>
>


--
Janna Layton (she, her)
Administrative Assistant - Product & Technology
Wikimedia Foundation <https://wikimediafoundation.org/>
_______________________________________________
Wikimedia-l mailing list, guidelines at: https://meta.wikimedia.org/wiki/Mailing_lists/Guidelines and https://meta.wikimedia.org/wiki/Wikimedia-l
New messages to: Wikimedia-l@lists.wikimedia.org
Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/wikimedia-l, <mailto:wikimedia-l-request@lists.wikimedia.org?subject=unsubscribe>
Re: [Wikimedia-l] [Wikimedia Research Showcase] Earlier time! September 18, 2019 at 9:30 AM PT, 16:30 UTC [ In reply to ]
The Research Showcase will be starting in about 30 minutes.

On Mon, Sep 16, 2019 at 10:59 AM Janna Layton <jlayton@wikimedia.org> wrote:

> Hello everyone,
>
> This is just a reminder that the Research Showcase will be this Wednesday,
> with Miriam Redi and Jonathan Morgan from the Foundation presenting.
>
> On Wed, Sep 11, 2019 at 3:10 PM Janna Layton <jlayton@wikimedia.org>
> wrote:
>
>> Hello everyone,
>>
>> The next Research Showcase will be live-streamed next Wednesday,
>> September 18, at 9:30 AM PT/16:30 UTC. This will be the new time going
>> forward for Research Showcases in order to give more access to other
>> timezones.
>>
>> YouTube stream: https://www.youtube.com/watch?v=fDhAnHrkBks
>>
>> As usual, you can join the conversation on IRC at #wikimedia-research.
>> You can also watch our past research showcases here:
>> https://www.mediawiki.org/wiki/Wikimedia_Research/Showcase
>>
>> This month's presentations:
>>
>> Citation Needed: A Taxonomy and Algorithmic Assessment of Wikipedia's
>> Verifiability
>>
>> By Miriam Redi, Research, Wikimedia Foundation
>>
>> Among Wikipedia's core guiding principles, verifiability policies have a
>> particularly important role. Verifiability requires that information
>> included in a Wikipedia article be corroborated against reliable secondary
>> sources. Because of the manual labor needed to curate and fact-check
>> Wikipedia at scale, however, its contents do not always evenly comply with
>> these policies. Citations (i.e. reference to external sources) may not
>> conform to verifiability requirements or may be missing altogether,
>> potentially weakening the reliability of specific topic areas of the free
>> encyclopedia. In this project
>> <https://meta.wikimedia.org/wiki/Research:Identification_of_Unsourced_Statements>,
>> we aimed to provide an empirical characterization of the reasons why and
>> how Wikipedia cites external sources to comply with its own verifiability
>> guidelines. First, we constructed a taxonomy of reasons why inline
>> citations are required by collecting labeled data from editors of multiple
>> Wikipedia language editions. We then collected a large-scale crowdsourced
>> dataset of Wikipedia sentences annotated with categories derived from this
>> taxonomy. Finally, we designed and evaluated algorithmic models to
>> determine if a statement requires a citation, and to predict the citation
>> reason based on our taxonomy. We evaluated the robustness of such models
>> across different classes of Wikipedia articles of varying quality, as well
>> as on an additional dataset of claims annotated for fact-checking purposes.
>>
>> Redi, M., Fetahu, B., Morgan, J., & Taraborelli, D. (2019, May). Citation
>> Needed: A Taxonomy and Algorithmic Assessment of Wikipedia's Verifiability.
>> In The World Wide Web Conference (pp. 1567-1578). ACM.
>> https://arxiv.org/abs/1902.11116
>>
>>
>> Patrolling on Wikipedia
>>
>> By Jonathan T. Morgan, Research, Wikimedia Foundation
>>
>> I will present initial findings from an ongoing research study
>> <https://meta.wikimedia.org/wiki/Research:Patrolling_on_Wikipedia> of
>> patrolling workflows on Wikimedia projects. Editors patrol recent pages and
>> edits to ensure that Wikimedia projects maintains high quality as new
>> content comes in. Patrollers revert vandalism and review newly-created
>> articles and article drafts. Patrolling of new pages and edits is vital
>> work. In addition to making sure that new content conforms to Wikipedia
>> project policies, patrollers are the first line of defense against
>> disinformation, copyright infringement, libel and slander, personal
>> threats, and other forms of vandalism on Wikimedia projects. This research
>> project is focused on understanding the needs, priorities, and workflows of
>> editors who patrol new content on Wikimedia projects. The findings of this
>> research can inform the development of better patrolling tools as well as
>> non-technological interventions intended to support patrollers and the
>> activity of patrolling.
>>
>> --
>> Janna Layton (she, her)
>> Administrative Assistant - Product & Technology
>> Wikimedia Foundation <https://wikimediafoundation.org/>
>>
>
>
> --
> Janna Layton (she, her)
> Administrative Assistant - Product & Technology
> Wikimedia Foundation <https://wikimediafoundation.org/>
>


--
Janna Layton (she, her)
Administrative Assistant - Product & Technology
Wikimedia Foundation <https://wikimediafoundation.org/>
_______________________________________________
Wikimedia-l mailing list, guidelines at: https://meta.wikimedia.org/wiki/Mailing_lists/Guidelines and https://meta.wikimedia.org/wiki/Wikimedia-l
New messages to: Wikimedia-l@lists.wikimedia.org
Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/wikimedia-l, <mailto:wikimedia-l-request@lists.wikimedia.org?subject=unsubscribe>