Monday, 19 March 2018

The Conversation/Nicosia & Carroll: Why Wikipedia often overlooks stories of women in history

The Conversation 

Edition:

Available editions
Africa

    Job Board

    Become an author
    Sign up as a reader
    Sign in

The Conversation
Academic rigour, journalistic flair

    Arts + Culture
    Business + Economy
    Education
    Environment + Energy
    Health + Medicine
    Politics + Society
    Science + Technology
    In French

Why Wikipedia often overlooks stories of women in history
March 16, 2018 12.25pm SAST
Less than a third of biographical entries on Wikipedia are about women. aradaphotography/shutterstock.com
Authors

    Tamar Carroll

    Associate Professor of History, Rochester Institute of Technology
    Lara Nicosia

    Liberal Arts Librarian, Rochester Institute of Technology

Disclosure statement

The authors do not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and have disclosed no relevant affiliations beyond their academic appointment.
Partners

Rochester Institute of Technology

Rochester Institute of Technology provides funding as a member of The Conversation US.

The Conversation is funded by Barclays Africa and seven universities, including the Cape Peninsula University of Technology, Rhodes University and the Universities of Cape Town, Johannesburg, Kwa-Zulu Natal, Pretoria, and South Africa. It is hosted by the Universities of the Witwatersrand and Western Cape, the African Population and Health Research Centre and the Nigerian Academy of Science. The Bill & Melinda Gates Foundation is a Strategic Partner. more
Republish this article

Republish
Republish our articles for free, online or in print, under Creative Commons licence.

    Email
    Twitter57
    Facebook291
    LinkedIn
    Print

Movements like #MeToo are drawing increased attention to the systemic discrimination facing women in a range of professional fields, from Hollywood and journalism to banking and government.

Discrimination is also a problem on user-driven sites like Wikipedia. Wikipedia is the fifth most popular website worldwide. In January, the English-language version of the online encyclopedia had over 7.3 billion page views, more than 2000 percent higher than other online reference sites such as IMDb or Dictionary.com.

The volume of traffic on Wikipedia’s site – coupled with its integration into search results and digital assistants like Alexa and Siri – makes Wikipedia the predominant source of information on the web. YouTube even recently announced that it would start including Wikipedia links below videos on highly contested topics. But studies show that Wikipedia underrepresents content on women.

At the Rochester Institute of Technology, we’re taking steps to empower our students and our global community to address issues of gender bias on Wikipedia.
Signs of bias

Driven by a cohort of over 33 million volunteer editors, Wikipedia’s content can change in almost real time. That makes it a prime resource for current events, popular culture, sports and other evolving topics.

But relying on volunteers leads to systemic biases – both in content creation and improvement. A 2013 study estimated that women only accounted for 16.1 percent of Wikipedia’s total editor base. Wikipedia co-founder Jimmy Wales believes that number has not changed much since then, despite several organized efforts.

If women don’t actively edit Wikipedia at the same rate as men, topics of interest to women are at risk of receiving disproportionately low coverage. One study found that Wikipedia’s coverage of women was more comprehensive than Encyclopedia Britannica online, but entries on women still constituted less than 30 percent of biographical coverage. Entries on women also more frequently link to entries on men than vice-versa and are more likely to include information on romantic relationships and family roles.

What’s more, Wikipedia’s policies state that all content must be “attributable to a reliable, published source.” Since women throughout history have been less represented in published literature than men, it can be challenging to find reliable published sources on women.

An obituary in a paper of record is often a criterion for inclusion as a biographical entry in Wikipedia. So it should be no surprise that women are underrepresented as subjects in this vast online encyclopedia. As The New York Times itself noted, its obituaries since 1851 “have been dominated by white men” – an oversight the paper now hopes to address through its “Overlooked” series.

Categorization can also be an issue. In 2013, a New York Times op-ed revealed that some editors had moved women’s entries from gender-neutral categories (e.g., “American novelists”) to gender-focused subcategories (e.g., “American women novelists”).
Next great American woman novelist? Roman Kosolapov/shutterstock.com

Wikipedia is not the only online resource that suffers from such biases. The user-contributed online mapping service OpenStreetMap is also more heavily edited by men. On GitHub, an online development platform, women’s contributions have a higher acceptance rate than men, but a study showed that the rate drops noticeably when the contributor could be identified as a woman through their username or profile image.

Gender bias is also an ongoing issue in content development and search algorithms. Google Translate has been shown to overuse masculine pronouns and, for a time, LinkedIn recommended men’s names in search results when users searched for a woman.
What can be done?

The solution to systemic biases that plague the web remains unclear. But libraries, museums, individual editors and the Wikimedia Foundation itself continue to make efforts to improve gender representation on sites such as Wikipedia.

Organized edit-a-thons can create a community around editing and developing underrepresented content. Edit-a-thons aim to increase the number of active female editors on Wikipedia, while empowering participants to edit entries on women during the event and into the future.

Later this month, our university library will host its second annual Women on Wikipedia Edit-a-thon in celebration of Women’s History Month. The goal is to improve the content on at least 100 women in one afternoon.

For the past four years, students in our school’s American Women’s and Gender History course have worked to create new or substantially edit existing Wikipedia entries about women. One student created an entry on deaf-blind pioneer Geraldine Lawhorn, while another added roughly 1,500 words to jazz artist Blanche Calloway’s entry.

This class was supported by the Wikimedia Education Program, which encourages educators and students to contribute to Wikipedia in academic settings.

Through this assignment, students can immediately see how their efforts contribute to the larger conversation around women’s history topics. One student said that it was “the most meaningful assignment she had” as an undergraduate.

Other efforts to address gender bias on Wikipedia include Wikipedia’s Inspire Campaign; organized editing communities such as Women in Red and Wikipedia’s Teahouse; and the National Science Foundation’s Collaborative Research grant.

Wikipedia’s dependence on volunteer editors has resulted in several systemic issues, but it also offers an opportunity for self-correction. Organized efforts help to give voice to women previously ignored by other resources.

    Gender
    Women
    Biography
    Libraries
    Wikipedia
    Gender bias
    Bias
    Archives
    Women's history
    Women's history month

    Tweet
    Share
    Get newsletter

You might also like
Why women with HIV are persistently invisible – and how we can challenge it
Perish not publish? New study quantifies the lack of female authors in scientific journals
Celebrating Marion Walter – and other unsung female mathematicians
Hidden figures: How black women preachers spoke truth to power
Sign in to comment
24 Comments
Oldest Newest

    Angelina Melansky

    There are a couple of fallacies in this argument:1. Social determinants of participation were ignored. When we know socioeconomic status of people could have very important role in their social behavior, ignoring this “confounding factor” in interpretation of crude result either shows lack of scientific mind or dishonesty.

        Wikipedia is an international effort and human cultural achievement. Restricting the analysis to English language only reflect deep bias in the mind of writer, as if the world is centered around the English-Speaking West. Maybe not.

        Did you talk about the disparity in contribution from different social classes? Is there enough representation of workers related matters in wiki? Can workers find useful and accessible resources in wiki to present themselves legally in the matter conflict with their employer? What about minoritues? How you decided Gender is the most important factor contributing to discrimination? Just because of #MeToo?( this movement is extremely valid to resist violent behavior toward women with the hope to stop exploitation).

    It was a conversation and I wanted to bring these points also into the attention of readers.
    Read more
    3 days ago
    Report
        Tamar Carroll

        Associate Professor of History, Rochester Institute of Technology
        In reply to Angelina Melansky

        Hi Angelina,Thanks for your comments. Certainly poor and working-class people, people of color, and people with disabilities are also underrepresented on Wikipedia, for many of the same reasons that women are: they are less likely to participate in editing Wikipedia, and they are less likely to have been included in published sources in the past. We are not arguing that gender bias is the only one present, nor that it is more important than other ones; however, in many cases, our students are writing entries about women that are also people of color or disabled, for example, and thus contributing to diversifying Wikipedia in multiple ways.
        3 days ago
        Report

Show all comments
Most popular on The Conversation

    How we recreated a lost African city with laser technology
    Survey shows Zuma and ANC’s mutual dance to the bottom
    Three major mistakes Tiger Brands made in response to the listeriosis crisis
    How Kinshasa’s markets are captured by powerful private interests
    What led to world’s worst listeriosis outbreak in South Africa

    South Africa’s land debate is clouded by misrepresentation and lack of data
    Black people beware: don’t let Black Panther joy mask Hollywood’s racism
    African universities are ignoring a rich, invaluable resource: their alumni
    South Africa’s economy is badly skewed to the big guys: how it can be changed
    What ‘blackface’ tells us about China’s patronising attitude towards Africa

Expert Database

    Find experts with knowledge in:*

Want to write?

Write an article and join a growing community of more than 64,300 academics and researchers from 2,279 institutions.

Register now
The Conversation
Community

    Community standards
    Republishing guidelines
    Research and Expert Database
    Analytics
    Job Board
    Our feeds

Company

    Who we are
    Our charter
    Our team
    Partners and funders
    Contributing institutions
    Resource for media
    Contact us

Stay informed and subscribe to our free daily newsletter and get the latest analysis and commentary directly in your inbox.
Email address
Follow us on social media

Privacy policy Terms and conditions Corrections

Copyright © 2010–2018, The Conversation Africa, Inc.

No comments: