CuriosityKat33, Reducing AI Bias by Reducing Wiki Bias, Wikimedia,
I was lucky enough to spend ten weeks learning about and contributing to English Wikipedia under the Tāmaki Paenga Hira Auckland War Memorial Museum’s 2025/26 Sheldon Werner Summer Studentship. Alongside 5 other students, this Wikipedia project taught us all about the Wikimedia world and encouraged us to contribute articles about local history here in Tāmaki Makaurau Auckland.
Wikipedia’s focus on open access and reliable information was what stood out to me most during this studentship. Having been instructed through college and university to steer clear of Wikipedia due to its apparent “unreliable nature”, the first week of the studentship flipped my perspective completely. Reliability is one of the ‘non-rule’ rules of Wikipedia – articles should be based on trusted, published sources with a known reputation for accuracy. This ensures that content from Wikipedia is unbiased, reliable, and easily verified.
In the age of Generative AI, the reliability of information is coming into conflict and challenging Wikimedia’s core principles. The Tow Center for Digital Journalism found that generative search tools, free or premium, confidently supply incorrect or speculative answers and fabricate links and citations. AI’s coded desire to please human demands at what seems to be all costs is increasingly dangerous for the reliability, transparency and diversity of knowledge. Not only can it spread misinformation, but with its machine learning system trained on inherently biased human data, the risk of continuously perpetuating historic social inequities is very real. AI overviews, a now permanent fixture of Google, can be inaccurate and misinterpret web content. And yet, AI is increasingly popular – Chat GPT reported between 400 million to 700 million monthly users in 2025.
AI chat boxes and generated summaries are often powered by Wikipedia due to its open access model. Wikipedia is unbiased in its presentation of neutral, balanced viewpoint on topics, however bias seeps in through significant representation gaps across the platform – a representation of wider societal knowledge bias. In terms of gender gaps, Wikimedia contributors are 87% male and only 18.13% of all content in Wikimedia is about women and women’s histories. Europe and Eurocentric views are also overrepresented – Africa has double the population of Europe, but only 15% the number of articles. If representation gaps exist in Wikimedia, then they will certainly exist in generated AI responses, and this risks the continual suppression of knowledge about and from marginalised people, communities and subjects.
Enterprising Wikimedians have developed many projects which aim to increase the diversity of information on Wikipedia, something I became aware of early on in my studentship. My contributions creating and enriching articles about local history in Tāmaki Makaurau enhance what is a niche collection of articles on Wikipedia that represent Aotearoa New Zealand to a global audience. Inspired by the Women in Red WikiProject, I decided to hone my researching and writing in on women in art and architecture within a Tāmaki context. This studentship privileged me with easy access to the Tāmaki Paenga Hira Auckland Museum’s Research Library as well as paywalled digital libraries, so I could collate lesser-known and harder-to-find information to enhance women and woman’s subjects on Wikipedia.
I was proud of the article I wrote on the Auckland Women’s Suffrage Memorial. Thousands of people walk past this mural every day in Auckland CBD. However, information on this memorial was collecting dust on library shelves and on Wikipedia, the memorial was sequestered to the bottom of another article. It felt fitting to give a proper platform to the memorial commemorating women’s fight in Aotearoa for an equal place in society and history. As part of the studentship, I co-hosted an edit-a-thon – Celebrating the Contributions of Working Wāhine in Aotearoa. We had new and experienced editors alike contributing to stub or start-class articles about women in the arts, law, politics and STEM in Aotearoa New Zealand. Hosted at the Research Library, this in-person event allowed an online community to create human connections with other editors and use library resources to enhance woman’s history in NZ.
Wikimedian’s across the globe work hard to fill representation gaps and the rise in the use of Generative AI makes their work all the more important. Wikipedia’s lauded open-access model means it is inevitable that AI draws upon Wikipedia, so WikiProject’s such as Women in Red, New Zealand Women in Architecture and Migrant of Colour Stories Aotearoa are as important as ever. Telling local histories, writing about women, indigenous peoples and marginalised communities prevents a patriarchal, Euro-centric worldview from prevailing in data and in society.
