Wikipedia has become the dominant source of reference information for more than half a billion people. Through its improbable rise to popularity, this “free encyclopedia that anyone can edit” has also become a synecdoche for open production communities online. In order to operate at massive scales (~160k edits per day), Wikipedians have embraced algorithmic technologies that bring efficiency and consistency to the wiki’s complex, distributed processes. These algorithms mediate social processes, governance decisions, and editors’ perceptions of each other. Specifically, so-called “black box” artificial intelligences have proven invaluable for supporting curation activities at scale, but they also have the potential to silence voices and introduce biases in subtle ways. Despite Wikipedians’ open governance model, that’s exactly what’s been happening. In this talk, I’ll introduce “ORES,” an open AI platform that is designed to enable Wikipedia’s technologists to enact alternative ideological visions and to enable researchers to more easily perform audits of modeling bias. I’ll share some lessons that we’ve learned maintaining a large-scale, generalized AI service and discuss our next directions in building the foundation for an open, collaborative ecosystem of AI governance.
Dr. Halfaker is a principal research scientist at the Wikimedia Foundation and a senior scientist at the University of Minnesota. He studies the intersection of advanced algorithmic technologies and social issues in open production communities (like Wikipedia) using a mixture of experimental engineering, data science, and ethnographic methods. His studies of Wikipedia’s editor decline and his development of “ORES”, an open AI platform for Wikipedians, have received attention from the technology press.