skip to Main Content

He started the Wikipedia page on the Buffalo shooting and many other tragic events | National

After Jason Moore, of Portland, Oregon, saw headlines from national news sources on Google News regarding the Buffalo shooting at a local supermarket on Saturday afternoon, he did a quick search of the incident on Wikipedia. . When no results emerged, he wrote a single sentence: “On May 14, 2022, 10 people were killed in a mass shooting in Buffalo, New York.” He clicked save and published the entry on Wikipedia in less than a minute.

This article, which had more than 900,000 views on Friday, has since undergone 1,071 edits by 223 editors who voluntarily updated the page of the Internet’s largest, free, participatory encyclopedia. Moore, who works as a strategist for a digital creative agency, has made nearly 500,000 edits to Wikipedia articles over the past 15 years. He is also ranked among the 50 most active English Wikipedia users of all time, based on the number of edits. (Wikipedia editors are not paid.)

“It’s a hobby,” Moore told CNN Business. “Sometimes I spend a lot of time diving in and fleshing out an article, but other times I write a sentence or two to get the ball rolling and watch other editors improve my work. the seed and see it evolve over time.”

He is credited with creating 50,000 entries, including some important news pages like that of the 2021 United States Capitol attack. He was also one of the main editors of the George Floyd protest pages and Black Lives Matter. “I had a lot more time when we were in lockdown than I could devote to quality work on Wikipedia,” he said.

In the midst of breaking news, when people are looking for information, some platforms may present more questions than answers. Although Wikipedia is not staffed by professional journalists, it is considered an authoritative source by much of the public, for better or for worse. His entries are also used for fact-checking purposes by some of the biggest social platforms, adding to the stakes and scope of the work of Moore and others.

The day after the Buffalo shooting, Moore created the page for the shooting at the Geneva Presbyterian Church in Laguna Woods, California, where one person was killed and five others were injured (four seriously). But he also created pages about earthquakes, wildfires, terrorist attacks and other news stories.

“Editing Wikipedia can absolutely cost me emotionally, especially when working on difficult topics such as the COVID-19 pandemic, mass shootings, terrorist attacks and other disasters,” he said. “I’ve learned to minimize this by stepping away when necessary and revisiting tasks later.”

Moore is part of a subculture of Wikipedia users who spend hours each day contributing to the platform, helping fulfill the organization’s mission to “create and distribute a free encyclopedia of the highest quality possible to every person on the planet in their own language”. He describes his work as a volunteer editor as “rewarding”.

“I love the instant gratification of making the internet better,” he said. “I want to point people towards something that’s going to give them much more reliable information at a time when it’s very difficult for people to understand which sources they can trust.”

While anyone can contribute to a Wikipedia story, editors who are fast, reliable, and resourceful have developed a strong reputation in the “tight” community of Wikipedia editors. Steven Pruitt, for example, is perhaps Wikipedia’s best-known editor – he’s made over 4.7 million edits, more than any other user on the site. In 2017, Time Magazine named him one of the 25 most influential people on the internet.

Some of these expert users attend conferences and meetings of Wikipedia editors around the world. “We’re kind of like ants,” Moore said. “You kind of find out how you fit in and how you can help.”

Cut the noise

Although Wikipedia’s topics vary, it has evolved over the years as a destination for up-to-date information on breaking news. Wikipedia articles on current events often generate hundreds of thousands of views, and other big tech companies, such as Facebook and YouTube, often use Wikipedia to verify content on their own platforms. (Wikipedia entries summarize, present, and cite reliable sources, as well as links to useful resources that might otherwise be considered secondary in a traditional news article.)

Lane Rasberry, who works at the University of Virginia’s School of Data Science and was a volunteer Wikipedia editor for 10 years, said there’s also an appeal and culture around people involved in current affairs. highly publicized on Wikipedia.

“It’s considered cool if you’re the first person to create an article, especially if you do it well with high-quality contributions,” Rasberry said. “Just like when a celebrity dies, we rush to go to Wikipedia and change their [date of] death. People like to be the first…and also to have an impact” by quickly disseminating reliable and accurate information.

To help patrol incoming edits and predict faults or errors, Wikipedia – like Twitter – uses artificial intelligence bots that can forward suspicious content to human reviewers who monitor the content. However, volunteer Wikipedia community editors decide what to remove or edit. The platform also uses administrators, called “trusted users”, who can apply or are appointed for the role, to help monitor content.

Rasberry, who also wrote Wikipedia’s page on the platform’s fact-checking processes, said Wikipedia doesn’t employ paid staff to monitor anything unless it involves “strange and serious crimes.” unusual like terrorism or real-world violence, like using Wikipedia to make threats, planning suicide, or when Wikipedia itself is part of a crime.

Rasberry said the flaws range from a geographic bias, which is related to communication difficulties between languages; Internet access in low- and middle-income countries; and obstacles to freedom of journalism around the world.

Additionally, the organization behind Wikipedia has previously stated that it believes only a small percentage of Wikipedia editors are women. Other issues involve “suppressionism” – when an article is deleted because there isn’t enough journalism to support the topic – and ideological bias, where Wikipedia can match the ideological bias of the ecosystem of information.

Another problem is vandalism, or people making deliberately erroneous edits to Wikipedia pages. But Moore said he’s not worried about his own pages being vandalized because he thinks Wikipedia’s guidelines and policies work in his favor.

“I have many other editors I work with who will support me, so when we come across vandalism or trolls or misinformation or misinformation, the editors are very quick to undo inappropriate edits or remove inappropriate content or missourced content,” Moore said.

While “edit wars” can happen on pages, Rasberry said it tends to happen more often on social issues rather than news. “People have always assumed that edit wars [play out on] Wikipedia and it doesn’t happen as much as outsiders expect,” he said. “Wikipedia has technological and social structures in place, which most people find enjoyable and appropriate, and which allow many people to edit at the same time.

Wikipedia also publicly displays who edits each version of an article through its history page, as well as a “discussion” page for each article that allows editors to openly discuss edits.

“Admins are very quick to block those who don’t follow the rules, so if you come to Wikipedia with bad intentions, you’re wasting your time because we’ll stop you from contributing to the site,” Moore said.

There are also challenges for users to have full access to news on Wikipedia. Rasberry said that due to news or magazine subscription fees, some Wikipedia editors may not be able to access these sources and cite them in their updates. “Access to media and interpretive media is a major bottleneck,” Rasberry said, arguing that “news agencies [should] see Wikipedia more as a collaborator than a rival source of information.”

Wikipedia volunteers have created many tips on reliable sources of information. A Wikipedia page dedicated to the subject notes that articles should be “based on reliable and published sources, ensuring that all significant majority and minority opinions that have appeared in these sources are covered”.

“If no reliable source can be found on a topic, Wikipedia should not have an article on it,” the page says.

Although Moore is known among his friends, colleagues and members of the Wikipedia editor community as a Wikipedia influencer, the weight of this title is far less than the fame one can acquire on YouTube, Instagram and Tik Tok.

“I don’t spend all my time contributing to Facebook and Twitter and those other platforms because I’m deeply committed to Wikipedia’s mission,” he said. “If it was a paid advertising site or had another mission, I wouldn’t waste my time.”

The-CNN-Wire

™ & © 2022 Cable News Network, Inc., a WarnerMedia company. All rights reserved.

Back To Top