An arcane moderation process leaves public-ish individuals in a lurch when articles get too personal
Sometimes, though, things go wrong. Rules are bent or used to justify bad behavior. The Wikipedia platform is leveraged as a tool for abuse.
Take the case of Jenny Nicholson. She makes videos for YouTube about theme parks, bad books, and porgs — the rotund, squawking bird creatures from Star Wars. She’s not an elected official, and she doesn’t run a Fortune 500 company. Yet, for a time, her Wikipedia page included deeply personal information, like names of old pets and when she got a certain job. It was stuff that went well beyond simple biography.
Deciding what goes on Wikipedia — in terms of who or what gets an article and what information should be included — is a complicated affair. The online encyclopedia has massive documents detailing criteria like notability of subjects, what sources of information are reliable, and how to write biographies of living people. For subjects that obviously merit inclusion, like the president or the CEO of a major tech company, the system is supposed to keep information accurate, neutral, and respectful. In edge cases like the one Nicholson faces, however, there’s a lot more room to fail.
The result was an article that felt less like an encyclopedia entry and more like a report from a stalker.
As Nicholson explained earlier this month in a thread on Twitter, an old version of her Wikipedia page contained detailed personal information that was sourced from a backlog of thousands of tweets, many of which were made long before she began her YouTube career in earnest. While the tweets were technically public, they were also part of that vast digital archive that most of us have but rarely think about — and certainly never expect someone to comb through and mine for minor details. The result was an article that felt less like an encyclopedia entry and more like a report from a stalker.
Other YouTubers chimed in to note similar problems with their own Wikipedia entries. Natalie Wynn, who runs the left-wing video essay channel ContraPoints, pointed out that the citation for her hometown was actually sourced from a tweet she had made about astrology. Harry Brewis, who runs the video essay channel hbomberguy, shared that Wikipedia got his age wrong, and when he attempted to correct it, editors reverted his changes.
Of course, YouTubers aren’t the only ones subjected to questionable entries on Wikipedia. At the time of writing, Wikipedia’s directory of articles nominated for deletion included an “actor” whose only listed role on IMDb was a small part he played when he was nine. His biography listed his ancestry, where he went to school, and his favorite cricket team. Another entry for a fitness expert listed where he lived when he was 10 and how many siblings he had. Yet another article for a Puerto Rican vocalist was so lacking in notable information that one editor believed she may not even exist. (Her music was available on Spotify, but no other major news sources about her were found.) It still listed where she went to school, who she performed with, and members of a band she supposedly started.
Wikipedia’s policies on reliable sources forbid the use of self-published sources like tweets. The site allows for some limited exceptions when the person is publishing information about themselves, but even in those cases, the article has to follow Wikipedia’s policies regarding biographies of living people. If the subject is a person and alive — as Nicholson is — Wikipedia specifically cautions against collecting personal information about them.
According to Trudi Jacobson, head of the Information Literacy Department at the University at Albany, SUNY, the policy against self-published sources makes plenty of sense. “Social media posts really are not appropriate or possibly even permissible for tertiary sources such as dictionaries or encyclopedias, but that doesn’t mean they won’t be used,” Jacobson says. Self-reporting sources like social media are unlikely to be neutral and, even when covering basic details, might not be reliable.
Yet, as Nicholson’s case highlights, things that are against the rules can still happen. “Ideals and reality are in conflict here,” Jacobson says. “While some reputable biographical settings, such as Wikipedia, have a number of policies that should preclude social media posts… from being included, the reality is that anything one has put out publicly might be used, shared, or covered in some way.”
Wikipedia’s immune system
Wikipedia has a number of methods to find articles that don’t conform to its standards, but they’re not infallible. According to the Wikimedia Foundation, which owns the Wikipedia domain and framework, a group of volunteers called the New Page Patrol examines new articles as they’re added. In addition, automated bots crawl Wikipedia constantly to look for “common errors or forms of vandalism,” according to a spokesperson for the foundation. But none of these tools caught Nicholson’s article. It wasn’t new when the problematic personal information was added, and it didn’t trigger the automated vandalism warnings.
The next safeguard is human moderation. Users can keep an eye on existing articles by adding them to their Watchlist. “By default, the person who initially creates an article is watching the article,” says a spokesperson for the Wikimedia Foundation. “Other editors will usually add the article to their watchlist as it is categorized and tagged or as they make changes themselves.”
“Ultimately, the success of an article depends on its readership and how many people are interested in that topic.”
With 200,000 volunteers editing articles every month, the goal is for someone to always be watching, but that may not always be true. In Nicholson’s case, the attention the article received from her tweets become a necessary part of Wikipedia’s immune system. As the Wikimedia Foundation explains, “Ultimately, the success of an article depends on its readership and how many people are interested in that topic; the more participation and interest there is any given topic, the higher the quality of the articles around that topic tend to be.”
Frustratingly, that means the counterpoint — that the fewer people showing interest in an article, the lower the quality of that article might be — can also be the case. To wit, Nicholson was aware of an older revision of her article that listed irrelevant and personal information. It was live for at least a couple months.
“It didn’t sound right, like a legitimate Wikipedia article. Like, the language was clumsy. It was a little parasocial and listed irrelevant things,” she explains. However, Nicholson didn’t act on it because, she says, “I just assumed it was going to get edited or deleted over time.” Indeed, Wikipedia’s system is designed to do that, but it failed in this case, in part because too few people were paying attention.
Reaching consensus
Even once a problem is discovered, resolving it can take time. The Wikipedia Foundation notes that volunteers craft and enforce Wikipedia’s policies. Since many of the site’s policies can be contradictory or subjective, volunteer editors resolve disputes among themselves via a process called consensus. As Stephen Harrison, who writes the Source Notes column for Slate about Wikipedia and the internet’s knowledge ecosystem, explains, “Members of the Wikipedia community will debate whether proposed edits are consistent with the site’s policies and guidelines… If the discussion leads to a consensus — for example, that certain personal information is not appropriate for the encyclopedia — then that content will be removed.”
The consensus process can get messy. It’s not dictated by a vote and rarely leaves everyone involved happy. To achieve consensus, editors discuss the problems and make changes until the result is a version of the article most people are satisfied with.
But consensus takes time. When even more unsettling personal details were added to Nicholson’s article, she decided to speak up publicly on Twitter. The same day, the article was heavily pared down and locked, and editors began a discussion page to debate deleting the article entirely. During that time, most older versions of her article were still available through the revision history feature. Wikipedia’s policies allow at least one week to debate deleting an article.
Consensus is also a messy process. In the now-deleted talk page for Nicholson’s article, one user asked if they could find a suitable photo that Wikipedia has the rights to. Another user chimed in, saying, “Any picture posted to social media is in the public domain,” which is categorically false: Uploading a photo to social media does not remove its copyright.
In another part of the talk page, a user questioned why Nicholson would want her article deleted, since she makes money by creating YouTube videos, and further opined, “I don’t see anything in this article that could in any way be a threat to her safety.” Wikipedia’s policies on whether a subject is “notable” do not take into account how a subject makes their living or how much they may want coverage, instead focusing on whether a subject has had “significant coverage in reliable sources that are independent of the subject.” Nor do its policies require private information to be a threat to a person’s safety to justify deletion. In other words, both of these opinions are irrelevant.
Of course, the talk page’s purpose is for debate. People being wrong and eventually (hopefully) corrected is the point. The ideal here, to borrow Jacobson’s term, is that Wikipedia editors will spot a problem with an article in a timely manner, debate the issues, eventually come to a consensus, and make the appropriate edits or deletions. The reality, on the other hand, is that there is no team of editors on standby discussing every article for every YouTuber, Instagrammer, or TikTok star with a following. Sometimes things slip through the cracks.
From Wikipedia’s perspective, this is the process working as intended. For subjects on the edge of (or outside) notability, however, it can feel like a bigger deal. Until Wikipedia editors discover a problem and take the proper action, detailed information about anyone with even mild notability can end up centralized and indexed for weeks or months. Wikipedia’s process requires a lot of time, energy, and manpower. But how many of those resources are going to be devoted to every musician, child actor, or YouTuber?
As Nicholson puts it, “I get the idea of a public right to information. Like, what if it’s a politician? Or a really influential CEO? But when it’s a YouTuber, I just don’t know.” Wikipedia doesn’t know, either, until a problem is discovered, consensus is reached, and action is taken. Until that process is finished, it can cause a lot of stress for the subject — assuming the person exists, of course.