Wikipedia editors have developed a new guide to aid readers in identifying entries written by artificial intelligence. The guide, titled "Signs of AI Writing," aims to provide a "feel" for spotting undisclosed AI-generated content. According to Ilyas Lebleu, a volunteer Wikipedia editor in France, AI writing isn't necessarily bad, but it requires more scrutiny.
The guide highlights several signs that may indicate AI-generated content, such as unnatural writing styles, fabricated citations, and overuse of certain words or phrases. For instance, AI-generated content may sound real but be completely fake, or it may include citations that are not related to the subject of the article or are entirely fictional. Additionally, AI-generated content may exhibit inconsistent formatting or language patterns that sound like AI responses.
The guide is part of Wikipedia's efforts to address the issue of low-quality AI-generated content. In August 2025, Wikipedia adopted a policy allowing editors to nominate suspected AI-generated articles for speedy deletion. The community has been discussing the role of AI in creating content, with some editors finding AI useful for starting drafts or creating new articles, while others raise concerns about the potential for misinformation and bias.
By providing this guide, Wikipedia aims to empower readers to critically evaluate the content they encounter on the platform. As AI-generated content becomes more prevalent, it's essential for readers to be aware of the potential signs of AI involvement and to approach such content with a healthy dose of skepticism.