Wikipedia has put a temporary stop to its AI-generated summary experiment following criticism from its editor community. The test, first announced earlier this month, aimed to provide users with brief, AI-written overviews of articles. These summaries appeared at the top of Wikipedia pages with a yellow “unverified” label and were only visible to users who opted in via a special browser extension.
The goal of the pilot was to explore how artificial intelligence could help users quickly understand article content. Users had to click to expand the summaries, which were meant to complement—not replace—the main body of content written and maintained by human editors.
However, the reaction from the editor community was swift and largely negative. Many people were worried about the accuracy of the AI summaries. They said the AI content might have mistakes. Even with a warning label, it could still be misleading. Critics felt this could hurt Wikipedia’s strong reputation. They believe human editing is more reliable and trusted.
A major worry was the risk of AI “hallucinations,” a known issue where AI tools generate content that sounds factual but is incorrect or misleading. This problem has been observed in other media outlets experimenting with AI, such as Bloomberg and CNET. Both have had to issue corrections or scale back AI-driven content due to errors.
Wikipedia is still open to using AI in the future. The Wikimedia Foundation said the test was part of a larger plan. They want to see how AI can help with accessibility and improve the user experience. This could help people with visual impairments or learning challenges.
They also said AI tools would only be used with help from the Wikipedia community. Strict rules would guide how AI is used to keep things clear and trustworthy.
The pause in the test is part of a bigger debate. People are asking how AI should be used in places like Wikipedia and the news. AI can help by summarizing or translating content. But it also brings up concerns about trust, control, and accuracy. These are values the Wikipedia community has protected for over 20 years.