Wikipedia, Qatar, and the Future of Knowledge
Qatar’s Prime Minister and Minister for Foreign Affairs Sheikh Mohammed bin Abdulrahman bin Jassim Al-Thani speaks on the first day of the 23rd edition of the annual Doha Forum, in Doha, Qatar, Dec. 6, 2025. Photo: REUTERS/Ibraheem Abu Mustafa
Imagine a world in which facts can be erased from one of society’s key sources of information.
A world where foreign governments and terror-supporters have a say in whether you should know something or not.
A world where truth is malleable and facts are twisted to fit pre-determined narratives.
No, this isn’t an Orwellian dystopia. It’s Wikipedia as it currently operates: one of the world’s most influential websites and a primary source of information for millions.
Because of how it crowd-sources information, Wikipedia is one of the most extensive sources of knowledge on the Internet (and possibly in the entire world). However, this same strength is also Wikipedia’s biggest weakness, leaving it vulnerable to manipulation by autocracies, terror supporters, and other bad actors.
From recently-uncovered Qatari influence to a secret network of anti-Israel activists, we’ll take a look at how the truth is being manipulated on Wikipedia, and what this means for our understanding of the world.
Wikipedia was meant to democratize knowledge, but today it’s a hub for deliberate fake info and erasing documented history by rogue editors – what I call Knowledge Poisoning.
The list of victims is endless:
Iranian protestors in Iran, Iryna Zarutska, Women, Jews and more.The… pic.twitter.com/Ir7WzKfHGD
— Ella Kenan (@EllaTravelsLove) January 17, 2026
In Qatar’s case, the PR firm Portland Communications was hired after Qatar was selected to host the 2022 World Cup. Its job was to edit Wikipedia articles related to human rights, and to suppress other unflattering facts that threatened the state’s international image.
According to the report, between 2013 and 2024 Portland Communications directed a network of subcontractors to edit Wikipedia articles on human rights in Qatar, as well as entries on Qatari politicians and businessmen accused of corrupt or unethical conduct.
The edits were deliberately small and incremental, designed to evade detection and slip past the scrutiny of other Wikipedia editors.
In short, anyone researching Qatar on Wikipedia has not been presented with a full or nuanced picture of the Gulf state.
Instead, they encountered paid-for reputation management designed to polish its image and suppress unflattering facts. In the process, Wikipedia shifted from an information resource to a vehicle for indoctrination.
The @TBIJ just revealed a UK PR firm allegedly paid intermediaries to rewrite Wikipedia pages — burying criticism of Qatar and reshaping public perception.
Hidden edits like these launder reputations, making biased content appear neutral to millions. The Portland case is the… pic.twitter.com/EwZpwS2ODx
— Ashley Rindsberg (@AshleyRindsberg) January 17, 2026
Nor is Qatari influence confined to Wikipedia. Analyst Eitan Fischberger has noted that the Qatar Investment Authority has invested billions of dollars in Elon Musk’s xAI. This is a development that has potential implications for how Qatar is portrayed on Grokipedia, xAI’s alternative to Wikipedia.
If this pattern continues, the result is straightforward: future audiences may encounter a curated version of Qatar that downplays human rights abuses and other reputational liabilities. By strategically funding the platforms people rely on for information, a state need not censor facts outright, as it can simply ensure they are never meaningfully encountered.
I was about to tweet about how Grokipedia is already far better than Wikipedia.
And I was going to illustrate this by comparing their pages on Qatari foreign influence in American universities.
But it looks like Qatar noticed too. Hence the billions now being thrown at xAI, as… pic.twitter.com/0VcCrJlUpx
— Eitan Fischberger (@EFischberger) January 8, 2026
Wikipedia’s Untrustworthiness on Israel
For those who have followed developments around Wikipedia, the revelation that Qatar actively sought to edit articles in its favor came as little surprise. Abuse of the crowdsourced encyclopedia by bad-faith actors has been documented for years.
In 2024, investigative journalist Ashley Rindsberg published an in-depth exposé about a group of 40 activists who had engaged in a coordinated campaign of anti-Israel disinformation since 2020.
According to Rindsberg, this group accounted for 90 percent of the content on dozens of Israel-related articles and made a combined total of more than two million edits on over 10,000 articles.
This coordinated effort has transformed Wikipedia’s Middle East narrative: Zionism is increasingly framed as inherently evil, Hamas’ violent Islamist ideology is softened or obscured, Iranian human rights abuses are minimized, and the Jewish historical connection to the Land of Israel is routinely challenged or erased.
Rindsberg has also identified another coordinated effort: a group known as Tech for Palestine (TFP), which formed during the recent Israel–Hamas war and edited thousands of Wikipedia articles related to Israel.
In its own welcome message on the platform Discord, the group explained its focus on Wikipedia by noting that the encyclopedia’s “content influences public perception.”
Most recently, independent investigative journalist David Collier conducted a deep dive into a Wikipedia claim that the Israeli town of Ofakim was built on a depopulated Bedouin village. He found that the cited books and maps did not support the claim at all, and that the evidence had been effectively fabricated through misrepresentation.
Yet the claim remains on Wikipedia, upheld by a decision from an anti-Israel activist editor, and it continues to feed into AI systems that treat Wikipedia as authoritative, compounding the misinformation.
Wikipedia’s Israel problem is no longer in dispute. As long as activist editors retain outsized control over key articles, the Internet’s largest encyclopedia remains an unreliable source for understanding Israel, the Palestinians, and the Middle East.
Exclusive: A detailed investigation exposes how false claims on @Wikipedia fabricate history – then get laundered into activist and media campaigns used to smear elected officials and demand resignations
Thread— David Collier (@mishtal) January 5, 2026
How Wikipedia Influences Your Life — Even Without Your Knowledge
According to Wikipedia’s own data, the site is viewed nearly 10,000 times per second, totaling close to 300 billion page views annually. In practice, this means a significant portion of the world’s population relies on Wikipedia for basic knowledge, often without realizing how susceptible it is to manipulation by bad-faith actors.
And opting out is not an escape. Even users who never consult Wikipedia themselves are still influenced by it, as many AI systems draw on Wikipedia as an authoritative source, recycling its distortions at scale. And to mark its 25th anniversary, Wikipedia has signed content partnerships with major AI companies, including Meta, Microsoft, Amazon, Perplexity, and Mistral AI.
This influence is already embedded in everyday technology. Google’s search results routinely draw on Wikipedia as a trusted reference, while voice assistants such as Alexa and Siri rely on it to answer basic factual queries.
In practice, Wikipedia now functions as a foundational layer of the modern information ecosystem.
At the center of everything is Wikipedia.
Wikipedia articles appear in 67%-84% of all search engine results & most info boxes
Wikipedia generates 43M clicks to external websites a month
Wikipedia is a major component of AI training data, including The Pile training set pic.twitter.com/kqMfsOZmaP— Ethan Mollick (@emollick) May 30, 2023
Whether you consult Wikipedia directly, ask an AI system for information, or turn to Siri with a question, you are being shaped by the thousands of editors whose collective work forms Wikipedia.
Most of those editors are diligent volunteers committed to accuracy and the pursuit of knowledge. Some, however, are not. They omit facts, introduce disinformation, and quietly reshape narratives to fit an ideological agenda.
The real danger is not Wikipedia’s scale, but the trust it enjoys. Too often, it is treated as neutral while users have no reliable way to distinguish between an article written to inform and one designed to manipulate.
The author is a contributor to HonestReporting, a Jerusalem-based media watchdog with a focus on antisemitism and anti-Israel bias — where a version of this article first appeared.