LAUNCH ETA: 2025 October

The Reshaping of the Information Ecosystem

6 min read

The way people reach information is undergoing a structural break. For two decades, the web grew on the back of search engines sending users outward: a query produced links, traffic flowed to publishers, and the economics of journalism, reviews, and analysis were sustained by advertising impressions or subscription funnels. That model is now collapsing.

Search no longer wants to be a referral system. Google and its peers are redesigning themselves as destinations. The introduction of AI Overviews marks a turning point: questions are answered directly on the results page, stitched together from the open web but without passing attention back to its sources. Advertisements are blended into those synthetic answers. The funnel is compressed as query → overview → ad click, and the publisher is cut out.

This change does not merely skim a percentage off the top. It shifts incentives across the system. Publishers who once depended on search visibility now find their reach shrinking, some see up to 25% declines in referrals year-on-year1. Enders Analysis shows publisher visibility today is less than half of 2019 levels, with AI Overviews triggered on a third of queries for major tabloids2. Communities that thrived on discoverability face crawling restrictions, bots, and economic barriers. Non-profits like Wikipedia or the Internet Archive survive only on donations. Advertisers, sensing the flow of attention, pour money into AI-native placements inside Google and Microsoft interfaces instead of the publisher pages where those ads once ran. The Financial Times calls this the “Google Zero” moment: AI captures the query and cuts publishers out of the funnel3 4. Pew finds only 8% click traditional links versus 15% without summaries5. Ahrefs reports a 34.5% decline in CTR when AI Overviews are present, echoed by independent studies6 and Authoritas found CTR losses as high as 79% for some publishers, calling the drop “devastating”7. The open web, long messy but generative, is being sidelined by design.

The results of this are already visible. As more content moves behind paywalls or API restrictions, AI summarizers lean on a thinner base of open material. That accelerates a dangerous loop: models recycling the outputs of other models, stripping nuance, repeating biases, and hallucinating detail to fill gaps. The appearance of authority grows even as the quality of substance declines. Review ecosystems, already corrupted by fake or incentivized entries, are further undermined by AI-generated noise. Summaries of “thousands of reviews” collapse into statistical sludge. Trust shifts away from the anonymous crowd toward branded critics, private communities, or sources that can guarantee provenance.

This is where the illusion of super-intelligence does its work. Large models are fluent; they sound confident, polished, and neutral. Users without subject-matter expertise rarely notice the missing edge cases or subtle biases. They see coherence and mistake it for truth. Complex debates are flattened into neat paragraphs that erase disagreement. The danger is error and the consolidation of interpretation. What appears to be knowledge is often just a convenient compression of skewed inputs.

Julian Assange once warned that ignorance can be manufactured. Not by withholding facts, but by shaping how they are framed, how they circulate, and what is easy to find. That warning resonates more sharply now. Ignorance in the AI-mediated ecosystem is not the absence of information but the byproduct of convenience. People have always chosen ease over structure, the smooth feed over the messy archive. If an AI box delivers an answer instantly, few will click through to check. Correctness, diversity, decentralization are values sacrificed to frictionless design.

The deeper shift is not just in search, but in control of the entry point. Whoever owns the starting application shapes the entire horizon of user knowledge. Google controls Chrome and Search; Microsoft pushes Bing and Copilot into Windows; Apple may direct Safari traffic in the future, and could embed its own assistant more deeply into iOS. Even if regulators force divestitures, defaults are powerful. Without control over browsers or operating systems, newcomers will remain niche. Meta pursues a different path, building AI companions that act as “friends”, designed to mediate social, emotional, and lifestyle interactions. Whether through search UIs or chat companions, the battle is over where intent originates. The company that controls the entry point controls the framing of the world.

Left unchecked, this trajectory leads toward concentration. A handful of firms will define the bulk of what most users “know”. They will license some content, gate other material, and blend everything into AI-generated summaries with ads seamlessly embedded. The surface will feel abundant, there are answers everywhere, always ready, but the underlying base of primary reporting and diverse perspectives will wither. Independent journalism, already fragile, risks further collapse as its visibility and ad revenue shrink. Crowdsourced communities face bot floods, while open projects struggle for sustainability. The informational commons becomes thin, while the monopolies become thick.

And yet, the antidote cannot rely on regulation. Laws tend to entrench incumbents. GDPR strengthened the giants who could afford compliance. Copyright enforcement favors established publishers. The idea that state regulation will restore openness is a mirage. If there is to be resilience, it must be built elsewhere.

The path forward lies in parallel structures that can survive even if they never become mainstream. They need to be self-sustaining, anchored in communities, and convenient enough to use without friction. Wikipedia, sustained through volunteer contributions and donations, shows how knowledge systems outside commercial funnels can persist.

These are not solutions for the majority. Most people will accept the convenience trap, just as they accepted algorithmic feeds on social platforms or Amazon’s centralization of commerce. But the existence of resilient enclaves matters. They provide alternatives when centralized platforms falter, they preserve the continuity of knowledge, and they sustain communities that still value accuracy and independence. The goal is not to replace Google, but to ensure that truth, provenance, and unmanipulated discourse still exist outside its walls.

The information ecosystem is being reshaped in real time. What we are building now will decide whether the next decade belongs entirely to AI monopolies, or whether there remain high value spaces where knowledge is preserved, bias is visible, and discourse is not bent to engagement loops or corporate interests. Convenience will always have gravity. The antidote is not to fight that directly, but to construct systems that are resilient, transparent, and unowned, systems that endure even as the mainstream drifts toward centralization.

That is the work ahead.


  1. Digiday — Google AI Overviews linked to 25% drop in publisher referral traffic ( source ↩︎

  2. Enders Analysis — Publishers’ invisibility problem: Organic traffic under pressure ( source ↩︎

  3. Financial Times — Publishers race to counter “Google Zero” threat as AI changes search engines ( source ↩︎

  4. Wired — With AI Mode, Google Search Is About to Get Even Chattier ( source ↩︎

  5. Pew Research Center — Google users less likely to click links when AI summaries appear ( source ↩︎

  6. Search Engine Land — Google AI Overviews hurt click-through rates, Ahrefs/Amsive data ( source ↩︎

  7. The Guardian — AI summaries causing devastating drop in news audiences ( source ↩︎