×
Cloud company claims AI bots now account for 80% of website traffic, threatening server stability
Written by
Published on
Join our daily newsletter for breaking news, product launches and deals, research breakdowns, and other industry-leading AI coverage
Join Now

Fastly’s new report reveals that AI crawlers and fetchers are overwhelming websites with traffic that accounts for a staggering 80 percent of all AI bot activity, with some bots hitting sites with over 39,000 requests per minute. The surge is primarily driven by Meta (52% of crawler traffic) and OpenAI (98% of fetcher traffic), creating unsustainable server loads that threaten website performance and the business models of content creators.

What you should know: AI bots are fundamentally reshaping internet traffic patterns, with crawlers scraping training data and fetchers delivering real-time responses creating new operational challenges.

  • Fastly, a cloud services company, analyzed data from over 130,000 applications and APIs, inspecting more than 6.5 trillion requests monthly, providing comprehensive visibility into AI bot behavior.
  • Meta dominates AI crawler traffic at 52 percent, followed by Google (23%) and OpenAI (20%), with these three companies controlling 95 percent of all AI crawler activity.
  • OpenAI overwhelmingly leads AI fetcher traffic at nearly 98 percent, indicating either market dominance from ChatGPT’s early consumer adoption or potential infrastructure optimization needs.

The traffic breakdown: Different AI companies show vastly different usage patterns between crawling for training data versus real-time information fetching.

  • Anthropic, an AI company, accounts for just 3.76 percent of crawler traffic, while the Common Crawl Project represents only 0.21 percent despite its mission to reduce duplication.
  • Perplexity AI, recently accused of ignoring robots.txt directives, accounts for 1.12 percent of crawler traffic and 1.53 percent of fetcher traffic, though this is growing.
  • AI fetchers, while representing only 20 percent of total AI bot requests, can generate massive traffic spikes with one bot recorded making over 39,000 requests per minute.

Why this matters: The unsustainable growth threatens website infrastructure and content creator economics while undermining the very sources AI companies depend on for data.

  • “Some AI bots, if not carefully engineered, can inadvertently impose an unsustainable load on webservers, leading to performance degradation, service disruption, and increased operational costs,” Fastly’s report warned.
  • Small site operators serving dynamic content are most severely affected, facing operational challenges that could force them offline or behind paywalls.

Industry pushback emerges: Website operators are increasingly deploying active countermeasures as polite opt-out mechanisms like robots.txt are frequently ignored.

  • Defensive tools like proof-of-work system Anubis and tarpit system Nepenthes are gaining adoption to make scraping computationally expensive for AI companies.
  • Cloudflare, a web infrastructure company, is testing a pay-per-crawl approach to create financial barriers for bot operators, while webmasters implement increasingly sophisticated blocking techniques.

What they’re saying: Experts emphasize the need for industry standards and responsible crawling practices while warning against premature regulation.

  • “At a minimum, any reputable AI company today should be honoring robots.txt. Further and even more critically, they should publish their IP address ranges and their bots should use unique names,” Fastly’s Arun Kumar told The Register.
  • Anubis developer Xe Iaso, CEO of Techaro, offered a stark perspective: “I can only see one thing causing this to stop: the AI bubble popping. There is simply too much hype to give people worse versions of documents, emails, and websites otherwise.”

The regulatory question: While technical solutions proliferate, some experts argue only government intervention can address the fundamental problem.

  • “This is a regulatory issue. The thing that needs to happen is that governments need to step in and give these AI companies that are destroying the digital common good existentially threatening fines,” Iaso said.
  • Kumar advocates for industry-led solutions first: “Mandating technical standards in regulatory frameworks often does not produce a good outcome and shouldn’t be our first resort.”

Looking ahead: Fastly expects fetcher traffic to accelerate as AI tools become more widely adopted and agentic systems that mediate between users and websites proliferate, potentially exacerbating current infrastructure strain.

Fastly warns AI bots can hit sites 39K times per minute

Recent News

Iowa teachers prepare for AI workforce with Google partnership

Local businesses race to implement AI before competitors figure it out too.

Fatalist attraction: AI doomers go even harder, abandon planning as catastrophic predictions intensify

Your hairdresser faces more regulation than AI companies building superintelligent systems.

Microsoft brings AI-powered Copilot to NFL sidelines for real-time coaching

Success could accelerate AI adoption across other major sports leagues and high-stakes environments.