The internet is being flooded with billions of visits from AI bots, according to a report.
AI continues to transform the web, but not only in terms of visible uses. Indeed, another, more discreet, transformation is taking place behind the scenes: that of traffic generated by autonomous agents.
Behind the promises of automation and performance, these systems are already redefining how websites are crawled, indexed, and used.
A recent report published by DataDome highlights a still largely underestimated reality: AI agents are no longer simply assisting users, but are becoming full-fledged players on the web… The figures presented set the tone. In the first two months of 2026, nearly 7.9 billion requests originating from AI agents were recorded. A 5% increase compared to the end of 2025 confirms a continued acceleration. In some cases, this traffic already represents nearly 10% of a site's total volume. This proportion is far from negligible, especially for high-traffic platforms. Among the most active agents are Meta ExternalAgent, ChatGPT-User, and Meta WebIndexer, each with different exploration methods.But this growth raises a fundamental question: do all these interactions actually have value? The report highlights that some agents contribute to SEO or visibility, while others simply collect data… without any direct benefit to the websites involved.
A Visibility and Identity Crisis
Beyond the volume, it is primarily the lack of transparency that is worrying, as a large portion of this traffic remains difficult to identify precisely. For businesses, distinguishing a legitimate agent from a malicious one is therefore becoming a real operational challenge.
Furthermore, this impersonation phenomenon perfectly illustrates this problem. Agents known as Meta ExternalAgent or ChatGPT-User are regularly imitated to bypass security systems, and in some cases, such as with PerplexityBot, fraudulent request rates reach significant levels. This ambiguity also weakens traditional filtering strategies, because allowing a bot based solely on its "user-agent" (i.e., its technical identifier) can now open the door to disguised attacks. Thus, the web is entering a phase where the identity of machines is becoming as critical as that of users… Particularly Exposed Sectors: According to the DataDome report, not all sectors are affected in the same way. For example, e-commerce alone accounts for approximately 20% of this traffic, followed by real estate at 17% and travel at 15%. In these environments, agents can analyze prices, availability, or user behavior on a large scale.
In addition, the emergence of solutions like OpenClaw or agent-based browsers, such as ChatGPT Atlas, further accentuates this phenomenon. These tools, capable of acting autonomously on the web, remain largely unregulated and are significantly underestimated by organizations.
Faced with this new reality, one certainty is emerging: without tools capable of precisely classifying these agents, companies are moving forward blindly in an increasingly automated ecosystem…
Please Login to leave a comment.
Want to Post Your Topic
Join a global community of creators, monetize your content easily. Start your passive income journey with Digbly today!
Post It Now
Comments