Perplexity's Stealth Bots Evade Website Restrictions, Sparking Controversy
8
What is the Viqus Verdict?
We evaluate each news story based on its real impact versus its media hype to offer a clear and objective perspective.
AI Analysis:
The issue's potential for widespread disruption to website architecture combined with the current media attention makes this a high-impact, high-hype event – a clear sign of a major conflict brewing in the tech landscape.
Article Summary
Perplexity, the popular AI search engine, is facing serious allegations of circumventing website restrictions using stealth bots. Cloudflare researchers discovered that Perplexity’s crawlers, after encountering blocks from standard robots.txt files and firewalls, would switch to a new bot employing multiple IP addresses and ASNs to mask its activity and bypass these protections. This behavior, observed across over 10,000 domains and millions of requests, directly violates the established norms of the internet established in 1994 by Martijn Koster’s Robots Exclusion Protocol, which allows website owners to control which bots can access their content. The allegations extend beyond mere technical violations; Perplexity has also been accused of plagiarism by publications like Forbes and Wired, who noted suspicious traffic patterns and manipulated bot ID strings. These issues highlight growing tensions between AI development and the established infrastructure of the web, raising questions about data rights, website accessibility, and the ethical use of AI. Perplexity’s refusal to respond to these concerns further fuels the controversy.Key Points
- Perplexity is allegedly using stealth bots to bypass website restrictions outlined in the Robots Exclusion Protocol.
- Researchers found Perplexity’s bots rotate IP addresses and utilize different ASNs to evade website blocks.
- The company faces accusations of plagiarism from multiple publications, compounding the ethical concerns.

