HomeInterviewsWhen Bots Outnumber Humans: The New Reality of Monitoring Web Traffic

When Bots Outnumber Humans: The New Reality of Monitoring Web Traffic

If you purchase via links on our reader-supported site, we may receive affiliate commissions.
cyberghost vpn ad

In this post, we spoke with Todd Persen on when bots outnumber humans and the new reality of monitoring web traffic.

Automated traffic now represents a substantial share of activity moving across the internet. According to the Imperva 2025 Bad Bot Report, automated programs accounted for 51 percent of all web traffic. meaning non human activity now exceeds human traffic online.

Some of that automation serves legitimate purposes. Search engines continuously index pages, uptime monitoring services check system availability, and partner integrations rely on automated requests to move data between platforms. Alongside those systems operate scrapers, credential testing tools, inventory trackers, and other automated programs interacting with digital infrastructure every second.

For engineering teams responsible for performance, reliability, and accurate analytics, the challenge goes beyond blocking malicious bots. The real task is understanding the scale and behavior of automated traffic moving through their systems.

Todd Persen, Chief Technology Officer at Hydrolix, works at the intersection of large scale data infrastructure and real time data analytics. As organizations collect increasingly detailed telemetry from their applications, Persen says monitoring automated activity has become essential for engineering, security, and analytics teams.

In this conversation, Persen explains how bot traffic complicates infrastructure planning, interferes with analytics, and why modern observability practices must account for automation as a normal part of internet traffic.

Interview With Todd Persen

Q: Bots now account for a large share of internet traffic. From an infrastructure perspective, why has bot monitoring become essential for modern platforms?

Todd Persen:
Automated traffic represents a significant portion of requests hitting most internet facing systems. Some of that activity is legitimate. Search engines indexing content, uptime monitoring services, and partner integrations all rely on automation. At the same time, there are bots scraping data, testing login credentials, monitoring prices, or scanning applications.

For engineers, the challenge is visibility. When automated traffic blends with human activity, it becomes difficult to interpret infrastructure signals. Request volume, latency, and scaling thresholds can all appear misleading. Monitoring bots allows engineering teams to understand which systems are interacting with their platforms and how that activity affects performance.

Q: Many organizations still treat bots primarily as a cybersecurity issue. Why is that view incomplete?

Todd Persen:
Security teams care about bots because automated tools are often used for reconnaissance and attack attempts. But the impact extends beyond security.

Bots influence analytics, infrastructure utilization, and product insights. For example, automated systems may repeatedly check product pricing or inventory. Without monitoring that behavior, the activity can appear as real customer interest. Marketing teams might see engagement spikes that are actually generated by crawlers rather than people.

Separating automated traffic from genuine user activity is essential if organizations want reliable operational data.

Q: How can bot traffic distort analytics and business intelligence when it is not identified?

Todd Persen:
Bots introduce noise into datasets that measure user behavior. Page views, click paths, and product searches can all be inflated by automated systems. When that traffic is mixed with legitimate sessions, the signals organizations rely on become unreliable.

Teams may believe a campaign is performing well when automated crawlers are responsible for much of the activity. Product teams might draw conclusions about user interest that do not reflect real behavior. Monitoring automated traffic helps restore accuracy to those datasets.

Q: What technical barriers make bot monitoring difficult for many organizations?

Todd Persen:
The biggest obstacle is data scale. Detecting automated patterns requires analyzing request level telemetry across large systems. Engineers need visibility into individual requests, IP activity, request timing, and endpoint targeting.

Many traditional logging platforms required teams to sample their data because storing and querying large volumes of logs was expensive. When that happens, the patterns that reveal automated behavior disappear. Systems built for high volume log analytics, including Hydrolix, allow teams to retain far more telemetry and explore it quickly. That visibility helps engineers identify patterns that indicate automated activity.

Q: What signals tend to reveal sophisticated bots attempting to mimic human users?

Todd Persen:
Behavior usually provides the clearest indicators. Bots often generate extremely consistent request timing or navigate applications in patterns that humans rarely follow. Repeated requests to specific endpoints at high frequency can also signal automation.

Another indicator is coordinated activity. Bot networks may rotate across thousands of IP addresses while performing the same sequence of actions. When engineers analyze request behavior across large datasets, those coordinated patterns become visible even when bots attempt to hide their identity.

Q: How does bot monitoring intersect with security operations?

Todd Persen:
Automated activity frequently appears during the early stages of an attack. Before attempting exploitation, attackers often use bots to scan endpoints, test authentication workflows, or attempt credential reuse.

Monitoring request behavior allows teams to detect that activity early. Unusual authentication attempts or repeated endpoint probing can signal that a system is being tested. Observability data helps engineers understand what the traffic is doing and respond before the situation escalates.

Q: Looking ahead, how will AI driven automation change automated traffic on the internet?

Todd Persen:
AI will make automated traffic far more sophisticated. Bots already simulate browsing behavior, rotate infrastructure, and adapt quickly to detection methods. As AI tools advance, automation will become better at blending into normal traffic patterns.

That shift makes behavioral analysis increasingly important. Organizations will need to examine how traffic moves through their systems rather than relying on simple identifiers. Engineers who can analyze request behavior at scale will have a much clearer understanding of which systems are interacting with their platforms and why.


INTERESTING POSTS

About the Author:

Angela Daniel Author pic
Managing Editor at SecureBlitz | Website |  + posts

Meet Angela Daniel, an esteemed cybersecurity expert and the Associate Editor at SecureBlitz. With a profound understanding of the digital security landscape, Angela is dedicated to sharing her wealth of knowledge with readers. Her insightful articles delve into the intricacies of cybersecurity, offering a beacon of understanding in the ever-evolving realm of online safety.

Angela's expertise is grounded in a passion for staying at the forefront of emerging threats and protective measures. Her commitment to empowering individuals and organizations with the tools and insights to safeguard their digital presence is unwavering.

Incogni ad
PIA VPN ad
RELATED ARTICLES