Bot traffic is any non human traffic that visits a website. The term bot traffic is usually viewed as being negative, but it ultimately depends on what the bot is trying to achieve. With so much misunderstood about bot traffic, we’re taking a look at the different bots involved and what it means for your website.
In a world where billions of users interact with each other online every single day, the internet can seem like a hectic place. With users liking pictures, retweeting messages, and upvoting comments, the amount of daily web traffic on the internet is at an all-time high.
But just how many of these visitors are actually real?
In recent years there has been a massive surge in bot traffic thanks to the improvement in artificial intelligence and automated services.
With more and more bots being launched onto the internet every single day, is this a good thing for website owners and users, or just another annoyance?
In order to fully understand what is bot traffic, we must first explore the different types of automated bots out there and what they do.
What Is Bot Traffic?
Bot traffic can be defined as any online internet traffic that is not generated by a human. This usually means the traffic comes from some kind of automated script or computer program that is made to save a user the time of doing all the tasks manually. Although these bots try to mimic human behavior, they are most certainly not human.
These automated bots can do simple things like clicking links and downloading images, or complicated jobs such as scraping or filling out forms. Whatever they are made to do, they usually do it at a large scale and run almost non-stop. If you’ve ever posted an image on social media like Instagram or Facebook and received hundreds of likes in seconds, then they are most likely bots.
With over 50% of the internet estimated to be bot traffic, it’s clear that bots can be found almost everywhere and on virtually every website.
To give you an idea of the different types of bots out there, here’s a quick breakdown of the good bots and what they do, as well as the bad bots.
The Good Bot Traffic
Although automated bot traffic does get quite a negative reputation from webmasters, there are in fact a range of legitimate bots out there that are only trying to help.
Search Engine Bots
The first and most obvious kind of good bot traffic has to be search engine bots. These internet bots crawl as much as the web as they can and help website owners get their websites listed on search engines such as Google search, Yahoo, and, Bing. Their requests might be automated and listed as bot traffic, but these bots are certainly good bots.
If you own a website, then making sure your site is healthy and always online is often a priority for many owners. To help users ensure their site is always accessible, there is a range of website monitoring bots out there that will automatically ping your site to ensure it’s still online. If anything ever breaks, or your website does go offline, then you’ll be immediately notified and be able to do something about it.
Trying to get your site to number one on search engines can be extremely difficult, especially when you don’t have a lot of information. Luckily, there is a range of software out there that can help improve your SEO efforts by crawling your site and competitors to see what you rank for and how well. Webmasters can then use this data to improve their search visibility and improve their organic web traffic.
Ensuring nobody has stolen your images and used them as their own can be a challenging task. With so many websites to continually check, the only sensible solution is to have an automated bot do it. These web robots crawl the web scanning for specific images to ensure nobody is illegally using any copyrighted content without permission.
The Bad Bot Traffic
Unlike the good bots we just mentioned above, bad bots do really bad things to your website and can cause a lot of damage if left to roam free. This can be any type of bot attack from sending fake traffic and spam traffic or something much more disruptive like ad fraud.
Web scrapers are annoying internet bots that scrape websites looking for valuable information such as email address and contact details. In other cases, they will steal content and images from websites and use them on their own site or social media accounts without permission. They don’t benefit anyone apart from the person who is using it to scrape data.
If you’ve ever got a bizarre email or blog comment from someone, then the chances are a spam bot left it. These bots love to leave generated messages (that often make no sense) on a website’s blog. They also fill out contact forms on websites and spam owners with promotional messages.
One of the oldest and deadliest bad bots out there has to be the DDoS bot. Known as distributed denial of service bots, these bots are often installed on unsuspecting victims PC’s and are used to target a particular website or server with the aim of bringing them offline.
Known as a DDoS attack, there have been plenty of reports in the past of them doing some severe financial damage to sites that have ended up being offline for several days.
These bots might seem like good bots from a website’s server logs, but that is unfortunately not the case. There is a range of malicious bots out there that will scan millions of sites for vulnerabilities and report them back to their creator. Unlike genuine bots that would inform the website owner, these malicious bots are specifically made to report back to one person who will then most likely sell the information or use it themselves to hack websites.
Click Fraud Bots
Unknown to many, there are plenty of sophisticated bots that produce a huge amount of malicious bot traffic specifically targeting paid ads. Unlike robots that produce unwanted website traffic, these bots engage in something known as ad fraud.
Responsible for fraudulently clicking paid ads, this non human traffic costs advertisers billions every year and is often disguised as legitimate traffic. Without good bot detection software, this bot activity can cost advertisers a large proportion of their ad budget.
How Can Traffic Bots Be Bad for Websites?
Now you know about the different types of good and malicious bots out there, how can bot traffic be bad for your site?
The important thing to understand about bots is that most of the scripts and programs are designed to do one job many times over. The creator of the bot obviously wants the job done as fast as possible, but this can bring up many problems for your site.
The biggest problem is that if a robot is continuously requesting information from your site, then this can lead to an overall slow down. This means that the site will be slow for everyone accessing it, which can cause massive problems if, for example, you’re an online store.
Consistent scraping requests could also lead to skewing important KPI’s and Google Analytics data such as your bounce rate.
In extreme cases, too much bot traffic can actually take your entire website offline, which is obviously not good. But thankfully, this is only in extreme circumstances, most of the time, the effects of bot traffic on your website are very subtle.
Having lots of bot traffic on your website will usually lead to things such as:
- More page views
- Higher bandwidth usage
- Incorrect Google Analytics
- Skewed marketing data quality
- Decrease in conversions
- Junk emails
- Longer load times
- Higher server costs
- Increased bounce rate
- Increased strain on data centers
How to Detect Bot Traffic
If you want to check to see if your website is being affected by bot traffic, then the best place to start is Google Analytics.
In Google Analytics, you’ll be able to see all the essential site metrics, such as average time on page, bounce rate, the number of page views and other analytics data. Using this information you can quickly determine if your site’s analytics data has been skewed by bot traffic and to what extent.
Since you can’t see any IP addresses of users in Google Analytics, you’ll have to review these metrics to see if they make sense. A very low time on site metric is a clear indicator that most of your visitors could be bots. It only takes an internet bot just a few seconds to crawl a webpage before it leaves and moves onto its next target.
Another place to check in Google Analytics is the referrals section to check you aren’t receiving any referral spam. Many companies target other sites with a custom bot that will spam their website URL.
When a webmaster checks their referral traffic in Google Analytics they’ll see the name of the website and be inclined to visit. As crude as this sounds, it can help generate the site quite a lot of visitors (mainly out of curiosity!). It might not sound like they are doing harm to your website, but they are actually skewing all of your metrics, wasting your bandwidth, and clogging up your server in general.
How to Stop Bot Traffic
Filtering bad bot traffic and stopping automated robots from harming your website is completely possible, but the solution will depend on the type of traffic source that is affecting your site. Remember, not all bot traffic is bad, and blocking bots such as search engine crawlers is really not a good idea!
If your website is prone to being scraped by robots, vulnerability scanners, and automated traffic bots, then the chances are you’ll want some bot filtering in the form of a firewall or CAPTCHA. The best way to do this is to install a free bot filter service on your website called CloudFlare.
Aside from being a Content Delivery Network (CDN), CloudFlare acts as an application firewall between the website and user, meaning it will only allow legitimate users to access your website. Any suspicious users won’t make it past and won’t get to access your site. This means they won’t waste your bandwidth, ruin your analytics, or have your content stolen.
Another useful way to block bots is to use your website’s robots txt file by filling it with user agents or the actual name of the known bots. You can learn more about blocking robots in the robots txt file in this handy guide. Of course, this only works if the robot respects the robots.txt file, which most genuine bots do. If you’re trying to get rid of a pesky bad bot, then using the CloudFlare option mentioned above is the best.
However, if you’re looking to protect your website from other forms of bots such as fraudulent and repetitive clicks on your ads, then you’ll need something else.
Protect Your Ads From Bad Bot Traffic
Anyone who runs pay per click ads on Google is subject to bot traffic. With so many crawlers out there constantly scraping Google and its results, it’s only a matter of time before these bots click on your ads and ruin your analytics data and budget.
PPC Protect is an automated ad fraud detection tool that will identify any click fraud on your pay per click ads in real-time.
By collecting lots of data from every click, the software will be able to detect when an IP address is suspicious and block that particular user from seeing your ads in the future.
This helps combat bot traffic from SEO tools that crawl Google and other search engines looking for PPC ads. With plenty of these tools out there, you’d be surprised at how many times they crawl search results looking for ads and other information.
To protect your ads from the likes of unwanted bot traffic and scrapers, click below to sign up for a free 14-day trial of our service.