According to the latest Google Analytics reports, 9/10 of referral traffic comes from dubious sites, which means that it is spambots that browse web pages of your online resource. Some may think that there is nothing bad about it. However, this is an issue to be addressed, and there are several ways to protect against them.
For effective protection against spambots, you can use the following methods or a combination of them. For example, you can use the anti spam plugin by cleantalk https://cleantalk.org/email-checker and set up a captcha. Let’s look into some of the available options.
These include antispam plugins, blocking spam commentators by IP, website admin panel protection, and reCAPTCHA. When choosing such tools, one should proceed from the peculiarities of the engine on which the protected online resource operates. You should also keep in mind that pre-moderation of comments and antispam plugins reduce the convenience of users. Therefore, these tools should be used only in cases when the number of comments left by spammers on the site is actually large.
This method allows you to achieve only one goal – to prevent distortion of statistical information. At the same time, spammers continue to visit the website, leave comments, increase the load on the server, and try to gain access to control over the online resource. To exclude information about the actions of spambots on the site from performance specs, you need to create a new filter. In addition, it is possible to exclude info about visits from specific cities, regions, and countries.
This is the most radical method of dealing with spambots and all other unwanted visitors. Setting up the .htaccess file allows you to deny access to the website for users who come from certain IP addresses. The main advantage of this method is the opportunity to protect the server from excessive load, the site – from hacking attempts, visitors – from spam, and statistics – from distortion.
But blocking using the .htaccess file has some drawbacks. The first thing to consider is that spambots have the ability to easily change IP addresses. The second disadvantage is that you may block a site for normal visitors.
Many webmasters faced with spammers do not take special measures but limit themselves only to pre-moderation of comments. This approach is not the best for different reasons:
- First of all, active visits to the resource by spambots lead to distortion of analytical information. Several tens or hundreds of visits by spammers per month spoil behavioral metrics and prevent the correct use of referral statistics;
- In addition, massive spambots may cause server overload. When the number of visits is limited to one or two per day, problems are usually not noticed. But if it grows to several dozen per day, then a slowdown in page loading becomes noticeable for real visitors, which is a serious problem;
- In addition to leaving comments, some programs may look for vulnerabilities in the used website engine or attempt to gain access to the server.
All the above may cause a lot of problems and even undermine the security of your website. Therefore, it is highly recommended to use special protective software.