alphatogo.com and iadn.com have been out there for a while and so my server gets thousands of hits daily from bad bots, hackers and spiders from search engines I don't want. I have a properly configured robots.txt that does a partial job keeping out the bots that obey those settings, but many of them do not obey those settings. Alpha Anywhere has a feature to block by IP address but that is of minimal use against this overwhelming tide.
So I finally installed a "IP Blocker Firewall" on my server. The firewall allows me to block all known bad IP's from industry maintained blacklists, and/or by whole country. The "known bad IP" lists are automatically refreshed on a schedule (because the lists change daily). Right now I am blocking by blacklist, several whole countries and a few additional IP ranges based on my observation.
Blocked hits never get to the Alpha web server and so consume NO Alpha web server resources.
This still leaves my server open to all of search engines I want, and most of those who I expect to be valid customers. Because the blocking is fairly aggressive right now, I am sure this will occasionally block a legitimate hit -- so the Blocker is keeping a list of all IPs that hit the server (both allowed and blocked) so in a month or so I can change my tactic and try to block just the rogue IP ranges that are actually hitting my server rather than by whole country.
Doing this has reduced the hits on my server by those thousands per day.
Note, here are my robots.txt and sitemap.xml files. You should always have these files in your website root for basic SEO. My sitemap.xml is auto-generated every month from a scheduled xbasic script.
So I finally installed a "IP Blocker Firewall" on my server. The firewall allows me to block all known bad IP's from industry maintained blacklists, and/or by whole country. The "known bad IP" lists are automatically refreshed on a schedule (because the lists change daily). Right now I am blocking by blacklist, several whole countries and a few additional IP ranges based on my observation.
Blocked hits never get to the Alpha web server and so consume NO Alpha web server resources.
This still leaves my server open to all of search engines I want, and most of those who I expect to be valid customers. Because the blocking is fairly aggressive right now, I am sure this will occasionally block a legitimate hit -- so the Blocker is keeping a list of all IPs that hit the server (both allowed and blocked) so in a month or so I can change my tactic and try to block just the rogue IP ranges that are actually hitting my server rather than by whole country.
Doing this has reduced the hits on my server by those thousands per day.
Note, here are my robots.txt and sitemap.xml files. You should always have these files in your website root for basic SEO. My sitemap.xml is auto-generated every month from a scheduled xbasic script.
Comment