I had a few similar scripts floating around this site, but this one is a bit more all-inclusive and better organized. The script will analyze your firewall/whatever access log and block particularly active visitors.

I added some extensive (for me, at least) comments to the script, but here’s some explanation anyway. What the script does:

  1. Checks specified log files for specific lines matching the regex. The default syntax is for a typical firewall log showing src and dst:port and whether the connection was allowed or dropped.
  2. From there the script will grab the lines matching the destination IPs and ports that you specified. You can either specify the destination IP or have it automatically set to your external Internet IP. This may be useful if you don’t have a static IP. You can specify multiple ports, or the script will get a list by running nmap against your server’s external IP (or whatever IP you specified).
  3. From the results, the script will exclude private networks. This is something you may need to adjust, if your destination IP is on the private network.
  4. The next step is to exclude any IPs listed in your /etc/hosts.allow. You can easily change this to any other IP whitelist file, or just enter IPs manually into the script.
  5. Extract unique source IPs and analyze their frequency. Select those that exceed the threshold. This threshold is an entirely arbitrary value that you must deduce based on how popular your server is.
  6. Try to determine location and owner of the source IP and filter out any that match entries in your second whitelist. This second whitelist may contain information like country, state, city, name of the organization – basically anything that can be generated by the geoiplookup command. This allows you to exclude certain organizations and geographical locations without having to specify a huge number of network ranges.
  7. The offending IPs will be blacklisted. There is an option to blacklist them for a period from one to whatever number of days.

If you’re not interested in having your Web site, FTP server, etc. downloaded over and over again by various “search engines” and “security researchers”, this script can automatically hamper their similarly automated efforts.

The script can be fairly easily modified to use the org_whitelist variable for blacklisting, instead of whitelisting. This way you can drop the threshold down to a minimum and blacklist based not on the number of hits, but on a specific geographic location or the organization name.

The script is below and you can also download it from my GitHub repo here.

 

Leave A Reply

Please enter your comment!
Please enter your name here