I’ve been paying more attention to my log files lately. What I’ve found is not surprising, just disheartening. There are a number of requests by bots and script hackers to specific pages that have been regularly exploited in the past.
Aside from keeping my platform up-to-date (this includes Apache, Linux, WordPress, and so-on) I’ve been restricting access to these potentially unsafe resources and I’ve finally put together a script to automate denying compromised hosts access to the site.
The first important bit (again, besides keeping my platform up-to-date) is restricting access to xmlrpc. I did that with a simple .htaccess script:
# BEGIN protect xmlrpc.php <files xmlrpc.php> order allow,deny deny from all </files> # END protect xmlrpc.php
The next thing I added (found with a little help from Google and stack overflow) is an IP list restriction. I just put this in every virtual host config:
<VirtualHost> .. <Directory /www/> Options Indexes FollowSymLinks MultiViews AllowOverride All Order allow,deny allow from all <RequireAll> Require all granted Include /path/to/IPList.conf </RequireAll> </Directory> </VirtualHost>
(I’ve also just realized how limited pasting code into WordPress can be… but luckily I found this awesome nugget http://hilite.me/)
And the last bit is a script that will search my log files and tell me who is being naughty. It doesn’t update the exclusion list in place and it doesn’t restart apache – we’ll leave that as an exercise for the reader 😉
#!/usr/bin/perl # $threshold = 250; $days = 7; %iplist = (); open(INF, "/path/to/IPList.conf" ); while ($a = <INF>) { chop($a); @fields = split(' ', $a); $ip = $fields[ @fields-1 ]; $iplist{ $ip } = 1; } close(INF); @URLS = ( 'wp-login', 'xmlrpc' ); @files = `/usr/bin/find /path/to/apache2/ -name '*access*log*' -mtime -$days`; $filelist = join(' ', @files); $filelist =~ s/\n//g; print "searching $filelist\n"; foreach $urlfragment ( @URLS ) { @list = `/bin/zgrep -h $urlfragment $filelist | /usr/bin/awk '{print \$1}' | /usr/bin/sort | /usr/bin/uniq -c | /usr/bin/sort -rg | /usr/bin/head -n30`; foreach $ipdata ( @list ) { @fields = split(/\s/, $ipdata); $ip = $fields[ @fields - 1 ]; $count = $fields[ @fields - 2 ]; if ( $count > $threshold ) { if ( $iplist{ $ip } != 1 ) { # new ip to add to the list $iplist{ $ip } = 2; print "adding $ip for requesting $urlfragment $count times over $days days.\n"; } else { print "skipping $ip, already in our list.\n"; } } } } open(OUF, ">/tmp/IPList.conf"); foreach $ip ( sort keys %iplist ) { if ( length($ip) > 3 ) { print OUF "Require not ip $ip\n"; } } close(OUF);