How I battled a DDOS attack like a complete idiot

Recently, one of my bigger projects once again found itself the target of a DDOS attack, repeatedly, with only few hours of breathing room in between, over the stretch of more than two weeks. It’s a MediaWiki that I host on CloudWays, without elaborate load balancing, WAF or CDN shenanigans, because it’s not a „high risk“ project (no competition – it’s not even a business, not political, etc.).

A quick manual analysis of IP traffic was sobering: countless different IP addresses from all over the globe, and constantly changing. There was no way I could block myself out of this attack through the firewall. Not only that, rate limiting or auto blacklisting also wouldn’t help me because analysis of hits over certain timespans showed that while it was a super high volume of hits, there weren’t enough hits of the same IP within a certain timefrime that would trigger a blocking mechanism or otherwise. I struggled to find duplicate entries at all, this attack went through thousands of IP addresses and hardly used any of them more than once, maybe twice at the most. This, in addition to calling up seemingly harmless application layer URLs (which is what constitutes a layer 7 attack), made this a conundrum.

I went ahead and blocked a bit anyway, but no avail. Also went ahead and blocked all of Russia and North Korea, but that was more to feel better about actually doing something. I was looking for something to do that would make a difference, as the price point of going all in on CloudFlare put me off (not to mention the dependency aspect).

What I believe helped in tackling the problem, was to understand the types of URLs requested. I was sure there had to be something I could do without jumping the gun on a CloudFlare subscription.
After all, this was an attack with http get floods, and the interesting thing wasL most, absolutely the majority of these hits, were to Mediawiki URLs that no human user would ever use, let alone know of, and of course not in that volume or frequency.
Here’s an example:

GET /index.php?from=20250426140609&hideminor=1&limit=50&target=some-page%3AY&title=Special%3ARecentChangesLinked HTTP/1.0" 404 0 "-" "Opera/9.36.(Windows NT 6.0; kw-GB) Presto/2.9.176 Version/11.00 

This is a query that can’t be in the cache, one that calls a resource-heavy database query and then a complicated rendering process. And it’s different pages and time frames all the time: in short, whatever this bot net was using, there was knowlege of Mediawiki functionality and how to call it via URL parameters, aware of the potential CPU load this would cause. And load it caused. The example above triggers a 404, but plenty of variations of this query got a 200 and resulted in legit page creation from these database calls:

GET /index.php?from=20250423182214&fromFormatted=18%3A22%2C+23+April+2025&limit=100&target=some-page&title=Special%3ARecentChangesLinked HTTP/1.1" 200 0 "-" "Mozilla/5.0 (Macintosh; U; PPC Mac OS X 10_7_9 rv:6.0; nr-ZA) AppleWebKit/534.40.4 (KHTML, like Gecko) Version/5.0.4 Safari/534.40.4 

So what then? I couldn’t block IP addresses. I already tried limiting Special:RecentChangesLinked (a special page in the wiki that the bots were calling not like a normal user would but via a direct URL parameter — you can use Externsion:Lockdown and Externsion:DisableSpedialPages to accomplish these) to logged in users only, but since that takes effect a couple of steps in the process too late, it wouldn’t take much strain off the server.
I had to find a solution to fend this off at the server (i.e. Apache) level, not serving these requests at all, before any database calls or PHP to spring into action. So I did the following:
I used an htaccess rewrite condition to forbid any requests that has RecentChangesLinked in it (please don’t do that if use of your website requires it of course, but in my case I was sure this is never used).

RewriteCond %{QUERY_STRING} RecentChangesLinked [NC]
RewriteRule "^.*$" - [F]

It’s a bit of a blunt approach, but hey…. so far, so good. Hope this helps others, as well.
Now, the hits are still coming in, but the load has lightened and the requests are getting 403/451 error codes from the server.

GET /index.php?returnto=Special%3ARecentChangesLinked&returntoquery=days%3D30%26from%3D20250428103517%26limit%3D250%26target=some-page&title=Special%3AUserLogin HTTP/1.1" 403 451 "-" "Opera/8.76.(X11; Linux x86_64; sa-IN) Presto/2.9.177 Version/10.00

I am not an IT security professional. This was a trial and error process by someone with little more than Google/Stack Overflow knowledge. No guarantee it’ll work for you. The strain on my server subsided once this change took effect, I cannot prove it was exactly this or any of the other tricks I applied, but it was when I tried this, that the situation improved (0% idle CPU, server unresponsive, all RAM used, to about 70%-80% idle CPU, which is normal for my application on an average weekday) and it has held up a few days since, fingers crossed!

The rub: this does not stop the attacks from reaching the server (CloudFlare would do that as it works as a CDN and on DNS level), but while Apache is receiving the request, it can deny it right away based on a matching string in the URL and no resource on the LAMP needs to work on it, it is not processed (no SQL query, no PHP, etc.) but results directly in an error forbidden page.

Thanks to Cloudways (if you wanna try it, use this link, I will earn a commission) for providing an intuitive web interface for me to sift through access logs, tune the firewall and change config settings. Their basis security is already solid, but if you’re under a DDOS attack, you’re left to your own wits :)

Sebastian

Thanks for reading my personal blog. Click "About Me" in the navigation bar for more information. I welcome your comments!

You may also like...

Schreibe einen Kommentar

Deine E-Mail-Adresse wird nicht veröffentlicht. Erforderliche Felder sind mit * markiert