top | item 10158529

Biggest image in the smallest space

359 points| fekberg | 10 years ago |bamsoftware.com

103 comments

order
[+] michaelmior|10 years ago|reply
Actually, it decompresses to a 5.8MB PNG. However, many graphics programs may choose to use three bytes per pixel when rendering the image and because it has incredibly large dimensions, this representation would take up 141GB of RAM.
[+] userbinator|10 years ago|reply
Better graphics programs will not attempt to put the whole image into RAM, but only decompress the pieces needed for processing it.

I remember working with multi-megapixel images on systems with far less than 1MB of RAM, many years ago. Perhaps this is a good example of how more hardware resources can lead to them being wasted - the fact that RAM has grown so much that most images fit completely in it, has also meant programmers assuming they can do this for all images without a second thought when often all that's needed is a tiny subset of all the data.

Even if the image data is compressed, there's absolutely no need to keep all of it in memory - just decompress incrementally into a small, fixed-size buffer until you get to the "plaintext" position desired, ignoring everything before that. The fact that it's compressed also means that, with suitable algorithms, you can skip over huge spans at once - this is particularly easy to do with RLE and LZ - and the compression ratio actually boosts the speed of seeking to a specific position.

Currently, (hopefully...) no application is attempting to read entire video files into memory before processing them, but I wonder if that might change in the future as RAM becomes even bigger, and we'll start to get "video decompression bombs" instead?

[+] jerf|10 years ago|reply
One of the rules of secure programming is that any program that is used in an even remotely security-sensitive context, and anything displaying a Portable Network Graphic is likely to be used in such a context, must be able to specify resource usage limits. In this case that could be dimensions or a limit on the total RAM allowed to be used. Limits need not be hard, either, but could produce a query, for instance, the way very long-running scripts in the browser ask you if they should continue.

Now, go find an API/library for dealing with PNGs that allow you to pass in such a limit, let alone pass in a callback for dealing with violations. Go ahead. I'll wait.

(The Internet being what it is, if there is one, someone will pop up in a reply in five minutes citing it. If so, my compliments to the authors! But I think we can all agree that in general image APIs do not offer this control. In fact, in general, if you submit a patch to allow it, it would probably be rejected from most projects as unnecessarily complicating the API.)

This is the sort of thing that I mean when I say that we are so utterly buried by insecure coding practices that we can't hardly even perceive it around us. I should add this as another example in http://www.jerf.org/iri/post/2942 .

[+] fla|10 years ago|reply
And almost every program that tries to display it.
[+] wiredfool|10 years ago|reply
Some image programs will allocate space based on the metadata in the file. The actual image data isn't actually required. So, if there's corrupted image data, say a byte or two (or even missing), there's nothing stopping the reported size being in the gigapixel range.
[+] DanBC|10 years ago|reply
That's impressive. Here are some other compression curiosities.

http://www.maximumcompression.com/compression_fun.php

A 24 byte file that uncompresses to 5 MB; another file with good compression under RAR but almost no compression under ZIP; and a compressed file that decompresses to itself.

[+] rwmj|10 years ago|reply
This isn't a decompression bomb, but here are some fun virtual disk images I found using AFL fuzzer. One of the files is 329 bytes, but causes qemu to consume 4 GB of heap trying to process it. This has interesting consequences for the public cloud, where people can upload any old stuff and it is usually processed immediately by 'qemu-img'.

https://bugs.launchpad.net/qemu/+bug/1462949

(I have a big collection of these, but most of the bugs have now been fixed in qemu)

[+] sulami|10 years ago|reply
The file that decompresses to itself is a work of art. How does one even go about to create something like this?
[+] Filligree|10 years ago|reply
I'll add mine: https://brage.info/hello

It's a 1MB file that decompresses to 261 tredecillion bytes of "Hello, World".

No terribly clever stream manipulation; it's a perfectly normal gzip file, other than the size. The generation script is here: http://sprunge.us/VhFc, but see if you can figure it out without peeking.

[+] aidenn0|10 years ago|reply
Malware scanners that search inside archives really love zips that decompress recursively forever.
[+] Hugie|10 years ago|reply
Maybe someone takes the recursive file to a new level and creates one that generates to n new identical files. So decompress to a tree.
[+] __mp|10 years ago|reply
Photoshop was able to show it: http://i.imgur.com/7EdBySv.png (Macbook Pro, 16GB RAM)
[+] userbinator|10 years ago|reply
Photoshop is an example of a graphics program that doesn't attempt to read the entire image into memory.

How much RAM did it actually use?

[+] anilgulecha|10 years ago|reply
Tells you why it's a solid image application. Kudos to them.
[+] semi-extrinsic|10 years ago|reply
If you follow the "related reading" link on the bottom of TFA, you come to a page by Glenn Randers-Pehrson discussing how libpng deals with decompression bombs. On the bottom of that page you find the following curious note; anyone know what to make of it?

""" [Note for any DHS people who have stumbled upon this site, be aware that this is a cybersecurity issue, not a physical security issue. Feel free to contact me at <glennrp at users.sourceforge.net> to discuss it.] """

[+] saalweachter|10 years ago|reply
He's presumably had problems with people confusing "decompression bombs" with the blowy-up kind and sending him panicky e-mails.
[+] fennecfoxen|10 years ago|reply
What to make of it? Seems clear enough; he's (half-jokingly?) afraid that someone in the federal government will see the page and think "oh no! bombs! explosions! TERRORISM!" and identify more clearly that this is only a computer analogy.
[+] nerdy|10 years ago|reply
I think it is related to the word "bomb" existing on the page
[+] octatoan|10 years ago|reply
DHS = Department of Homeland Security?
[+] wiredfool|10 years ago|reply
PNGs also have optional compressed text metadata chunks, and it's possible to sneak a decompression bomb into one of those as well. You can get about a factor of 1000 in the compression -- 1MB of 'a' winds up being about 1040 bytes. You can have multiple itxt chunks, and it appears that the chunk size is only limited to 2^31-1.

See https://github.com/python-pillow/Pillow/blob/master/Tests/ch... for a quick way to generate some of these.

[+] andersthue|10 years ago|reply
Reminds me of how you could crash a fido node by sending them some big empty files, so when they got automatically unzipped the filled of the harddrive :)
[+] fizgig|10 years ago|reply
I think this kind of thing was common even a few years ago in DoS'ing mail gateways that uncompressed and scanned various archive formats. Things like really huge files when uncompressed or ridiculously deep nested directory structures.

I think most software these days is immune to such tricks, or at least has tunables to reduce the chance of such tricks causing harm.

[+] inglor|10 years ago|reply
This does wonders when used in favicons :D
[+] raffomania|10 years ago|reply
I just tried it on a locally served page, and my browser handles it quite well (although it won't really display it).
[+] feld|10 years ago|reply
You just made my stomach turn at the thought
[+] raffomania|10 years ago|reply
Fun fact: When trying to upload this as a profile picture (on a site I host myself), chromium crashes.
[+] dahart|10 years ago|reply
Having dealt with and printed a lot of very large images, e.g., 60k x 60k pixels, I have been on the lookout for image processing software that never decompresses the entire image into ram, but instead works on blocks or scan lines or blocks of scan lines, but stays in constant memory and streams to and from disk. For example, the ImageMagick fork GraphicsMagick does a much better job of this than ImageMagick. What other software is out there that can handle these kinds of images?
[+] phkahler|10 years ago|reply
The key is not to store it in raster form in RAM. Either tiles (like GIMP) or I prefer Z-ordering. Then a user can zoom in and pan around easily - you let the system swap and it won't be bad at all. If they zoom out though, you probably want to store MIP maps of it.

Swap works well for this as long as your data has good locality. huge raster images don't.

But no, I'm not aware of any software that handles stuff like that well - except the GIMPs tiling, but that's not going to help when zoomed out.

[+] AndrewStephens|10 years ago|reply
I used to work on a scanning SMTP/HTTP proxy and even back then it wasn't unknown for people to send crafted decompression bombs to attempt to crash the services. We handled it by estimating the total uncompressed size upfront (including sub archives) and throwing out anything with a suspiciously large compression ratio.

I imagine that .pdf files are another avenue for mischief. They contain lots of chunks which may be compressed in varying ways.

[+] JosephRedfern|10 years ago|reply
That's cool. Presumably the same "attack" could be applied to any file format that uses DEFLATE.

From a legal stand-point, I'd be wary about following through with the authors suggestion of "Upload as your profile picture to some online service, try to crash their image processing scripts" without permission. Sounds like a good way of getting into trouble.

[+] logicallee|10 years ago|reply
>The image is almost entirely zeroes, with a secret message in the center.

too pressed for time, did anyone look? What is it?

[+] sgdread|10 years ago|reply
It is "SORRY, OUR PRINCESS IS IN ANOTHER PIXMAP"
[+] tiler|10 years ago|reply
I realize that this is besides the point but going on the title alone we could write a script that could generate an 'infinite' (max out available memory) sized image.
[+] javajosh|10 years ago|reply
Everyone's focusing on this being a PNG problem but actually if my server unzips a 420 byte file into a 5M file of any kind, I'd say that's the first red flag. Assuming some sort of streaming decompression, you could write an output filter that shuts off the decompressor when it's seen a factor of X bytes. A reasonable factor would be 10 - which in this case would have halted bzip decompression at 4kB.

This would probably be a trivial patch to bzip2. But I like the idea in general of passing an "max input/output ratio" to any process or function that might yield far more output than input.

[+] ctdonath|10 years ago|reply
The real problem is image handling libraries that blindly render images into too-large objects where unnecessary. While full-res uncompressed images are very convenient under the hood, the image library should inherently handle anything "too big" gracefully. Instead we're often prone to apps crashing when someone feeds in a ridiculously large image.

A 420B > 5MB expansion should not be a "red flag" because there is nothing about it (including the subsequent attempt to process a 141GB uncompressed image) which cannot be handled appropriately in software. Flagging such ratio limits is arbitrary, and setting an arbitrary limit is usually a sign the software is incorrect, not the data.

[+] ctdonath|10 years ago|reply
Looks handy for large image processing tests, thanks.
[+] atom_enger|10 years ago|reply
Trying to run the program and create my own image, however a few questions, what did you use for secret.png? Any old png?

Are you using PIL or pillow?

[+] pvdebbe|10 years ago|reply
Cool, but most web sites wouldn't allow to upload a 5-MB picture as a profile picture. Or do they, these days?