top | item 9737156

How Naughty Dog Fit Crash Bandicoot into 2MB of RAM on the PS1

934 points| ekianjo | 10 years ago |quora.com | reply

247 comments

order
[+] JonnieCache|10 years ago|reply
If you haven't read the "Making Crash Bandicoot" blogposts, you're in for a treat.

http://all-things-andy-gavin.com/2011/02/02/making-crash-ban...

[+] danso|10 years ago|reply
I don't know if Dave Baggett's heroic essay (well, as heroic as a game debugging essay can be), "My Hardest Bug Ever" is part of those blogposts, but it's one of the things I now mentally associate with the legacy of Crash Bandicoot (which apparently involved all manners of engineering feats):

http://www.gamasutra.com/blogs/DaveBaggett/20131031/203788/M...

HN discussion here: https://news.ycombinator.com/item?id=6654905 (with participation from the author, dmbaggett https://news.ycombinator.com/user?id=dmbaggett)

[+] joesmo|10 years ago|reply
Nice read. I really like this part: "But we worried about the camera, dizziness, and the player’s ability to judge depth – more on that later."

It's interesting that they were concerned with dizziness and the camera, concerns which seem to have unfortunately evaporated in most 3d games made since, to their detriment.

[+] packersville|10 years ago|reply
If Crash Bandicoot was so hard to squeeze in 2MB, I imagine other guys like Solid Snake (can't remember its name) would be incredibly hard.
[+] AtmaScout|10 years ago|reply
You're right about being in for a treat. Thanks for the link.
[+] M4v3R|10 years ago|reply
> Ultimately Crash fit into the PS1's memory with 4 bytes to spare. Yes, 4 bytes out of 2097152. Good times

Wow. Just wow. One can only imagine the amount of hard work and sweat that was put into making this possible. And the pride of developers when it actually worked and the game has become a success. Great story.

[+] kayoone|10 years ago|reply
Pretty amazing what kind of skills game development required back then. One side of me is happy that we have all these great tools today, the other is sad because you hardly use this low level stuff in todays software development world.

We are solving different problems today, but the level of software development skills required for a game that today could be done by a single person in Unity in a few weeks is quite impressive.

[+] NickPollard|10 years ago|reply
A good portion of this stuff does still go on in Games Development.

I don't work in games any more, but on the last title I worked on (Forza Horizon, Xbox 360), one of my colleagues engaged in a very similar exercise in order to allow data for the open world to be streamed in quick enough to deal with car (travelling at potentially 150+mph) to drive through the world without having to wait for loads, whilst streaming direct from the DVD (we weren't allowed to install to the HDD).

Given that the world was open and you could drive in pretty much any direction, trying to pre-load the right tiles of the world was difficult, and seek times made it tough to bring stuff in if it wasn't all packed together. However we were almost at the limit of DVD capacity so we couldn't do much duplication of assets to lower the amount of seeking required.

My colleague wrote a system that took about 8 hours overnight (running on a grid) to compute a best attempt at some kind of optimized packing. It did work though!

[+] fra|10 years ago|reply
Some day I will write a blog post about the tricks we have to pull off to fit all code & assets - or get animations running at 30FPS - on Pebble.

Trust me when I say this: low level development is alive and well at hardware companies.

[+] jeremiep|10 years ago|reply
We also have pretty amazing hardware today that is surprisingly underused most of the time.

I've seen many Unity projects with the worst code you could imagine still run close to 60 FPS when shipped.

Having an easy entry point means you're also going to get a lot of mediocre programmers using it. They seem productive in the first few weeks of the project but then quickly slow down halt once they start changing the design and end up with massive overtime hours while trying to debug and optimize the resulting mess.

Its made even worse with managers trying pseudo-agile-learned-in-a-two-day-masterclass adding even more procedures and overhead.

So yeah, you can make games today with less development talent than yesterday, it's still going to cost you more than having skilled software engineers and the resulting product will be a fraction of its potential.

[+] javajosh|10 years ago|reply
There are some abstractions, like TCP sockets, that I don't think anyone would leave behind. But language abstractions like "classes" or even "functions" are certainly worth digging into. (The new lambda feature in Java 8 is an interesting case - you don't have to get into C to see the complexity there, but rather the bytecode, which is a kind of abstract sort of assembly.)

I'm not terribly familiar with the scene, but there are a variety of competitions for fun and art that operate within highly constrained environments. The demoscene[0], and computer golf[1], come to mind (although ironically computer golf is in some sense has to be very high level). There's also the security scene, which is quite bit-fiddly.

It's also the case that Go and particularly Rust are quite low-level system languages at their heart, so are presumably amenable to running in constrained environments.

[0] https://en.wikipedia.org/wiki/Demoscene

[1] E.g. Rhoscript http://rhoscript.com/

[+] fsloth|10 years ago|reply
" the other is sad because you hardly use this low level stuff in todays software development world."

I don't think there is a dichtomy. It's always about smartly leveraging available resources. The problem with modern development perhaps is then that there are these tempting high level orthodoxies that often obscure the core matter at hand. Ie. focusing on some pointless candied abstract object interface rather than focusing design efforts on data flow and the datastructures and algorithms used.

The need for low level optimization has most definetly not vanished. When program data fits into local ram the bottleneck moves into cache optimization.

[+] pjmlp|10 years ago|reply
Just try to code for mobile or embedded platforms.

Yeah they might get bigger storage every year, but the reason why Google, Apple and Microsoft do talks about package size on their developer conferences is that size is number one reason for people to avoid installing apps, or to chose which one to remove.

Also given how the app life cycle works on mobile platforms, big apps are being killed all the time they go into background.

[+] pbo|10 years ago|reply
There is a tremendous number of hardware products and industrial systems where the processing is performed on small and cheap components (microcontrollers, digital signal processors).

Of course there exist very complex components in the category of microcontrollers, some of them even offer enough resources to run Linux, but if you stick to the $1-$5 range the specs are very limited.

Here are two examples, the first one costs around $3 and the second one is less than $1.

http://www.ti.com/product/tms320f28027 http://www.atmel.com/devices/attiny85.aspx

I develop on such platforms and even though there is an interesting challenge in programming these tiny processors and optimizing CPU cycles and memory usage all the time, in the long run it becomes quite strenuous because there is only low-level stuff and I miss the expressiveness and flexibility of more abstract languages.

[+] shultays|10 years ago|reply
I feel a little sad every time someone says "We have much higher CPU power/memory now, those things are unnecessary". Maybe they are right but I feel like this is not the path we should take
[+] banachtarski|10 years ago|reply
Making an (state of the art) engine is no easier. Making a game is only easier because the engines are more available now than they were in the past.
[+] kamaal|10 years ago|reply
>>the other is sad because you hardly use this low level stuff in todays software development world.

During my engineering days(Circa 2005) I programmed in 8085 and our professor would give us all kinds of crazy assignments and small projects. That was the first taste of any genuine programming challenge I faced in my life. Immediately post that, programming in C felt a little boring.

Recently I worked on a embedded systems project for which I relived these kind of days. I had to invent all kinds crazy tricks to work around with resource constraints.

Your true creativity skills are kindled when your resources are constrained in all manners. Time or other wise. Unfortunately you can't academically recreate the same experience.

[+] Qantourisc|10 years ago|reply
The amount of unoptimized crap they generate in a few weeks (in some cases) is quite depressive too I'm afraid. Game developers are wasting more and more CPU cycles, this of course reduces development cost. But it would be nice if they put some effort in making things run fast. And this doesn't only count for games!
[+] i_are_smart|10 years ago|reply
> and this had to be paged in and out dynamically, without any "hitches"—loading lags where the frame rate would drop below 30 Hz.

This is what gets me. Modern game development seems to say "eh, a little hitching won't hurt anyone", and then we wind up with games that run like shit. Even on consoles.

[+] derefr|10 years ago|reply
Compare this to the Super Nintendo's 128KB of working memory.

It's hard to tell which games used more or less of that memory; the big thing about game complexity in that era was always ROM size limiting asset complexity, rather than RAM size limiting computational complexity, so the games released toward the end of the console's lifecycle were just ones with the biggest ROMs and therefore most assets, rather than games that used the available CPU+RAM more efficiently.[1]

Now I'm considering writing a memory profiler patch for a SNES emulator, to see how much of the 128KB is "hot" for any given game. I would bet the hardest-to-fit game would be something like SimCity or SimAnt or Populous.

On the other hand, the SNES also had "OAM" memory—effectively the 2D-sprite equivalent to GPU mesh handles. And those were a very conserved resource—I think there was space to have 128 sprites active in total? Developers definitely had problems fitting enough live sprites into a level. Super Mario World's naive answer was to basically do aggressive OAM garbage-collection, for example: any sprite scrolled far-enough offscreen ceases to exist, and must be spawned again by code. Later games got more clever about it, but it was always a worry in some form or another.

---

[1] There were also those that used expansion chips to effectively displace the SNES with something like an ARM SoC in the cartridge, once that became affordable. It's somewhat nonsensical to talk about how much SNES system memory some games used, because they came with their own.

[+] xedarius|10 years ago|reply
Resources blowing the memory was a common problem on the PS1. A common trick was to bake the resources into the exe. I worked on a game where the final disc was a directory of 25 exe's. Each exe was a level, even the front end was a separate exe. You could see that the size of each file was under 2mb, so you knew it would work. You would never/hardly ever dynamically allocate memory on the PS1.

There was a lot of duplication, but the CD was a huge resource and memory was thin on the ground. Also meant that we could use the CD for audio during the game.

[+] trumpete|10 years ago|reply
As mentioned in the answer, there were levels that were a lot bigger than 2 megabytes. What their goal was to ensure that these bigger levels would seamlessly be in the memory at all times
[+] Mahn|10 years ago|reply
That was the "easy" way of handling it and most devs resorted to it, but then you couldn't load things dynamically and seamlessly in the middle of a "level".
[+] Htsthbjig|10 years ago|reply
I am waiting for Mr. Baggett book on the Lisp internals of Crash Bandicoot.

I don't care about the game, I never liked it, but the Lisp code of this man should be on par with PG's.

[+] iNerdier|10 years ago|reply
Dave Bagget could write a book on the making of crash and I'd buy it in a second. There was something very special that went to making that game and I was just the right age to appreciate it.
[+] dmbaggett|10 years ago|reply
Thanks!

It's on my bucket list to write that book -- also including many humorous tales from my 10+ years at ITA Software, and anecdotes from my current startup (http://inky.com).

If I live long enough, that is. :)

[+] kozak|10 years ago|reply
Old console games are great examples of how creativity benefits from constraints.
[+] Someone1234|10 years ago|reply
Or that survival bias is a real thing. People only remember the good games that survived obscurity for tens of years, while ignoring the majority of terrible games that didn't.

A lot of people like to claim "old games used to be better!" but pick any month of the 1990s and look at the newest releases for that month, I bet for the average month there might be one title you've even heard of.

[+] listic|10 years ago|reply
I wonder if anyone controls the physical layout of bytes on the disk, at least for things like installing large software packages on an HDD, or a major release of an operating system on a DVD. It doesn't look likely: every time I install Ubuntu, I feel the process could be made much faster.
[+] gambiting|10 years ago|reply
We do for PS4 disc images. Basically you want to lay out the data on the disc in such a way that the game can still be played while the system copies everything to the HDD. There are pages upon pages of manuals in PS4 SDK how to do it efficiently.
[+] cnvogel|10 years ago|reply
At least for Windows there's now a way to install the OS so that it only occupies one continuous large filesystem image with the compressed install files, and only stores the changes to this frozen filesystem images in the "traditional" way. It's called "wimboot".

http://labalec.fr/erwan/?p=1078

That way installing the OS mainy consists of copying this huge image file, hence the "physical layout of bytes on the disk" will mostly be fixed.

I'd guess that this could also be replicated in Linux, but personally I don't know if this is being done. I usually "debootstrap" or "pacstrap" my installs from a bootable USB stick :-).

[+] jeremiep|10 years ago|reply
Ubuntu targets a LOT of different hardware, it would be much harder to optimize the layout on their ISO image than it is to optimize for a fixed hardware platform like consoles.
[+] Cthulhu_|10 years ago|reply
Do you still use a CD / DVD to install your OS? That's probably the problem right there; CD's/DVD's are relatively slow. Go for an install off a flash disk (where physical layout shouldn't matter), and / or a web install where only the relevant software compiled for your hardware is downloaded.
[+] goalieca|10 years ago|reply
One really nice benefit of SSDs is that there is no needle or laser to seek. :)
[+] kayoone|10 years ago|reply
Looks like the system RAM on playstations increased by the factor of 2ˆ4 / 16x with every generation. 2MB to 32MB (PS2) to 512MB (PS3) to 8GB (PS4)
[+] userbinator|10 years ago|reply
Following the same trend, the PS5 having 128GB of RAM seems a little ridiculous now, but consider that the PS4 was released in 2013, and the PS3 in 2006, so perhaps in 2020 it may really have that much.
[+] drzaiusapelord|10 years ago|reply
> The PS1 had 2MB of RAM, and we had to do crazy things to get the game to fit.

That's what? About $80-100 worth of RAM back then? On a product that sold at $299 (July 1995 pricedrop) that's incredible. Nearly a third of your cost was ram alone.

[+] gavanwoolery|10 years ago|reply
Crash Bandicoot was quite a technical feat, but also (perhaps less talked about) - a marketing feat. The series went to sell ~50 million copies IIRC and was also the best selling Playstation game of all time.
[+] tluyben2|10 years ago|reply
I still play Crash (2/3, not 1) quite a lot on the OpenPandora & on my PS2 when i'm in my wood cabin. I still like it and that's partly of how much constraints there were on the PS1 to get something 'big' like this out there.

I keep hoping for a book with annotated (including lisp) source... Please please!!

[+] ParrotyError|10 years ago|reply
Back in the days of the 8-bit micros there were some games that were absolutely incredible for what they could pack into a tiny machine. Elite on the BBC micro (32k) was one, but being poor I had a Spectrum 128. There was a helicopter simulator by Digital Integration for the 48k Spectrum called Tomahawk which was particularly good, and another 3D game called Starglider, which was originally for the Atari ST but there were ports to the Spectrum 128 and 48k Spectrum! I seem to remember an article in Sinclair User where they interviewed the developers who explained how they got it all to fit into such a small machine. Self-modifying code and using parts of the frame buffer to store code and data IIRC...
[+] sown|10 years ago|reply
Time + Internet = A smaller world that doesn't forget
[+] landmark2|10 years ago|reply
confirmed: I'm a shitty developer