Sort of. While it was helpful to have the delta-compressed polygon list for each part of the level in its own 64KB chunk, the minor miracle of fitting >10MB levels into 2MB of RAM (half of which was VRAM as I recall) was down to two things: 1) Andy wrote this insane dynamic layout/loader thing that optimized the CD’s bandwidth (which was of course pathetic by today’s standards, as you point out); 2) I wrote a tool that packed the chunks into pages so that we never needed too many active at any given point in the level. This is an NP-Complete problem and we didn’t have solvers back then so the tool just tried a bunch of heuristics, including a stochastic one (think early simulated annealing). The problem with the latter was that if you “got lucky” you might never achieve the required packing again after the artist changed a turtle or something…
monocasa|3 months ago
Separate 2MB of main ram and 1MB of VRAM for 3MB total.
dmbaggett|3 months ago
djmips|3 months ago
dmbaggett|3 months ago
The packer was the final step after a level was pre-sorted and otherwise processed. It was quite fast, so it added only a little bit of extra time to the primary work of pre-rendering every frame of the level to recover the sort order (which typically took around an hour).
I did experiment with solver algorithms but they were so obviously going to be too slow that I abandoned the idea.