top | item 23224840

Nintendo 64 Architecture – A Practical Analysis

330 points| bottle2 | 5 years ago |copetti.org

72 comments

order

monocasa|5 years ago

I gave a talk not too long ago about running Rust on a Nintendo 64, with the slide deck written in Rust and running on an N64.

https://twitter.com/DebugSteven/status/1054903603985559553

So I guess what I'm saying is that I have pretty hands on knowledge with the system and would be happy to answer any questions I can.

One thing I'll throw out there, is that one of the biggest limitations of the N64 (its 4KB texture memory) gets called a texture cache a lot, but that's a misnomer. It's a manually managed piece of memory, and (IMO) the system would have been much better off if it were actually a cache rather than having to load an entire texture in regardless of what was being sampled. Nowhere I've seen in Nintendo's literature do they call it a cache either. The crazy hacks that Rare did to subdivide their geometry on texture boundaries wouldn't be necessary for instance. I'd maybe even be into a 2KB cache over a 4KB chunk of manually managed memory.

One other aside is that I think the system still has tons of unlocked potential. So much of unlocking it's power seems to be centered around memory bank utilization. Switching which page of DRAM within a bank is expensive in terms of latency, but it seems like if you allocate your memory in 1MB bank chunks you can get around a lot of the limitations of the systems having the slow memory that developers complained about at the time. I don't blame developers at the time, they were coming from SNES where it was single cycle access to RAM, to the N64 that had a very deep, very modern memory hierarchy and what all that means for your code. The industry as a whole didn't really catch on until about halfway through the PS2's development cycle. But applying some of those PS2 techniques back, the system really purrs when you have dedicate a 1MB bank to each streaming source or destination. I can't wait to see what crazy stuff happens when the demoscene folk really start to get their hands dirty with it.

zeta0134|5 years ago

> The crazy hacks that Rare did to subdivide their geometry on texture boundaries wouldn't be necessary for instance.

I would love to know more about this. Is their texture format tomfoolery written up somewhere?

domlebo70|5 years ago

Amazing. Is there somewhere I can watch this talk?

r0bbbo|5 years ago

For someone just starting to learn about computer architecture at a low level (rather embarrassing for someone who's been in the industry for over a decade), this is a really interesting read, and it's helping concepts like pipelining and caching to gel.

als0|5 years ago

Hardware used to be so exotic!

rawoke083600|5 years ago

Brilliant article ! I love these in-depth "old-hardware" articles with my morning coffee. (It's only 8am here in South Africa)

Lol this stood out for me (although probably just semantics...):

"Reality Co-Processor running at 62.5 MHz." What big dreams we had back then to try and"simulate reality" with only 62.5Mhz :)

Well done author... well done Nintendo !

pjmlp|5 years ago

Amiga's Agnus and its AGA successor were running at around 7 - 35 Mhz tops. :)

bambataa|5 years ago

On the cost-saving point, I always understood that the limited 4kb texture memory led to lots of games having really muddy, blurry textures. How much more would have 8kb or 16kb cost? It seems a small cost saving that had a pretty large, negative impact.

nwallin|5 years ago

Regarding simply having 8/16/32kB, the cache was integrated with the chip itself, it wasn't RAM that lived on the motherboard. So adding more would have required a larger chip.

It was a multifaceted problem, and was ultimately a design flaw/oversight rather than someone saying "I think 4kB is enough memory to store all the textures". The problem is less that the cache was small, it's more that Nintendo's plans for how awesome RDRAM and a unified memory architecture didn't pan out.

Problem #1: There was no dedicated video memory. All RAM on the N64 was shared RAM. So framerates tanked if you didn't have most of your stuff in cache. Keep in mind the framebuffer also lived in this unified memory area, so the video chip was already very noisy on the memory bus.

Problem #2: The unified shared system RAM was RDRAM, not SDRAM. And the latency on RDRAM is absolutely terrible. So the already expensive cost of using RAM was compounded.

If the N64 did what the playstation and saturn did and just have dedicated video/system RAM, and made this RAM relatively low latency SDRAM instead of the relatively high latency RDRAM, this 4kB limitation wouldn't have mattered.

Athas|5 years ago

Interestingly, I feel the opposite. The inferior texture memory means that many games just used Gouraud shading (or sprites) instead, which was quite clean. At the time, I felt that N64 games looked cleaner than PS games, mostly because of the textures, and I also think they have aged (slightly) better. At least those games that didn't aim for realism - Super Mario 64 and Ocarina of Time are still quite playable, while the looks of GoldenEye 007 are probably more of a hurdle.

Jasper_|5 years ago

Texture memory (TMEM) is very special and fast, so it tends to be expensive stuff, at least back then, and it's balanced together with the rest of the architecture, like the bandwidth the RDP has, DMA copies from main mem -> TMEM, and so on. It would have changed quite a lot of the underlying architecture to increase it to 8KB, and might not have been worth it. You couldn't have increased texture resolution by simply increasing memory without changing a significant part of the architecture elsewhere (e.g. RDP fill rates). Most textures used during draws on the N64 don't even fill up the 4KB texture memory.

jchw|5 years ago

Well, I would guess twice as much and four times as much respectively. I mean RAM is cheap today but if you wanted 128GB of it today it’d still run you like, $700. I can’t pretend to know what this particular type of RAM cost back in the day, but given how relatively cutting edge the machine was I can only guess it was not particularly inexpensive...

Causality1|5 years ago

I don't think I would have gotten as interested in gaming if it weren't for Nintendo's decisions with the N64. The native bilinear texture filtering, Z-buffer, and subpixel model rendering make such an enormous difference to me I'd have found the Playstation unplayable.

webwielder2|5 years ago

To me, N64 continued the video game "tradition" I knew from the 16-bit era. Colorful, fast-paced, responsive. PSX games by contrast were dour, slow-paced, and controlled poorly because of Sony's initial failure to consider how digital control wouldn't work in a 3D environment.

293984j29384|5 years ago

The design of this web page leaves a bit to be desired. The tabbed boxes with the light grey background on the white background of the website itself are pretty easy to miss as you scroll forever.

plerpin|5 years ago

Did Rambus have a dossier of compromising photos on industry executives in the mid 90's? Why did Intel and Nintendo go all in on such an expensive and technically inferior memory technology? The latency is such a killer, especially if you're on an architecture with really deep pipelines (ahem, P4).

rasz|5 years ago

Rambus meant Nintendo shipped 5 gen console on 2 layer pcb with only 4 big ICs. TWO layers! that was a huge cost saving. Compare to Sega Saturn with a total of something like 144bits of various memory buses divided into multiple memory banks over multiple memory chips.

littleweep|5 years ago

I get the nostalgia angle -- but why are we discussing a console that's >20 years old?

zapzupnz|5 years ago

Because it's interesting and technical. Despite the name 'Hacker News', I don't know if you've noticed but not a lot of what's on here is necessarily news — just plenty of food for thought for engineers, people in comp sci, etc.

edoceo|5 years ago

The linked article is a very indepth, clearly presented breakdown and analysis of the hardware. It's well written and educational. Works like this get hella karma on this site.

cesarb|5 years ago

Sometimes, it's an important step in the path of hardware evolution; sometimes, it's a path not followed that we can learn from; either way, 20-year-old hardware is still relevant.