top | item 41226250

(no title)

ljhsiung | 1 year ago

I've definitely had this thought about this sort of openness that RISC-V inherently promotes.

Sure, anybody can make a RISC-V CPU, but who really has the capabilities to verify them?

There's a reason the ARM model has succeeded-- that is, providing totally off-the-shelf IP with pre-verified cores (because of their own large verif team). The logical end of RISC-V is that we have custom cores literally everywhere, but verifying them is quite costly.

The (equally) hard part with CPU design is funnily enough not in creating the design, but the verification. (That's kinda one small reason why I think CoPilot-esque tools haven't permeated the hardware design space very much).

discuss

order

dpeckett|1 year ago

Isn't this the Cathedral vs Bazaar debate all over again? All designs have flaws, but at-least with something like the C910 you can open a PR to fix it going forwards.

Sure the first revisions of a new design will be buggy, but over time with iteration and continuous improvement they'll only get better.

I don't think too many folks will be designing new RISC-V cores from scratch, in the same way that very few people build their own OS's. It'll be contributing features and bugfixes to existing designs (and designing custom extensions).

camel-cdr|1 year ago

Most RISC-V processors are proprietory, the C910 is partially open source, excluding the draft vector extension implementation where the bug was located.

kstenerud|1 year ago

The same used to be said of open source competition to Oracle's database. An open source DATABASE? Unthinkable! First off it would be prima facie insecure, and secondly, it would be a toy that could never compete on important things like performance!

Arnavion|1 year ago

Is it the logical end that we have custom cores everywhere? Yes RISC-V is open and relatively straightforward to implement (coincidentally I did just that over the weekend in a circuit simulator game), but I can also see economies of scale making it so that a few vendors end up making cores that are good enough for all use cases between them and end up dominating the market. A few low-powered 32E cores, a few desktop-grade 64GCBV... alphabet soup (or more likely, hanzi lamian) cores, and a few in between would seem to be enough.

dwoxctbvgq|1 year ago

May I ask what circuit simulator game? I've attempted something similar with my very limited and hobby level hardware knowledge in Virtual Circuit Board and I learned that it would take me much longer than a weekend, heh.

msla|1 year ago

Yes, it would be disastrous if a widely-used chip had a bug in its division opcode.

https://en.wikipedia.org/wiki/Pentium_FDIV_bug

It would be disastrous if a widely-used chip had a bug in a lock prefix that allowed crafted untrusted binaries to soft-brick the whole machine.

https://en.wikipedia.org/wiki/Pentium_F00F_bug

I'm not even mentioning the weirdnesses from the eight-bit era. We all know those chips were heroic efforts to get anything done given transistor budgets, and error checking was simply not possible. The point is, even mature proprietary companies have had severe hardware bugs for longer than many here have been around, even if you discount stuff like spectre and meltdown.

cpswan|1 year ago

My takeaway from this is that the existence of RISCVuzz will make future verification much easier. Just run your prototype (or even an emulation of it) against a known good array of implementations.

IshKebab|1 year ago

Companies taping out chips have the capability to verify them. The open source verification systems and tests are definitely like only 5% of a proper verification, but there are commercial tests and models that are much better.

I think the real question is not how you verify a CPU - we know how to do that. It's how you know how well a CPU has been verified. This is all based on reputation currently.