Cool project! It's interesting the dichotomy in tech communities between the 'minimalists' on one hand who love to get under the hood, work their way down to the bare metal and understand how everything works, and the opposite trend, building seemingly simple web apps that sit on top of 1,000 libraries and frameworks, pack their huge dependency chains into Docker containers distributed onto clusters (probably because the app runs so slow on a single VM), etc. I wonder if it's two fundamentally different kinds of personalities at work.
The first person you describe has to have almost endless amounts time and/or technical interest available to them. The second person you describe has a job to get done as quickly as possible and move on to the next job.
The whole reason libraries and frameworks were created is so everyone doesn't have to dig down to the bare metal to get a task completed. If my manager asked me to provision a server to run an application on and I sat down and built my own hardware from scratch and then wrote my own OS rather than clicking a single button in VMware, I'd be fired pretty quickly.
You are 100% right. This project fits 100% with my personality. I like to understand in details what I'm working on, even if I need to "loose" time exploring things not directly linked to what I'm supposed to do. And I also like systems that are designed at necessary level.
> I wonder if it's two fundamentally different kinds of personalities at work.
Not two different kinds of personalities, just two different kinds of work. Many people find ways to do both.
The minimalist role is better for learning and building for the art of it — when the purpose of the making is the making itself. The article is a great example of this.
The other role is when that learning needs to be applied to a further end. E.g. when shipping product, code is a "means," and pre-built libraries and layers of abstraction are leverage.
Give it a few years, and we will see the first ‘compile npm to FPGA’ project that still uses a thousand libraries, but does away with the CPU, the OS, the docker container, the web server, etc.
I don't think it's necessarily two fundamentally different personalities (unless I have multiple...which I might...)
I grew up taking things apart, and I loved the courses where we build logic gates or modified compiler or interpreter code.
I now build things on the shoulders of giants.
But when I need to, I know I can dive to the deepest levels to debug something, or I can write or customise any part of the stack.
It sure is a mindset shift and a context switch. I consider choosing the right moment to switch approach to be one of the most important and hardest tasks of developing software systems.
I'm a bit of an odd duck because I do ASIC design for work (custom Digital Audio Codecs) and do web development for fun/side work. So I do the bare metal and the high level.
It reminds me a little of people who restore old cars, yes, there's something great about it, but no, I'm not interested in driving a Model-T around myself...
There's the great nand2tetris course [1] - teaches step by step how to build a computer from the simplest logical gates, using hdl, to building your own ALU, computer, and later on operating system, etc'.
To quote the author (F4HDK) who designed everything from hardware to software, compiler,loader,assembler etc:
It is a design that came from my imagination, not from existing CPUs. It is a RISC design, without microcode, and instruction are very low level. Therefore, the executable code is much bigger than for CISC CPUs. But the CPU core itself is very simple: especially instruction decoding is very very simple. It is also slower than CISC because the CPU takes lots of time just for reading instructions from the bus (and of course there is no execution pipeline)... But it works!
Nice Job! Don't be discouraged when people say "What a waste of time, why didn't you just use an Arduino?"
When I went to college I had a choice of majoring in Computer Science or Electrical Engineering (there weren't computer systems degrees at the time). Since I really really wanted to know how to build a computer from the ground up (I had assembled one from a kit already in high school and was hungry to know more!) I chose getting my degree in EE with a minor in CS. I don't know where you are in your studies but if you have the opportunity you might find, as I did, that this path scratches that particular itch.
There are a number of books you might track down which you would find interesting given what you now have learned about computer internals. One is "A DEC view on hardware design" which talks about the minicomputers and their architecture that DEC designed. "Introduction to Computer Systems using the PDP-11 and PASCAL" (McGraw-Hill computer science series). And "Digital Computer Design" by Kline. All are out of print but a good computer science section in a library should have them.
One of the reasons I enjoy the older books on computer design is that they assume you don't know why different paths were chosen and so they explain in more detail why one path is better than another. Modern texts often assume you have learned these best practices elsewhere and so treat these design decisions as given knowledge.
If you ever do decide to pick it up again, the two places that you might find rewarding would be automated co-processing (like DMA controllers) and complex mathematical instructions (like floating point).
Thank you very much for this encouraging comment! I ended my studies 15 years ago, I made this project as an "autodidact".
I don't know if I will work again on such projects, because I have tons of other electronic topics I want to work on (mainly radio). But if I pick it up again, it will be with a brand new CPU project.
Cool project none the less. I build a custom CPU in FPGA as a school project once. Far less complicated than A2Z, iirc I copied an instruction set from a different CPU, so I could use the assembler (and subsequently the C compiler) from that vendor. Can recommend doing such project (VHDL is not that hard to learn), it's an awesome learning experience!
Same here, we had implemented the architecture of a CPU in FPGA, I think it was 8bit or 16 bit, not sure. It was great for understanding the ALU, Opcodes ...etc. Probably the most satisfying lab in my computer eng. degree.
Cool project. I did something similar for an FPGA class in college. Prof gave us 3 C programs and we had to implement everything to make it work. Difficult project but one of the most rewarding.
>"I have built this development board by myself, using wrapping technique, because I couldn’t find any board with 2MB of SRAM arranged in 16bits. I wanted SRAM, instead of DRAM, for the simplicity."
I am have heard the term "wrapping" or "board wrapping" in historical references by Steve Wozniak and the original Home Brew Computer Club as well. Could someone describe what this "wrapping" process entails? Is this essentially translating the verilog into physical wires and pins?
It's a way of cold welding wires to the pins of electronics[1]. It's pretty much fallen out of usage as computers have gotten too small for the technique, but it's nice for prototyping because it's easier to undo than solder but more permanent than a breadboard
I still use wire-wrapping with veroboards quite often for my home hobby. I like this method, because above certain frequency, it is fairly difficult (quite impossible) to use Breadboards because of electromagnetic issues. The goal is to test different configurations, to rapidly test a new component, explore new ideas quickly. On Veroboard, the wire-wrapping method enables very dense connections.
Same. We had/got to implement a keyboard controller and VGA output as well and the grading was based on our system running prof's C programs, taking input, and producing correct output. Lots of late nights, but great fun when it worked.
It really depends on you current skills. If you already know about electronics, and roughly what an FPGA is, and if you know C programming, then you can jump rapidly to FPGA and Verilog.
One good and very condensed training course below:
http://www.ee.ic.ac.uk/pcheung/teaching/ee2_digital/Altera%2...
I was reading through a gcc source code yesterday and found a moxie architecture, which seems like quite a similar very small project. It is from the autor of libffi and includes gcc, binutils, qemu ports.
It's probably a nice example in how to take this further and implement a gnu toolchain support for something like this.
I’m afraid, Linux and a C compiler is totally not feasible.
A2Z lacks many things to achieve the goal of C retargeting and Linux porting.
- A2Z only manages Direct Addressing in hardware. No complex address computation. If you want to implement data stack, recursive functions, then direct addressing is not enough. You cannot retarget a C compiler with only direct addressing (or if you emulate complex addressing modes by software, it would be very very slow).
- A2Z has not interrupt management. You cannot implement/port a pre-emptive multitasking OS (Linux) without interrupt management.
[+] [-] QuadrupleA|7 years ago|reply
[+] [-] freehunter|7 years ago|reply
The whole reason libraries and frameworks were created is so everyone doesn't have to dig down to the bare metal to get a task completed. If my manager asked me to provision a server to run an application on and I sat down and built my own hardware from scratch and then wrote my own OS rather than clicking a single button in VMware, I'd be fired pretty quickly.
[+] [-] F4HDK|7 years ago|reply
[+] [-] zackbrown|7 years ago|reply
Not two different kinds of personalities, just two different kinds of work. Many people find ways to do both.
The minimalist role is better for learning and building for the art of it — when the purpose of the making is the making itself. The article is a great example of this.
The other role is when that learning needs to be applied to a further end. E.g. when shipping product, code is a "means," and pre-built libraries and layers of abstraction are leverage.
[+] [-] Someone|7 years ago|reply
(if it doesn’t exist already. The best I can google is https://github.com/hamsternz/FPGA_Webserver, which is incomplete, misses the ‘npm to’ part, and seems abandoned)
[+] [-] dscpls|7 years ago|reply
I grew up taking things apart, and I loved the courses where we build logic gates or modified compiler or interpreter code.
I now build things on the shoulders of giants.
But when I need to, I know I can dive to the deepest levels to debug something, or I can write or customise any part of the stack.
It sure is a mindset shift and a context switch. I consider choosing the right moment to switch approach to be one of the most important and hardest tasks of developing software systems.
[+] [-] abfan1127|7 years ago|reply
[+] [-] blacksmith_tb|7 years ago|reply
[+] [-] shacharz|7 years ago|reply
[1] https://www.nand2tetris.org/
[+] [-] whitten|7 years ago|reply
[+] [-] ChuckMcM|7 years ago|reply
When I went to college I had a choice of majoring in Computer Science or Electrical Engineering (there weren't computer systems degrees at the time). Since I really really wanted to know how to build a computer from the ground up (I had assembled one from a kit already in high school and was hungry to know more!) I chose getting my degree in EE with a minor in CS. I don't know where you are in your studies but if you have the opportunity you might find, as I did, that this path scratches that particular itch.
There are a number of books you might track down which you would find interesting given what you now have learned about computer internals. One is "A DEC view on hardware design" which talks about the minicomputers and their architecture that DEC designed. "Introduction to Computer Systems using the PDP-11 and PASCAL" (McGraw-Hill computer science series). And "Digital Computer Design" by Kline. All are out of print but a good computer science section in a library should have them.
One of the reasons I enjoy the older books on computer design is that they assume you don't know why different paths were chosen and so they explain in more detail why one path is better than another. Modern texts often assume you have learned these best practices elsewhere and so treat these design decisions as given knowledge.
If you ever do decide to pick it up again, the two places that you might find rewarding would be automated co-processing (like DMA controllers) and complex mathematical instructions (like floating point).
[+] [-] F4HDK|7 years ago|reply
[+] [-] LeonM|7 years ago|reply
Cool project none the less. I build a custom CPU in FPGA as a school project once. Far less complicated than A2Z, iirc I copied an instruction set from a different CPU, so I could use the assembler (and subsequently the C compiler) from that vendor. Can recommend doing such project (VHDL is not that hard to learn), it's an awesome learning experience!
[+] [-] guiomie|7 years ago|reply
[+] [-] wilsonnb3|7 years ago|reply
[+] [-] jwineinger|7 years ago|reply
[+] [-] bogomipz|7 years ago|reply
>"I have built this development board by myself, using wrapping technique, because I couldn’t find any board with 2MB of SRAM arranged in 16bits. I wanted SRAM, instead of DRAM, for the simplicity."
I am have heard the term "wrapping" or "board wrapping" in historical references by Steve Wozniak and the original Home Brew Computer Club as well. Could someone describe what this "wrapping" process entails? Is this essentially translating the verilog into physical wires and pins?
[+] [-] fasquoika|7 years ago|reply
[1]: https://en.wikipedia.org/wiki/Wire_wrap
[+] [-] F4HDK|7 years ago|reply
[+] [-] srcmap|7 years ago|reply
I used Xlinx VTPro 20, 10 + years ago, like to know the state of FPGA Software tools today.
[+] [-] F4HDK|7 years ago|reply
[+] [-] cushychicken|7 years ago|reply
Wish I'd taken the followon course about writing peripherals.
[+] [-] jwineinger|7 years ago|reply
[+] [-] bradhoffman|7 years ago|reply
[+] [-] F4HDK|7 years ago|reply
[+] [-] megous|7 years ago|reply
It's probably a nice example in how to take this further and implement a gnu toolchain support for something like this.
[+] [-] teabee89|7 years ago|reply
I’m afraid, Linux and a C compiler is totally not feasible.
A2Z lacks many things to achieve the goal of C retargeting and Linux porting.
- A2Z only manages Direct Addressing in hardware. No complex address computation. If you want to implement data stack, recursive functions, then direct addressing is not enough. You cannot retarget a C compiler with only direct addressing (or if you emulate complex addressing modes by software, it would be very very slow).
- A2Z has not interrupt management. You cannot implement/port a pre-emptive multitasking OS (Linux) without interrupt management.
- A2Z has no memory management unit.
- A2Z’s ALU is compatible with nothing.
[+] [-] unknown|7 years ago|reply
[deleted]
[+] [-] Zardoz84|7 years ago|reply
[+] [-] peter_d_sherman|7 years ago|reply
[+] [-] godelmachine|7 years ago|reply