A reason to use old computers that I don't see mentioned here has to do with accessibility. People in the US usually have current hardware such as the latest Mac laptop, but that is not the case in all other countries. Current hardware is a bit of a luxury that we don't fully appreciate.
I have an open source project with global users, and one person in Mexico contacted me looking for help. He was trying to create 3D visualizations of MRI brain scans and was running it on an old computer that hardly anybody in the US would consider using. Happily I had done testing on an old laptop and much performance tuning during my development. I was able to help him get his project working. It was still slow, but at least it was usable. It wouldn't have been if my code only worked on current hardware.
A couple of the web sites I maintain have a primary audience of poor, largely immigrant, people with a fifth-grade education and only rudimentary English.
The server logs show most of the connections come from people using what people on HN would consider toy or throwaway convenience store phones. The high-end is people on Windows XP.
(The sites are in the healthcare space, and if one of our clients is really so desperately poor that they can't even afford a smartphone, we'll give them either a laptop and a hotspot, or a smartphone, so they can access the web sites. We pay for their connection.)
Related to this: one of the very few good reasons to offer unencrypted HTTP is that in some parts of the world, old devices are in widespread use, and support for modern HTTPS cannot be taken for granted.
my personal rule of thumb is that my software must be useable at -O0 with address sanitizers on my desktop - so far that has meant that at -O3 it stays useable on raspberry pi-3 level hardware.
A few months ago I tried to make a build which targetted ivybridge-level CPUs, it took no more than one day for a few users to report that it didn't work on their machines, turns out a lot of people still rock some old AMD Phenom or Q6600-era machines
Technically, couldn't he install a very lightweight Linux distribution.
I have a few Raspberry Pi zeros and I actually enjoy coding within the limitations of said hardware, when you know you only have 500 megs of RAM on the device you have to solve problems differently
A lot of young people use hand-me-down computers, even in developed countries.
I was just playing Unreal Tournament with the homestay family children, on WinXP. One of their friends asked "Is this like Fornite?" and I felt like I'm getting old. I was there when UT was new! Fornite runs on the Unreal Engine!
On that note though, it would be really great to have a new game for Windows XP.
>I think the only solution is to stop expecting every computer to be general-purpose
Why? Computers are general purpose. The software we put on computers may have specific purposes, but computers are general purpose.
As for 'computer powered appliances' plenty of those exist and the general trend does seem to be to abstract the computer away inside some kind of locked down appliance.
I hope general purpose computers never go away. They're one of the most powerful and amazing tools ever created by humans. It's really too bad more people don't seem to understand or appreciate that.
I think a lot of people get turned off from general purpose computers because they are using proprietary operating systems and software that mitigate the "general purpose" aspect.
The most "general purpose" software most people interact with is a browser.
NB: He wrote ”I think the only solution is to stop expecting every computer to be general-purpose” (my emphasis). He didn’t write ”I think the only solution is to stop expecting computers to be general-purpose”.
On the other hand, I don't need my fridge to be a general purpose computer when people start making them smarter. Embedded software gains a lot of robustness from being separate from general purpose computing software pipelines.
If my toaster starts running node.js and needs internet connectivity I may go find my own shark to jump.
At my institution, our students take a series of courses on programming a simple microcontroller (and were doing so long before IoT/Arduino made that fashionable again). We worked with the HC11 until a few years ago when we moved to the 9s12. They even worked for a while in Assembly Language until quite recently (we now use C exclusively). In this case it wasn’t nostalgia or joy or anything subtle: modern computers are too complex to permit a useful mental model of how they operate. These ‘older’ systems (and their modern simple cousins) are a fantastic way to learn how a computer actually works with sufficient insight that it gives you a much deeper feel for how more complex descendants work. As one example, pointers and indirection are always a topic that students learning programming struggle with. Explaining that topic is much, much easier to a roomful of people who’ve worked directly with address registers and offsets.
My father believed strongly in this. I first learned to program in my early teens (at the time there were precisely two computers in a 300 km range of where we lived, my father was an operator on one of them). The ‘computer’ I learned on was made of cardboard, and I was the CPU: https://en.wikipedia.org/wiki/CARDboard_Illustrative_Aid_to_...
> imagine if spreadsheet programs like Microsoft Excel stopped being developed and eventually just disappeared – that’s the level of significance that HyperCard had.
I often hear similar claims about the significance of HyperCard.
But if HyperCard was so significant to so many people, wouldn’t it have been ported and/or rewritten over the years to still be available today? Even if not by Apple, then by someone else?
That’s happened to Excel and other programs. So why not HyperCard? (Serious question)
HyperCard is like Concorde: it was replaced by less powerful alternatives (eg PowerPoint) that took away any hope of mass success and left the remaining audience too small to be viable.
That's why later successors like LiveCode have to aim themselves at niches of the original HyperCard audience, like those who want an easy dev tool. Which is nice, but misses the tool-for-everyone dream of the original
In some cases it has to do with the operating system cooperating or not. Modern operating systems don't allow certain things to work due to restrictions in what apps are capable of. Consider a Smalltalk OS, where you can dynamically link objects across "apps" in interesting ways (see Alan Kay's "Tribute to Ted Nelson" where he shows a demo of this). You simply cannot do such a thing with mac/windows/linux.
In the case of Hypercard, I cannot say whether or not this is the case. It could be that Hypercard is absolutely possible today. But I wouldn't doubt it if it was somewhat unusable on modern systems due to this "cooperation" issue I mentioned. It may need buy-in from other apps and/or the host OS for it to work fully as intended.
For another example, consider emacs. Emacs effectively gives you the lisp-machine experience, but the problem is it isn't integrated with the rest of the system. You sort of have to live in an emacs bubble. With hypercard, you could surely get it running, but would you be in a bubble? Ideally you could use hypercard to script the rest of your system as well.
What we should want is something like a "card" that could "link" to a specific cell in a spreadsheet, as an example. Or a card that could open a PDF to a specific page. The more the rest of the system "plays along", the more powerful something like hypercard could be.
One of the downsides of hypercard was that there were many XCMDs that were buggy, poorly written and became unmaintained. You could create a stack and send it to people and it would hang their machine. If not immediately an OS update came and bang! Often solving that issue was too much for the average user. It languished as Apple languished and people moved on.
Hypercard was amazing in it's prime, so it's place in history is assured. Many people who were not "programmers" were tricked in to being programmers on a 30MHz machine! They created amazing products. It also influenced the development of Netscape and the way Javascript treated events/actions (attaching actions to buttons for example as opposed to a buttons sending events in to an event loop).
It's also one of the finest examples of a domain specific language empowering a normal user to do more. It's powerful but uses human concepts. Something we seem to have forgotten completely with our obsession in integrating Javascript, Python or Lua in to everything and then blaming users for not being empowered.
There is still a lot to (re)learn from these technologies.
In brief: in the early 1980s, home computers were designed with the primary purpose of owners programming the machines themselves. They came with BASIC interpreters and how-to-program manuals. (Examples: ZX Spectrum, Oric-1, BBC Micro.)
But in fact, what happened was that most owners just played 3rd party videogames on them, which they bought ready-made on pre-recorded media.
So late-1980s home computers mostly had much better graphics and sound for better games, and some didn't have a BASIC at all, or only limited ones (examples: Amiga, ST) and better BASICs were left to the 3rd-party market (e.g. STOS, AMOS, GFA BASIC, BlitzBASIC.)
The Mac was on the crux between these generations, with one foot on both sides. Fairly poor graphics and sound, but it did have a (limited) BASIC. It focused on delivering a radically better UI, and this briefly included a radically better end-user programming environment, HyperCard.
But that isn't where the market went, and it wasn't where Steve Jobs was so focussed, which was on the UI and improving it, not user programmability.
Cynical interpretation: making it easier for owners to write their own polished, professional-looking graphical applications would potentially reduce the lucrative aftermarket for additional applications software, so Apple killed off this line of evolution.
For others who love old software and hardware I'll share two of my favorite sites, an excellent retro PC emulator, 86Box [0] and a clean and well-maintained software archive, WinWorld [1].
These two sites together have provided me hours of exploration into old hardware, BIOS screens I'd never otherwise see, and plenty of interesting software scenarios.
>On this blog, I write about the various computers I use and about the operating systems I use on them. Apart from Windows 7, which is relatively modern, these include Mac OS 10.6 Snow Leopard, which at this point is quite old
Completely nitpicking here, but both operating systems are the exact same age. I agree that Snow Leopard feels significantly less up-to-date than Windows 7 though, which speaks to how quickly Apple’s operating systems are obsoleted (and this isn't necessarily a bad thing).
All software is bound to keep changing forever unless people stop using it... even after it's past its "perfect" place in terms of usability and benefits it brings to its intended audience (I am not saying perfect in terms of having no bugs - though that may also be the case)... because we can only know that in hindsight and we have no way of measuring this objectively.
Some old Unix tools are perhaps the closest we have to that. (ls, cd, tail...) but in terms of UI, I can't think of anything. As the needs of users change, so does what the "perfect software" for such users looks like... however, I would think there's usually a decades-long period in which some software could stay just as it is without there being possible improvements one could make to it.
I think it would be really interesting if we could find a good way to tell when that "perfection" is reached and tried to intentionally stop changing what is literally already perfect (though that will never happen in a commercial product, for obvious reasons).
At the time of writing the answer appears to be "Error establishing a database connection" which tickled me as, well, accessing my childhood 8 bit computers never involved database errors!
Sometimes I used old computers because they appeared functional enough. Not so long ago I used a computer from late 2000s and it was a quite normal user experience on Linux (on lightweight window manager of course) with a small exception of web browser. Amount of scripts and data on modern sites caused problems, made whole OS hang often. If only JS was turned off, no problem.
Coincidentally i'm writing this from my late 2009 iMac. It is already more than a decade old but i think it is a perfectly fine computer. With the latest version of Firefox every site works.
The main issue it has is that it is a bit sluggish but i think an additional 8GB of RAM (it has 4GB) and perhaps an SSD would make it feel perfectly fine.
Sadly Apple doesn't seem to agree and the last version of macOS to support it is 10.13 - which itself isn't supported anymore as of December 2020 (just ~3 years after it was released, which is kinda mad IMO). Most things seem to work fine so far (most open source applications seem to support even older versions anyway), though Homebrew (which i used to install a couple of command line tools) does warn that they have no official support for it and some stuff may break (fortunately that didn't happen).
At Brown U.'s semiconductor fab cleanroom we had a Windows 3.1 PC controlling our plasma etching machine. One day we opened up its case--not a speck of visible dust despite operating continuously since the early 90's!
32 bit aren't a problem, RAM however could be. I've run Debian on 32 bit Atom netbooks with 1 Gig RAM without problems. Using light desktop environments such as XFCE or smaller ones would allow also 512MB RAM or even less.
Years ago I successfully run Debian + LXDE desktop on one of those toy Win-CE Chinese laptops with just 128MB RAM. CPU was a WM8505 clocked at a whopping 300MHz. And then there's ELKS Linux which would work on 8086 CPUs too which I successfully run on a industrial PC many moons ago. https://github.com/jbruchon/elks
Extremely small systems aside, it can run fine on decently equipped laptops or netbooks. Surfing the web with a full featured browser such as Firefox or using heavy apps such as LibreOffice without having the system swap too much would likely require no less than 2 Gigs or more, but if you do network maintenance using command line tools, even the smallest netbook with half a Gig RAM becomes an useful tool to keep in the bag along with bigger laptops.
Which CPU model do you have exactly? If it's a core 2 model, they are actually 64bit capable (32bit extended) and can run an x86_64 linux without issues.
Rather than that I'd recommend Debian or Mint with MATE if you want an easy and stable distro. Otherwise if you are willing enough, go for archlinux32 to have still the benefits of AUR.
> I recently inherited a 32-bit laptop that runs Vista, any recommendations of what version of Linux to try?
I'll have to check to be sure that it is 32bit(l/top is downstairs and I'm lazy), but I do my personal projects on a 2008 Asus that came with Vista and 2GB of RAM. I literally use it daily using:
1. Emacs
2. Vim + every plugin you can think of for development
3. GCC + all the devtools for C development
4. Standard gui tools (browser, some solitaire games, dia for diagrams, etc).
One thing about old computers is how hackable the hardware is. On old PCs, things like parallel ports, serial ports, joysticks and so on are trivially easy to interface to and have great performance. Even further back, generating and measuring periods in the sub-microsecond range was just a quick bit of assembler on 4MHZ Z80's back in the early 80's. Although that sort of hacking is still very possible now, it's the province of things like PICs, Arduino's and Pi's which are all relatively specialised and require far more effort to get started with than sticking a couple of wires in a parallel port and doing an outportb().
When I was an independent consultant in the mountains in northern california. There were companies running devices from their Windows 7 and Window NT computers where the device manufacturer had gone belly up, quit or just decided to not support their previous products. I had several sucesses using WINE on Linux. The companies were so grateful that in some cases I get free pastry, coffee, and/or sandwiches
Here in LA some schools are in a bad way so I put Linux on old hardware and teach the kids what is different between linux, libre Office, GIMP, and other FOSS software. I freely transfer my HATE of M$Soft and MAC to these young minds
I worked as a recording engineer for several years. The first studio I worked at, the B room had an SSL 6000 G Series and a Pro Tools 888 system hooked up to an Apple machine running OS9.
This was 2008, so already old back then, but with the way it was configured plus the 888 system, it was still valuable.
I have friends who work in water treatment infrastructure, and out of necessity carry laptops with VMs for DOS, windows 3.1, etc.
Even my AD/DA converter at home is no longer supported. I use a 2010 mac mini running OSX 10.11 with it.
As long as people are using older hardware that interfaces with a computer, older OSes and machines will be useful.
I've enjoyed installing windows 98 on old hardware; unlike windows xp, the OS installs without network authorization and runs mind of legacy software. But the couple of old towers I've gotten from friend's closets have lacked driver CDs. Once you reinstall windows (perhaps because of a bad hard drive sector) it becomes impossible to get the sound card working again or better than 16 colors out of the graphics card. Vendors like even intel no longer provide 80710E chipset drivers and the ones for download on shareware sites don't work.
In addition to Browservice, WRP is also pretty good for very old browser that can't do anything but to load images. As long as you can load a webpage with image map support, you can browse any modern websites on it (I think Browservice requires some js support on the client browser).
I have an old Win2K box with a P3-450 with a SB-Live! soundcard which has some nice environmental effects such as reverb that I use for recording. I haven't used it much in the past few years because I need to clone the two drives, which are close to 20 years old. Rebuilding the system from scratch would be a nightmare.
[+] [-] hx2a|5 years ago|reply
I have an open source project with global users, and one person in Mexico contacted me looking for help. He was trying to create 3D visualizations of MRI brain scans and was running it on an old computer that hardly anybody in the US would consider using. Happily I had done testing on an old laptop and much performance tuning during my development. I was able to help him get his project working. It was still slow, but at least it was usable. It wouldn't have been if my code only worked on current hardware.
[+] [-] reaperducer|5 years ago|reply
The server logs show most of the connections come from people using what people on HN would consider toy or throwaway convenience store phones. The high-end is people on Windows XP.
(The sites are in the healthcare space, and if one of our clients is really so desperately poor that they can't even afford a smartphone, we'll give them either a laptop and a hotspot, or a smartphone, so they can access the web sites. We pay for their connection.)
[+] [-] MaxBarraclough|5 years ago|reply
[+] [-] jcelerier|5 years ago|reply
A few months ago I tried to make a build which targetted ivybridge-level CPUs, it took no more than one day for a few users to report that it didn't work on their machines, turns out a lot of people still rock some old AMD Phenom or Q6600-era machines
[+] [-] offtop5|5 years ago|reply
I have a few Raspberry Pi zeros and I actually enjoy coding within the limitations of said hardware, when you know you only have 500 megs of RAM on the device you have to solve problems differently
[+] [-] peterburkimsher|5 years ago|reply
I was just playing Unreal Tournament with the homestay family children, on WinXP. One of their friends asked "Is this like Fornite?" and I felt like I'm getting old. I was there when UT was new! Fornite runs on the Unreal Engine!
On that note though, it would be really great to have a new game for Windows XP.
[+] [-] grawprog|5 years ago|reply
Why? Computers are general purpose. The software we put on computers may have specific purposes, but computers are general purpose.
As for 'computer powered appliances' plenty of those exist and the general trend does seem to be to abstract the computer away inside some kind of locked down appliance.
I hope general purpose computers never go away. They're one of the most powerful and amazing tools ever created by humans. It's really too bad more people don't seem to understand or appreciate that.
[+] [-] thomastjeffery|5 years ago|reply
The most "general purpose" software most people interact with is a browser.
[+] [-] cpach|5 years ago|reply
[+] [-] ehnto|5 years ago|reply
If my toaster starts running node.js and needs internet connectivity I may go find my own shark to jump.
[+] [-] ajarmst|5 years ago|reply
[+] [-] ajarmst|5 years ago|reply
[+] [-] musicale|5 years ago|reply
[+] [-] TedDoesntTalk|5 years ago|reply
I often hear similar claims about the significance of HyperCard.
But if HyperCard was so significant to so many people, wouldn’t it have been ported and/or rewritten over the years to still be available today? Even if not by Apple, then by someone else?
That’s happened to Excel and other programs. So why not HyperCard? (Serious question)
[+] [-] bonaldi|5 years ago|reply
That's why later successors like LiveCode have to aim themselves at niches of the original HyperCard audience, like those who want an easy dev tool. Which is nice, but misses the tool-for-everyone dream of the original
[+] [-] chillpenguin|5 years ago|reply
In the case of Hypercard, I cannot say whether or not this is the case. It could be that Hypercard is absolutely possible today. But I wouldn't doubt it if it was somewhat unusable on modern systems due to this "cooperation" issue I mentioned. It may need buy-in from other apps and/or the host OS for it to work fully as intended.
For another example, consider emacs. Emacs effectively gives you the lisp-machine experience, but the problem is it isn't integrated with the rest of the system. You sort of have to live in an emacs bubble. With hypercard, you could surely get it running, but would you be in a bubble? Ideally you could use hypercard to script the rest of your system as well.
What we should want is something like a "card" that could "link" to a specific cell in a spreadsheet, as an example. Or a card that could open a PDF to a specific page. The more the rest of the system "plays along", the more powerful something like hypercard could be.
[+] [-] brudgers|5 years ago|reply
The World Wide Web because it solved distribution and vendor lock-in.
Power Point because slide shows were an important use case Windows Desktops were much much more common in the 1990's and 00's than Macs.
Power Point also provided much better integration with wordprocessors and spreadsheets.
[+] [-] dazzawazza|5 years ago|reply
Hypercard was amazing in it's prime, so it's place in history is assured. Many people who were not "programmers" were tricked in to being programmers on a 30MHz machine! They created amazing products. It also influenced the development of Netscape and the way Javascript treated events/actions (attaching actions to buttons for example as opposed to a buttons sending events in to an event loop).
It's also one of the finest examples of a domain specific language empowering a normal user to do more. It's powerful but uses human concepts. Something we seem to have forgotten completely with our obsession in integrating Javascript, Python or Lua in to everything and then blaming users for not being empowered.
There is still a lot to (re)learn from these technologies.
[+] [-] lproven|5 years ago|reply
http://www.loper-os.org/?p=568
In brief: in the early 1980s, home computers were designed with the primary purpose of owners programming the machines themselves. They came with BASIC interpreters and how-to-program manuals. (Examples: ZX Spectrum, Oric-1, BBC Micro.)
But in fact, what happened was that most owners just played 3rd party videogames on them, which they bought ready-made on pre-recorded media.
So late-1980s home computers mostly had much better graphics and sound for better games, and some didn't have a BASIC at all, or only limited ones (examples: Amiga, ST) and better BASICs were left to the 3rd-party market (e.g. STOS, AMOS, GFA BASIC, BlitzBASIC.)
The Mac was on the crux between these generations, with one foot on both sides. Fairly poor graphics and sound, but it did have a (limited) BASIC. It focused on delivering a radically better UI, and this briefly included a radically better end-user programming environment, HyperCard.
But that isn't where the market went, and it wasn't where Steve Jobs was so focussed, which was on the UI and improving it, not user programmability.
Cynical interpretation: making it easier for owners to write their own polished, professional-looking graphical applications would potentially reduce the lucrative aftermarket for additional applications software, so Apple killed off this line of evolution.
[+] [-] the_only_law|5 years ago|reply
Which is the modern evolution of HyperCard to my understanding
[+] [-] wizzerking|5 years ago|reply
[+] [-] accrual|5 years ago|reply
These two sites together have provided me hours of exploration into old hardware, BIOS screens I'd never otherwise see, and plenty of interesting software scenarios.
[0] https://github.com/86Box/86Box
[1] https://winworldpc.com
[+] [-] the_af|5 years ago|reply
[+] [-] spideymans|5 years ago|reply
Completely nitpicking here, but both operating systems are the exact same age. I agree that Snow Leopard feels significantly less up-to-date than Windows 7 though, which speaks to how quickly Apple’s operating systems are obsoleted (and this isn't necessarily a bad thing).
[+] [-] brabel|5 years ago|reply
Some old Unix tools are perhaps the closest we have to that. (ls, cd, tail...) but in terms of UI, I can't think of anything. As the needs of users change, so does what the "perfect software" for such users looks like... however, I would think there's usually a decades-long period in which some software could stay just as it is without there being possible improvements one could make to it.
I think it would be really interesting if we could find a good way to tell when that "perfection" is reached and tried to intentionally stop changing what is literally already perfect (though that will never happen in a commercial product, for obvious reasons).
[+] [-] Siira|5 years ago|reply
[+] [-] dcminter|5 years ago|reply
[+] [-] prox|5 years ago|reply
[+] [-] lioeters|5 years ago|reply
Archived: https://web.archive.org/web/20210319083317/http://john.ankar...
[+] [-] rany_|5 years ago|reply
[+] [-] lew89|5 years ago|reply
[+] [-] badsectoracula|5 years ago|reply
The main issue it has is that it is a bit sluggish but i think an additional 8GB of RAM (it has 4GB) and perhaps an SSD would make it feel perfectly fine.
Sadly Apple doesn't seem to agree and the last version of macOS to support it is 10.13 - which itself isn't supported anymore as of December 2020 (just ~3 years after it was released, which is kinda mad IMO). Most things seem to work fine so far (most open source applications seem to support even older versions anyway), though Homebrew (which i used to install a couple of command line tools) does warn that they have no official support for it and some stuff may break (fortunately that didn't happen).
[+] [-] dudeinjapan|5 years ago|reply
[+] [-] frabert|5 years ago|reply
[+] [-] b06tmm|5 years ago|reply
[+] [-] squarefoot|5 years ago|reply
Extremely small systems aside, it can run fine on decently equipped laptops or netbooks. Surfing the web with a full featured browser such as Firefox or using heavy apps such as LibreOffice without having the system swap too much would likely require no less than 2 Gigs or more, but if you do network maintenance using command line tools, even the smallest netbook with half a Gig RAM becomes an useful tool to keep in the bag along with bigger laptops.
[+] [-] jsyedidia|5 years ago|reply
[+] [-] cookiengineer|5 years ago|reply
Rather than that I'd recommend Debian or Mint with MATE if you want an easy and stable distro. Otherwise if you are willing enough, go for archlinux32 to have still the benefits of AUR.
[+] [-] gnyman|5 years ago|reply
It's feels like a modern Windows XP.
But I must admit I have not used it for much work, but the feeling of playing around with it was great.
[+] [-] lelanthran|5 years ago|reply
I'll have to check to be sure that it is 32bit(l/top is downstairs and I'm lazy), but I do my personal projects on a 2008 Asus that came with Vista and 2GB of RAM. I literally use it daily using:
1. Emacs 2. Vim + every plugin you can think of for development 3. GCC + all the devtools for C development 4. Standard gui tools (browser, some solitaire games, dia for diagrams, etc).
I am pretty certain I am using this: https://www.linuxmint.com/edition.php?id=255
Once again, I might be wrong (although "pretty certain" covers that), but you can give it a try.
[+] [-] silentsysadmin|5 years ago|reply
[+] [-] pomian|5 years ago|reply
[+] [-] hazeii|5 years ago|reply
[+] [-] wizzerking|5 years ago|reply
[+] [-] jcpst|5 years ago|reply
This was 2008, so already old back then, but with the way it was configured plus the 888 system, it was still valuable.
I have friends who work in water treatment infrastructure, and out of necessity carry laptops with VMs for DOS, windows 3.1, etc.
Even my AD/DA converter at home is no longer supported. I use a 2010 mac mini running OSX 10.11 with it.
As long as people are using older hardware that interfaces with a computer, older OSes and machines will be useful.
[+] [-] elevation|5 years ago|reply
[+] [-] someperson|5 years ago|reply
If you're on a machine with only a RJ11 / 56k dial-up port, you can also setup Raspberry Pi to handle this too: https://www.youtube.com/watch?v=NFUTInM7gq8
Hope this helps some retrocomputing enthusiasts!
[+] [-] neurostimulant|5 years ago|reply
https://github.com/tenox7/wrp
[+] [-] dcassett|5 years ago|reply
[+] [-] sleavey|5 years ago|reply