top | item 33406798

(no title)

fd111 | 3 years ago

It was great. Full stop.

A sense of mastery and adventure permeated everything I did. Over the decades those feelings slowly faded, never to be recaptured. Now I understand nothing about anything. :-)

Starting in 1986 I worked on bespoke firmware (burned into EPROMs) that ran on bespoke embedded hardware.

Some systems were written entirely in assembly language (8085, 6805) and other systems were written mostly in C (68HC11, 68000). Self taught and written entirely by one person (me).

In retrospect, perhaps the best part about it was that even the biggest systems were sufficiently unsophisticated that a single person could wrap their head around all of the hardware and all of the software.

Bugs in production were exceedingly rare. The relative simplicity of the systems was a huge factor, to be sure, but knowing that a bug meant burning new EPROMs made you think twice or thrice before you declared something "done".

Schedules were no less stringent than today; there was constant pressure to finish a product that would make or break the company's revenue for the next quarter, or so the company president/CEO repeatedly told me. :-) Nonetheless, this dinosaur would gladly trade today's "modern" development practices for those good ol' days(tm).

discuss

order

bombcar|3 years ago

> In retrospect, perhaps the best part about it was that even the biggest systems were sufficiently unsophisticated that a single person could wrap their head around all of the hardware and all of the software.

This was it, even into the 90s you could reasonable "fully understand" what the machine was doing, even with something like Windows 95 and the early internet. That started to fall apart around that time and now there are so many abstraction layers you have to choose what you specialize in.

And the fact that you couldn't just shit another software update into the update server to be slurped up by all your customers meant you had to actually test things - and you could easily explain to the bosses why testing had to be done, and done right, because the failure would cost millions in new disks being shipped around, etc. Now it's entirely expected to ship software that has significant known or unknown bugs because auto-update will fix it later.

origin_path|3 years ago

It isn't right to consider that time as a golden age of software reliability. Software wasn't less buggy back then. My clear recollection is that it was all unbelievably buggy by today's standards. However things we take for granted now like crash reporting, emailed bug reports, etc just didn't exist, so a lot of devs just never found out they'd written buggy code and couldn't do anything even if they did. Maybe it felt like the results were reliable but really you were often just in the dark about whether people were experiencing bugs at all. This is the origin of war stories like how Windows 95 would detect and effectively hot-patch SimCity to work around memory corruption bugs in it that didn't show up in Windows 3.1.

Manual testing was no replacement for automated testing even if you had huge QA teams. They could do a good job of finding new bugs and usability issues compared to the devs-only unit testing mentality we tend to have today, but they were often quite poor at preventing regressions because repeating the same things over and over was very boring, and by the time they found the issue you may have been running out of time anyway.

I did some Windows 95 programming and Win3.1 too. Maybe you could fully understand what it was doing if you worked at Microsoft. For the rest of us, these were massive black boxes with essentially zero debugging support. If anything went wrong you got either a crash, or an HRESULT error code which might be in the headers if you're lucky, but luxuries like log files, exceptions, sanity checkers, static analysis tools, useful diagnostic messages etc were just totally absent. Windows programming was (and largely still is) essentially an exercise in constantly guessing why the code you just wrote wasn't working or was just drawing the wrong thing with no visibility into the source code. HTML can be frustratingly similar in some ways - if you do something wrong you just silently get the wrong results a lot of the time. But compared to something more modern like JavaFX/Jetpack Compose it was the dark ages.

dleslie|3 years ago

This is why I drifted towards game development for most of my career. Consoles, until the penultimate (antipenultimate?) generation, ran software bare or nearly bare on the host machine.

I also spent time in integrate display controller development and such; it was all very similar.

Nowadays it feels like everything rides on top some ugly and opaque stack.

treis|3 years ago

>This was it, even into the 90s you could reasonable "fully understand" what the machine was doing, even with something like Windows 95 and the early internet. That started to fall apart around that time and now there are so many abstraction layers you have to choose what you specialize in.

This doesn't really track. 30 years ago computers were, more or less, the same as they are now. The only major addition has been graphics cards. Other than that we've swapped some peripherals. Don't really see how someone could "fully understand" the modem, video drivers, USB controllers, motherboard firmware, processor instruction sets, and the half dozen or so more things that went into a desktop.

samstave|3 years ago

the crazy thing to me is just how many different workflows/UI/UX you need to learn along so many platforms today. AWS, GCP, Azure - like you need to learn so much deeply about each in order to be "marketable" and the only way you shall learn all of them is if you happen to work at a company that happens to rely on said platform.

Then there is low-level training of ILO bullshit that ive done weeks training on for HPE and I have building and dealing with HPE servers since before the bought COMPAQ....

And dont even get me started on SUN and SGI... how much brain power was put into understanding those two extinct critters... fuck even CRAY.

there is so much knowledge that has to evaporate in the name of progress....

thfuran|3 years ago

Yeah, it's definitely great but also terrible that bugs can be patched so easily now.

jerf|3 years ago

I've been trying to teach my young teenage kids about how things work, like, washing machines, cars, etc. One of the things I've learned is that it's a looooot easier to explain 20th century technology than 21st century technology.

Let me give you an example. My father was recently repairing his furnace in his camper, which is still a 20th century technology. He traced the problem to the switch that detects whether or not air is flowing, because furnaces have a safety feature such that if the air isn't flowing, it shuts the furnace off so it doesn't catch on fire. How does this switch work? Does it electronically count revolutions on a fan? Does it have two temperature sensors and then compute whether or not air is flowing by whether their delta is coming down or staying roughly the same temperature? Is it some other magical black box with integrated circuits and sensors and complexity greater than the computer I grew up with?

No. It's really simple. It's a big metal plate that sticks out into the airflow and if the air is moving, closes a switch. Have a look: https://www.walmart.com/ip/Dometic-31094-RV-Furnace-Heater-S... You can look at that thing, and as long as you have a basic understanding of electronics, and the basic understanding of physics one gets from simply living in the real world for a few years, you can see how that works.

I'm not saying this is better than what we have now. 21st century technology exists for a reason. Sometimes it is done well, sometimes it is done poorly, sometimes it is misused and abused, it's complicated. That fan switch has some fundamental issues in its design. It's nice that they are also easy to fix, since it's so simple, but I wouldn't guarantee it's the "best" solution. All I'm saying here is that this 20th century technology is easier to understand.

My car is festooned with complicated sensors and not just one black box, but a large number of black boxes with wires hooked in doing I have no idea what. For the most part, those sensors and black boxes have made cars that drive better, last longer, are net cheaper, and generally better, despite some specific complaints we may have about them, e.g., lacking physical controls. But they are certainly harder to understand than a 20th century car.

Computers are the same way. There is a profound sense in which computers today really aren't that different than a Commodore 64, they just run much faster. There are also profound senses in which that is not true; don't overinterpret that. But ultimately these things accept inputs, turn them into numbers, add and subtract them really quickly in complicated ways, then use those numbers to make pictures so we can interpret them. But I can almost explain to my teens how that worked in the 20th century down to the electronics level. My 21st century explanation involves a lot of handwaving, and I'm pretty sure I could spend literally a full work day giving a spontaneous, off-the-cuff presentation of that classic interview question "what happens when you load a page in the web browser" as it is!

kabdib|3 years ago

It was a mix of great and awful.

I wrote tons of assembly and C, burned EPROMs, wrote documentation (nroff, natch), visited technical bookstores every week or two to see what was new (I still miss the Computer Literacy bookstore). You got printouts from a 133 column lineprinter, just like college. Some divisions had email, corporation-wide email was not yet a thing.

No source code control (the one we had at Atari was called "Mike", or you handed your floppy disk of source code to "Rob" if "Mike" was on vacation). Networking was your serial connection to the Vax down in the machine room (it had an autodial modem, usually pegged for usenet traffic and mail).

No multi-monitor systems, frankly anything bigger than 80x25 and you were dreaming. You used Emacs if you were lucky, EDT if you weren't. The I/O system on your computer was a 5Mhz or 10Mhz bus, if you were one of those fortunate enough to have a personal hard drive. People still smoked inside buildings (ugh).

It got better. AppleTalk wasn't too bad (unless you broke the ring, in which case you were buying your group lunch that day). Laserprinters became common. Source control systems started to become usable. ANSI C and CFront happened, and we had compilers with more than 30 characters of significance in identifiers.

I've built a few nostalgia machines, old PDP-11s and such, and can't spend more than an hour or so in those old environments. I can't imagine writing code under those conditions again, we have it good today.

jjav|3 years ago

> No source code control

30 years ago is 1992, we certainly had source control a long time before!

In fact in 1992 Sun Teamware was introduced, so we even had distributed source control, more than a decade before "git invented it".

CVS is from 1986, RCS from 1982 and SCCS is from 1972. I used all four of those are various points in history.

> No multi-monitor systems, frankly anything bigger than 80x25 and you were dreaming.

In 1993 (or might've been early 1994) I had two large monitors on my SPARCstation, probably at 1280×1024.

cylinder714|3 years ago

>(I still miss the Computer Literacy bookstore)

I used to drive over Highway 17 from Santa Cruz just to visit the Computer Literacy store on N. First Street, near the San Jose airport. (The one on the Apple campus in Cupertino was good, too.)

Now, all of them—CL, Stacy's Books, Digital Guru—gone. Thanks, everyone who browsed in stores, then bought on Amazon to save a few bucks.

strangattractor|3 years ago

Agree with the poster. Much better IMHO and more enjoyable back then.

Because of the software distribution model then there was a real effort to produce a quality product. These days not so much. Users are more like beta testers now. Apps get deployed with a keyboard input. The constant UI changes for apps (Zoom comes to mind) are difficult for users to keep up with.

The complexity is way way higher today. It wasn't difficult to have a complete handle on the entire system back then.

Software developers where valued more highly. The machines lacked speed and resources - it took more skill/effort to get performance from them. Not so much of an issue today.

Still a good job but I would like seek something different if I was starting out today.

hirvi74|3 years ago

> Still a good job but I would like seek something different if I was starting out today

I'm only 6 years in, and I am starting to feel this.

I went into computer science because it's something I knew that, at some level, it was something I always wanted to do. I've always been fascinated with technology ever since I was a child -- how things work, why things work, etc..

While studying computer science at my average state school, I met a few others that were a lot like me. We'd always talk about this cool new technology, work on things together, etc.. The was a real passion for the craft in a sense. It's something I felt similar during my time studying music with my peers.

Perhaps, in some naive way, I thought the work world would be a lot like that too. And of course, this is only my experiences so far, but I have found my peers to be significantly different.

People I work with do not seem to care about technology, programing, etc.. They care about dollar signs, promotions, and getting things done as quickly as possible (faster != better quality). Sure, those three things are important to varying degrees, but it's not why I chose computer science, and I struggle to connect with those people. I've basically lost my passion for programing because of it (though that is not the entire reason -- burnout and whatnot has contributed significantly.)

I'm by no means a savant nor would I even consider myself that talented, but I used to have a passion for programming and that made all the "trips" and "falls" while learning worth it in the end.

I tell people I feel like I deeply studied many of the ins and outs photography only to take school pictures all day.

mattgreenrocks|3 years ago

> A sense of mastery and adventure permeated everything I did.

How much of that is a function of age? It is hard to separate that from the current environment.

Personally, I don't feel as inspired by the raw elements of computing like I once did, but it is probably more about me wanting a new domain to explore than something systemic. Or at least, it is healthier to believe that.

> knowing that a bug meant burning new EPROMs made you think twice or thrice before you declared something "done".

The notion of Internet Time, where you're continuously shipping, has certainly changed how we view the development process. I'd argue it is mostly harmful, even.

> perhaps the best part about it was that even the biggest systems were sufficiently unsophisticated that a single person could wrap their head around all of the hardware and all of the software.

I think this is the crux of it: more responsibility, more ownership, fewer software commoditization forces (frameworks), less emphasis on putting as many devs on one project as possible because all the incentives tilt toward more headcount.

psychphysic|3 years ago

Yes indeed could be Dunning Kruger effect.

gautamdivgi|3 years ago

There wasn’t HN so no distraction to digress to every now and them.

I second this - systems were small and most people could wrap their brains around them. Constant pressure existed and there wasn’t “google” & “so” & other blogs to search for solutions. You had to discover by yourself. Language and API manuals weighed quite a bit. Just moving them around the office was somewhat decent exercise.

There wasn’t as much build vs buy discussion. If it was simple enough you just built it. I spent my days & evenings coding and my nights partying. WFH didn’t exist so, if you were on-call you were at work. When you were done you went home.

My experience from 25 years ago.

convolvatron|3 years ago

I actually used to do 'on call' by having a vt100 at the head of my bed and I would roll over every couple hours and check on things over a 9600 baud encrypted modem that cost several thousand dollars.

the only time I ever had to get up in the middle of the night and walk to the lab was the Morris worm. I remember being so grateful that someone brought me coffee at 7

doug_durham|3 years ago

I have one word for you "Usenet".

chinchilla2020|3 years ago

Alot of our modern software practices have introduced layers of complexity on to systems that are very simple at a fundamental level. When you peel back the buzzword technologies you will find text streams, databases, and REST at the bottom layer.

It's a self fulfilling cycle. Increased complexity reduces reliability and requires more headcount. Increasing headcount advances careers. More headcount and lower reliability justifies the investment in more layers of complicated technologies to 'solve' the 'legacy tech' problems.

PontifexMinimus|3 years ago

> A sense of mastery and adventure permeated everything I did.

My experience too. I did embedded systeems that I wrote the whole software stack for: OS, networking, device drivers, application software, etc.

> Over the decades those feelings slowly faded, never to be recaptured. Now I understand nothing about anything.

These days programming is more trying to understand the badly-written documentation of the libraries you're using.

Jeema101|3 years ago

I'm younger than you, but one of my hobbies is messing around with old video game systems and arcade hardware.

You're absolutely right - there's something almost magical in the elegant simplicity of those old computing systems.

UncleOxidant|3 years ago

> A sense of mastery and adventure permeated everything I did. Over the decades those feelings slowly faded, never to be recaptured. Now I understand nothing about anything. :-)

Are you me? ;) I feel like this all the time now. I also started in embedded dev around '86.

> Nonetheless, this dinosaur would gladly trade today's "modern" development practices for those good ol' days(tm).

I wouldn't want to give up git and various testing frameworks. Also modern IDEs like VSCode are pretty nice and I'd be hesitant to give those up (VSCode being able to ssh into a remote embedded system and edit & debug code there is really helpful, for example).

HankB99|3 years ago

And it had it's downside too. - Developing on DOS with non-networked machines. (OK,l one job was on a PDP-11/23) - Subversion (IIRC) for version control via floppy - barely manageable for a two person team. - No Internet. Want to research something? Buy a book. - Did we have free S/W? Not like today. Want to learn C/C++? Buy a compiler. I wanted to learn C++ and wound up buying OS/2 because it was bundled with IBM's C++ compiler. Cost a bit less than $300 at the time. The alternative was to spend over $500 for the C++ compiler that SCO sold for their UNIX variant. - Want to buy a computer? My first was $1300. That got me a Heathkit H-8 (8080 with 64 KB RAM) and an H19 (serial terminal that could do up to 19.2 Kbaud) and a floppy disk drive that could hold (IIRC) 92KB data. It was reduced/on sale and included a Fortran compiler and macro-assembler. Woo! The systems we produced were simpler, to be sure, but so were the tools. (Embedded systems here too.)

airbreather|3 years ago

Yeah, I am almost identical, lots of 6805, floating point routines and bit banging RS232, all in much less than 2k code memory, making functional products.

Things like basketball scorebaords, or tractor spray controllers to make uniform application of herbicide regardless of speed. Made in a small suburben factory in batches of a hundred or so, by half a dozen to a dozen "unksilled" young ladies, who were actally quite skilled.

No internet, the odd book and magazines, rest of it, work it out yourself.

In those days it was still acceptable, if not mandatory to use whatever trick you could come up with to save some memory.

It didn't matter about the direct readability, though we always took great pains in the comments for the non obvious, including non specified addressing modes and the like.

This was around the time the very first blue LEDS came out.

When the web came along, and all the frameworks etc, it just never felt right to be relying on arbitrary code someone else wrote and you did not know the pedigree of.

Or had at least paid for so that you had someone to hassle if it was not doing what you expected and had some sort of warranty.

But also a lot of closed source and libraries you paid for if you wanted to rely on someone elses code and needed to save time or do something special, an awful lot compared to today.

Microsoft C was something like $3000 (maybe $5k, cant rememeber exactly) dollars from memory, at a time when that would buy a decent second hand car and a young engineer might be getting 20-25k a year tops(AUD).

Turbo C was a total breakthru, and 286 was the PC of choice, with 20MB hard drive, with the Compaq 386-20 just around the corner.

Still, I wouldn't go back when I look at my current 11th Gen Intel CPU with 32Gig RAM, 2 x 1TB SSDs and a 1080Ti graphics card with multiple 55inch 4k monitors, not even dreamable at the time.

convolvatron|3 years ago

don't forget the community. it was very much the case that you could look at an IETF draft or random academic paper and mail the authors and they would almost certainly be tickled that someone cared, consider your input, and write you back.

just imagine an internet pre-immigration-lawyer where the only mail you ever got was from authentic individuals, and there were no advertisements anywhere.

the only thing that was strictly worse was that machines were really expensive. it wasn't at all common to be self-funded

deathanatos|3 years ago

> knowing that a bug meant burning new EPROMs made you think twice or thrice before you declared something "done".

> Schedules were no less stringent than today;

So … how did that work, then? I know things aren't done, and almost certainly have bugs, but it's that stringent schedule and the ever-present PM attitude of "is it hobbling along? Good enough, push it, next task" never connecting the dots to "why is prod always on fire?" that causes there to be a never ending stream of bugs.

ipaddr|3 years ago

With no pms you dealt directly with the boss and you managed your own tasks so you had a hard deadline and showed demos and once it was done support/training. It was waterfall so not finishing on time meant removing features or finishing early meant added additional features if you had time . Everything was prod. You needed to fix showstopper bugs/crashed but bugs could be harmless (spelling fr example) or situational and complex or show shoppers. You lived with them because bugs were part of the OS or programming language or memory driver experience at the time.

HeyLaughingBoy|3 years ago

As my old boss once said (about 30 years ago actually!) when complaining about some product or the other "this happens because somewhere, an engineer said, 'fuck it, it's good enough to ship'."

jokethrowaway|3 years ago

I wonder how much of this is due to getting old vs actual complexity.

When I started I was literally memorising the language of the day and I definitely mastered it. Code was flowing on the screen without interruption.

Nowadays I just get stuff done; I know the concepts are similar, I just need to find the specifics and I'm off to implement. It's more akin to a broken faucet and it definitely affects my perception of modern development.

Rediscover|3 years ago

Thanks. I'd forgotten how much the 68705 twisted my mind.

And how much I love the 68HC11 - especially the 68HC811E2FN, gotta get those extra pins and storage! I never have seen the G or K (?) variant IRL (16K/24K EPROM respectively and 1MB address space on the latter). Between the 68HC11 and the 65C816, gads I love all the addressing modes.

Being able to bum the code using zero-page or indirectly indexed or indexed indirectly... Slightly more fun than nethack.

fgatti|3 years ago

https://en.wikipedia.org/wiki/Rosy_retrospection

I am sure everything was great back then but. I've been coding for 20 years, and there are a lot of problems of different types (including recurring bugs) that have been solved with better tooling, frameworks and tech overall. I don't miss too much

chkaloon|3 years ago

Exactly my experience coming out of school in 1986. Only for me it was microcontrollers (Intel 8096 family).

Thanks for bringing back some great memories!

mech422|3 years ago

I miss everything being a 'new challenge'... Outside of accounting systems - pretty much everything was new ground, greenfield, and usually - fairly interesting :-)