top | item 30413629

Ask HN: What was better in the past in tech?

74 points| rixed | 4 years ago | reply

Not that progress should be denied entirely, but I often think that the past gets discarded too quickly and many good ideas got lost with the bath water. I'm certainly not the only one to feel that way.

So I'm wondering what such good ideas that have disappeared can other HNers remember.

I'll start:

In the golden age of sun stations, the BIOS was written in forth, and the ROM contained a forth interpreter. Not only all extension cards ROM was interpreted and therefore all extension cards were architecture independent, but you were given a Forth REPL to tinker around the boot process, or in fact at any later point once the system had started with a special key combination.

That was in my opinion way ahead of the modern BIOSes, even taking into account OpenBIOS.

Your turn?

138 comments

order
[+] PostOnce|4 years ago|reply
Programs were sold, not subscribed to. They continued to work long after the company folded.

Being a mischievous teenage hacker didn't land you in prison.

Software makers gave a shit about how much RAM and how many CPU cycles they were using. Disk space was sacred.

The technocrats weren't always a given. At least hacker-hippies that resented corporate control gave us an alternative; we could be living in a completely proprietary world. Compilers and even languages used to cost money, I shudder now when I see a proprietary language.

Once upon a time, computers did what you told them to without reporting you to the stasi, Google, or a number of marketing firms. Now that kind of freedom is obscure and hard to access for most people.

Software used to not hide all the options to "protect me from myself".

Computers used to be bigger. I love my office computer, but you gotta admit, fridge size and even room size 1401 style computers are pretty damn cool. I'm planning to buy a fiberglass cooling tower for a big-ish computer for a project this summer...

There used to be killer apps and amazing innovations but now its just ads and single function SaaS leases. At least open source projects are incredible, still. There are a few amazing commercial software products though. It's the future, after all.

Thats enough grumpy ranting for the minute, I'm sure I'll have to append this.

(P.S. remember when computers didn't have an out-of-band management system doing God knows what in the background?)

[+] easywood|4 years ago|reply
>> Software used to not hide all the options to "protect me from myself". I agree this is not for for us power users. However - ten years ago I was the free pc support for extended family and distant friends. Constantly removing 5 toolbars from IE and all sorts of adware, spyware, pre-installed crap ... Now it's years ago I had to resolve anything. So at least this "protecting people from themselves" is really working out great.
[+] xtracto|4 years ago|reply
>Being a mischievous teenage hacker didn't land you in prison.

On this note: Hacking, cracking and phreaking were done purely for curiosity sake. Even virus developers did it just because they could.

There was a sense in the community of pushing the technology to its limits.

Nowadays hacking, cracking and all the 'black arts' are for pure profit. Being it private or government...

[+] shantara|4 years ago|reply
>Once upon a time, computers did what you told them to without reporting you to the stasi, Google, or a number of marketing firms. Now that kind of freedom is obscure and hard to access for most people.

Remember when you could note the number of sent/received packets in your network connection properties before going to bed, and wake up in the morning and see exactly the same number?

[+] atmosx|4 years ago|reply
> Once upon a time, computers did what you told them to without reporting you to the stasi, Google, or a number of marketing firms. Now that kind of freedom is obscure and hard to access for most people.

Varoufakis answers the question here[^1]. Use subtitles, translation is acceptable.

Yanis pointed out any new "successful common" (e.g. building societies) gets privitized. The most widely known example he points at, is the internet.

He also point out, that this is nothing new. Happened in the 19th century as well.

[^1]: https://youtu.be/JfGgRf0JPr8?t=1761

[^2]: https://en.wikipedia.org/wiki/Kratos_(mythology)

[+] Fnoord|4 years ago|reply
The first computer which was mine (completely mine) was a Gameboy. Like, the original, classic Gameboy, whatever its name is. We also had a 80286 PC, and compared to that it was 'small'. This was back in 1991 or so. I also had a 'smartwatch' which had a Nintendo game (one. Yes, one game): Tetris. You also had Palm computers back then. Everyone has a 'small' portable/mobile computer with networking these days, but back then portable/mobile computers existed as well. They just didn't have networking, were very limited, etc. Piracy existed then as well. Back in the days you could study a chip by putting it under a microscope. The Soviet Union made VAX clones via such way. Today with EUV I don't think that's possible anymore.
[+] pjc50|4 years ago|reply
> Being a mischievous teenage hacker didn't land you in prison.

This is describing a time before many readers of HN were alive, let alone programming. The https://en.wikipedia.org/wiki/The_Hacker_Crackdown documents when the US started cracking down in the 80s.

The UK had a similar incident in which a PRESTEL account belonging to Prince Charles was hacked in the late 80s, resulting in the Computer Misuse Act 1990.

[+] kstenerud|4 years ago|reply
And the downsides:

* Getting information and help was difficult. At best there was c.l.c and such if you had a modem, but there was so much gatekeeping going on that it was hard to get a straight answer on anything.

* Source code was hard to come by. Everyone was so damn bent on keeping their precious code secret that you could only learn best practices if you happened to be in a job with good leadership.

* Hobbyist embedded systems were all but impossible unless you rolled your own. Working on an embedded platform meant using crappy tools, expensive and clunky in-circuit-emulators, and proprietary toolchains. Otherwise it was time for a homebrew etching tank.

* Storage, backups, and code versioning were a problem. Sure, we had CVS and eventually SVN and sourcesafe, but man did they suck!

* Hardware was super expensive. Software was super expensive. Getting anything done on a tight budget required a lot of creative thinking.

* Spending all day squinting at a small monitor sucked.

* Software (especially development environments) was chock full of sharp edges and required arcane knowledge and incantations. They were called UNIX wizards for a reason.

* Data communications and interchange formats were TERRIBLE (and always proprietary)

* Multilingual support was an exercise in madness.

* Bash was one of the nastiest undead languages ever invented. Oh wait, it it still is...

* "If it was hard to write, it should be hard to read and understand" was the mantra of the day.

[+] dazzawazza|4 years ago|reply
Regarding "getting information was difficult"

I've worked in game development for 25 years so YMMV: a few years ago I had an internet outage for a week or so and had to work on the engine just using my library of books. It was definitely slower and more cumbersome BUT it was a much more rewarding experience. I produced substantially better ideas with better documentation and with MUCH better execution. I don't think it was just because of the quality of my library. I think it was because I was afforded the space to think, something that had not happened for quite some time.

I now make a point to investigate my library before blogs, forums and the pitiful stack-overflow. Slower learning has lead to deeper thinking.

[+] urthor|4 years ago|reply
The sheer insanity that are shell scripts is something I don't see talked about often enough.

Bash is absolutely wonderful as CLI driver. It's truly an awful scripting language).

And yet, I see many, many projects with huge, enormous shell scripts. It beggars belief.

[+] jes|4 years ago|reply
> and clunky in-circuit-emulators…

I wrote software for in-circuit emulators from Applied Microsystems back in the late 80s and early 90s. I was fortunate to be able to work with some really talented people.

Our high-end ICEs were expensive, maybe $40K at the time. I can think of at least one company that bought 120 of them from us, although most of our customers only bought a few.

A lot of NRE went into building such systems. And yes, I think they probably were clunky yet capable of some amazing things when it came to debugging embedded systems.

[+] caeril|4 years ago|reply
> Getting information and help was difficult

As time goes on, I don't see this as the benefit I used to. When Google supplanted Altavista as my go-to engine for technical docs, and then when StackOverflow come to the scene, I initially thought these were great developments. But I was wrong.

Yes, it took much longer to both obtain and read the Intel/Microsoft/VESA/Cisco/IETF/etc documentation, but at the end of that process I always had a very solid understanding of what I was doing, why I was doing it, and where to look if X or Y goes wrong. Engineering coworkers and managers understood the process, understood the time delays involved, and accepted what were typical turnaround times for both programming and debugging iterations.

Nowadays? I solve the problem much more quickly, but within a range of zero to very little understanding of why it works, what is above and below me in the abstraction hierarchy, what the hardware is actually doing, what the performance tradeoffs are, etc. And then to speak to engineering culture, if you ever step out to do it the old-fashioned way, impatience often sets in and you're told to just Google it, bro.

If something goes horribly wrong, would you prefer your engineering team to have developed a deep understanding of what's actually going on under the hood, or would you rather have a pile of stackoverflow answers patched together?

I think the old way was better. My brain is dumber and lazier than before, and I'm uncomfortable with this feeling. It's just sad.

[+] Fnoord|4 years ago|reply
Yeah, documentation required effort. There was good documentation, but you had to know where to find it. You had to network for such, too.

Software-wise, everything was enabled by default. You had to disable services. Which meant your machine was remotely exploitable right after install.

[+] Dracophoenix|4 years ago|reply
>* Getting information and help was difficult. At best there was c.l.c and such if you had a modem, but there was so much gatekeeping going on that it was hard to get a straight answer on anything.

Just to clarify, by c.l.c, you mean comp.lang.c on Usenet, right?

[+] ASalazarMX|4 years ago|reply
> * Spending all day squinting at a small monitor sucked.

To be fair, the pixels were bigger back then.

[+] vatotemking|4 years ago|reply
Rapid Application Development, similar to classic Visual Basic and Borland Delphi. I remember being a college student and we can create GUI apps with relative ease.

Today, even electron is reserved for the experienced web dev. See roadmap.sh

I believed we've lost a lot when we transitioned from desktop apps to web and mobile apps and JavaScript won.

Even ActionScript was way ahead of JS during its time!

[+] tjansen|4 years ago|reply
Low-Code tools like Retool allow you to do this. They have the same limitations though (like inflexible layouts that don't adapt to different screen sizes), which hurts especially on mobile.
[+] ihateolives|4 years ago|reply
If it were not for VB and Delphi, I wouldn't have kept my interest for programming. Mucking around with C on the command line took my teenage self only so far before losing any interest.

If I had free time and enough dedication I'd re-learn Pascal just to write desktop apps with Lazarus. Alas, that will probably never happen.

[+] chakkepolja|4 years ago|reply
> Today, even electron is reserved for the experienced web dev. See roadmap.sh

I am a recent graduate, but I think UI development has always been hard, whether mobile, web or desktop. Web seems to have had the least barrier to _entry_ (because HTML and JS, I think).

I have had classmates who tried flutter tell that it's so much easier than doing the same thing on web. While it was UI builders then, UI-as-code is new trend. There are some exciting directions now too (Flutter, Jetpack compose, svelte with its first-class reactivity).

[+] i_dont_know_|4 years ago|reply
Now to put on my grumpy pants...

Programming nowadays feels more like an exercise in importing other people's code correctly. I feel like it's mostly writing something that takes data format X, converts it to format Y for library Z and then 'just works'[sic]. The 'heavy lifting' is usually not happening in your code, but in one of the libraries/full-on programs being pulled in, and your code just happens to be calling it with the right parameters.

This isn't all bad, in that it allows ideas to be tested and products to be created in a testable (and sometimes even shippable) form in a ridiculously fast amount of time, but then you're also far more prone to discover some library in your stack had a breaking change a few versions ago and you suddenly don't know when your upgrade will be delivered because you don't know if it's just a 'legacy=true' parameter that needs to be passed in or they redid their core somehow.

Or you try to profile your code just to find 97% of the execution is from the single 'load_thing_from_internet()' call and you have no idea if you want to fork and maintain a branch of that thing, switch it out for something else, or try to write your own. And you probably have dozens of these in the code you don't even know about because the libraries you import are just the same thing.

I think this whole process makes for sloppy, difficult-to-understand, and slightly scary applications -- and this is basically all applications running today.

[+] onion2k|4 years ago|reply
The web, circa 2001.

It was easy enough to make a website that anyone could. There were services to build a page (Geocities, Angelfire), and if you wanted a bit more control you could host something on a shared server as simply as FTP'ing some HTML files to a remote directory. Expectations were low. People rarely criticised. That meant some truly whacky and creative things got built. Taking payments online was relatively hard work. That meant no one really expected to make much money online. Even ads were only just starting really, and most people didn't bother. People made fan sites for things they were passionate about, just to say they have a website. It was never a "side hustle", it was a just hobby. That was nice.

It was also the era of Flash, which lead to some brilliant and creative sites.

Languages like Perl and PHP were taking hold of server side generation so real SaaS business were starting to take shape as well.

I miss it a little. I have no doubts that the web of today is better, especially given the fact it's the basis of my 25 year (so far) career, but there are definitely aspects of it that I'd bring back if I could. The web should be more fun.

[+] AussieWog93|4 years ago|reply
>It was easy enough to make a website that anyone could. There were services to build a page (Geocities, Angelfire), and if you wanted a bit more control you could host something on a shared server as simply as FTP'ing some HTML files to a remote directory.

Just for what it's worth, this web still exists! Free hosting sites are still around, as is (S)FTP and plain ol' HTML.

Just a couple of weeks ago, I wrote a python script that generates + uploads webpages to help my wife with her work using nothing more than print() statements and an SFTP command.

[+] heurisko|4 years ago|reply
I much prefer the web now, than the web then.

Mainly because now you can run your own VPS cheaply and install whatever exotic tech stack you want on it.

[+] mamcx|4 years ago|reply
Similar to https://news.ycombinator.com/item?id=30414068:

FoxPro

Imagine what all that "low-code" apps pretend to do, but do it better, faster, far more powerful. And in DOS 2.6.

You can do all: Forms, Reports, Apps, utilities, code the database (queries, stored procedures, etc) all with the SAME language and 0 impedance mismatch.

And the Form/Report Builder rival what Delphi do it later.

---

My dream is to resurrect the spirit of it (https://tablam.org). One thing that make this apart from the "low-code" of today: These were tools made for run in YOURS devices, not in a "cloud" where you are at mercy, not only of overlords, but of the latency. This kind of tool feel way faster than moderns because this little thing: Run local is way faster!

So, If I could get the way to dedicate it, this could be the major advantage: Local First, Cloud capable.

[+] ASalazarMX|4 years ago|reply
ForPro was a fantastic improvement over dBase. It was rumored that Microsoft bought it to put it to pasture because it was competing with MS Access. Seeing how Visual FoxPro quickly stalled, I tend to agree.
[+] einr|4 years ago|reply
Consistent graphical user interfaces drawn by the operating system with standard widgets that automatically respected user selected color and font themes.
[+] noir_lord|4 years ago|reply
100% this.

I use linux with GTK and if there is a choice of applications I'll try the GTK one first because of that consistency.

Which is fine as far as it goes but a lot of the proprietary apps I have to use for work I can't do much about.

IMO the classic WIMP desktop peaked with Win2000 from a UI/UX point of view (and I say that as someone who has run Linux since that era...)

[+] wolframhempel|4 years ago|reply
I'm gonna go with an emphatic NO on this one:

- Buying software, especially computer games for Windows 95/98 was a complete gamble. Maybe one third just worked, one third required Direct X/Soundcard Driver/Graphic/Bios fiddling to get something running (though crashes were frequent), one third just outright never worked at all.

- Web Development was an endless nightmare of IE6 compatibility, tables with eight separate pngs around an element to create a drop shadow, clearfixes and floatfixes, polyfills and fallbacks.

- SVN/Turtoise version control was constantly corrupted or in some dodgy state. Different line endings or upper/lowercase filenames could get files stuck and unrecoverable.

- So much (and I mean SOOO MUCH) money was spent on buying and maintaining the serverroom (no cloud, remember) - so you had to buy and amateurishly maintain all this super expensive hardware - usually on a standard, off-the shelf T1 line connection to the internet.

- CD-ROMs got scratched, files got corrupted if your computer crashed during saving

- You spent eight hours a day in front of a giant, non-flatscreen cathode monitor that blasted your eyes with light and radiation, making you look properly stoned when you came home.

... I could go on and on... but honestly, stuff got soo much better over time, especially in tech. Sure, there are downsides (lootbox/micro-transactions in games, expensive software subscriptions for what's essentially static products (I'm looking at you here, my 59$/month Adobe CC abo) etc. - but overall, tech is much, much better today as it was 20 or 30 years ago.

Now - when it comes to social interactions and interpersonal communication, I am less sure...

[+] Fnoord|4 years ago|reply
> I'm gonna go with an emphatic NO on this one:

The point of the post is that its two steps forward, one step backwards. Not that everything used to be better back in the days.

There's countless of examples why the tech world is better today. HTTPS alone makes it more difficult for a MITM attack, for example, but its not a panacea.

> Buying software, especially computer games for Windows 95/98 was a complete gamble. Maybe one third just worked, one third required Direct X/Soundcard Driver/Graphic/Bios fiddling to get something running (though crashes were frequent), one third just outright never worked at all.

That's why I worked with references, so I'd read a review in a magazine, or learn from a friend which hardware was stable. For example, I used a Plextor CDRW which used SCSI. This lead to massively less buffer underflows.

> You spent eight hours a day in front of a giant, non-flatscreen cathode monitor that blasted your eyes with light and radiation, making you look properly stoned when you came home.

Radiation from CRT monitors was harmful?

[+] ASalazarMX|4 years ago|reply
> - So much (and I mean SOOO MUCH) money was spent on buying and maintaining the serverroom (no cloud, remember) - so you had to buy and amateurishly maintain all this super expensive hardware - usually on a standard, off-the shelf T1 line connection to the internet.

This is not a thing of the past, migrating everything to the cloud is not advantageous to everyone. Even if it still does, you still are left with (edge) servers to maintain.

[+] rixed|4 years ago|reply
I kind of agree with most of this. The question was not "don't you think the past was better", but rather "If you could revive something from the past, what would you pick?"
[+] johndoe0815|4 years ago|reply
Documentation on system-level details was available, with DEC’s servers and workstations (PDP, VAX, MIPS, Alpha) and the Motorola computer products (e.g., the MVME board series for 68/88k and PPC) as great examples.

The availability of documentation enabled the porting of Linux and BSD systems in the 1990s without wasting a lot of time on reverse engineering the hardware details from the original OS.

[+] _moof|4 years ago|reply
Human interface guidelines. We had them and we respected them and it meant every application had a consistent set of basic behavior. By comparison web app usability is anarchy, and not the good kind.

A lot of other people have mentioned that software wasn't scraping the bottom of the revenue barrel by spying on your every move but I'll say it again because I think it's so important. This has been a huge shift in how software is designed and in the incentive structures behind app development. It has pushed the user far, far down the priority ladder, and I look forward to when this period in software history is over.

Oh, and ironically, software was faster.

[+] ASalazarMX|4 years ago|reply
The worse UI pattern the modern web has normalized is UI reflow. Things can move from under your mouse/finger, and sometimes you touch the wrong element, or a notification.
[+] 2000UltraDeluxe|4 years ago|reply
30 years ago, information technology was made with the expectation that users would tinker with it. Subsequently, most products were designed with that in mind, and the documentation was written with tinkering in mind.

Today's consumer products are made with the expectation that they should always "just work". The result is that users reactions to problems have a tendency to range from frustration to panic and rage, rather than an inquisitive curiosity aimed at solving the problem.

I guess you could still tinker with most products, even if some brands make it increasingly hard. The real difference is in the expectations of the users.

[+] AussieWog93|4 years ago|reply
>The result is that users reactions to problems have a tendency to range from frustration to panic and rage, rather than an inquisitive curiosity aimed at solving the problem.

To be fair, a lot more people now (many of whom are not tech nerds) are forced to use computers in order to do things that didn't require a computer 30 years ago.

I'm sure the average HN user's reaction would be closer to frustration than inquisitive curiosity if they were forced to navigate a complex and banal social system in order do something like sign up for a phone plan or watch a movie.

[+] osullivj|4 years ago|reply
VAX VMS Clustering: a single CPU architecture on VAX HW meant that micros and minis could cluster, and your process could be hosted anywhere in the cluster. It's taking a long, long time to reinvent that particular wheel with a combination of microservices and containers.

Xerox PARC Alto workstation: GUI, TCP networking and Smalltalk 72 OO programming system, all in 1973! This wheel has been reinvented in part many times since by Apple, MS and others. How much real progress has there been in the last ~50 years?

[+] rixed|4 years ago|reply
Re. the VMS clustering: pardon my complete ignorance but how did orchestration work? How was the host of a new process chosen? Could processes be migrated from host to host? And how do you connect to a given service if you don't know where it's running?

Any link to some overview of how this work would be greatly appreciated.

[+] smackeyacky|4 years ago|reply
I miss machines that didn't require UEFI just to boot. Phooey to you Microsoft.

I miss repairable machines. Replacing a keyboard in a modern laptop is all hidden screws, sticky tape and one time use plastic studs you have to reglue because they have to be snapped off to remove the broken keyboard. Phooey to you Acer.

Like others I miss the whole rapid application development movement. VB6 and Delphi just rocked at CRUD apps. Now we got nothing but dependency hell trying to get Electron started after you upgrade some library to pick up a bug fix and it cascades into a complete rebuild. Or trying to remember what dark wizard CLI you use to add a page to angular or react.

I miss being able to catch an exception in VisualWorks, fix the code and hitting continue.

[+] biztos|4 years ago|reply
The magic of having the TV blink on every press of the membrane keyboard while copying programs out of the manual, which had a funny smell[0]. Once my brother and I got Commodore 64's, we could see the golden age receding -- the Super Expander[1] clearly did not want us to suffer enough for our own good, and GEOS[2] embraced the blasphemy of XEROX. By the time I bought an Amiga instead of a car, my retired neighbor was already looking up from his Tandy[3] to will me off his damned lawn.

Seriously though, there have been a lot of Golden Ages already and I hope there will be a lot more.

My nostalgia for the time I started out is strong, but when I look at it objectively I see two things I think genuinely were better in the 80's, from a cultural if not an economic perspective:

1) There were many competing hardware and software ecosystems, even paradigms, and it was absolutely not obvious what we'd all be using five or ten years down the road. We live with an impoverished imagination of personal computing now.

2) People working in tech out of pure nerdy love outnumbered the people in it for the money by about 20:1. Now that's reversed, or worse.

[0]: https://www.timexsinclair.com/computers/sinclair-zx80/

[1]: https://www.c64-wiki.com/wiki/Super_Expander_64

[2]: https://en.wikipedia.org/wiki/GEOS_(8-bit_operating_system)

[3]: https://en.wikipedia.org/wiki/TRS-80

[+] dusted|4 years ago|reply
Applications worked offline, they made good use of available system resources in exactly the same way that modern "web apps" do not.
[+] DocTomoe|4 years ago|reply
The whole language standard of a Borland C++ compiler was written down in a telephone-book-sized manual that was delivered with the compiler itself. No Stackoverflow, no loading hundreds of libraries to do the most basic things. You just learned the language.
[+] yoyopa|4 years ago|reply
back in the day everything wasn't a means to sell ads or collect rent or promote yourself. tech wasn't about disrupting the marketplace or the industry but about solving problems
[+] tagersenim|4 years ago|reply
I agree with you statement. But I do have a - genuine, absolutely not sarcastic - question. As products and services were not meant for the purposes you mentioned, how was money made? Did everything have a price upfront and was nothing ‘free’?
[+] nephrenka|4 years ago|reply
I started to code 35 years ago on a Commodore 64. The beauty of those machines was that you literally had to write code to even load a game (LOAD, RUN, etc.).

As such, the barriers to start exploring programming on your own were low: the development environment was already there, you were familiar with the interface, and it booted in an instant. I haven't seen any modern day technology replicate that ease of access for beginning coders.

[+] marcus_holmes|4 years ago|reply
I always consider myself extremely lucky to have started learning tech when I did, because the devices were so simple then. I could code in Assembler on my BBC micro with no problem, because there were only 3 registers and everything was simple.

But the beauty of that was that there was nothing more complex. There are simple devices and environments around today, but they pale in comparison to the more complex environments. You can totally code a game up in a BBC emulator, for example, but it looks shit in comparison to the Unity tutorial game.

I was forced to learn Assembler because it was the only way of writing a video game on the BBC micro. And there was nothing better out there. If I was 14 again and trying to create video games to sate my urge now, I'd be learning Unity. Even though there are simpler environments available.

[+] wink|4 years ago|reply
While I wasn't a fan of Windows 95/98 per se, there were some fun hacks where you would strip it down to only a few megabytes and run it from a ram disk. And that was serious feat with ram sizes back then.

I guess my point is that it was more easily understandable and hackable and it wasn't a 10 GB install if all you need is SOME version of windows to run your games on.

Also zero phone home or mandatory/dark patterned Microsoft accounts.

[+] lproven|4 years ago|reply
I could give a dozen. Let me try to restrict myself...

Single function operating systems.

Cisco's PIX OS ran routers and it did nothing else.

Netware 2 & 3 shared files and printers, and nothing else.

As a result, Netware (for instance) was relatively small and simple enough to understand completely, top to bottom, and the performance was blistering.

When you use the same handful of general purpose OSes for everything, on all hardware, the inevitable result is that they become huge to try to cover every conceivable base. Result: vast OSes which are vastly complex, so much so that no individual can totally understand the whole thing.

A functional OS for one specific specific task should be able to fit into no more than a double-digit number of megabytes of code, and one human should be able to read that entire codebase top to bottom in a comparable number of weeks and understand it completely.

If it can't fit into a normal human's head, then it can't be completely debugged and optimised except by stochastic methods. That's bad.

It's normal now and everyone expects it, but it's still bad.