Realistically, what happened is that Apple increased the maximum allowed size for apps delivered from the App Store. When you don't put limits in, nobody is going to choose to prioritize smaller app size. It's a bit of a tragedy of the commons.
Short answer: combinatoric explosion of dependencies.
If you write everything yourself, you might have a few modules in your codebase that all add to up to something like 15 MB. Of that, maybe 1 MB is specific to your app, the other 14 MB will be generic components for things like handling logging, HTTP, running background jobs, dealing with permissions, etc...
If you pull in a library to enable authentication with Facebook, then you drag in not just the specifics of that concept -- the 1 MB -- but also its framework components -- the other 14 MB.
There is virtually no sharing!
Add in, say, Google authentication, as well as Twitter, Microsoft, and Facebook, and now you've got 4 copies of every basic "shared" component... because they're not shared. (Not effectively enough, at any rate.)
This is a disease of modern "package based" development practices, such as NPM, Cargo, NuGet, etc...
Every package has a dependency tree, which may or may not be shared.
Worse still, if any packages need any level of interaction, you often get shims or adapters, bloating things out further.
It's not unusual for a trivial app to have an entire web browser layout engine embedded in it by accident. For web apps, I've seen megabytes of JavaScript getting pulled down to the client because of the various "sharing" buttons that are static content taking up just a few hundred pixels on the screen. Each one is from a separate vendor with separate dependency trees.
One more random example: the new Windows Calculator pulls in dependencies for things such as Windows Hello 4 Business login recovery helper!
Personally, I blame JavaScript and Python.
Both languages had tiny and/or bad standard libraries and hence developers were forced to develop huge frameworks of packages to compensate. Facebook did their own thing, Google did their own thing, and on and on.
Somehow developers internalised their limitations and started telling everyone that "batteries not included" standard libraries are somehow magically better, despite the glaring downsides.
Now new languages like Rust are being purposefully designed with deliberately small standard libraries, as compared to C# and Java. Their proponents will say this is a good thing right up until Rust develops Python- and JS- style cliques where there's a "Facebook Rust" and a "Google Rust" and if you accidentally pull in the tiniest thing from both you get a duplicate of every basic concept.
My preferred environment these days is LÖVE (https://love2d.org). 5MB binary, 27MLoC of C for the Linux kernel (or more for equivalent OSs, I imagine) + a few million more for low-level libraries, graphics, et cetera, 12kLoC for Lua, 10kLoC for any app I am likely to create. It's still a lot, but it's pretty batteries-included, much less than various alternatives and feels like a nice sweet spot.
Many developers have no concept of efficient use of hardware resources (or what hardware resources even are), figuring they can just ask for more of whatever they need and the system will take care of the rest. When your program and anything else including the operating system needs to play nicely in KB or MB of RAM, you're much more careful. Why not be just as careful when you have GB of RAM?
Extensive use of bloated multi-dependancy-level libraries and frameworks when you could just write the code to do what you want to do in 1/100th of the time and space. Copy-paste coders who copy far more code than they actually need.
A lot of developers who grew up coding last century understand, and some later people, but an awful lot of development today is done by people who have no idea what is actually happening when they "read a file". And think that toast requires avocados. (sorry, that was mean). I know some young people who get it, many more who probably should be pursuing a psychology degree. (oops)
Sure, you could argue that the real coders have already done the hard work, and that's why you can just click a few buttons and end up with a working "program". But it really helps to actually understand what's happening under the hood.
Disclaimer: I was apparently so technology-poor that I wrote several quite useful programs in Apple II machine language while in college. Though I had no mouse, it was a far more pleasant experience than working without video, or persistent storage other than punched paper tape, on my high school's PDP-11.
PDP-11? Closest mine had was a minicomputer with "Word Perfect 1.0 for UNIX" and 40 orange plasma dumb-terminals. And the best commandline spellchecker I've seen.
WP's was…terse, like most (all?) word processors: three to five often-wrong suggestions. The unrelated commandline tool _appeared_ dump the entire dictionary one instant screenful at a time, sorted by nearest match.
We didn't have the magic/more-magic switch from {ahem} a certain machine, but if you hit the lowest-leftmost key on the terminal keyboard, you got dumped to shell (and lost whatever you hadn't saved).
This was all after I was cured of "hating computers" by my dad bringing home an SE/30 (I know: sacrilege to TUI lovers). Upgraded to a IIsi with 5 MB RAM and we LIKED^H^H^H^H^Htolerated it. At least they both let me SM 0 A9F4<enter>G 0<enter> when I wanted to C-z a program (only _sometimes_ because I'd just written the offending bug).
stu2010|3 years ago
jiggawatts|3 years ago
If you write everything yourself, you might have a few modules in your codebase that all add to up to something like 15 MB. Of that, maybe 1 MB is specific to your app, the other 14 MB will be generic components for things like handling logging, HTTP, running background jobs, dealing with permissions, etc...
If you pull in a library to enable authentication with Facebook, then you drag in not just the specifics of that concept -- the 1 MB -- but also its framework components -- the other 14 MB.
There is virtually no sharing!
Add in, say, Google authentication, as well as Twitter, Microsoft, and Facebook, and now you've got 4 copies of every basic "shared" component... because they're not shared. (Not effectively enough, at any rate.)
This is a disease of modern "package based" development practices, such as NPM, Cargo, NuGet, etc...
Every package has a dependency tree, which may or may not be shared.
Worse still, if any packages need any level of interaction, you often get shims or adapters, bloating things out further.
It's not unusual for a trivial app to have an entire web browser layout engine embedded in it by accident. For web apps, I've seen megabytes of JavaScript getting pulled down to the client because of the various "sharing" buttons that are static content taking up just a few hundred pixels on the screen. Each one is from a separate vendor with separate dependency trees.
One more random example: the new Windows Calculator pulls in dependencies for things such as Windows Hello 4 Business login recovery helper!
Personally, I blame JavaScript and Python.
Both languages had tiny and/or bad standard libraries and hence developers were forced to develop huge frameworks of packages to compensate. Facebook did their own thing, Google did their own thing, and on and on.
Somehow developers internalised their limitations and started telling everyone that "batteries not included" standard libraries are somehow magically better, despite the glaring downsides.
Now new languages like Rust are being purposefully designed with deliberately small standard libraries, as compared to C# and Java. Their proponents will say this is a good thing right up until Rust develops Python- and JS- style cliques where there's a "Facebook Rust" and a "Google Rust" and if you accidentally pull in the tiniest thing from both you get a duplicate of every basic concept.
akkartik|3 years ago
unknown|3 years ago
[deleted]
codevark|3 years ago
Extensive use of bloated multi-dependancy-level libraries and frameworks when you could just write the code to do what you want to do in 1/100th of the time and space. Copy-paste coders who copy far more code than they actually need.
A lot of developers who grew up coding last century understand, and some later people, but an awful lot of development today is done by people who have no idea what is actually happening when they "read a file". And think that toast requires avocados. (sorry, that was mean). I know some young people who get it, many more who probably should be pursuing a psychology degree. (oops)
Sure, you could argue that the real coders have already done the hard work, and that's why you can just click a few buttons and end up with a working "program". But it really helps to actually understand what's happening under the hood.
Disclaimer: I was apparently so technology-poor that I wrote several quite useful programs in Apple II machine language while in college. Though I had no mouse, it was a far more pleasant experience than working without video, or persistent storage other than punched paper tape, on my high school's PDP-11.
IIsi50MHz|3 years ago
WP's was…terse, like most (all?) word processors: three to five often-wrong suggestions. The unrelated commandline tool _appeared_ dump the entire dictionary one instant screenful at a time, sorted by nearest match.
We didn't have the magic/more-magic switch from {ahem} a certain machine, but if you hit the lowest-leftmost key on the terminal keyboard, you got dumped to shell (and lost whatever you hadn't saved).
This was all after I was cured of "hating computers" by my dad bringing home an SE/30 (I know: sacrilege to TUI lovers). Upgraded to a IIsi with 5 MB RAM and we LIKED^H^H^H^H^Htolerated it. At least they both let me SM 0 A9F4<enter>G 0<enter> when I wanted to C-z a program (only _sometimes_ because I'd just written the offending bug).
stjohnswarts|3 years ago