Few years ago a link to article was posted on HN. The idea in the article was that software can only ever increase complexity of things it does manage, not reduce it. The idea was proven using the example of (possibly) the Australian tax system which got way more complicated since they moved from books to a digital system. Does anyone have a link saved somewhere? I drastically need this article.Thanks!
Yujf|2 years ago
spit2wind|2 years ago
People who enjoy that may also enjoy "All Late Projects are the Same" https://web.archive.org/web/20130818024030/http://www.comput...
dang|2 years ago
Why it is important that software projects fail (2008) - https://news.ycombinator.com/item?id=24390855 - Sept 2020 (72 comments)
Why It Is Important That Software Projects Fail (2008) - https://news.ycombinator.com/item?id=20109316 - June 2019 (18 comments)
Why it is Important that Software Projects Fail - https://news.ycombinator.com/item?id=932956 - Nov 2009 (26 comments)
diurnalist|2 years ago
I find it interesting how this ties in to the "productivity paradox."[0] The idea the author seems to be getting at--that software accelerates the creation of ever more elaborate solutions (often to problems created by prior iterations of software), and in the process leaves a wake of complexity that frustrates and baffles the society it was supposed to serve--is something I'd like to read more about.
[0]: https://en.wikipedia.org/wiki/Productivity_paradox
actuallyalys|2 years ago
One idea the author suggests is that tax agencies will grow to take up roughly 1 percent because 1 percent of the budget is so negligible. The U.S. provides an interesting contrast. In 1955, the IRS was 0.08 percent of GDP [0] [2]. In 2008, it was 0.093 percent of GDP [0] [1]. So it's much lower, but also has remained fairly steady between 1955 and 2008. Looking ahead, the new IRS budget is around $20 billion, which is 0.072 percent of GDP [0]. To pick another example, the Canadian Revenue Service was about 0.2 percent of GDP in 2018 [3] [4].
One other assumption is that the Australian Tax Office does the same, at least when normalized to GDP. I'm not sure that's true, but investigating that would take more time.
[0] https://taxfoundation.org/blog/irs-budget-increase-technolog... [1] https://fred.stlouisfed.org/series/FYGDP [2] https://fraser.stlouisfed.org/title/budget-united-states-gov... See page 1000
yonki|2 years ago
tacostakohashi|2 years ago
The way I see it, there are a few different kinds of software.
An old kind is something like using a spreadsheet instead of an abacus or punch cards, using a word processor instead of a typewriter, email instead of snail mail / fax machine, etc. In this case, you're using computers/software to increase the efficiency of some external, real-world, non-computer thing, and it works pretty well, especially if you have a complicated logistical problem like running a big warehouse, airline, bank, etc.
Another kind is software that talks to other software, like a trading algorithm, exchange, a search engine or spam filter where the inputs to your software are the outputs from some other person's software. In this kind of software, there is never any permanent outcome/result, it's just a never-ending arms race where you write some software that temporarily produces better results, but then the other side figures out a more elaborate way to exploit it and get through the filter or better SEO results or whatever, and then you obfuscate some more / change again, etc.
Unfortunately, more and more software is now in the latter camp :(
woolion|2 years ago
I'm also interested in the research you mention, but I think it is a special case of this general human behaviour.
frfl|2 years ago
The task did get easier in the short term and in the long term. The only thing that changed was the task itself.
Local job markets were the only practical thing for most people, but if you found a job somewhere else, you wrote a letter. Farming took dozens (or more?) of people per farm, now it's done by a small team if not a couple -- the task isn't to farm to provide a living or feed your family, it's to supply tons and tons of foodstuffs to a global economy.
Similarly, you can now call or video chat with your immediate family when they're at the store, while you're working from home -- or even if you're in an office 20 mins away. And if you wanted to run a small farm, you can now do that, providing for your family and maybe even generating a small income (see YouTube for homestead channels).
Timpy|2 years ago
There are trade offs for the complexity though, and well managed complexity could disappear behind the interface of a computer. When this is done well it feels seamless, and when it's done poorly it's painful. So maybe I'm arguing the meaning of complexity isn't 1:1 with the meaning of complicated. At the end of the day "did moving this to a computer make it better?" is the question to answer, and a lot of times the answer is no. QR menus at restaurants is my favorite punching bag for this but any home appliance with bluetooth or wifi is an easy target.
stillwithit|2 years ago
This reads like empty circumlocution in defense of programmer jobs.
My boring EE and math degrees are from another era, before all this cool software jargon captivated the world. I am not really sure what all the verbosity of the DSLs, config formats, and many programming languages really solves from an engineering perspective. Much of it feels like baggage from the era before graphical computing first; iPhones don’t boot to a CLI, right?
Humanity burns a lot of real resources preserving computing history when the first computers were human mathematicians. What does a Commodore and Borland have to do with mathematics? Feels like nostalgia more than engineering.
I will keep iterating on personal computing experiments. Ye olde cranks of software lore and genius CEOs are just normal people hallucinating about their essentialness to society. Yawn. (I’m doing it too!)
The real energy vampires are not the sarcastic, but the toxic positivity crowd peddling Ponzi schemes passed on from dad and his 80s bitching Camaro crowd. Yes, yes, you did something within the constraints of physical reality. Ooh wow an expensive agency manipulating boondoggle; can I subscribe to your newsletter?
The next generation grew up on the internet. They’re aware of the hustle, whereas the aging out elders were maybe a bit less discerning given lack of education and experience; how were they know it’s just arithmetic and Boolean of memory addresses and semantic babble? Their special boys seemed convinced and the elders might have been a bit more accepting of hallucinations given their religious upbringing.
I ended up in software expecting to have a career in industrial controls, but entered that field at the tail end of offshoring, never got my foot in the door as networking became harder, internet was not so socially organized back then. Couldn’t figure out where to be at the right time.
I’m fucking sick of “software” as we know it. It’s elementary DSL parsers and git pull github.com/everything.git which given how things work with software is great but that that’s how things work in software is ridiculous.
abnercoimbre|2 years ago
Disclosure: it was given at my tech conference.
[0] https://vimeo.com/780013486
[1] https://inkandswitch.com
sorokod|2 years ago
"Increasing Complexity" — as an E-type system evolves, its complexity increases unless work is done to maintain or reduce it
[1] https://en.m.wikipedia.org/wiki/Lehman%27s_laws_of_software_...
spit2wind|2 years ago
The article you're seeking demonstrates this one way. You can approach it another way: Amdahl's Law.
Any process that takes time T has two parts: a part which can improve and a part which cannot improve. Let p be the percentage of the program which may improve. Symbolically,
T = part that can improve + part that cannot improve
or
T = pT + (1-p)T
Suppose we can introduce an improvement of factor k. Then the improved process time T' is
T' = pT/k + (1-p)T
or
T' = T[p/k + (1-p)]
The overall speedup S, then, is the ratio of the original time to the improved time.
S = T/T'
or
S = 1/[p/k + (1-p)]
It's so simple to derive, I love it. Say you have a bureaucratic process and you're asked to "automate it". You can plug in the numbers and play with them to get a feel for how much overall improvement you can expect. For example, how would the overall process improve in the (unlikely) case that you provided infinite improvement :)
Bureaucracy is not necessarily, although often synonymous with, "composed of many, many parts." This implies that the "part which can improve" is small relative to the part which cannot improve. Amdahl's Law kicks in and improving those tiny parts have minuscule effects overall. No amount of automation will have any significant effect on the size or efficiency of a bureaucracy.
However, this raises an important philosophical question: if you improve a part, do you replace it? How many parts can you replace in a process before it is no longer the same process?
frfl|2 years ago
nier|2 years ago
jzombie|2 years ago
ljdtt|2 years ago
lfciv|2 years ago
https://www.stilldrinking.org/programming-sucks
sys_64738|2 years ago
unknown|2 years ago
[deleted]
unknown|2 years ago
[deleted]
solumunus|2 years ago
yonki|2 years ago
haltist|2 years ago
[deleted]