I would also recommend 1st chapter of "Principles Of Computer System Design: An Introduction - Saltzer & Kaashoek" for a more general discussion on complexity in digital system.
Good to know that there are books addressing this as well. I have the impression that the topic of complexity is not discussed enough, neither in the academia nor in the private sector.
We have to delineate between environmental entropy and the entropy within the system. As about adding entropy, every action has effects which can simultaneously both decrease entropy in one variable while increasing in another.
Any system's future state is not entirely predictable. We determine actions and their impact on current entropy while taking decisions.
1) the author thinks there are obvious ways to decrease the entropy of the coin flipping example and expects these to be so obvious to the reader that they don't need enumerating to exemplify the approach he has in mind
or
2) the author is pointing out that in the coin flipping example the entropy is already so close to zero as to render efforts to reduce it absurd, and believes that this is so self evident as to need no explanation
In other words, the entropy of this article is ~ln(2). I suspect some energy could be usefully devoted to reducing it.
In my opinion, the author aims to introduce entropy as a metric to evaluate the stability of processes. Creating room for advocacy against process changes when new additions/changes are likely to increase entropy by introducing new outcomes.
Example - engineering teams can be obsessed with introducing new variables to a sufficiently stable system with the intention to improve stability. But in turn, reduce stability due to inaccurate impressions of stability of the new variables.
This approach generates a bias against change, but in many situations this bias is helpful. This allows engineering teams time to observe process outcome distribution over a longer duration, improving the data backing any process change decisions.
The coin flipping example ideally would have S = ln(2), with the two outcomes being:
* heads, no issues
* tails, no issues
The articles says this:
"The way to lower that maximum is to bring N to a number as close as possible to one. In addition to reducing the number of possible outcomes, we can further reduce entropy in any given process by reducing the probability of every undesired outcome,[...]"
which applies to a reproducible process. In the case of coin flipping, the desired reproducibility refers to having no issues, so N should be as close as possible to 2".
(the macrostate vs microstate distinction could be introduced but it would complicate the argument)
Its interesting joining a new startup seeing how clean organized it is, joining a mature startup and seeing how disorganized it is. There are layers and layers of interpretation of what systems should look like and they are different. We often refer to this as tech debt, but there is always that one box no one wants to touch because of its age and potential importance, kind of like a time capsule of when the company was once clean and orderly. I've see so much entropy.
But to the point of the article entropy only exists when external factors are introduced. Like new talent or tech paradigms changing the landscape of a startup.
[+] [-] qazpot|2 years ago|reply
[+] [-] ghomem|2 years ago|reply
[+] [-] mercurialsolo|2 years ago|reply
Any system's future state is not entirely predictable. We determine actions and their impact on current entropy while taking decisions.
[+] [-] blumomo|2 years ago|reply
better swap lives with money here
[+] [-] HumanOstrich|2 years ago|reply
[+] [-] Towaway69|2 years ago|reply
[+] [-] ghomem|2 years ago|reply
[+] [-] jameshart|2 years ago|reply
1) the author thinks there are obvious ways to decrease the entropy of the coin flipping example and expects these to be so obvious to the reader that they don't need enumerating to exemplify the approach he has in mind
or
2) the author is pointing out that in the coin flipping example the entropy is already so close to zero as to render efforts to reduce it absurd, and believes that this is so self evident as to need no explanation
In other words, the entropy of this article is ~ln(2). I suspect some energy could be usefully devoted to reducing it.
[+] [-] omk|2 years ago|reply
Example - engineering teams can be obsessed with introducing new variables to a sufficiently stable system with the intention to improve stability. But in turn, reduce stability due to inaccurate impressions of stability of the new variables.
This approach generates a bias against change, but in many situations this bias is helpful. This allows engineering teams time to observe process outcome distribution over a longer duration, improving the data backing any process change decisions.
Edit: added explanation
[+] [-] ghomem|2 years ago|reply
* heads, no issues
* tails, no issues
The articles says this:
"The way to lower that maximum is to bring N to a number as close as possible to one. In addition to reducing the number of possible outcomes, we can further reduce entropy in any given process by reducing the probability of every undesired outcome,[...]"
which applies to a reproducible process. In the case of coin flipping, the desired reproducibility refers to having no issues, so N should be as close as possible to 2".
(the macrostate vs microstate distinction could be introduced but it would complicate the argument)
[+] [-] Sparkyte|2 years ago|reply
But to the point of the article entropy only exists when external factors are introduced. Like new talent or tech paradigms changing the landscape of a startup.