(no title)
cjcole | 4 years ago
Here's the tricky part. It seems to me that the metrics which you don't choose to bundle into the aggregate measurement will get annihilated (optimized out) at the expense of those which you do choose to bundle. And the decisions as to which things you do bundle and with what weight represent ethical, moral, spiritual, aesthetic, and otherwise intangible judgments which are (unsurprisingly) difficult to quantify or even come to basic agreement on. He weasels a bit by using relatively uncontroversial examples ("more trees", "leisure") and by handwaving ("authorize": how?, "sort of agency": what sort?), but it's trivial to imagine metrics which produce wild disagreement regarding the magnitude or even sign of the weight to be applied.
(Shadows of paperclip maximization loom.)
At that point, I'm not sure where we've significantly improved things since we're still left with the problem of how to choose the people who decide which metrics go into the aggregate measure and at what weight. Can we make a market on that, too? At that point we're in some crazy recursion and I get lost. I'm deeply skeptical.
SlapperKoala|4 years ago
The big issue I feel like he underrates is the massive incentive to take control of the agency that's supposed to be doing objective assessments. These are now effectively the most powerful people in the country. Even if they start off as saints the political incentive is going to be to find ways to influence it. And once that agency is captured then you have a dictatorship in all but name.
I feel like this reflects a common problem in political theorising, of coming up with an ideal institutional structure without thinking about the incentives around it and how it need to be sustained.
Historical analogues would be how originally non partisan district drawing processes are politicised, the politicisation of science and medicine, or soviet or Chinese gdp figures. The simplest solution is always to just rig the game.
blfr|4 years ago
cjcole|4 years ago
It seems to me that many of these "rationalize all the things" schemes tend to bottom out at and build onto an "incorruptible kernel of truth" which, if corrupted, causes the whole thing to collapse (or worse: have a veneer of impartial truth while being secretly corrupt). It's a sort of microkernel approach to government.
With Hanson's futarchy, it's the hypothetical mechanism for "voting on values" in order to choose what does and doesn't go in the almighty bundle of metrics (and who gets to measure the result: they have the real power). If that can be corrupted then the whole thing falls apart. Yarvin's "neocameralism" has a "cryptographic decision and command chain" which everything else rides on (https://www.unqualified-reservations.org/2008/05/ol6-lost-th...).
This "all the eggs, one basket" approach seems fragile and ripe for subversion, especially when having fallible humans administer it. It seems more resilient to avoid concentrations of this kind of power.
pphysch|4 years ago