top | item 44317936

(no title)

samuel | 8 months ago

I'm currently reading Yudkowsky's "Rationality: from AI to zombies". Not my first try, since the book is just a collection of blog posts and I found it a bit hard to swallow due its repetitiveness, so I gave up after the first 50 "chapters" the first time I tried. Now I'm enjoying it way more, probably because I'm more interested in the topic now.

For those who haven't delved(ha!) into his work or have been pushed back by the cultish looks, I have to say that he's genuinelly onto something. There are a lot of practical ideas that are pretty useful for everyday thinking ("Belief in Belief", "Emergence", "Generalizing from fiction", etc...).

For example, I recall being in lot of arguments that are purely "semantical" in nature. You seem to disagree about something but it's just that both sides aren't really referring to the same phenomenon. The source of the disagreement is just using the same word for different, but related, "objects". This is something that seems obvious, but the kind of thing you only realize in retrospect, and I think I'm much better equipped now to be aware of it in real time.

I recommend giving it a try.

discuss

order

Bjartr|8 months ago

Yeah, the whole community side to rationality is, at best, questionable.

But the tools of thought that the literature describes are invaluable with one very important caveat.

The moment you think something like "I am more correct than this other person because I am a rationalist" is the moment you fail as a rationalist.

It is an incredibly easy mistake to make. To make effective use of the tools, you need to become more humble than before you were using them or you just turn into an asshole who can't be reasoned with.

If you're saying "well actually, I'm right" more often than "oh wow, maybe I'm wrong", you've failed as a rationalist.

zahlman|8 months ago

> The moment you think something like "I am more correct than this other person because I am a rationalist" is the moment you fail as a rationalist.

Well said. Rationalism is about doing rationalism, not about being a rationalist.

Paul Graham was on the right track about that, though seemingly for different reasons (referring to "Keep Your Identity Small").

> If you're saying "well actually, I'm right" more often than "oh wow, maybe I'm wrong", you've failed as a rationalist.

On the other hand, success is supposed to look exactly like actually being right more often.

wannabebarista|8 months ago

This reminds me of undergrad philosophy courses. After the intro logic/critical thinking course, some students can't resist seeing affirming the antecedent and post hoc fallacies everywhere (even if more are imagined than not).

the_af|8 months ago

> The moment you think something like "I am more correct than this other person because I am a rationalist" is the moment you fail as a rationalist.

It's very telling that some of them went full "false modesty" by naming sites like "LessWrong", when you just know they actually mean "MoreRight".

And in reality, it's just a bunch of "grown teenagers" posting their pet theories online and thinking themselves "big thinkers".

greener_grass|8 months ago

I think there is an arbitrage going on where STEM types who lack background in philosophy, literature, history are super impressed by basic ideas from those subjects being presented to them by stealth.

Not saying this is you, but these topics have been discussed for thousands of years, so it should at least be surprising that Yudkowsky is breaking new ground.

elt895|8 months ago

Are there other philosophy- or history-grounded sources that are comparable? If so, I’d love some recommendations. Yudkowsky and others have their problems, but their texts have an interesting points, are relatively easy to read and understand, and you can clearly see which real issues they’re addressing. From my experience, alternatives tend to fall into two categories: 1. Genuine classical philosophy, which is usually incredibly hard to read and after 50 pages I have no idea what the author is even talking about anymore. 2. Basically self help books that take one or very few idea and repeat them ad nouseam for 200 pages.

FeepingCreature|8 months ago

In AI finetuning, there's a theory that the model already contains the right ideas and skills, and the finetuning just raises them to prominence. Similarly in philosophic pedagogy, there's huge value in taking ideas that are correct but unintuitive and maybe have 30% buy-in and saying "actually, this is obviously correct, also here's an analysis of why you wouldn't believe it anyway and how you have to think to become able to believe it". That's most of what the Sequences are: they take from every field of philosophy the ideas that are actually correct, and say "okay actually, we don't need to debate this anymore, this just seems to be the truth because so-and-so." (Though the comments section vociferously disagrees.)

And it turns out if you do this, you can discard 90% of philosophy as historical detritus. You're still taking ideas from philosophy, but which ideas matters, and how you present them matters. The massive advantage of the Sequences is they have justified and well-defended confidence where appropriate. And if you manage to pick the right answers again and again, you get a system that actually hangs together, and IMO it's to philosophy's detriment that it doesn't do this itself much more aggressively.

For instance, 60% of philosophers are compatibilists. Compatibilism is really obviously correct. "What are you complaining about, that's a majority, isn't that good?" What is wrong with those 40% though? If you're in those 40%, what arguments may convince you? Repeat to taste.

sixo|8 months ago

To the Stem-enlightened mind, the classical understanding and pedagogy of such ideas is underwhelming, vague, and riddled with language-game problems, compared to the precision a mathematically-rooted idea has.

They're rederiving all this stuff not out of obstinacy, but because they prefer it. I don't really identify with rationalism per se, but I'm with them on this--the humanities are over-cooked and a humanity education tends to be a tedious slog through outmoded ideas divorced from reality

HDThoreaun|8 months ago

Rationalism largely rejects continental philosophy in favor of a more analytic approach. Yes these ideas are not new, but they’re not really the mainstream stuff you’d see in philosophy, literature, or history studies. You’d have to seek out these classes specifically to find them.

samuel|8 months ago

I don't claim that his work is original (the AI related probably is, but it's just tangentially related to rationalism), but it's clearly presented and is practical.

And, BTW, I could just be ignorant in a lot of these topics, I take no offense in that. Still I think most people can learn something from an unprejudiced reading.

bnjms|8 months ago

I think you’re mostly right.

But also that it isn’t what the Yudkowsky is (was?) trying to do with it. I think he’s trying to distill useful tools which increase baseline rationality. Religions have this. It’s what the original philosophers are missing. (At least as taught, happy to hear counter examples)

turtletontine|8 months ago

  For example, I recall being in lot of arguments that are purely "semantical" in nature.
I believe this is what Wittgenstein called “language games”

throwaway314155|8 months ago

In spirit of playing said game, I believe you can just use the word "pedantic" these days.

quickthrowman|8 months ago

Your time would probably be better spent reading his magnum opus, Harry Potter and the Methods of Rationality.

https://hpmor.com/

ramon156|8 months ago

Sounds close to Yuval's book nexus which talks about the history of information gathering

hiAndrewQuinn|8 months ago

If you're in it just to figure out the core argument for why artificial intelligence is dangerous, please consider reading the first few chapters of Nick Bostom's Superintelligence instead. You'll get a lot more bang for your buck that way.