(no title)
shadowsun7 | 4 years ago
The basic idea is this:
1. Deliberate practice only works for skills with a history of good pedagogical development. If no such pedagogical development exists, you can’t do DP. Source: read Peak, or any of Ericsson’s original papers. Don’t read third party or popsci accounts of DP.
2. Once you realise this, then the next question you should ask is how can you learn effectively in a skill domain where no good pedagogical development exists? Well, it turns out a) the US military wanted answers to exactly this question, and b) a good subsection of the expertise research community wondered exactly the same thing.
3. The trick is this: use cognitive task analysis to extract tacit knowledge from the heads of existing experts. These experts built their expertise through trial and error and luck, not DP. But you can extract their knowledge as a shortcut. After this, you use the extracted tacit knowledge to create a case library of simulations. Sort the simulations according to difficulty to use as training programs. Don’t bother with DP — the pedagogical development necessary for DP to be successful simply takes too long.
Broadly speaking, DP and tacit knowledge extraction represent two different takes on expertise acquisition. For an overview of this, read the Oxford Handbook of Expertise and compare against the Cambridge Handbook of Expertise. The former represents the tacit knowledge extraction approach; the latter represents the DP approach. Both are legitimate approaches, but one is more tractable when you find yourself in a domain with underdeveloped training methods (like most of the skill domains necessary for success in one’s career).
Hermitian909|4 years ago
https://3starlearningexperiences.wordpress.com/tag/accelerat...
shadowsun7|4 years ago
The article you link to is essentially a response to 2 pages in the book, where Hoffman et al mention, almost in passing, that CLT is a silly theory when you want to train for real world scenarios (the intuition is that if you’re training marine fire squad commanders to plan on the battlefield, perhaps it helps to simulate shooting at them during training?) Hoffman et al use this as an example of a learning theory that doesn’t seem to map to real world requirements.
This reads like a disagreement over one particular dismissal in the book, perhaps because CLT is a pet theory of the article’s authors. The problem: this argument is not core to the book!
The article does not, for instance,
a) Deal with the many examples of successful real world accelerated training programs with no curriculum design (as is commonly understood; ordering of simulations isn’t really designing a syllabus) in Chapter 9 (some of which were designed by some of the authors)
b) Have a rejoinder to the two learning theories presented in Chapter 11 that the authors claim underpins their training approach (if there were something to attack, this would be it!)
c) Nor have a rejoinder to a more central claim in the book, (and to my mind a more controversial claim) that atomisation of concepts impedes rapidised training.
And, perhaps most surprisingly to me, your claim that
> Reading it I got the distinct impression that the authors did not understand a great deal of the research they cited, either when supporting or dismissing it.
is remarkable, given that one of the authors of Accelerated Expertise is Paul J Feltovich, one of the founders of the field of expertise research, and a contemporary of Ericsson’s.
wainstead|4 years ago
spekcular|4 years ago
I've read Cognitive Load Theory by Sweller et al., and while a good book, it's not very practically oriented.
hinkley|4 years ago
The future is here, it’s just not evenly distributed.
About half of the things I used to obsess about fifteen years ago are now common if not dogma. I don’t obsessively track the things that don’t pan out but odds are I was dead wrong about a few of them. Still waiting on the rest.
sova|4 years ago
kqr|4 years ago
I think, by applying some of the core principles (variety of scenarios, high difficulty, guidance from expert available, high density of lessons, etc) I can learn things quicker, as well as help others learn things quicker. Even without CTA proper, which is its own skill I haven't taken the time to learn yet.
bsanr2|4 years ago
This has been my personal experience as well, and it makes me highly suspicious of anyone whose advice for acquiring technical skills is simply to practice constantly - "draw every day," "you have to code," "always be networking," etc. They either aren't aware of how useless this advice is, or simply don't care about your growth or performance. Which, I admit ambivalently, is reasonable in this society; if you want someone to care, pay them to. This of course opens us back up to the issue of underrepresented groups often being unable to afford formal "someone caring about your growth."
throwaway4356|4 years ago
I will add though, some fields are far more open than others. Most notable would probably be software or computing in general. An observant person can glean insights from places like HN, Reddit, lobste.rs, GitHub/Lab threads, university course pages from all over the world, personal opinionated blogs, etc. etc. (add reading good open source code to this list too). I say observant because you need to sift through marketing (mostly HN and Reddit), influencer crap [0], (often attractive) polemics, etc. With computing, these places can come from genuine passion and respect for the craft, rather than grift like you'd find on LinkedIn [1]. From there a person can weigh up different ideas and experiences, looks for patterns, try to determine what is being implied and what context has gone unsaid, what is taken for granted, etc. This would replace otherwise unavailable mentors.
Like you, what you mention has been my personal experience. My experience with people from my demographic is that some, without formal guidance, struggle, and in getting started really need someone to hold their hand. In some cases this would be a problem of confidence and self-esteem more than ability. Others can soldier through looking to pick up bits of wisdom from wherever, eventually being able to make judgements of their own.
>and it makes me highly suspicious of anyone whose advice for acquiring technical skills is simply to practice constantly - "draw every day," "you have to code," "always be networking," etc.
It's a matter of perspective I think. This reminds me of that Ira Glass quote on taste and creativity. Perhaps taste is a sort of tacit knowledge, and this is picked up from mentors, or developed independently like I mentioned above. Then a person who does not know the experience of lacking taste -- either because they have access to mentorship (and took it for granted) or because they had the confidence and ability to independently "seek" tacit knowledge (and took the confidence and ability to actually do so for granted) -- lacks perspective and gives this advice. It "worked" for them, but really this is just the apparent, and they cannot express how they developed the taste or tacit knowledge (or that it is necessary!) for this advice to be useful.
[0] I specifically left YouTube off that list. Aside from conference videos posted to YouTube (hardly any views), university lecture recordings (hardly any views past first year with a few exceptions), and maybe a feeew other channels, YouTube content is in my experience garbage and very typical-influencer, mostly hyping up megacorps, the latest buzzword-tech, and grinding leetcode.
[1] Yes computing grift is very real, but I mean that it appears that sharing information online for "nobler" reasons or genuine professional reasons is more common in computing than in some other fields.
civilized|4 years ago
1. Be self-aware
2. Extract case library of simulations from yourself
3. Write a book covering the case library
LambdaTrain|4 years ago