top | item 17505262

(no title)

itsmenotyou | 7 years ago

Interesting perspective. I think that seeking and naming patterns "microservices", "agile", etc. is useful. It provides something like a domain specific language that allows a higher level conversation to take place.

The problem, as your identify, is that once a pattern has been identified people too easily line up behind it and denigrate the "contrasting" pattern. The abstraction becomes opaque. We're used to simplistic narratives of good vs evil, my team vs your team, etc. and our tendency to embrace these narratives leads to dumb pointless conversations driven more be ideology than any desire to find truth.

discuss

order

dalbasal|7 years ago

I agree that it's useful, I even think more people should do it more often. Creating your own language (and learning other people's) is a way of having deep thoughts, not just expressing. Words for patterns (or abstract ions generally) are a quanta of language.

I just think there can be downsides to them. These are theories as well as terms and they become parts of our worldview, even identity. This can engage our selective reasoning, cognitive biases and our "defend the worldview!" mechanisms in general. At some point, it's time for new words.

Glad people seem ok with this. I've expressed similar views before (perhaps overstating things) with fairly negative responses. I think part of it might be language nuance. The term "ideology" carries less baggage in Europe, where "idealist" is what politicians hope to be perceived as while "ideologue" is a common political insult statesside, meaning blinded and fanatic.

parasubvert|7 years ago

The issue is that it is rare and difficult to be able to synthesize all the changes happening in computing and to go deep. So a certain “Pop culture” of computing develops that is superficial and cliche’d. We see this in many serious subjects: pop psychology, pop history, pop science, pop economics, pop nutrition. Some of these are better quality than others if they have a strong academic backing, but even in areas such as economics we can’t get to basic consensus on fundamentals due to the politicization, difficulty of reproducible experiment, and widespread “popular” concepts out there that may be wrong.

Concepts like microservices synthesize a bunch of tradeoffs and patterns that have been worked on for decades. They’re boiled down to an architecture fad, but have applicability in many contexts if you understand them.

Similarly with Agile, it synthesizes a lot of what we know about planning under uncertainty, continuous learning, feedback, flow, etc. But it’s often repackaged into cliche tepid forms by charlatans to sell consulting deals or Scrum black belts.

Alan Kay called this one out in an old interview: https://queue.acm.org/detail.cfm?id=1039523

“computing spread out much, much faster than educating unsophisticated people can happen. In the last 25 years or so, we actually got something like a pop culture, similar to what happened when television came on the scene and some of its inventors thought it would be a way of getting Shakespeare to the masses. But they forgot that you have to be more sophisticated and have more perspective to understand Shakespeare. What television was able to do was to capture people as they were.

So I think the lack of a real computer science today, and the lack of real software engineering today, is partly due to this pop culture.”