top | item 21981578

The End of the Beginning

202 points| nikbackm | 6 years ago |stratechery.com | reply

115 comments

order
[+] christiansakai|6 years ago|reply
I've been thinking along the same line, albeit on a more personal take as a software engineer.

Basically, starting around 15 years ago, there's the proliferation of bootcamps teaching fullstack development, because software startups were the new hot thing, and they desperately need generalist engineers that were capable of spinning up web app quickly. Rails was the hot thing those days because of this as well. Hence we saw many new grads or even people who change careers to fullstack development and bootcamps churning out these workers at an incredible pace (regardless of quality) but the job market took it because the job market was desperate for fullstack engineers.

During that time, the best career move you can do was to join the startups movement as fullstack engineers and get some equity as compensation. These equities, if you are lucky, can really be life changing.

Fast forward now, the low hanging CRUD apps (i.e., Facebook, Twitter, Instagram, etc) search space has been exhausted, and even new unicorns (i.e., Uber) don't make that much money, if they do for that matter. Now those companies have become big, they are the winners in this winner take all filed that is the cloud tech software. Now these companies these days have no use for fullstack engineers anymore, but more specialists that do few things albeit on a deeper level.

Today, even the startup equity math has changed a lot. Even with a good equity package, a lot of the search space has been exhausted. So being fullstack engineers these days that join startups don't pay as much anymore. Instead, a better move would be to try to get into one of these companies because their pay just dwarfed any startups or even medium / big size companies.

Just my 2c as someone who is very green (5 yrs) doing software engineering. Happy to hear criticism.

[+] paulsutter|6 years ago|reply
"Fullstack" development is just a tiny slice of software. There's an entire economy to be automated, there are endless inefficient processes and massive opportunities there.
[+] onlyrealcuzzo|6 years ago|reply
My understanding is that startups never paid well. Sure, if you got lucky and were employee #7 at Facebook, it paid off great! But even during that time frame, working at startups instead of MS or Google was not a good proposition. And even during DotCom and the rise of Microsoft, it paid a lot better to work at IBM than these classes of startups.
[+] codingslave|6 years ago|reply
Right but this is because startups are largely a financial vehicle to transfer value created by employees to investors and founders. If startups actually gave employees a real stake in their business, the equity math would make much more sense. Its a cultural issue, but also, the set of people working at startups for worthless equity is mostly disjoint with the set that can get hired at Google/FB. Its almost a different career.

The value of full stack engineering is also plummeting because the tooling i.e. React has gotten so good that the barrier to entry is very low.

[+] zozbot234|6 years ago|reply
Software is still eating the world, and there will be plenty to eat for a long time. Cars (the foremost example in the OP) had basically eaten the world by the 1950s (sometimes even in a fairly literal sense).
[+] timClicks|6 years ago|reply
Isn't the perception of a saturated market persistent though? I mean, there were many social media apps when Facebook started. Twitter was created when microblogging had already become a trend.
[+] tim333|6 years ago|reply
The low hanging CRUD apps may have been done but there are still less low hanging CRUD apps like Flexport (rails) and the advance of AI is opening up loads of new opportunities.
[+] alexashka|6 years ago|reply
The tragedy of people who want to make money and believe it will be 'life changing', is that no matter how many times they are told it won't be, they think 'ha, you only say that because you got yours'.

What useless shit are you going to buy with 'life changing' money exactly that a software developer's salary won't allow?

[+] oflannabhra|6 years ago|reply
I'm not exactly sure where I fall on this. Ben is a really smart guy (way smarter than me), but I feel like this could be a classic case of hindsight.

Now, looking back, it makes sense that the next logical step after PCs was the Internet. But from each era looking forward, it's not as easy to see the next "horizon".

So, if each next "horizon" is hard to see, and the paradigm it subsequently unlocks is also difficult to discern, why should we assume that there is no other horizon for us?

I also don't know if I agree that we are at a "logical endpoint of all of these changes". Is computing truly continuous?

However, I think Ben's main point here is about incumbents, and I agree that it seems it is getting harder and harder to disrupt the Big Four. But I don't know if disruption for those 4 is as important as he thinks: Netflix carved out a $150B business that none of the four cared about by leveraging continuous computing to disrupt cable & content companies. I sure wasn't able to call that back in 2002 when I was getting discs in the mail. I think there are still plenty of industries ripe for that disruption.

[+] jdmichal|6 years ago|reply
Until I have something resembling Iron Man's Jarvis with at least a subvocal interface, I think there's still a long way to go for "continuous" computing. I currently still have to pull out a discrete device and remove myself from other interactions to deal with it. If I'm not on that device all the time, then I don't have continuous computing. Maybe continuously available computing is more accurate?
[+] jeffshek|6 years ago|reply
This captures my sentiment.

It’s hard to see how the incumbents could be beaten — precisely because how effective they are around data, buying potential competitors (Instagram, YouTube) .... but this is precisely because we don’t know what/if the next market shift is.

What happens if AI takes off? What happens if 3D printing magically becomes 100x more efficient and you can print anything you want from home?

We don’t know. It doesn’t seem like the big incumbents could be defeated, but history repeats itself.

[+] marcosdumay|6 years ago|reply
> it seems it is getting harder and harder to disrupt the Big Four

Microsoft, IBM, Oracle... What is the other one?

Or, right, wrong decade.

(My point is, it completely not obvious if it is getting harder to disrupt the incumbents.)

[+] myblake|6 years ago|reply
Isn’t that kind of his conclusion too though? It matters in as much as we’re less likely to see new general purpose public clouds come into play, but he didn’t seem to predict there was no more room for change in the industry, just that were unlikely to see those incumbents toppled from certain foundational positions in the ecosystem.
[+] pthomas551|6 years ago|reply
Was it really that hard to predict the Internet? SF authors picked up on it almost immediately.
[+] elfexec|6 years ago|reply
> but I feel like this could be a classic case of hindsight.

Well it's 2020 afterall.

> Now, looking back, it makes sense that the next logical step after PCs was the Internet.

But the internet existed before PCs.

> and I agree that it seems it is getting harder and harder to disrupt the Big Four.

I agree, but then again, people thought AOL was hard to disrupt so you never know. A company can look invincible one day and irrelevant a few years later.

> I think there are still plenty of industries ripe for that disruption.

Yes, but the low hanging fruits have already been taken. I suspect the next round of disruptions would be more difficult and less profitable.

[+] hogFeast|6 years ago|reply
This has happened in every industry and in every time period. The mistake made in the article is that the author doesn't actually appear to realise how important this effect is (it is always amazing to me that you have all these people writing about the same topics, always from the same angle...no-one thinks to just open a book, and check what happened last time...actually I am aware of one book that has done this, just one). Again: every industry, every time period. It is permanent.

Definitely, you see new industries replacing old ones. Acknowledging the above isn't denying progress. But every industry consolidates down to a few large companies.

[+] solidasparagus|6 years ago|reply
Bah. He took three datapoints, built a continuum out of it and says that since the third datapoint is at the end of his continuum, we must be at the end.

But this doesn't fit any of the upcoming trends. The biggest current trend is edge computing where cloud-based services introduce issues around latency, reliability and privacy. These are big money problems - see smart speakers and self-driving cars. The cloud players are aware of this trend - see AWS Outposts that brings the cloud to the needed location and AWS Wavelength where they partnered with Verizon to bring compute closer to people.

But privacy in a world full of data-driven technology is still very much an unsolved problem. And most of the major technology players have public trust issues of one sort or another that present openings for competitors in a world where trust is increasingly important.

[+] nostrademons|6 years ago|reply
I've seen similar analogies to the airline industry, but IMHO this misses the forest for the trees. Tech isn't an industry, like automobiles or airlines. Tech is industry, like machine tools and assembly lines. When industry was first developed in the 1810s it meant specifically the textile industry, which was the easiest existing manufacturing task that could benefit from power tools and specialized workers on an assembly line. It was only a century later that we could begin to dream of things like automobiles and airplanes.

Similarly, I bet that our great-grandchildren will look upon the Internet, e-commerce, and mobile phones the same way we look upon railroads, paddle steamers, and power looms. Great inventions for their time, and drivers of huge fortunes, but also quaint anachronisms that have long since been replaced by better alternatives.

Notice that the article focuses almost entirely on I/O and the physical location of computation. This is a pretty good sign that we're still in the infrastructure phase of the information revolution. When we get to the deployment phase, the focus will be on applications, and our definition of an industry focuses on what you can do with the technology (like fly or drive) rather than how the technology works. In between there's usually an epochal war that remakes the structure of society itself using the new technologies.

FWIW, there was a similar "quiet period" between the First and Second Industrial Revolutions, from 1840-1870s. It was very similar: the primary markets of the original industrial revolution (textiles, railroads, steamboats) matured, and new markets like telegraphs were not big enough to sustain significant economic growth. But economic growth picked up dramatically once a.) the tools of the industrial revolution could be applied to speed up science itself and b.) the social consequences of the industrial revolution remade individual states into much larger nation-states, which created larger markets. That's when we got steel, petroleum, electrification, automobiles, airplanes, radio, and so on.

[+] tudorw|6 years ago|reply
Don't agree, comparing histories is not a reliable way to predict the future, I think we'll see the growth of governance level disruption, a pushback that will encourage home grown solutions for countries that are not necessarily aligned with US interests. That field is wide open and growing!
[+] camillomiller|6 years ago|reply
Policy driven disruption is the only option I see to break the cycle. Let's see.
[+] legitster|6 years ago|reply
I've been reading Zero to One, and one of the ideas the book pitches is that monopoly and innovation are two sides of the same coin. Only monopoly-like companies have time and money to dump into innovative products (Bell, GE, IBM, Google). And people only invest in an idea if they think they can profit from it (look at how crucial a patent system was for the industrial revolution).

Competition is important, but to drive efficiency - weed out bad ideas and bring down costs of already created innovations. But the thing that usually drives monoliths out of business is... new monoliths.

The somewhat contrarian takeaway is that some (keyword) amount of consolidation is good.

[+] hogFeast|6 years ago|reply
That isn't right.

The truth is somewhere in the middle: definitely, you see some large companies invest heavily but (more commonly) you see small firms nibble at the edges of an existing product until it is too late for the larger companies.

Saying that monopoly produces innovation is like saying government produces innovation. It happens but given a long enough period all things happen. The question is about incentives: the incentives to innovate within large companies are terrible, that is why it doesn't happen most of the time.

Also, consolidation has happened in all industries at all times. It is a function of things that repeat: knowledge curves, lindy effects, etc.

Just generally: be wary of Thiel and his ilk. They have a predilection for ahistorical nonsense. The history in this area, broadly business history, is particularly difficult and not well known (the only tech person who I have seen get close is Patrick Collison..and then...not really).

[+] mirimir|6 years ago|reply
> What is notable is that the current environment appears to be the logical endpoint of all of these changes: from batch-processing to continuous computing, from a terminal in a different room to a phone in your pocket, from a tape drive to data centers all over the globe. In this view the personal computer/on-premises server era was simply a stepping stone between two ends of a clearly defined range.

Sure, that's what happened.

But what jumps out for me is that, at both ends of that range, users are relying on remote stuff for processing and data storage. Whether it's mainframe terminals or smartphones, you're still using basically a dumb terminal.

In the middle, there were personal computers. As in under our control. That's often not the case now. People's accounts get nuked, and they lose years of work. And there's typically no recourse.

As I see it, the next step is P2P.

[+] magwa101|6 years ago|reply
The current computing paradigm is all about "data entry", you are your own "sysadmin". Slowly enriching others by working for them. Yes saved time is valuable for you, but also, you create value for "them". We have moved from mainframe to phone with very little design change. The current wave was about convenience. There is a coming wave of redesign that is people centric. Interfaces will be vastly different. This article is just a lack of imagination.
[+] tlarkworthy|6 years ago|reply
Thats a very bold claim, that goes against Ray Kurzweil's hypothesis tech is accelerating. Maybe (unlikely) that cloud/mobiles is the end game for silicon. But what about quantum? What about biological? What about Nano? What about AI? Literally there are a ton of potential generational changes in the making that could turn everything on its head again
[+] the_af|6 years ago|reply
Why is Ray Kurzweil's hypothesis particularly important to contrast other hypotheses against? What sets it apart in relevance and/or authority?
[+] vonnik|6 years ago|reply
I think this is a crucial point. Based on Ben's premises, his conclusions make sense. But what if you alter the premises, for example, by assuming that compute will happen on another substrate? If you choose a biological substrate, then you can move compute from inside one's pocket, to inside the body. And for many functions, you wouldn't need the cloud. I doubt that the dominant companies in silicon-based tech today have the expertise to make that shift.

A lot of work is being done to make bio-silicon fusion real, with use cases like creating olfactory sensors.

And our increasing control over both brain and genes may be the pathway to more general biological computation.

https://www.ucsf.edu/magazine/control-brains-genes

[+] gfodor|6 years ago|reply
I think it will, at best, be a semantic argument in retrospect. The companies highlighted are all clearly defined as being bolstered by computing technology. But what about next generation, huge companies that are bolstered by computing and other technologies fused together? For example, if a company manages to create a brain-computer interface that gains global adoption and equivalent valuations to the existing tech giants, but the software layer is a mashup of, by that time, commoditized services from the existing tech giants who fail to enter this industry, does it count?
[+] dsalzman|6 years ago|reply
I don't think the claim is technology in general, but non quantum based computing.
[+] jiveturkey|6 years ago|reply
As @oflannabhra said, I think this is a case of hindsight thinking, with little predictive impact. Privacy issues can very quickly change everything. Security issues (story on NPR today about medical devices pretty much all vulnerable and ripe for random killing of people) could as well. Climate change is going to be a large driver for technology in the near future. The tech situation is very, very dynamic right now and it is way too early to say we are going to settle down with the current tech giants.

Also, giants are giants. In manufacturing, there are absolutely vast advantages to economy of scale. In tech, except for network effects, it's very easy for a very broad array of upstart companies to dominate their respective arenas at the 100bn level.

> today’s cloud and mobile companies — Amazon, Microsoft, Apple, and Google — may very well be the GM, Ford, and Chrysler of the 21st century.

Well, except google is not a cloud or mobile company. They are an advertising company.

[+] nl|6 years ago|reply
while new I/O devices like augmented reality, wearables, or voice are natural extensions of the phone.

I don't agree with this at all. This is like saying "the internet is a natural extensions of the operating system, therefore Microsoft Windows will remain all powerful and the sole route to consumers"

Bill Gates in his famous memo realised that this wasn't the case, and Google realized that mobile did to the internet what the internet did to Windows (hence Android).

Wearables are radically different to phones. People want to use them differently, and interact with them in different way to how they do with phones.

To be clear: We are in the very early days of wearables, and Apple is far and away the dominant player (and maybe Garmin). But there is huge disruptive potential here.

[+] F_J_H|6 years ago|reply
Interesting point about Garmin. I wonder if it would ever make sense for Apple to just buy them...
[+] dgudkov|6 years ago|reply
I disagree. A long period of evolution starts when a revolution before that managed to find a more or less working solution. However at this point, there are at least two big problems that haven't been solved properly yet, that get worse every day, and where a revolution would be more probable than evolution - social networks and payments.

I believe at least one more revolution would still be possible before we have a long period of evolution. It will be a shift from centralization to de-centralization (one more time), actually to federation. De-centralized federated systems might be able to get social networking and payments to the level where it finally works well and only needs to be gradually improved.

[+] gz5|6 years ago|reply
>And, to the extent there are evolutions, it really does seem like the incumbents have insurmountable advantages...

By definition, doesn't it always seem like this?

Jim Barksdale (Netscape) said there are 2 ways to make money - bundling and unbundling. What can be unbundled from the incumbent bundles, in order to be offered in a more fit-for-purpose way, or with a better experience?

How might that answer change if the world's political structure changes? How might that answer change if processing, storage and networking continue their march towards ubiquitous availability?

[+] Animats|6 years ago|reply
His graph conveniently stops in the 1980s. Since then, there have been many new US car companies, mostly in the electric of self-driving spaces. Lots of little city cars, new imports from China, too.
[+] the_watcher|6 years ago|reply
He specifically mentions excluding imports. Outside of Tesla, what are the new American car companies that made any kind of mark?
[+] kevin_thibedeau|6 years ago|reply
Most of those are NEVs with a 25MPH max speed to bypass safety regulations. $20K golf carts.
[+] JohnFen|6 years ago|reply
> there may not be a significant paradigm shift on the horizon, nor the associated generational change that goes with it.

That's possible, but I see things that lead to me think that we're not there.

Primarily, there are a number of rather serious problems with the cloud, some of which are inherent to the paradigm and likely can't be resolved -- we'll just have to live with them.

When a paradigm has such problems, the possibility always exists that a new way of doing things can come about that sidesteps those problems.

[+] ropiwqefjnpoa|6 years ago|reply
The dealership model really helps manufacturers keep a tight reign on the market, look at all the trouble Tesla had.

In a similar vein, Apple, Google and Microsoft control the medium and have grown so powerful, I can't imagine there ever being a new "Google" that comes about the old grass roots method.

Someday Apple will be bought though, probably by Facebook.

[+] whatitdobooboo|6 years ago|reply
I think if you abstract away the specific companies mentioned and stuck to the technology, the point about people building on top of already "accepted" paradigms is a good one, in my opinion.

The rest doesn't really seem to have enough evidence for such a bold claim.

[+] LMo|6 years ago|reply
Frankly not sure this piece really said anything other than the big 4 or 5 are so unbelievably strong that we're all left playing in the spaces, usually small, leftover.
[+] graycat|6 years ago|reply
For the OP, let me think ....

There is

> IBM’s mainframe monopoly was suddenly challenged by minicomputers from companies like DEC, Data General, Wang Laboratories, Apollo Computer, and Prime Computers.

So, to shed some more light on this statement, especially about "mainframe monopoly", let me recount some of my history with IBM mainframes:

(1) Uh, to help work myself and my wife through grad school, I had a part time job in applied math and computing: Our IBM Mainframe TSO (time-sharing option) bill was about $80,000 a year, so we got a Prime, and soon with my other work I was the system administrator. Soon I graduated and was a new B-school prof where the school wanted more in computing. So, I led an effort to get a Prime -- we did. IBM and their super-salesman Buck Rodgers tried hard but lost.

The Prime was easy to run, very useful, and popular but would not have replaced IBM mainframe work running CICS, IMS, DB2, etc. Of course, in a B-school, we wanted to run word processing, D. Knuth's TeX math word whacking, SPSS statistics, some advanced spreadsheet software (with linear programming optimization), etc. and not CICS, IMS, DB2.

(2) Later I was at IBM's Watson lab in an AI group. For our general purpose use, our lab had six IBM mainframes, IIRC U, V, W, X, Y, Z. As I recall they had one processor core each with a processor clock likely no faster that 153 MHz.

Okay, in comparison, the processor in my first server in my startup is an AMD FX-8350 with 8 cores and a standard clock speed of 4.0 GHz.

So, let's take a ratio:

(8 * 4.0 * 109)/(6 * 153 * 106) = 34.9

so that, first cut, just on processor clock ticks, the one AMD processor is 35 times faster than all the general purpose mainframes at IBM's Watson lab when I was there.

But, still, on IBM's "mainframe monopoly", if what you want is really an IBM mainframe, e.g., to run old software, then about the only place to get one is from IBM. So, IBM still has their "mainframe monopoly".

Or to be extreme, an Apple iPhone, no matter how fast it is, does not really threaten the IBM "mainframe monopoly".

Continuing:

> ... like DEC, Data General, Wang Laboratories, Apollo Computer, and Prime Computers. And then, scarcely a decade later, minicomputers were disrupted by personal computers from companies like MITS, Apple, Commodore, and Tandy.

Not really: The DEC, DG, ..., Prime computers were super-mini computers and were not "disrupted" by the PCs of "MITS, Apple, Commodore, and Tandy."

The super-mini computers did lose out but later and to Intel 386, etc. chips with Windows NT or Linux.

> ... Microsoft the most powerful company in the industry for two decades.

Hmm. So now Microsoft is not so "powerful"? Let's see: Google makes it easy to get data on market capitalization:

Apple: $1,308.15 B

Microsoft: $1,202.15 B

Alphabet: $960.96 B

Amazon: $945.42 B

Facebook: $607.59 B

Exxon-Mobil: $297.40 B

Intel: $256.35 B

Cisco: $201.47 B

Oracle: $173.73

IBM: $118.84 B

GM: $50.22 B

Microsoft is still a very powerful company.

Uh, I'm no expert on Apple, but it appears that the Apple products need a lot of access to servers, and so far they tend to run on processors from Intel and AMD with operating system software from Microsoft or Linux -- that is, Apple is just on the client and not the server side.

It appears, then, that in computing Microsoft is the second most powerful company and is the most powerful on the server side.

Sure, maybe some low power ARM chips with 3 nm line widths and Linux software will dominate the server side, but that is in the future?

And personally, I can't do my work with a handheld device, need a desktop, and am using AMD and Microsoft and nothing from Apple. A Macbook might suffice for my work but seems to cost maybe $10,000 to have the power I plugged together in a mid-tower case for less than $2000.

Broadly it appears that the OP is too eager to conclude that the older companies are being disrupted, are shrinking and are fading, are being replaced, etc.

Maybe the main point is just that in the US hamburgers were really popular and then along came pizza. So, pizza is popular, but so are hamburgers!

I also go along with the point of zozbot234 at

https://news.ycombinator.com/item?id=21986141

> Software is still eating the world, and there will be plenty to eat for a long time.