top | item 39508046

You are not late (2014)

313 points| yarapavan | 2 years ago |kk.org

164 comments

order
[+] tomcam|2 years ago|reply
There are always opportunities, at least in the USA.

I started programming in 1985 and thought I was very likely too late. I was the same age as Chris Espinosa, who was Apple's head of documentation at age 14. At that time it was still not perfectly clear that PCs would be regarded as essential to business. I had no one to talk to about computers and programming. I was in a very rough spot in my life and had to teach myself everything, while not being sure I'd have a job in 5 years.

A decade later in 1996 I started at Microsoft, which virtually all of my peers thought had run its course. They were selling stock like crazy. By the time I left 4 years later the stock had gone up 1300%. Last I checked it's gone up yet another 2500% or so--I kept my 1,000 ESPP shares mostly out of nostalgia and thanks to a company that was so good to me.

I bought a website after the first dot com bust of 2001, monetized it by hiring well, and it provided a very good living for well over two decades after that.

This is an incredible time to start a web-based business because there's a surfeit of free and almost free tools for development all the way to deployment.

[+] mchinen|2 years ago|reply
It's interesting to hear your experience at this point in time. Mine was later but noticeably different.

I started coding in the 90's as a teen releasing shareware, with my first gig at Adobe in 2001, where they paid me $14/hr as an intern. Even though this was rough times for tech, me and my CS peers at UW felt a lot of optimism and enthusiasm about what could be built because the framework still supported idealists and we cared less about the economy (and we were definitely naive). Both the researchers and entrepreneurial types were very curious about inventing new paradigms.

When I talk to younger people now in CS, the enthusiasm seems to be split for 'pure researcher' types and 'entrepreneurial/builder types', with the latter having interest concentrated on what is booming (like AI), and more about what can be built but what will be able to raise large sums or attract more users. I'll caveat this with I don't know to what extent the people I talk to do have a bias, but I do wonder if there are less people willing to explore new frontiers now.

One major difference between now and then is that the fraction of US market cap that is tech, and to a lesser extent, tech's importance in the economy. I wonder if this established leader position somehow could make people less optimistic and willing to explore?

[+] michaelt|2 years ago|reply
> A decade later in 1996 I started at Microsoft, which virtually all of my peers thought had run its course.

People thought Microsoft had run its course, one year after the release of Windows 95?

Just as MS Word was dominating WordPerfect? And the likes of AutoCAD were dropping support for AIX, Solaris etc? When Photoshop had just released for Windows, and Apple's market share had dropped to about 3% ?

Wow! I know they were kinda late to online stuff, but still - strange how these things turn out, in hindsight!

[+] kubb|2 years ago|reply
Damn, I better buy some tech stock if it’s gonna go up 13x in 4 years.

Actually if everyone just does that, nobody will need to work!

[+] begueradj|2 years ago|reply
>I started programming in 1985 and thought I was very likely too late.

Lots of skilled programmers started coding when they were kids. But that does not mean starting to learn coding when you are 30 years old won't make you an excellent programmer: actually, you can learn even faster at that age because concepts like logic, dividing an issue into several small ones, and iterating step by step until the final solution are more intuitive to you than to a kid.

[+] up2isomorphism|2 years ago|reply
Give an example in 1985 to show there are opportunities today is not convincing.
[+] caseyross|2 years ago|reply
While I agree with the spirit of the post, I think that there are better and worse times to start something new, and in retrospect 2014 seems like it was one of those worse times. The period from 2014--2024 was an era where the sheer gravity of the big tech platforms crushed out innovative startups left and right. People with an extreme focus on product polish could succeed, like Slack (est. 2013) and Discord (est. 2015), but it feels like most of the tech-sphere was either just working on half-hearted products towards an inevitable acqui-hire, or fervently trying to mainstream blockchain in a wishful attempt to create an entirely separate, more open ecosystem.
[+] wouldbecouldbe|2 years ago|reply
Yeah there are, but it's hard to know where you are in. So best just to have fun & create great stuff. If focused on making people's lives better you never really go wrong in the longterm.
[+] szundi|2 years ago|reply
Also bad economic times are the best: validation is more meaningful and when things start to go again, you have a good cost structure.
[+] manmal|2 years ago|reply
You’re using past tense - has this period ended for you?
[+] callamdelaney|2 years ago|reply
The last thing Slack had was an extreme focus on polish. As a chat system, it's hardly functional and far less so than irc which came before. Slack managed to sell chat to big corporations, that was its innovation.
[+] beacon294|2 years ago|reply
2013 was a great time to join in the fray, I think the clarifying point is that you are working on the machine, you are not embodying the success or failure of the machines of that time.
[+] WD-42|2 years ago|reply
Not to seem pessimistic, but in the 10 years since this article what have we really gained on the internet that we didn’t have then? Seems like we got a lot more social media and some failed promises from crypto. This is barring the current ai stuff since it’s still really shaking out.
[+] muzani|2 years ago|reply
There was the mobile and cloud boom. Which resulted in more digital payments (more difficult for crime and corruption), online to offline stuff like ride sharing and e-commerce. Plus a ton of advancements on logistics, especially in developing countries.

I think most of these changes didn't affect developed nations so much, it's probably still good old Walmart and Amazon. But they were lifechanging to developing nations. We had some advancements in the rights of factory workers as they had to match gig workers, and crime dropped drastically in some places because it just wasn't worth it anymore when you could climb out of poverty by delivering food.

[+] boppo1|2 years ago|reply
I'm a painter pursuing traditional-style work. The education system absolutely failed me, and I have seen it fail countless others with the same desire.

The last 10 years have seen a renaissance of academic painting information and education, and social media, particularly Instagram, has been the fuel.

There is so much now that simply wasn't there then. My life would be so much different and happier if I were coming of age now with those desires instead of a decade ago. Nonetheless, it is still dramatically improved by the advancements I described.

I no longer feel so alone. I suspect many people with different niches are enjoying richer lives like I am, due to the last 10 years of internet.

[+] faronel|2 years ago|reply
I had a similar thought but challenged myself to think about the other side. Using this list many companies and thinking about those founded after 2012, because it takes a couple years to enter the mainstream, we can see that there's quite a bit of opportunity. https://en.wikipedia.org/wiki/List_of_largest_Internet_compa....

Social media for sure, but the entirety of AR/VR. AI, and not just GPTs. but recommendation, detection, sentiment, data mining are all things that are 'new'. We can also think about things like online banking or healthcare apps that didn't exist. I was still sending checks in 2014, and I certainly wasn't doing video visits with my doctor. As someone that is middle age, when I ask younger people they point out how much opportunity as a counterpoint to the cynicism I'm seeing here HN.

[+] Swizec|2 years ago|reply
> but in the 10 years since this article what have we really gained on the internet that we didn’t have then

Figma comes to mind as an obvious standout example. We didn’t even have the technology to support that level of multiplayer computationally heavy UI in the browser back in 2014. No native apps had collaboration that smooth either.

Collaborative [text] document editing in general is a good example. So mundane these days in all the big web-based work apps that we don’t even notice it anymore.

[+] dartharva|2 years ago|reply
India - and probably many other countries outside of the Americentric West zone - achieved complete adoption of digital (mobile) payments, shot up internet connectivity to the moon (to the point that it now has the largest population of netizens in the world), made huge strides in FinTech and Digital Learning, achieved complete adoption of digital commerce including app-based ride-hailing and food delivery, and saw the blossoming of a gigantic app/digital services-based economy.

Life has changed radically here as compared to 2014.

[+] Cerium|2 years ago|reply
Zoom? Try having a video conference call 10 years ago. I remember going to an HP corporate office (in Taiwan, if I remember correctly) in 2012 where they proudly demonstrated a video conference room that worked well. Setting up calls using WebEx back then was slow, had poor performance, and usually somebody failed to join and had to call in.
[+] whycome|2 years ago|reply
Tinder was technically launched in 2012 but the swiping thing came out in 2013. These tools are an essential part of connection for a large group of people. And you can't really handwave social media aside, it's been absurdly relevant since 2014. The Apple watch launched in 2015 and had led a revolution in wearable internet-connected tech. 4k TVs and subsequently hq streaming wasn't a thing til after 2012.
[+] hot_cereal|2 years ago|reply
Amongst the other examples, IoT. 10 years ago it was still in its infancy. As an example, Amazon didn't acquire Ring until 2018. beyond just smart home stuff, payments like Apple and Android Pay impact daily life. The EV boom is also a massive part of IoT. Every TV is now a smart TV (which is miserable, but that's a whole other discussion)

And an IoT world has nearly as many drawbacks as it does benefits, but I think it's hard to argue it hasn't changed the way we interact with the internet in our day to day lives.

[+] bbor|2 years ago|reply
Discord and slack figured out messaging, and the web development tooling now is infinitely better than 2014. On the front end, HTML5 and responsive web design were still new in 2013, and React came out in 2013/stabilized in 2015/released Hooks in 2019.

On the backend, Next.js brought us SSR in 2016, MongoDB brought us document databases by winning “fastest growing database engine” in 2013 and 2014, and Docker brought us disposable containers in 2013.

The list is stacked towards older tech but that’s maybe because recent tech hasn’t proven itself yet: svelte in 2020 is still maturing AFAICT, and ive never heard of Vite (2021) or SolidJS (2022). I personally think many exciting non-AI trends are also ramping up now, such as Local First development (formalized in May 2023).

I think that the economy and innovation in general were curtailed due to the climaxing rampant corruption in the US, but the internet is something of an exception IMO.

This is all talking about developer abilities, of course: the constraints of corrupt capitalism mean that many of the actual sites released in the past decade have been terribly bloated and over-featured. But I think that’s partially a consequence of businesses moving everything into mobile-first websites and apps —- you’re gonna see a lot more low-margin websites.

[+] tppiotrowski|2 years ago|reply
WebGL2 and now WebGPU is allowing networked access to the GPU. That's new since 2014.

Edit: plus flexbox

[+] DoreenMichele|2 years ago|reply
The best time to plant a tree is twenty years ago. The second best time is today.

I don't know why anyone would sit around moping about "If only this were thirty years ago!" If it seems like your idea would be "easy" or something -- if only it were thirty years ago -- then most likely it's because we are where we are and you know what you know now that you wouldn't have known then.

It's like all the people who say things like "Youth is wasted on the young" and "I wish I had known what I know now back when I was seventeen." Yeah, you know it at all because you aren't seventeen.

[+] santoshalper|2 years ago|reply
I do, because figuring out what is going to be big 30 years from now is hard (arguably impossible to do deterministically), but knowing what is big now is easy, and wishing you could go back and capitalize on that is natural.
[+] AnimalMuppet|2 years ago|reply
If this were thirty years ago instead of today, the opportunity to build X[1] would be wide open, but you'd have to build it with the tools that were available 30 years ago. That might not be nearly as easy as building it with today's tools.

[1] "X" meaning "variable to be substituted for" rather than "company that made a bizarre naming choice for no apparent reason".

[+] dougmwne|2 years ago|reply
I feel like the last 10 years have been a maturing phase. There weren’t a lot of opportunities for young upstarts without funding. It seemed like it was the era of big money innovation, burning mountains of cash trying to stake the last few open claims on the app and web ecosystem. And there was the crypto stuff, yuck, many went to prison.

But the real revolution is AI. Thinking back to 2014 and peering forward, it’s unfathomable. If there had been a sci-fi movie, I would have thought it unrealistic. I still think I have no idea where this will take is or how much our industry will change. What a great time to be an entrepreneur.

[+] ildjarn|2 years ago|reply
Her came out in 2013, so there kind of was a movie.
[+] tavavex|2 years ago|reply
I'm young, and talking about years like 1985 feels like talking about some other reality, but to me it still feels like the two years aren't comparable.

Whenever I read about the history of computers and software in the 80s, it feels like there are always mentions of relatively new companies foraging their path, new hardware manufacturers and software developers shaping the brand new home computer industry. Sure, there were some old giants on the playing field, but it almost sounds like anyone with a decent idea could carve out a space for themselves.

2014 though? There were a bunch of opportunities back then, obviously, but it was already deep into the megacorp consolidation era, when large software or web companies owned dozens of unrelated services and had secured an almost endless influx of cash. There were startups, and a few of them became extremely successful, but it feels like most of them were either doomed to fail or be bought out by a larger company. I feel like in this last decade, internet innovation had mostly slowed - the internet of 2004 was extremely different from the internet of 2014, yet 10 more years passed since then and it doesn't feel like as much has changed.

Maybe it's just my imaginary rose-tinted view of the past or something, but it feels like it's harder than ever for a startup to compete with the big players now. The only big exception I can think of is video games - self-publishing anything was probably almost impossible in the 80s, but nowadays we have an endless sea of development and publishing tools for independent developers with fresh new ideas.

Perhaps, there's a completely new field on the horizon that will level the playing field once more, putting everyone back at square one. I think that some industries could get squeezed dry until the next big innovation comes along, or people move onto some other new space.

[+] jauntywundrkind|2 years ago|reply
I have hope for many more turns / revolutions!

Boy have we been in a holding/consolidation pattern. The age of massification has been upon us; getting everyone else online has been the effort, the way to rise. Free online services we can connect to from anywhere has been an amazing change, a totally new expectation that totally changes how computing situates itself in our lives.

At much cost to figuring out further places to pioneer, I feel. We need new cycles with new energy, where lots of people are trying stuff out again. Computing & tech should not stay well settled; we should be trying crazy stuff. It feels like a lifetime ago that Tim O'Reilly was preaching "follow the alpha geeks", look for those very capable folks doing excellently for themselves. That ability to trendspot & try lots of things has been somewhat by these huge scales & systems, but I believe a return to personal-ity has to crop up again sometime. We'll find some fertile terrains where new things are a happening again.

2014 was when thing were actually really setting in place, when the pioneering stage was really giving way to some settlers (and town planners, see: https://blog.gardeviance.org/2015/03/on-pioneers-settlers-to...). There's a lot of roughness, but tech getting it's mojo back may be not far off. It's a challenge though; the reality of creating ever-connected ever-online systems is hard. We have amazing amounts of server we can fit in a pizza box, at incredibly low $/performance, but we're unpracticed at doing it without pain at smaller scales, by ourselves, anew. Trying to create new generative platforms, that serve as a basis to let more value spring up: it needs a strong foundation, so that it can keep "creating more value than it captures," another O'Reilly-ism.

The future is (with hope) exciting!

[+] boffinAudio|2 years ago|reply
As someone who has used the Internet since 1985, I constantly find myself reminded of the fact that the Internet isn't just port 80. The Internet is so much more than just the web, and when someone comes up with a cross platform, powerful application which uses some other port and some other protocol, it will be just as functional on the Internet as any other Web Browser.

We could just as easily produce clients which exchange Lua bytecode. In fact, we do (Games, etc.) .. but we could just as easily build an execution environment (which is what browsers are) that allows a much wider and broader range of application capabilities than the browser.

This, then, is what I have in mind, when I think "I've been on the Internet too long, I've become jaded": actually, the browser is just the lowest common denominator. As soon as some other client application is built which pushes user experience beyond what the browser is capable, the Internet will still be there to host it.

And I find that inspiring, even after 40 years of crud.

[+] TimTheTinker|2 years ago|reply
> As soon as some other client application is built which pushes user experience beyond what the browser is capable

The problem is that the browser is so good at being a ubiquitous, open-source, open-standards-based, cross-platform, near-instant-loading secure remote code execution environment. As Brendan Eich says, always bet on JS and WASM. I would extend that to -- always bet on browsers.

With the amount of change that has occurred in browser technology over the last 30 years, I strongly think the future looks like more improvement in browser technology (and perhaps deprecation/removal of old standards) than an entirely new paradigm. I know various early web pioneers want to build a post-Web application delivery system (notably Tim Berners-Lee and Douglas Crockford), but given the accumulated legacy of modern browsers and web apps and the ongoing investments into browser technology, I don't see how any post-Web system could gain any traction within the next 25 years.

But yes - your point still stands. If anything better than browsers ever appears, the internet will indeed still be there to host it.

[+] martynr|2 years ago|reply
Unlikely to be a popular question in this forum but in my view the most important topic we need to reflect on is “are we really better off with these services?” - obviously at face value there is utility or there wouldn’t be the market penetration but the externalities and issues of market dominance are becoming more apparent and pressing.

And if we agree that better solutions are needed then the question becomes “how do we create the market conditions that support those better outcomes?”

I’d really like to see kk reflect on this article with 10 years hindsight and see if he remains optimistic.

[+] MichaelRo|2 years ago|reply
"Opportunities still exist" is not at odds with "the low hanging fruit has been picked".

At the moment both statements above are true, meaning we (the little guy) are on average fucked. Sugar coating it with platitudes and cheap aphorisms doesn't change the reality that opportunities have dried out for all but the wealthy and connected and even for them it's not easy.

[+] AnimalMuppet|2 years ago|reply
The low-hanging fruit has been picked. But part of that fruit was producing better tools. So we now have taller ladders. As a result, there is new fruit that now is low-hanging.
[+] JohnMakin|2 years ago|reply
This fun thought only holds true if the internet is not in decline. in 2014, which very arguably was peak internet, I could understand not thinking that, but today, I struggle to imagine how this isn’t the case.
[+] datadrivenangel|2 years ago|reply
Peak internet was 2015, before the 2016 internet rupture when the digital space become pop culture and cyberspace inverted into the real.
[+] aizyuval|2 years ago|reply
History shows that there’s no such thing as late, as we are humans, there are always vacuum to fill if people take ownership for them. Rebels, inventors, thinkers, hackers. They all emerged in different times by taking ownership.

It is only the perception of this innovation that could change as times go on.

[+] i_am_a_peasant|2 years ago|reply
I'm pretty effective at my job (embedded software) and I have a decade of pretty solid experience writing device drivers and debugging some really tough issues. But I always feel like I'm not at all ready for the caliber expected on the US job market. I joined a startup so that I could be pushed harder and to prove to myself I can thrive in an environment where it is expected that you will put in 300% effort.

Who knows maybe I'll give it a go in a couple of years.

[+] Draiken|2 years ago|reply
You will likely learn this yourself, but having done this transition in the past, the image I had of this world was simply not real.

Startups fake shit all the time, cut corners everywhere (even when they shouldn't) and the hustle culture exists primarily to benefit from employee overwork (free labor).

Coming from a developing country I always thought I would never be as good as "first-world developers". The reality is that the bell curve applies the same way everywhere. Most people aren't exceptional and if you're exceptional in your country, you're most likely exceptional everywhere.

If I were to give myself one advice in the past it would be: you're more than good enough.

Of course, YMMV. Good luck either way mate!

[+] silent_cal|2 years ago|reply
An old coworker of mine had a friend who created an app for tugboat captains, and is now pulling over 200k a year from it (probably more now). There are opportunities, they're just not very "cool". Have to look in unusual places.
[+] ffitch|2 years ago|reply
as a side note, I find it curious how back then technology innovations were largely ignored for years after they mature (over a decade for companies to start claiming domain names), and today technology is hyped even before it becomes practical (blockchain, llms)
[+] d4n0ct|2 years ago|reply
A few general observations: While there are still opportunities, perhaps fewer in developed and more in developing countries, the QUALITY of opportunities were different back in the days. The SOFTWARE field was still relatively wide open. The barrier-to-entry was being nerdy smart, but otherwise the costs (basic computers, telephone line access, some education) were low, at least in parts of the USA. Simultaneously the potential return was high. Because the information age was just beginning, and the world was hungry for software. People who were smart with software at that right time could literally shape the coming world. Think Microsoft and Google. In comparison, while the hardware side (chipsets, telecom equipment, etc) has grown, it hasn't "exploded", maybe except CPU or GPU, which are hugely capital intensive. You cannot train AI in your garage yet in a way that differentiates your "product" from that of the big players. The demand for pure software is no longer that high. On the hardware side, barriers to "physical" innovations, such as semiconductors, medicine, etc are still high now as ever. The "low" hanging "soft" fruits have been picked. This now feels more like the big mainframe computer era. So while there are still opportunities, they aren't comparable to the past. When science and technology fundamentally changed in the past, from manual labor to machinery to electronics etc, they opened up new fields for human innovation and endeavors. Groundbreaking disruptions are much rarer now, and the only one, AI, opens up a new field more for computers than for people.
[+] hasoleju|2 years ago|reply
Most of the things that made up the internet in 2014 were created in the second half of the internets lifespan at that point in time. Assuming that this trend continues, most of the things people will use in 2044 still have to be invented. You don't now what the new things will be. But looking back we notice that a lot of things came into our lifes recently.

We are so used to these big innovations that it feels like they have always been there.

[+] coffeebeqn|2 years ago|reply
They already figured out bronze that you can build anything you can imagine out of. Why would I put any more effort into metallurgy ?
[+] warthog|2 years ago|reply
I think another consideration here is not the internet itself but its distribution. Internet economy is not an anarcho capitalist environment anymore where any opportunity is wide open. It is more restricted to big circles or ecosystems controlled by Big Tech.

Take mobile and app stores. There are teams and people that build so many innovative and better solutions to deal with apps and app stores, yet they are simply not allowed.

AI does not change this norm. VR and AR perhaps, but they are also dominated by the same companies. Web3 was a good bet to change it but seems like it is not happening, at least the way we thought it should happen.

I am still optimistic about the future but possibilities are realistically rarer.

Ref: Yanis Varoufakis also talks about this in context of a digital feudalistic society.