So I work in the video games industry(at one of the largest AAA publishers) so I can shed a little bit of light on it from our side - we used to have a guy here who worked for 30 years with us, was here right from the start. He said that yeah, he got the job by making some demos on his Amiga and literally going to the house of the owner and asking if he needed any help - and was hired on the spot, worked out of his bedroom for a while. 17 at the time.
Nowadays even for junior positions we wouldn't consider anyone without at least a bachelors degree in CS, so that's at least 3 years of university - putting the minimum starting age with us at around ~21-22 years old. I myself started just after my masters degree, at 23 years old.
I think that it does mean we are missing out on a lot of very self driven, ambitious candidates - but even requiring a minimum of CS education from candidates we get in excess of 1000 applications for every junior position. I understand why recruitment would be reluctant to start looking at candidates without any formal qualifications.
Self taught myself, only starting now after decades in the software industry (from dev to management) studying Math and CS on a bachelor degree out of curiosity and to gain more knowledge.
Early games needed visuals and that was something results were important, which demo guys delivered on machines with still very limited resources. Formal education for games was lacking behind during the 90th. After all, games as a broad industry was a new concept. (EA once produced games for the C64.)
This has changed. With more and more abstractions and more processing power it is more and more important to work on algorithms not hacks.
Most self taughts lack broader concepts, abstractions and especially algorithms. I had to learn everything besides hacking my way into Frontend development.
Would I look out for folks with no degree? Yes and no. I depends on the use case.
On the other side: the world is filled up with mediocre code from people with degrees. There is no guarantee. Only constant learning - and this is a supoerpower at least some self-taughts have.
If i got an applicant with an impressive itch.io page full of games they've made and released themselves I'd interview them even if they're 18 (I'm not sure about child labor laws for kids younger than that).
not in the industry anymore but I'd argue that, as far as 3D engines go, truly understanding an engine implies a breadth of knowledge much wider that what was necessary to make "simple" (by the time it was tough) 3D textured triangles. Things got much more sophisticated... Now you need at least some algebra, you need to understand datastructures, light behaviour, memory management, etc. Sure you can learn on the spot, but it's going to take time.
... perhaps the title (currently "One thing that has changed in the professional game industry is that ") can be altered, it's not useful in it's current form, and a bit click-bait (but no shade on the author, I think it's because a regular sentence is cut-off!)
Crytek started by recruiting people from the demoscene, somewhere in the early to mid 00s. Cevat Yerli did this himself, simply by shooting off some emails to whoever did some cool demos, I guess. I got contacted as well, but this was before they were doing Far Cry. I believe it was some kind of space game, but I'm sure there are some people here who have a better recollection of that.
I suspect this is more about the growing divide between high end games and indie games.
Back in the day, the divide was much smaller. So it was a lot easier for a motivated indie dev to make something like Crash Bandicoot and have it be seen in the same light as AAA studios.
Today, the divide is so high that you can’t possibly make a AAA level game as an indie. But a lot of indie games get overlooked by the mass market.
So I don’t agree with the author. Plenty of people start young, and possibly even more do so now than ever before. It’s never been easier to make a game.
But it’s also never been harder to make games that meet people’s expectations of high end gaming and that’s likely what the author is seeing.
> Today, the divide is so high that you can’t possibly make a AAA level game as an indie.
To latch onto this remark; indie games of today are bigger and better than the AAA games of 20, 30 years ago I think. Groundbreaking games like Commander Keen (side-scrolling on PC) were written by a late teen, early 20 year old John Carmack, who continued to basically invent 3D video games.
That said, reasons why indie games are better than old "pro" games are that they stand on the shoulders of giants; Carmack for example had to invent and apply a lot of theoretical knowledge into performant code, afterwards people could reuse his code and ideas. Programming languages and tooling have improved, so indie developers can focus more on building a game instead of solving lower-level problems.
But they still build their own engines, albeit usually not from scratch. I'm thinking games like Factorio, Cogmind, The Witness, Minecraft, Amnesia, Project Zomboid, etc (yeah I had to do a search to find and confirm these games).
That said, "high-end gaming" is a difficult term to use; with some of the most popular games, graphic fidelity and the like is secondary to gameplay. Minecraft, the best-selling game of all time, was made by one guy / a small team, Stardew Valley by just one guy, and Pokemon and Terraria were made by just small teams.
The Indie games market is saturated. That doesn't mean that they can't get the same attention as AAA. For example Minecraft and more recently Battlebit
They later clarify that they're talking specifically about engine development, which makes much more sense to me since I see tons of relatively young hires at our studio (albeit no outright teenagers) but none of them for the engine.
I agree with their reasoning that engines are largely commoditized. Ours isn't, but (a) we don't iterate heavily on it any more (so just not hiring for it much in general) and (b) even if we were, most amateurs are using Unity/Unreal because generally there's very little reason for an amateur to write their own engine except for its own sake, so they wouldn't have the low-level experience of someone from e.g. the demoscene or with more general dev experience/credentials.
Just a question - where do you see modern engine technology evolving? My biggest gripe is that the extra processing power in modern machines is used to make static content look prettier, and make its authoring easier by non-technical artists, and the extra possibilities of real-time computation (like fully dynamic GI, and collision, navigation) and not taken advantage of, which leads devs to remake the same setpiece-heavy railroaded games over and over.
I think it’s more that the studios used to all be small, and the foundational games of the previous generations were made by newly-formed companies comprised of mostly young people. Today we have established AAA studios with huge bankroll that put their money behind developers with a track record.
One of the issues with this is most of those folks are probably using an engine like Unity or Unreal. Which is fine by itself if you want to make a game, but it kind of forces you into certain constraints while allowing you to not have to understand the underlying tech, as well as the hardware.
To really get into the fundamentals, like computer graphics, pathfinding, collision detection to make something really impressive and never seen before like Teardown (https://store.steampowered.com/app/1167630/Teardown/), you really need to start from scratch.
And speaking as someone who has written a basic-ish but fully functional (deferred rendering, dynamic lights, shadows, collision detection etc.) engine, it's really not that hard (well compared to making a shipping game), and having a fundamental understanding of the underlying tech will probably help you to make a game that stands out from the crowd.
I think that is true for most programming fields, not just game development. Back in the day a 16 year old who knew HTML and a tiny bit of JavaScript could easily get hired to do 'web development'. I got hired as a sys admin fresh out of high school simply by "knowing computers". My interview was basically
"Hey, are you good with computers?"
"Yes!"
"Great you're hired. Go see if you can fix this thing"
Back then there were simply much fewer people who had any relevant experience so the barrier to entry (experience wise) was much lower.
Roblox, Fortnite Creative, Overwatch Workshop, and Minecraft modding says false. Heck, even Paradox Games have a vibrant modding community. Children still yearn for the mines, but they don't need to build their own game engine any more, and the adults have found more efficient ways to exploit them.
There are tons of kids still doing games. The AAA scene is fully professionalized, but if you look at what's going on at itch.io, there is no lack of young talent. The older generation forgot that when they were starting there was no multi-billion industry pumping out games.
This seems like just an antidote and not a fact. I'm sure if you looked at the age that Roblox, Minecraft, etc developers started at you would see that there still is a lot of people who start young.
This is going to be a thing across tech in the near future. You can see across games, web apps, whatever, there's a trend of more libraries, more frameworks, more guard rails, less fundamentals, less innovation.
The fad these days is to skip learning how to program and instead just learn how to subsist in an ecosystem 'real' programmers have built for you.
You don't need to know how to write a game engine, you just have to download the right assets from the Unity Store.
What does 'state' mean? That's that thing in React right?
Tree? Graph? Never heard of them, do you mean a component?
Difference between a class and a struct? Well I've never heard of a struct but classes are the best thing since sliced bread because you can use them to do a ObservableFactoryControllerFactory and I read in a Medium article that those are good.
The net result of all this is that the code that actually drives everything is being written by older people that got the chance to actually learn, while all the learning opportunities are being stolen from the next generation in the name of getting people "productive" on day 1 (at the expense of the rest of the days of their career).
Those anecdotes you find 10 of in every thread on HN where someone copied a BASIC script out of a book or whatever you boomers did, the modern equivalent of that is npm installing some shit, building 10x the amount of stuff and learning 0.1x as much.
It reminds me a conversation I had with a friend that recently started programming, through javascript and various online tutorial. He introduced me to "PM" for node, to spawn several processes of the same node program. I asked him "And is there shared memory between them ?" He said "What ? What do you mean ?" I insisted to ask with different wording, and he ended up saying "Oh, you mean 'state' ? Nah there is nothing shared". So he doesn't know about the fundamentals of what's memory, processes etc. and uses this nebulous "state" word that could frankly mean anything. But hey, he actually has a better salary than me now. So that's fair, why bother learning fundamentals if it doesn't pay more ? Why go through this struggle ? You could argue "innovation", but this is way harder to innovate now than before
I can't help but think this is how assembly programmers viewed people using compiled languages, or how punched card programmers viewed people using interactive terminals.
> You can see across games, web apps, whatever, there's a trend of more libraries, more frameworks, more guard rails, less fundamentals, less innovation.
Eh, way back in 1991 we had things like Visual Basic with drag-and-drop GUI creation. You could easily do things like display bitmaps on screen without ever having created an array or used variable types other than 'variant'.
[+] [-] gambiting|2 years ago|reply
Nowadays even for junior positions we wouldn't consider anyone without at least a bachelors degree in CS, so that's at least 3 years of university - putting the minimum starting age with us at around ~21-22 years old. I myself started just after my masters degree, at 23 years old.
I think that it does mean we are missing out on a lot of very self driven, ambitious candidates - but even requiring a minimum of CS education from candidates we get in excess of 1000 applications for every junior position. I understand why recruitment would be reluctant to start looking at candidates without any formal qualifications.
[+] [-] _the_inflator|2 years ago|reply
Early games needed visuals and that was something results were important, which demo guys delivered on machines with still very limited resources. Formal education for games was lacking behind during the 90th. After all, games as a broad industry was a new concept. (EA once produced games for the C64.)
This has changed. With more and more abstractions and more processing power it is more and more important to work on algorithms not hacks.
Most self taughts lack broader concepts, abstractions and especially algorithms. I had to learn everything besides hacking my way into Frontend development.
Would I look out for folks with no degree? Yes and no. I depends on the use case.
On the other side: the world is filled up with mediocre code from people with degrees. There is no guarantee. Only constant learning - and this is a supoerpower at least some self-taughts have.
[+] [-] ido|2 years ago|reply
[+] [-] wiz21c|2 years ago|reply
[+] [-] dschuetz|2 years ago|reply
Why?
[+] [-] baud147258|2 years ago|reply
Maybe they'll end up joining/founding indie companies instead of AAA companies
[+] [-] NeoTar|2 years ago|reply
[+] [-] royjacobs|2 years ago|reply
[+] [-] ido|2 years ago|reply
[+] [-] dagmx|2 years ago|reply
Back in the day, the divide was much smaller. So it was a lot easier for a motivated indie dev to make something like Crash Bandicoot and have it be seen in the same light as AAA studios.
Today, the divide is so high that you can’t possibly make a AAA level game as an indie. But a lot of indie games get overlooked by the mass market.
So I don’t agree with the author. Plenty of people start young, and possibly even more do so now than ever before. It’s never been easier to make a game.
But it’s also never been harder to make games that meet people’s expectations of high end gaming and that’s likely what the author is seeing.
[+] [-] Cthulhu_|2 years ago|reply
To latch onto this remark; indie games of today are bigger and better than the AAA games of 20, 30 years ago I think. Groundbreaking games like Commander Keen (side-scrolling on PC) were written by a late teen, early 20 year old John Carmack, who continued to basically invent 3D video games.
That said, reasons why indie games are better than old "pro" games are that they stand on the shoulders of giants; Carmack for example had to invent and apply a lot of theoretical knowledge into performant code, afterwards people could reuse his code and ideas. Programming languages and tooling have improved, so indie developers can focus more on building a game instead of solving lower-level problems.
But they still build their own engines, albeit usually not from scratch. I'm thinking games like Factorio, Cogmind, The Witness, Minecraft, Amnesia, Project Zomboid, etc (yeah I had to do a search to find and confirm these games).
That said, "high-end gaming" is a difficult term to use; with some of the most popular games, graphic fidelity and the like is secondary to gameplay. Minecraft, the best-selling game of all time, was made by one guy / a small team, Stardew Valley by just one guy, and Pokemon and Terraria were made by just small teams.
[+] [-] bibanez|2 years ago|reply
[+] [-] indigochill|2 years ago|reply
I agree with their reasoning that engines are largely commoditized. Ours isn't, but (a) we don't iterate heavily on it any more (so just not hiring for it much in general) and (b) even if we were, most amateurs are using Unity/Unreal because generally there's very little reason for an amateur to write their own engine except for its own sake, so they wouldn't have the low-level experience of someone from e.g. the demoscene or with more general dev experience/credentials.
[+] [-] torginus|2 years ago|reply
[+] [-] buro9|2 years ago|reply
Perhaps this is more where the author is, and what they're seeing from that perspective?
[+] [-] enneff|2 years ago|reply
[+] [-] yellow_lead|2 years ago|reply
[+] [-] torginus|2 years ago|reply
To really get into the fundamentals, like computer graphics, pathfinding, collision detection to make something really impressive and never seen before like Teardown (https://store.steampowered.com/app/1167630/Teardown/), you really need to start from scratch.
And speaking as someone who has written a basic-ish but fully functional (deferred rendering, dynamic lights, shadows, collision detection etc.) engine, it's really not that hard (well compared to making a shipping game), and having a fundamental understanding of the underlying tech will probably help you to make a game that stands out from the crowd.
[+] [-] dagw|2 years ago|reply
[+] [-] noelwelsh|2 years ago|reply
[+] [-] matusp|2 years ago|reply
[+] [-] fragmede|2 years ago|reply
[+] [-] otterley|2 years ago|reply
[+] [-] unknown|2 years ago|reply
[deleted]
[+] [-] charcircuit|2 years ago|reply
[+] [-] fulafel|2 years ago|reply
In adjancent fields of culture, are literary authors, musicians or film makers getting their break later in life these days?
[+] [-] totetsu|2 years ago|reply
[+] [-] archo|2 years ago|reply
[+] [-] BigJono|2 years ago|reply
The fad these days is to skip learning how to program and instead just learn how to subsist in an ecosystem 'real' programmers have built for you.
You don't need to know how to write a game engine, you just have to download the right assets from the Unity Store.
What does 'state' mean? That's that thing in React right?
Tree? Graph? Never heard of them, do you mean a component?
Difference between a class and a struct? Well I've never heard of a struct but classes are the best thing since sliced bread because you can use them to do a ObservableFactoryControllerFactory and I read in a Medium article that those are good.
The net result of all this is that the code that actually drives everything is being written by older people that got the chance to actually learn, while all the learning opportunities are being stolen from the next generation in the name of getting people "productive" on day 1 (at the expense of the rest of the days of their career).
Those anecdotes you find 10 of in every thread on HN where someone copied a BASIC script out of a book or whatever you boomers did, the modern equivalent of that is npm installing some shit, building 10x the amount of stuff and learning 0.1x as much.
[+] [-] TheRoque|2 years ago|reply
[+] [-] nanidin|2 years ago|reply
[+] [-] michaelt|2 years ago|reply
Eh, way back in 1991 we had things like Visual Basic with drag-and-drop GUI creation. You could easily do things like display bitmaps on screen without ever having created an array or used variable types other than 'variant'.
[+] [-] gmerc|2 years ago|reply