(no title)
coffeesque | 2 years ago
I find this attitude kind of depressing. When I was in CS undergrad I thought about going into security because it seemed like this thing where you needed a bunch of systems background. To understand some exploits e.g. [0] you'd need hardware-related knowledge like branch prediction and memory hierarchy. Or something like stack smashing you'd need to know the process memory model and maybe some assembly
It seems to me like security is split into two camps: the people who are out there in the wild who find exploits and the "bootcamp" crowd
[0] https://en.wikipedia.org/wiki/Spectre_(security_vulnerabilit...
don-code|2 years ago
Much as you hire an electrician, as opposed to an electrical engineer, to wire your house, or a carpenter, rather than a mechanical / structural engineer, to build your house, I do think we should be finding ways to separate "programmer" from "computer scientist". These are both completely legitimate roles, and deserve to be paid according to their value produced - they're both quite high! But the fact that I need a deep understanding of data structures, operating systems, computer architecture, or other college-level concepts to write software feels like a shortcoming. If CS researchers can set standards for software, rather than _write the software itself_, we can more easily create on ramps and shallow ends for people to use.
Low code tools and LLMs are great steps in this regard, but I feel that they're still stigmatized from the enablement perspective.
keyle|2 years ago