top | item 31046791

Attack campaign involving stolen OAuth tokens issued to third-party integrators

281 points| heyoni | 4 years ago |github.blog | reply

67 comments

order
[+] Sytten|4 years ago|reply
If one thing can be gotten from that incident is forcing GitHub to create finer OAuth scopes and per repository.

I am always uneasy when an application even thrust worthy like sentry asks access to ALL my private repositories even when they dont need it.

I feel that in 2022 GitHub is way too critical to many organizations and they should really work on stricter rules. Even google requires special and costly audits from oauth applications to access users critical data like drive and emails.

[+] greggman3|4 years ago|reply
I wrote about this a couple of years ago

https://games.greggman.com/game/github-permission-problem/

Github has (had) some of the worst wording in the permission prompts as well as too many 3rd party apps asking for blanket permission to do anything and everything in all your repos. It might as well be "hey, give us root on your servers and our new service will do cool things for you!"

And even worse is the 1000s and 1000s of repos that have signed up for those services including all kinds of popular libraries. They've given the keys to hack the libraries at will to 10s or 100s of random companies that integrate with github. It's insane.

[+] media-trivial|4 years ago|reply
> If one thing can be gotten from that incident is forcing GitHub to create finer OAuth scopes and per repository.

This. Limiting OAuth app access to specific repositories. I really wish this was a thing. For this exact reason I'm creating a new GitHub account for every new integration and share the repo with this account. It's technically against the TOS (they only allow single bot account per user) but there is no other way if you want security.

Maybe GitHub will implement this feature now. My hopes are high.

[+] chucke|4 years ago|reply
> If one thing can be gotten from that incident is forcing GitHub to create finer OAuth scopes and per repository.

Just noting that gitlab has supported per repo tokens for ages. Yes, it's really useful, as this incident can attest.

[+] judge2020|4 years ago|reply
All the oauth prompts I’ve gotten in the past 3 years or so seem to have the granular per-repo control you’re looking for.
[+] yuvadam|4 years ago|reply
Github already has per-repo scope, and has supported that for at least several years AFAIR.
[+] clearf|4 years ago|reply
How _lucky_ that github itself was the subject of the attack in npm.

Unless I'm missing something, this attack could have gone unnoticed for a long time (it would be hard for someone to connect a random breach in their infrastructure to an oauth intrusion affecting two of their service providers).

[+] formerheroku|4 years ago|reply
Neither Heroku nor GitHub are addressing the key part of this - if Heroku's Dashboard apps were compromised, and thus access was given to connected GitHub repos for download - were the secrets in the Heroku config vars for the app also visible? That is the nightmare scenario for the affected app's owners.
[+] prosim|4 years ago|reply
Heroku/SalesForce updated their status on https://status.heroku.com/incidents/2413:

"Salesforce continues to investigate this incident in coordination with GitHub and our retained third-party breach vendor. Once we identify how the threat actor gained access to customers' OAuth tokens, we will immediately take appropriate actions."

Sounds like they simply don't know yet how the actor got access and what else was exposed.

[+] fdr|4 years ago|reply
Looks like it's addressed in an update to https://status.heroku.com/incidents/2413

The compromised tokens could provide the threat actor access to customer GitHub repos, but not customer Heroku accounts

While the situation can get worse, this token alone may not be enough...

[+] mjg59|4 years ago|reply
Just going to say that this is the kind of scenario I was discussing in https://mjg59.dreamwidth.org/59353.html . RFC 8705 actually provides a mechanism for dealing with this, so I guess I'm going to be trying to implement that for our infrastructure in the near future.
[+] ec109685|4 years ago|reply
“A lot of services will verify the user, and then issue an oauth token that'll expire some time around the heat death of the universe”

mTLS is definitely an excellent step towards solving this.

[+] jchw|4 years ago|reply
> Based on subsequent analysis, we believe this API key was obtained by the attacker when they downloaded a set of private npm repositories using a stolen OAuth token from one of the two affected third-party OAuth applications described above.

Does this imply there was unencrypted AWS credentials stored in an NPM repository and/or possibly GitHub? Seems like a bad idea, though I’m sure it’s hard to get off that practice if you’ve been on it for a long time.

[+] nightpool|4 years ago|reply
specifically the attack Github describes seems to go:

            ???
             ↓
    Heroku/Travis OAuth token
             ↓
      Private GitHub repo
             ↓
    S3 token for NPM prod infra
[+] Aeolun|4 years ago|reply
> Does this imply there was unencrypted AWS credentials stored in an NPM repository and/or possibly GitHub?

Large organisation, hundreds of developers. Kind of likely that something is visible somewhere. Still a bit surprising it’s the prod access keys for npm…

[+] jacobr|4 years ago|reply
GitHub’s own Application Security product has secret scanning, allowing you an overview of secrets exposed across your entire organization.
[+] anon3949494|4 years ago|reply
We're a small org with a github connected to heroku. All of our repos were cloned between April 8 and April 15 with the majority of them having no activity for several years. The audit logs don't show this, you can only see this information in the traffic graphs (/graphs/traffic). If you're seeing cloning of repos that you haven't touched in a while, you've likely been compromised.
[+] Arctic_|4 years ago|reply
Just wanted to check this for my own repos but it’s only available for GitHub Pro users.
[+] vemv|4 years ago|reply
For all the complexity of OAuth, an OAuth token is essentially a free pass for $stuff.

Perhaps we could start validating that a token not only is valid, but also is digitally, freshly signed with a known certificate?

I'd say there's a huge difference between:

* I stole a token

* I stole the token AND am able to keep signing said token with my victim's public certificate

A token can be stolen from many places and is comparatively easy to obtain, while full intrusion into an org's infrastructure is less likely, and even if it happens, eventually will be remediated (and the related certificate will be revoked).

Edit: looks like https://tools.ietf.org/id/draft-ietf-oauth-mtls-09.html would fit the bill.

[+] taf2|4 years ago|reply
I like this idea makes tons of sense… not only is the token valid but it’s from a trusted source… it is not going to prevent future attacks but it will add one more lock for an attacker to break before gaining access that is a good thing…
[+] remram|4 years ago|reply
mtls is problematic because it is not proxy friendly (needs special support in the frontmost proxy).

AWS and Mastodon use HTTP request signing which seems interesting.

[+] paxys|4 years ago|reply
Private repos of Heroku customers, to be more specific. Including "Salesforce" in the submission title only causes confusion.
[+] heyoni|4 years ago|reply
I was confused when I wrote the title because they referred to Heroku as Salesforce and I’m not used to that.
[+] media-trivial|4 years ago|reply
I was confused because I got an email from Salesforce to my personal email. (We use Salesforce at work.) Email looked phishy (e.g. unsecured (non-https) links). Apparently they acquired Heroku in 2010.
[+] hardwaresofton|4 years ago|reply
> Known-affected OAuth applications as of April 15, 2022:

> Heroku Dashboard (ID: 145909)

> Heroku Dashboard (ID: 628778)

> Heroku Dashboard – Preview (ID: 313468)

> Heroku Dashboard – Classic (ID: 363831)

> Travis CI (ID: 9216)

Developers at Heroku and Travis are going to have a rough Friday

[+] londons_explore|4 years ago|reply
This is why it's important to have 'bait' to know if secrets are stolen or attackers have access to something they shouldn't.

For example, my site, serverthiefbait.com might help...

[+] jarym|4 years ago|reply
If github themselves hadn’t been targeted then would their security team have even detected this? That is scary to me!
[+] heyoni|4 years ago|reply
This wasn’t clear to me at first but it looks like these might be heroku specific repos and not salesforce ones.
[+] Arubis|4 years ago|reply
Not to disagree, but to provide context, Heroku is wholly owned by Salesforce.
[+] Aeolun|4 years ago|reply
So this means they also got access to all the private npm package code? Compromising all the organisations using travis or heroku is one thing. But all npm packages quite another.
[+] nomilk|4 years ago|reply
Can anyone confirm or rule out whether config vars for heroku apps have been exposed?

I can't see anything mentioned on the incident page nor config vars page

https://status.heroku.com/incidents/2413

https://devcenter.heroku.com/articles/config-vars

[+] samcheng|4 years ago|reply
I don't think they know yet.

It's pretty plain that source code exfiltration occurred, though. It's not clear to me how exactly to confirm that the exfiltration happened to your account or not.

From the other thread:

> GitHub indicates they are performing an audit and if they find such evidence they will notify each account/org within the next 72 hours.

[+] jrochkind1|4 years ago|reply
I still don't totally understand, can anyone read between the lines to explain what's going on here? The attackers stole what sort of OAuth tokens with what sort of access?
[+] tomatowurst|4 years ago|reply
I could be wrong but you know when you sign up for services you have option of signing in with github or other oAuth providers? I think they used this to gain access to various private repos, npms, cloud credentials.

So it seems like if your company had some super secret project running on github, heroku, AWS, they had full access to it.

This is my rough understanding just skimming so could be wrong.

[+] nomilk|4 years ago|reply
Am I right to say GitHub knew about this on the 12 April [1], but Heroku customers are only finding out now, 4 days (!!!!) later?

Heroku boasts:

> we will notify affected customers by email without undue delay.

4 days. Really. "without undue delay".

[1] https://github.blog/2022-04-15-security-alert-stolen-oauth-u...

[+] sofixa|4 years ago|reply
> Following immediate investigation, we disclosed our findings to Heroku and Travis-CI on April 13 and 14

13-14, so "only" 1-3 days later, depending on what GitHub mean and timezones involved.

[+] no-dr-onboard|4 years ago|reply
Anyone else noticing an uptick in OAuth bug/authentication integrations for CICD/dev ops platforms?
[+] rsanheim|4 years ago|reply
Yikes. This is bad. On a friday night no less.

Some questions that come to mind:

* Would encrypted secrets for GitHub Actions be exposed to these tokens?

* How to best audit for unauthorized access to exposed repos?

* Were Heroku encrypted secrets exposed?

[+] judge2020|4 years ago|reply
Nothing can pull the value of secrets from GitHub actions without either running a compromised/modified runner, or the workflow itself encrypting/printing/shipping that actions secret off to somewhere. If you’re really worried, use oidc GitHub actions where possible: https://docs.github.com/en/actions/deployment/security-harde...
[+] rndgermandude|4 years ago|reply
>On a friday night no less.

Not just any Friday, but Good Friday, which happens to be a legal holiday in many countries, including here in Germany. And since Easter Monday is one too, a lot of people will go on vacation on this "long" weekend (maybe even taking a couple of days off before or after to make it even longer)... Flights and hotel rooms booked, and all that.