snagg | 1 year ago | on: Comparing Auth from Supabase, Firebase, Auth.js, Ory, Clerk and Others
snagg's comments
snagg | 2 years ago | on: When MFA isn't MFA, or how we got phished
snagg | 2 years ago | on: Securing Your PostgreSQL DB with Roles and Privileges
snagg | 2 years ago | on: Adding Identity to Docusaurus
sorry for the late reply, just saw this. Do you mean to have certain pages public while others are private?
If so, yes. You can make login optional (using the slashID.forceLogin parameter. See here: https://github.com/slashid/docusaurus-slashid-login/blob/mai...) and restrict certain pages to only logged-in users. You can also use user groups/roles to further segment access to pages (https://www.slashid.dev/blog/groups-react/).
To get the slashID.orgID parameter for the theme you can sign-up here: https://console.slashid.dev/signup
If you send me an email at [email protected] and we can add you to our Slack in case you have any issues with it.
snagg | 2 years ago | on: Tailscale doesn't want your password
Practically if the passkeys are stored in your iCloud Keychain, they are automatically synced across your Apple devices and the recovery mechanism is the recovery mechanism for iCloud.
Similar consideration for Google/Chrome and other password managers.
We wrote a relatively long blogpost about this + implementation and threat modeling considerations in case it's interesting: https://www.slashid.dev/blog/passkeys-security-implementatio...
snagg | 2 years ago | on: Passkeys now support external providers
Passkeys are definitely a leap forward in that we are shifting the bulk of the account takeover risk from the end users using weak passwords or clicking on phishing links to: 1) The server side implementation, including any mechanism for account recovery and support for multiple passkeys/auth factors
2) The browser enforcement checks (eg: this is what Chrome does: https://www.slashid.dev/blog/webauthn-antiphishing/)
3) The wallet/keychain/password manager holding the keys (there's a lot of variance here in terms of security guarantees, see recent password managers breaches. We wrote a bit about how Apple does it: https://www.slashid.dev/blog/passkeys-deepdive/#the-technica...)
4) The authenticator itself (again, lots of variance here)
All of which are harder to compromise vs the average end-user.
There are still scenarios where the end-user could be targeted/tricked but they are fewer and harder to pull off (to name some: malware stealing the private keys and account takeovers on the password manager).
snagg | 2 years ago | on: Passkeys now support external providers
The WebAuthn specs recommends to register multiple passkeys/credentials per device and assume that once a credential is lost it might not be recoverable.
Apple and other vendors using keychains/wallets are effectively offering the option to delegate the recovery of the passkey to the recovery of the account with them (eg: the iCloud account).
In case it is of interest, we wrote a long blogpost on the topic: https://www.slashid.dev/blog/passkeys-security-implementatio...
snagg | 2 years ago | on: Understanding Passkeys
I'm the author of the SlashID blogpost. You are right, the WebAuthn standard doesn't provide any guarantees on the authenticator storage security hence passkeys (and WebAuthn creds) can be stored in anything that speaks CTAP2.
We wrote a follow-up blogpost talking about the threat model in which we touch on the above: https://www.slashid.dev/blog/passkeys-security-implementatio...
snagg | 2 years ago | on: Understanding Passkeys
I'm the author of the blogpost. You are spot on, Passkeys are exportable so the private key ends up both on iCloud and the Enclave/authenticator.
My understanding is that there's chatter about cross-vendor synchronization of passkeys but nothing concrete yet.
Meanwhile Apple allows people to share Passkeys via AirDrop (Settings > Passwords - select the passkey you want and click the "Share" icon to send it over Airdrop) so it should be possible with some effort to obtain the private key with something like this: https://github.com/seemoo-lab/opendrop. Haven't done extensive testing yet though, so I can't confirm.
Would love to hear if anybody knows more about how the sharing via AirDrop is implemented/protected.
snagg | 3 years ago | on: Bitwarden Acquires Passwordless.dev
snagg | 3 years ago | on: Bitwarden Acquires Passwordless.dev
snagg | 3 years ago | on: Bitwarden Acquires Passwordless.dev
Technically a Passkey is just a multi-device FIDO credential that is compatible with WebAuthn (which is an official W3C and FIDO spec).
However, vendors implementations of Passkeys/FIDO credentials differ quite widely. The Apple implementation of Passkeys, as an example, doesn't provide attestation information which reduces the ability to do device verification. Similarly, even though it's not technically part of Passkeys, Apple removed the possibility to create device-bound WebAuthn keys which significantly weakens the security guarantees you'd normally get with WebAuthn.
snagg | 3 years ago | on: Show HN: HSM-backed PII storage directly from the front end
Our team would love to get your feedback and answer any questions!
snagg | 15 years ago | on: Zed's new project: Vulnerability Arbitration
snagg | 15 years ago | on: Zed's new project: Vulnerability Arbitration
then I fail to understand what you plan to do with the encrypted stuff that you get from the researcher since the only one able to decrypt it would be the vendor. At that point the scenario is: researcher says this is vulnerable, vendor denies it, somebody needs to decrypt the content of what the researcher has sent to the vendor. In contrast what happens without having the encrypted content is: researcher says this is vulnerable, vendor denies it, the researcher can publish the un-encrypted original advisory on your website if he feels like it.
Regardless the SSL trick is a nice one I just don't see the point of the third-party involved and I somewhat doubt that a website like this can be useful to the end-user or to put pressure on the vendor.
snagg | 15 years ago | on: Zed's new project: Vulnerability Arbitration
Usually researchers' submissions are way more detailed than the advisories you see from vendors. Meaning that you cannot just decrypt the content of what the research submitted. It's true that oftentimes you can find the bug by reading the advisory and using tools to diff the patch but it's a long shot compared to just publishing the original researcher's submission imo.
>Uh, not sure what the "middle-man" is, but if you mean vulnarb.com then no, the point is that it's industry standard asymmetric crypto so I wouldn't know anything. In fact, I'd have incentive to not know anything so that I'm not getting sued.
yep I meant vulnarb.com. I'm not saying that you have anything decrypted and ready to use, I'm saying that it's pointless to have another recipient for sensible data. Cause in the extreme scenario where the private key is stolen than it's just more attack surface to get to the submissions. If we assume that no leak of this sort happens, still what's the reason for a third party to have this sort of data compared to say sha-1 hash of the Poc?
Actually my point about the karma system is that you can achieve the same goal you have in mind without needing any sort of data from the researcher other than "product X is vulnerable" and then the karma points will determine whether the researcher is reliable or not.
snagg | 15 years ago | on: Zed's new project: Vulnerability Arbitration
Co-founder of SlashID here (one of the companies mentioned above). I think we have exactly what you are looking for:
https://www.slashid.dev/blog/anonymous-users/
https://developer.slashid.dev/docs/access/concepts/anonymous...
Hit me up if you want to chat more: [email protected]