In Linux, I can easily monitor how much data, what kind of data are transferred to the remote IP and disable them on per apps base anytime I want.
In IOS, I can't do any of that. At most, I can only disable an app from using cell data. If anyone else know how to do monitor/block network connection on per APP base in IOS, I would love to hear about it.
For me, it is a different between "trust" and "trust and verify".
I use IOS and have some level of trust on Apple/IOS. But I don't trust majorities of the IOS APPs for security/privacy.
To use a public key with something like Userify (https://userify.com, plug ssh key management) or a service like Github, use --export-key to export the public key in OpenSSH format:
sekey --export-key <key-id>
(see also in the readme that you can use sekey --list-keys to get key ID's.)
This is seriously such an awesome project that I might have to get a new MBP just for this.
This is a really cool project. I just tried to compile it myself, and code-signed it with my mac dev key. I changed the assets/sekey.entitlements file by replacing 5E8NNEEMLP with my own hex id for my developer key.
However, when i run ./bundle/Sekey.App/.../sekey, i keep getting a "Killed 9" message.
When i run the unsigned version, the binary at least runs (shows the -h messages). Any hints on how to fix?
thats because the code signing its invalid. you are signing the whole app? or only the binary. could be many reasons, one is the lack of the provisioning profile on your computer (for your key).
This looks useful. I'd love to use this with gopass (https://github.com/justwatchcom/gopass). It would have to support preexisting keys for that to work for me.
I wonder if the limitation around elliptic curve keys originate with the Secure Enclave, or is that just the one type of key this tool supports?
it's SE limitation :(. also makes me sad, will be awesome if you can import a key generated by yourself. so if you reinstall you can import again to the enclave
I'm amazed there's a market for this kind of thing. Since the 1990s its been known how to implement an undetectable backdoor in hardware based crypto which leaks your keys to an attacker.
The tl;dr is that any device in which you can't check the implementation or get the private key out (i.e. Secure Enclave, TPM etc), can leak your key to a passive attacker in a way that you provably can't detect.
On the flip side, you can also demonstrably prove that private keys stored on disk can be exposed with a minimal intrusion of your user account, also with no indication they were leaked.
Because this research exists, there can't possibly be a market for a security product whose threat model might not include a malicious hardware manufacturer?
My threat model excludes hardware compromise because it's dead easy for an attacker in the position of sticking malicious hardware into my laptop to e.g. run a malicious BIOS that notices when I'm doing crypto operations and wakes up and steals secrets off the CPU. If I worried about that I wouldn't use my computer at all.
I think the market for people whose threat model includes hardware compromise is extremely tiny. It should include purchasers of external Bitcoin hardware wallets, as you suggest, but it probably doesn't include the average SSH user deciding whether to trust hardware built into their laptop.
Also, in the specific case of macOS, your hardware manufacturer can more easily just ship you a malicious ssh binary or ssh-agent....
[+] [-] nyolfen|8 years ago|reply
https://krypt.co/
i'm not a security expert, and i have a lot more trust in my iphone for managing secrets than my personally configured linux pc's.
[+] [-] srcmap|8 years ago|reply
In Linux, I can easily monitor how much data, what kind of data are transferred to the remote IP and disable them on per apps base anytime I want.
In IOS, I can't do any of that. At most, I can only disable an app from using cell data. If anyone else know how to do monitor/block network connection on per APP base in IOS, I would love to hear about it.
For me, it is a different between "trust" and "trust and verify".
I use IOS and have some level of trust on Apple/IOS. But I don't trust majorities of the IOS APPs for security/privacy.
[+] [-] gdamjan1|8 years ago|reply
[+] [-] _hyn3|8 years ago|reply
To use a public key with something like Userify (https://userify.com, plug ssh key management) or a service like Github, use --export-key to export the public key in OpenSSH format:
(see also in the readme that you can use sekey --list-keys to get key ID's.)This is seriously such an awesome project that I might have to get a new MBP just for this.
[+] [-] floatboth|8 years ago|reply
But with TPM there's no external unlock mechanism like Touch ID, the TPM unlock happens from the operating system.
[+] [-] abhv|8 years ago|reply
However, when i run ./bundle/Sekey.App/.../sekey, i keep getting a "Killed 9" message.
When i run the unsigned version, the binary at least runs (shows the -h messages). Any hints on how to fix?
[+] [-] ntrippar|8 years ago|reply
[+] [-] froderick|8 years ago|reply
I wonder if the limitation around elliptic curve keys originate with the Secure Enclave, or is that just the one type of key this tool supports?
[+] [-] epistasis|8 years ago|reply
[+] [-] unknown|8 years ago|reply
[deleted]
[+] [-] falcolas|8 years ago|reply
Can it also support a pin to go with the biometric auth?
[+] [-] ntrippar|8 years ago|reply
https://developer.apple.com/documentation/security/ksecattra...
[+] [-] viraptor|8 years ago|reply
That makes me sad :( Does anyone know if that's a SE limitation, or the app's?
[+] [-] ntrippar|8 years ago|reply
[+] [-] timdorr|8 years ago|reply
That's also why it only generates one kind of key. It's a black box that spits out public keys.
[+] [-] arete|8 years ago|reply
[+] [-] unknown|8 years ago|reply
[deleted]
[+] [-] galadran|8 years ago|reply
E.g. http://paper.ijcsns.org/07_book/201006/20100623.pdf details how to do it for Elliptic Curves. But its been studied since https://link.springer.com/content/pdf/10.1007%2F3-540-69053-...
The tl;dr is that any device in which you can't check the implementation or get the private key out (i.e. Secure Enclave, TPM etc), can leak your key to a passive attacker in a way that you provably can't detect.
[+] [-] falcolas|8 years ago|reply
Another layer of security is never bad.
[+] [-] jmileham|8 years ago|reply
[+] [-] pvg|8 years ago|reply
[+] [-] geofft|8 years ago|reply
I think the market for people whose threat model includes hardware compromise is extremely tiny. It should include purchasers of external Bitcoin hardware wallets, as you suggest, but it probably doesn't include the average SSH user deciding whether to trust hardware built into their laptop.
Also, in the specific case of macOS, your hardware manufacturer can more easily just ship you a malicious ssh binary or ssh-agent....