In Australia it's mandated you're sent a message before rerouting or migrating to another provider. Surprised this isn't enforced in the other countries, it costs next to nothing to implement and is just an additional step in the account migration process.
I'd love to see companies allow for opt in additional security measures, like banks or telco's calling me - having a verbal password to confirm things, that level of security seems to only be available to VIPs.
I tried to get T-Mobile to stop giving my location to anyone that hits their APIs with a 'Yes I have permission' flag set.
There's no opt-out for it, and no enforcement of the permission requirement. Their support had me snail mail a letter to some PO box. I never got a response.
And now they're going to start outright selling their customer activity after forcibly un-opt-outing* everyone who opted out in their privacy settings previously..
*un-opt-outing -- ??? I don't know what to call this. It's not 'opting-in' since nobody has a choice.. 'resetting user selection without notification or consent' seems too mild and wordy.
Back in the early days of mobile number portability the majority of telcos put in systems to make porting out harder, e.g. getting an unlock code. This gave them a chance to keep the customer when they called up.
Regulators (particularly in Europe) soon put a stop to that to promote competition. While this was good, the majority of regulators failed to put in a consumer protection mechanism to stop identity theft through account stealing.
The article describes a more insiduous attack, as the mobile account is still active (hiding the existence of the attack from the user), but the message destination has been rerouted, making all the linked accounts that use SMS as their 2FA also vulnerable.
I think this particular issue is specific to North America, due to peculiarities of the NANP phone number scheme (inter-provider texts are routed quite differently from voice calls, if I understand it correctly).
In other countries, the two channels are more closely coupled (but SIM swap and/or number porting attacks are still possible, depending on the provider‘s security protocols).
Nice one. My neighbour chaired the Australian inter-carrier roundtable implementing mobile phone number portability. I will send him a note! Other cool hacks of his: automatic video advertising scheduling system once saved Channel 9(?) from airing a gas oven ad during a Holocaust documentary. Scored a bonus for that one.
In Canada with tell you can put a “lock” on your account so there is additional steps like going into a store with is to remove it before you can port a number of sim swap (I think) Still don’t use my phone number on my google accounts thou
SMS is irredeemably broken, like all telco-designed garbage protocols. The only way you can incentivize companies to stop using it as security theater is to shift liability so any losses incurred by SMS jacking is automatically the liability of the company using SMS, just as nowadays any credit card fraud is borne by the company that is not using the EMV chip to secure a transaction.
Reminder: SMS 2FA adds only a negligible amount of security, if your company does 2FA via SMS you're doing nothing more than lulling your users into a false sense of security. Don't do it. Support proper 2FA. (And while you're at it, allow your users to decide how much they care about their account. Don't make the decision for them.)
> Reminder: SMS 2FA adds only a negligible amount of security
I would disagree. Obviously, there are better approaches, but consider basic password auth on desktop, that is easily exploitable en masse by botnets. if you add 2FA via SMS, you would need to exploit both devices (or attack SS7, transfer number or some other trick) and match infos from these devices. Can be done in targetted attack, but harder in en masse botnet attacks.
SMS 2FA is, at best, just adding a little hassle for the hacker. If it's not a targeted attack, there's a chance that the extra effort means they'll move on, but that won't stop any remotely determined hacker.
I tried to set up a Twilio number specifically to handle these services that demand SMS for login.
Weirdly it only works for a minority of services, I expect many use Twilio to send their auth texts and Twilio blocks sending these to their own numbers?
Too many services use phone numbers as the keys to the kingdom. It's a convenient and stable identifier, but holy shit it's not designed for security at all.
It's neither convenient nor stable for anyone moving between countries either. When given the choice between a service that uses my phone number as my permanent user identifier and one that uses my email, I'll always go for the latter.
Unfortunately, big parts of the industry seem to be headed the other direction.
This is about complete takeover of SMS for a phone number.
The threat model is beyond 2FA, imagine being able to impersonate anyone over text.
Social engineering gone to the next level. This isn't about just taking over accounts, it is about taking over a huge chunk of someone's social existence.
It is not stable in the least for millions of Americans, especially those who live in poverty (I'm not sure about the rest of the world). Phones are lost or stolen, phone numbers changed because of being harassed by debt collectors, ex-partners, current partners, etc. And if it isn't stable, it isn't convenient.
Yep, I hate this with a passion. I deprecated text messaging a decade ago, too. Anyone who thinks they can reach me by some ancient 140-character mobile-operator-controlled service, it's their fault for not getting with the beat in 2021.
It’s worth pointing out that often LOA forms ask for a PIN, usually the same PIN as would be required to check voicemail. A better telecom company might make the PIN something harder to remember but enforcing such things would also make it harder to switch carriers, particularly if it replaced today’s standard forms of ID checks.
It’s better to assume that until phone numbers can be locked and unlocked the way domains can, with a random authorization code only accessible by real offline 2FA (though not all domain providers require it), and with the option of completely encrypted end-to-end texting (RCS?), well, then SMS won’t really be all that secure.
My reading of this article suggests that the PIN requirement for number porting is bypassed in this forwarding scenario, since this method is claimed to be distinct from simjacking. That is, the number hasn't been ported by the FCC's guidelines, although I didn't glean exactly how that's happening by these retail providers.
Yes. There's nothing special about a mobile phone number when it comes to SMS delivery. The underlying infrastructure company given in the article, Bandwidth, provides phone number provisioning and bulk service for Google's Voice product. On-net (one number hosted by Bandwidth to another number hosted by Bandwidth) might be slightly more of a hurdle to intercept or redirect but off-net is fairly trivial.
Heck, even with "port lock" enabled on a Google Voice number, that is the barest of security against an attacker who has any kind of access better than "retail store employee." Working for a telco with access to our back-end port system, access several other people had, I could forcibly acquire a number by simply checking a box that said I had verified a written LOA even if the losing carrier responded with code 6P ("port-out protection enabled").
So, yes, you're likely sitting in a security-by-obscurity, or at least security-by-slightly-more-difficult-than-someone-else, situation.
It would be useful to understand the flow of an SMS from a source to a Google voice number. While you can't port a Google voice number, it seems like if you can intercept an SMS from a source before it gets to Google then this technique will work.
A useful strategy to help against this in any case is to use a different email address for every online service. Hackers generally can't initiate an account password reset if they don't know the account.
Also if you use a different phone number for account security than your public one then it's a lot harder for them to know what SMS to intercept. Security by obscurity sucks but in this world it may be your only practical choice.
So, when my nontechnical friends ask me what they should be using for 2FA, I'm kind of at a loss what to tell them. It's either a false sense of security (e.g., SMS), or too complicated for them (Yubikey).
WebAuthn, so, a Yubikey would work for that, but also cheaper products (the keywords for a product search are FIDO Security Key) which are similarly capable.
If they have a nice phone (modern iPhone or Android phone that is able to recognise who you are by fingerprint or facial recognition ought to be enough) that can do WebAuthn too, the actual recognition remains local to your device (so you're not giving some mysterious entity your face or fingerprint).
I'm assuming since they're "nontechnical" that you mean as a user, the user experience for WebAuthn is trivial, one touch. You do this to enroll the Yubikey, and then you do it whenever you need to prove who you are to the same site. It's entirely phishing proof, the credentials can't be stolen, you can keep one on your keyring or just leave it plugged into a personal PC all the time, it has excellent privacy properties, the biggest problem is too few sites do WebAuthn but Google and Facebook do, so that's a good start for non-technical people.
Which brings me to the other side, if your non-technical friends are wondering what their organisation should mandate, then again, WebAuthn, but this time I admit it's somewhat complicated. Somebody is going to need to at least research what product suits the userbase, and check boxes in the software they use, and at worst they need to do a bunch of software development. It's not crazy hard, but it's a bit trickier than yet another stupid password rule requirement. However unlike requiring passwords to contain at least two state birds and the name of an African country requiring WebAuthn will actually make you safer.
What are the problem parts with yubikey? I've got a Fetian Epass and it feels like the most natural way of doing auth - here's a key, it goes on my keyring next to my locker key or my car key if I had one, I use it to log in like putting a key in a lock.
I believe the practical solution for many people is to switch the 2FA to an authenticator on-your-phone code generator, which someone cannot hack easily.
Most important account / banks / etc services now offer this option.
The only thing is, though, make sure to keep backups of the codes you use to initialize the authenticator app, because for some services there is no recovery if you lose your phone or don't have backups.
Don't use Phone number based 2Factor or if you must use a number, keep it to an app (eg, Google Voice) and don't forward your Google Voice texts to your phone's number.
Basically, avoid using your carrier provided phone number for anything related to an account.
I note, however, that this attack seems to only be possible on VOIP routable numbers, and it’s my experience that banks, etc, will not allow you to use VOIP routable numbers for 2FA.
That’s definitely not the case for a naive implementation of sms 2fa as would be done by likely any dev using Twilio, etc.
Yes, this is only for VOIP. The author of article is dishonest. He mentioned that his TMobile phone number got hacked but I am willing to bet that this is a marketing .. .
Lots of comments here along the lines of "SMS 2FA is bad", but hell, if the phone companies had an appropriate level of liability here (which should be a shit ton), this should be impossible.
And it's not just about 2FA, most of humanity expects that if someone else texts them, those texts will go to their phone and only their phone unless they've given explicit verifiable consent.
I mean, in this case all the hacker did was fill out a form and say pretty please. I hope phone companies that allow this get sued.
This would also be impossible if services stopped demanding your phone number to make an account.
This is a growing trend in consumer services, and it's a privacy nightmare.
Imagine if they demanded your SSN to sign up? A phone number is no different or less sensitive a unique identifier, perhaps even moreso these days.
There are widespread reports of delivery businesses selling their phone number databases (with associated credit card suffixes, delivery addresses, order history, et c) to large advertising companies for data mining.
Providing your direct cell number to an app is basically like providing your home address and a bunch of other sensitive data. Don't do it, or make a burner gmail account to get a disposable Google Voice number for each account that you must have that demands a phone number. Then, that number isn't reused and an attacker that obtains your mobile number can't attack your login method for other apps.
Reusing phone numbers is about as bad as reusing passwords.
But what is an appropriate level of liability here? Phone companies never signed up to be the guardians of our digital lives, and the tech industry at large has just built a castle on shakey foundations.
And there are obvious trade-offs here, if we make number portability harder, it means you're somewhat hostage to your phone provider.
Isn't this easy solvable with additional SMS token approval as mentioned in article?
> "orsman added that, effective immediately, Sakari has added a security feature where a number will receive an automated call that requires the user to send a security code back to the company, to confirm they do have consent to transfer that number. As part of another test, Lucky225 did try to reroute texts for the same number with consent using a different service called *Beetexting*;
the site already required a similar automated phone call to confirm the user's consent. This was in part "to avoid fraud," the automated verification call said when Motherboard received the call. Beetexting did not respond to a request for comment."
But it seems that the entire system is globally infested with security holes. Is this applicable worldwide or just limited to one country ?
Sakari just was dumb, and deserves the bad press. I've built similar products and we launched with the "phone call to verify" feature to specifically prevent this type of abuse.
Based on the high level description given in the article it seems to be related to enum lookup or net number. It's basically a kind of DnS lookup for phone numbers used for sms routing. Also this is used for routing sms that are belonging to a user to an application (in case you want to reroute your sms to an application). The company will change the enum code for the number to a.code that belong to the company and reroute the messages to its services.
So the hack is not really a hack in a sense that it work as intendant, the safety net is missing though. The company operating the enum is supposed to check the legitimacy of the change.
That’s crazy that there is no verification system in place allowing the user to approve the forwarding.
Years ago I asked my carrier to not port or forward without me being physically present at a store. Maybe I should test them out to see if that’s still the case.
Regardless, I don’t use SMS MFA for anything important and even when I do, I have a 32 character password to go along with it.
> While adding a number, Sakari provides the Letter of Authorization for the user to sign. Sakari's LOA says that the user should not conduct any unlawful, harassing, or inappropriate behaviour with the text messaging service and phone number.
But as Lucky225 showed, a user can just sign up with someone else's number and receive their text messages instead.
So this means that the only protection from attacks like this is the law, and not a technical or operational hurdle like going through an AT&T hotline to get sim swapping going.
This is bad news because following the law isn't a top priority when trying to hack someone.
What I would find really interesting is if someone used this exploit to hack into the accounts of Sakari staff and sabotaged their service, deleting all their infrastructure from their cloud hosting provider etc. I'm sure Sakari would take this security hole more seriously if their own C-suite fell victim to it.
Weird. The whole idea behind the whole company is to send SMSes on behalf of its customers, if I understood the article correctly. So why would they need to muck about with reassigning the phone numbers of SMS recipients in the first place?
My strategy is to have a second phone that has Authenticator and is also the phone for any SMS based 2FA.
The phone is locked in a file cabinet when not in use and never leaves my desk.
An extra phone only costs me $10/month. Well worth the peace of mind.
These hackers have so much time in their hands , that they can understand this technology more than the creators and abuse them, amazing how hacker culture works.
Damn lies. Damn lies. The attack vector only works for VOIP or Toll Free Numbers. The upstream agreements already block Mobile numbers. This is paid marketing for his company.
Not sure why this isn't higher up. This is crucial information showing this is FUD.
There are still grave vulnerabilities in mobile provider SMS (2FA or otherwise) due to how easy it is for a dedicated attacker to SIM swap, but this particular claim is completely misleading.
okay so how did he manage to pull this off and is this still possible? how would you protect yourself against this attack (i dont understand how it works)
The details are in the article, but essentially the attacker used a 3rd party bulk SMS service that allows it's users to use their own number and routes sms messages to said service provider.
The attacker instead used the cell number of the author of the article, and supplied a fraudulent letter authorizing the re-routing of text messages through the bulk SMS service.
The attacker works for a service, which purports to verify the routing and carrier settings for a given mobile phone number; I expect that their solution periodically checks the results and issues an alert if the results differ from a known valid value.
Mandatum|5 years ago
I'd love to see companies allow for opt in additional security measures, like banks or telco's calling me - having a verbal password to confirm things, that level of security seems to only be available to VIPs.
tehwebguy|5 years ago
heroprotagonist|5 years ago
There's no opt-out for it, and no enforcement of the permission requirement. Their support had me snail mail a letter to some PO box. I never got a response.
And now they're going to start outright selling their customer activity after forcibly un-opt-outing* everyone who opted out in their privacy settings previously..
*un-opt-outing -- ??? I don't know what to call this. It's not 'opting-in' since nobody has a choice.. 'resetting user selection without notification or consent' seems too mild and wordy.
cskinner|5 years ago
Regulators (particularly in Europe) soon put a stop to that to promote competition. While this was good, the majority of regulators failed to put in a consumer protection mechanism to stop identity theft through account stealing.
The article describes a more insiduous attack, as the mobile account is still active (hiding the existence of the attack from the user), but the message destination has been rerouted, making all the linked accounts that use SMS as their 2FA also vulnerable.
lxgr|5 years ago
In other countries, the two channels are more closely coupled (but SIM swap and/or number porting attacks are still possible, depending on the provider‘s security protocols).
tsjq|5 years ago
contingencies|5 years ago
katbyte|5 years ago
fmajid|5 years ago
jfrunyon|5 years ago
zajio1am|5 years ago
I would disagree. Obviously, there are better approaches, but consider basic password auth on desktop, that is easily exploitable en masse by botnets. if you add 2FA via SMS, you would need to exploit both devices (or attack SS7, transfer number or some other trick) and match infos from these devices. Can be done in targetted attack, but harder in en masse botnet attacks.
albertgoeswoof|5 years ago
“Sorry you’re locked out forever, good luck lol”
Is not a response you can give to them.
caskstrength|5 years ago
fckthisguy|5 years ago
SMS 2FA is, at best, just adding a little hassle for the hacker. If it's not a targeted attack, there's a chance that the extra effort means they'll move on, but that won't stop any remotely determined hacker.
twiddling|5 years ago
tbodt|5 years ago
DyslexicAtheist|5 years ago
thaumasiotes|5 years ago
t-writescode|5 years ago
The numerous emails I get when I log in from a new device serve me pretty well, all things considered
minusSeven|5 years ago
malwarebytess|5 years ago
Feels like the industry needs to push for a dedicated, universal, probably physical, tool for 2FA.
voicedYoda|5 years ago
floss_silicate|5 years ago
Weirdly it only works for a minority of services, I expect many use Twilio to send their auth texts and Twilio blocks sending these to their own numbers?
TameAntelope|5 years ago
It's a really backwards and confusing system, I agree.
bingojess|5 years ago
unknown|5 years ago
[deleted]
ampdepolymerase|5 years ago
the_snooze|5 years ago
lxgr|5 years ago
Unfortunately, big parts of the industry seem to be headed the other direction.
com2kid|5 years ago
The threat model is beyond 2FA, imagine being able to impersonate anyone over text.
Social engineering gone to the next level. This isn't about just taking over accounts, it is about taking over a huge chunk of someone's social existence.
auslegung|5 years ago
It is not stable in the least for millions of Americans, especially those who live in poverty (I'm not sure about the rest of the world). Phones are lost or stolen, phone numbers changed because of being harassed by debt collectors, ex-partners, current partners, etc. And if it isn't stable, it isn't convenient.
lostlogin|5 years ago
I had a miserable time trying to get into Backblaze recently, with even the ability it offered to switch sms providers failing.
The list of valid keys they give you on setup bailed me out eventually, but it took me a while to remember them.
dheera|5 years ago
mikem170|5 years ago
lstamour|5 years ago
It’s better to assume that until phone numbers can be locked and unlocked the way domains can, with a random authorization code only accessible by real offline 2FA (though not all domain providers require it), and with the option of completely encrypted end-to-end texting (RCS?), well, then SMS won’t really be all that secure.
lvs|5 years ago
plank_time|5 years ago
techsupporter|5 years ago
Heck, even with "port lock" enabled on a Google Voice number, that is the barest of security against an attacker who has any kind of access better than "retail store employee." Working for a telco with access to our back-end port system, access several other people had, I could forcibly acquire a number by simply checking a box that said I had verified a written LOA even if the losing carrier responded with code 6P ("port-out protection enabled").
So, yes, you're likely sitting in a security-by-obscurity, or at least security-by-slightly-more-difficult-than-someone-else, situation.
TwoBit|5 years ago
A useful strategy to help against this in any case is to use a different email address for every online service. Hackers generally can't initiate an account password reset if they don't know the account.
Also if you use a different phone number for account security than your public one then it's a lot harder for them to know what SMS to intercept. Security by obscurity sucks but in this world it may be your only practical choice.
angst_ridden|5 years ago
There's got to be a better system.
tialaramex|5 years ago
If they have a nice phone (modern iPhone or Android phone that is able to recognise who you are by fingerprint or facial recognition ought to be enough) that can do WebAuthn too, the actual recognition remains local to your device (so you're not giving some mysterious entity your face or fingerprint).
I'm assuming since they're "nontechnical" that you mean as a user, the user experience for WebAuthn is trivial, one touch. You do this to enroll the Yubikey, and then you do it whenever you need to prove who you are to the same site. It's entirely phishing proof, the credentials can't be stolen, you can keep one on your keyring or just leave it plugged into a personal PC all the time, it has excellent privacy properties, the biggest problem is too few sites do WebAuthn but Google and Facebook do, so that's a good start for non-technical people.
Which brings me to the other side, if your non-technical friends are wondering what their organisation should mandate, then again, WebAuthn, but this time I admit it's somewhat complicated. Somebody is going to need to at least research what product suits the userbase, and check boxes in the software they use, and at worst they need to do a bunch of software development. It's not crazy hard, but it's a bit trickier than yet another stupid password rule requirement. However unlike requiring passwords to contain at least two state birds and the name of an African country requiring WebAuthn will actually make you safer.
lmm|5 years ago
pgsimp|5 years ago
throwaway53453|5 years ago
testfoobar|5 years ago
supernova87a|5 years ago
Most important account / banks / etc services now offer this option.
The only thing is, though, make sure to keep backups of the codes you use to initialize the authenticator app, because for some services there is no recovery if you lose your phone or don't have backups.
techrat|5 years ago
Basically, avoid using your carrier provided phone number for anything related to an account.
0xbadcafebee|5 years ago
Lucky's company has this product that can monitor for the attack, but it won't prevent it: https://okeymonitor.com/
supermatt|5 years ago
I note, however, that this attack seems to only be possible on VOIP routable numbers, and it’s my experience that banks, etc, will not allow you to use VOIP routable numbers for 2FA.
That’s definitely not the case for a naive implementation of sms 2fa as would be done by likely any dev using Twilio, etc.
Also, don’t forget that NIST deprecated SMS 2FA over 5 years ago. Here’s their reasoning: https://www.nist.gov/blogs/cybersecurity-insights/questionsa...
aidenn0|5 years ago
camkego|5 years ago
supermatt|5 years ago
wealthyyy|5 years ago
hn_throwaway_99|5 years ago
And it's not just about 2FA, most of humanity expects that if someone else texts them, those texts will go to their phone and only their phone unless they've given explicit verifiable consent.
I mean, in this case all the hacker did was fill out a form and say pretty please. I hope phone companies that allow this get sued.
sneak|5 years ago
This is a growing trend in consumer services, and it's a privacy nightmare.
Imagine if they demanded your SSN to sign up? A phone number is no different or less sensitive a unique identifier, perhaps even moreso these days.
There are widespread reports of delivery businesses selling their phone number databases (with associated credit card suffixes, delivery addresses, order history, et c) to large advertising companies for data mining.
Providing your direct cell number to an app is basically like providing your home address and a bunch of other sensitive data. Don't do it, or make a burner gmail account to get a disposable Google Voice number for each account that you must have that demands a phone number. Then, that number isn't reused and an attacker that obtains your mobile number can't attack your login method for other apps.
Reusing phone numbers is about as bad as reusing passwords.
Eridrus|5 years ago
And there are obvious trade-offs here, if we make number portability harder, it means you're somewhat hostage to your phone provider.
coding123|5 years ago
Not sure which is why I'm asking.
NiceWayToDoIT|5 years ago
> "orsman added that, effective immediately, Sakari has added a security feature where a number will receive an automated call that requires the user to send a security code back to the company, to confirm they do have consent to transfer that number. As part of another test, Lucky225 did try to reroute texts for the same number with consent using a different service called *Beetexting*; the site already required a similar automated phone call to confirm the user's consent. This was in part "to avoid fraud," the automated verification call said when Motherboard received the call. Beetexting did not respond to a request for comment."
But it seems that the entire system is globally infested with security holes. Is this applicable worldwide or just limited to one country ?
fatnoah|5 years ago
neo2006|5 years ago
e-clinton|5 years ago
Years ago I asked my carrier to not port or forward without me being physically present at a store. Maybe I should test them out to see if that’s still the case.
Regardless, I don’t use SMS MFA for anything important and even when I do, I have a 32 character password to go along with it.
naebother|5 years ago
Um, what?!
meibo|5 years ago
This is bad news because following the law isn't a top priority when trying to hack someone.
wyqydsyq|5 years ago
CRConrad|5 years ago
Jan454|5 years ago
matijs|5 years ago
GameOfKnowing|5 years ago
intrasight|5 years ago
markdown|5 years ago
nozx|5 years ago
trashface|5 years ago
unknown|5 years ago
[deleted]
kwhitefoot|5 years ago
neo_neo|5 years ago
wealthyyy|5 years ago
meowface|5 years ago
There are still grave vulnerabilities in mobile provider SMS (2FA or otherwise) due to how easy it is for a dedicated attacker to SIM swap, but this particular claim is completely misleading.
fatnoah|5 years ago
f430|5 years ago
balls187|5 years ago
The attacker instead used the cell number of the author of the article, and supplied a fraudulent letter authorizing the re-routing of text messages through the bulk SMS service.
The attacker works for a service, which purports to verify the routing and carrier settings for a given mobile phone number; I expect that their solution periodically checks the results and issues an alert if the results differ from a known valid value.