Just got a new phone and migrating Signal was a total disaster. The first thing I did was login to the new phone which is apparently the wrong thing to do because you lose all your old messages and there's no option to import after you login. You have to logout but there is no logout. Gotta uninstall and reinstall, or clear app data. Next I tried the account migration feature which requires using both phones simultaneously, hope you didn't erase your old phone yet. But it doesn't work anyway, it's just broken. So after failing at that several times, then I had to manually make a backup on the old phone, write down a 20 digit numeric code that's only displayed once on screen, transfer the backup file manually, type the code on the new phone, and then it logged me in but silently failed to import the backup. Had to uninstall again and did the exact same steps again and then it finally worked. What a travesty. Security shouldn't come with this kind of UX disaster.
I do think it's too easy to do it the Wrong Way, but I've easily migrated from phone to phone several times now with both techniques and think they alright.
Signal presents both a "create account" or a "migrate account" option on a fresh install, which seems like the best compromise for UX flow.
Also, considering their guarantee is to keep your messages secure, of course you need your old phone. If you wipe your phone that securely contains all your old messages they are gone. That is security.
That said, if you have the backup of your messages and the code, then the developers might be able to debug what caused the migrations to fail if you submit a bug.
Recently got a new phone and the transition was super smooth. Tried to open Signal on the new phone, it somehow knew that I needed to transfer my backup, asked me to scan a QR code and keep the phones close together, a minute or so later, and done. Signal logged me out on the older device when it was finished.
I've learned this lesson before though. Always keep your old phone around while transitioning to a new device in case you need to pull anything off of it.
I recently moved phones and had the opposite experience. The Signal migration went very smoothly. Moving WhatsApp did not. I ended up managing to do it but not before about five attempts, each of which made me reauth my phone number and I hit rate limiting, etc.
Yeah, this is the #1 issue that holds me back from recommending Signal to everyone in my network. It seems like there are just too many ways to end up in an unrecoverable state. Phone died, lost, or stolen? Can't migrate. Phone number changed? Can't restore backup. And so on. Sure would be nice if there was at least a standalone viewer app which would let you view backups even if you can't restore them.
Sounds like you were migrating from one Android phone to another. Try the same with iOS and you wouldn't even have any backup option to try out. The loss of older messages is the worst thing that a chat app could have, but it's still marketed as a general purpose chat app.
Sounds like you footgunned yourself. When starting Signal for the first time it gives the option to register OR migrate old account. There's no UI in the world that can totally protect users from themselves.
Edit: I’m out of date for most of this. Way to go Signal team! See comments below my complainy post.
Forcing me to recite my PIN every few weeks is one of the most irritating UX I’ve seen.
I’m sure there’s some smart engineer explanation in terms of cryptography and being unable to recover it if lost. But just let me disable it if I’m okay losing my entire account if I forget it.
Also last time I checked the binding of a phone number to Signal was really bad. I had a friend abandon Signal and I could never sms them ever again.
Your last paragraph provides an insight I hadn't had; that is a serious problem. I personally appreciate the PIN repetition, as it encouraged me to use a code I had not used before, since I knew If be reminded to practice it now and then.
Signal's message requests is a major selling point for me.
I recently created a Signal account after having a bad experience in which someone used my phone number to harass me with SMS/calls (every time, from a different VOIP number). It's frustrating that, even in this era, there is no good way to filter out malicious actors from SMS/call.
It's complicated and can be expensive. Also people that has you real phone number cannot contact you on Signal, so you have to have two accounts, one with your real number to chat with your contacts and another one with the fake one to share with strangers that you don't trust, and as far as I know the Signal app doesn't support two account (yes on Android there are ways to have two instances of the app, on iOS you can't).
To me Telegram is superior to Signal because it gives you both options, use your phone numbers with people that already have you in the contacts, and don't share your number with strangers in a group.
Is it strictly-true that anti-spam algorithms must be hidden to be effective?
It would seem that initially, spammers would have the upper hand, but in the long run, coordinated and open effort against spammers has the potential to pay off.
Furthermore, to have a closed algorithm that cuts off the ability for people to communicate opens the door to institutionalized censorship -- designating a political opponent's communication as spam could limit their ability to be heard.
I've fought spam in other contexts and found that to be true. A change stopped spam until spammers learned what the change was, which made them adapt. The adaption might not be perfect, but it would happen, so prolonging the blocked period was essential. If one can do something that's effective and that the spammers don't understand for months, that's great. If they can spam for days and then blocked for a longer period and that repeats, they'll go away and attack an easier target instead (well, some of them will look back later).
Let me try an argument other than "it's been my experience", though.
Suppose that it's not true. In that case, the anti-spam filter provides a publicly visible definition of spam in its code, and it's one that isn't grounded in the recipients' opinions. If the filter is to be effective, spammers must not be able to adapt their behaviour in a way that users consider spam but isn't according to the definition. Many people have tried to provide such a definition since 1995 and AIUI all have failed. I think the persisten failure indicates that it's extremely unlikely that one can be found, and therefore it's extremely unlikely that a publicly visible spam filter can be as effective as one that can be updated in secret.
> Is it strictly-true that anti-spam algorithms must be hidden to be effective?
Can't answer that but there were, for a long time, services that would run your "targeted emails" through spamassassin et al for you and give you advice on how to tweak them to lower the resultant score.
> Is it strictly-true that anti-spam algorithms must be hidden to be effective?
No. But that "no" is missing some qualifiers. It's not strictly true since you can implement some out-in-open options such as paying money, requiring a phone number, validating your physical identity, delays of use on new account creation, etc, that will make the barrier of entry for spamming much higher. There will still be spam, but it'll become more manageable to manually monitor and take action against. The downsides I feel are obvious based on the service we are talking about.
But I would say it is strictly true that you cannot have a free, anonymous, low barrier to entry service without hiding your anti-spam algorithms. There is fundamentally no difference in such an environment to differentiate a bot from a human besides the message content and patterns in who they message. If bots know what content they can't send or what patterns in messaging to avoid, they can still spam.
It's the same as anti-cheat in a game. Given how easy it is to create a cheat for open source code, it's just way more cost effective to keep it closed.
The rule is not to RELY on security by obscurity. It shouldn't be the last line of defense. But by all means, use it where the ROI makes sense.
Depends how high is your bar for "effective". For example there's nothing secret in rate limiting and IP bans, and these are fundamental techniques.
However, there are plenty of opportunities to catch spammers when they make stupid mistakes. I've worked with web bot spam. I could catch a very popular bot brand by observing that it sent Accept header with content that was implausible for the User-Agent it sent. If the code was public, spammers could see `if (browser == safari && accept != safari-like) { spam }` and fix their header in minutes.
There's also a middle ground where the algorithm is public, but it's powered by secret data: blacklists, databases for classifiers, ML models. From accountability point of view data being secret is as problematic as secret code.
> To keep Signal a free global communication service without spam, we must depart from our totally-open posture and develop one piece of the server in private: a system for detecting and disrupting spam campaigns.
Interesting. Matrix solves this problem by not associating a username with a phone number or email address (unless you opt in to). Does anyone have views on if this will work into the future?
I'm moved ~90 of my friends onto Matrix (along with plenty of group chats), and most of them quite like the Element [1] app (though there are rough edges on iOS I'm hearing).
Signal was superb for a long time, and then received a hefty chunk of funding, and, although I may be wrong, has declined since then, and in fact jumped the shark about a year ago.
They attempted ever more forcefully to make users to set a PIN to protect server-side state; it started with a dialog at the bottom of the screen, obscuring about 20% of the user list, which could not be dimissed, and then after a few weeks progressed to the a full page dialog, which could not be dismissed - rendering the app unusable.
All you saw upon starting was the full page dialog demanding you set a PIN to continue using Signal.
I did not want any server-side state, and so did not set a PIN, and stopped using Signal. After a few weeks, the full-page dialog went away, and I found I could use Signal again.
Signal actually blocked usage of the app to force users to adopt unwanted new functionality. It's hard to imagine any app doing well with such mis-management.
I opened a thread discussing the problem on their support/public discussion forum, which was deleted. I also at first opened a bug report on Git, before I understood it was all intentional, this was also deleted.
Since this experience, I've regarded Signal as on the way out, but it's still the best there is right now.
Correct me if I'm wrong, but I believe your comment is misguided.
The PIN is a security option that prevents a SIM-swapping attacker from registering a new device under your phone number unless they know the PIN. You can opt out of it (and it might be opt-in to begin with). You can also easily opt out of PIN reminders. Both of these options are in Settings -> Account.
As for server state - my understanding is that Signal attempts to be zero-knowledge overall, but they definitely store some state on the server. I believe it's encrypted using your private key that's not backed up to the server. Setting the PIN does not change that.
Server state comment aside, it seems your main complaint is about a pop-up PIN entry UI that can be opted out of? I get that it might seem annoying, but it feels like a fairly weak criticism of a messaging platform, certainly not one that should warrant an impression that Signal is "on the way out"?
These all seem like useful changes, but I've never received spam over Signal. I'm glad that solves a problem for someone.
What I have received were creepy messages from long-lost acquaintances who had been suddenly reminded, by virtue of my having installed the Signal app, that I existed and that I was attending a security-focused event at the time.
And last I heard, Signal was adamantly opposed to removing these notifications. That's a big problem, IMHO.
They usually oppose that because it helps them build engagement. People discovering each other helps increase app usage.
I don't really like it either but all the other big players do this and it does really help. If signal doesn't it'll be even harder to get decent marketshare.
There's this weird situation where fighting something means adopting some of their tactics even if you don't agree with them, otherwise it's not viable to compete. I think Signal is still better than eg WhatsApp but they are pretty similar in some ways because less privacy-sensitive users expect certain features like auto discovery. And without them it'll always remain a fringe thing.
Is there a reason, when they have control of the client software and the protocol, that they don't introduce computational cost to establishing contact?
Like, if I send a message to a particular number, and their client receives a message from me for the first time, it responds with "prove that it's worth it for you to message me: find an `n` character string that together with `random string` hashes to 5 leading zeroes", or something like that.
Somehow I imagine that doing a few seconds of computation per initiated first message is unproblematic, but doing it for thousands or millions of numbers starts to be a problem.
If the server is mediating that proof-of-work system, then the server is building a list of contacts, and not doing that is the #2 goal for the whole Signal project.
If the client is doing it, you need to come up with a system that is resilient to people using a bunch of different devices, getting new phones, etc.
This is a theme about building Signal: they're doing everything on hard mode, because they start from the premise that they don't get to know everyone's contacts. Virtually every other mainstream messaging system, including the ground-up E2E encrypted ones, keep a complete plaintext contact database serverside.
The issue with these sorts of HashCash schemes is that legitimate users need to PoW their messages on a battery powered phone with a basic CPU and GPU. Spammers can use big, mains-powered GPUs, or better yet, botnets full of stolen CPU-time from mains-powered desktops/servers/etc.
I think with the current climate crisis we should be wary of introducing more unnecessary computation. The internet already uses a substantial amount of the world's energy.
We should try to bring this down, not up. And bitcoin showed how quickly the technology catches up with such requirements by using GPUs, ASICs etc.
This is great! I haven't ever received Signal spam but have been getting say 5 junk SMS and 15 junk calls each week. They must be doing something right.
Please let people allow Signal to automatically save their media to their photo roll (or whatever ios calls it). The number of people I have met who require this and left Signal _after starting to use it_ is astonishing.
This surprises me greatly, as it's my biggest annoyance with WhatsApp and Google Hangouts or whatever they call it now (I've not yet accepted the new app, so it still calls itself hangouts, but clearly is using the new chat app backend...)
I despise that it shows up in my main photos whenever someone sends a photo/gif/vid on these apps. Telegram segregates its local storage to a separate folder that Android doesn't seem to grab for the photo-roll and I want it this way; my photoroll should only be __my__ pictures, not random stuff my friends send me.
I talk to different people on different apps in very different ways; the ways I joke with my best friends is absolutely not appropriate for the ways I talk with the older people in my life, and vice versa. And it's absolutely not about me testing these waters; that's not for an App/OS to decide, that's for me to decide, and mixing these photos into one central location makes it more annoying and precarious for me to accidentally tap and share something I never meant to because the App/OS saved it to a central location that it never should have.
I don't even agree with "give people the option"; photocell/gallery/whatever are for __my pictures__. Downloads are for just that; Downloads. Apps should stay away from the pre-defined user space and put it in correctly named locations.
I realize I'm "reeeeee-ing" here, but it's honestly a huge pet peeve of mine to see all the photos from the chat apps I use to keep in touch with a small handful of people dump themselves into a user space that isn't theirs.
Telegram does it right, and apparently Signal does too -- I cannot understand why other chat-apps feel the need to behave otherwise; finding said photos is a matter of a single system call, so it's not like discoverability is a challenge for them. The users __will learn__, it's not hard to summon the file manager even with locked down permissions.
Do you think that's really the reason they are abandoning it? I think the bigger issue is probably that most of their friends are not on there and they use it less frequently until they abandon it.
I’m interested in applying to and working at Signal. But is there a chance I would get put on a list of some sort making my life more difficult by the government?
No one can give you a 100% answer, but I can relay some personal anecdotes which might help.
I have consulted for the Signal Foundation in the past. None of the employees or consultants I worked with indicated that they'd had any problems.
Around 2010 timeframe, I hosted the RedPhone server backend on some spare colo hardware I owned. RedPhone, along with TextSecure, were the predecessors to Signal Messenger.
I have never had any (signal related) issues with the government.
First impressions? I left Signal years ago after using it for years, but I have one question - do they still actively hide that by long pressing Send button you can force send SMS instead Signal message or they show it to user at first launch somehow?
Not looking forward to this... I use bots and I bet this will make them stop working at some inconvenient time.
I wish I could put up collateral, say stablecoin in a smart contract, where Signal can ban my account and keep the money if I abuse, but otherwise do not break my automation.
[+] [-] modeless|4 years ago|reply
[+] [-] aeturnum|4 years ago|reply
Signal presents both a "create account" or a "migrate account" option on a fresh install, which seems like the best compromise for UX flow.
Also, considering their guarantee is to keep your messages secure, of course you need your old phone. If you wipe your phone that securely contains all your old messages they are gone. That is security.
That said, if you have the backup of your messages and the code, then the developers might be able to debug what caused the migrations to fail if you submit a bug.
[+] [-] pocketlim|4 years ago|reply
I've learned this lesson before though. Always keep your old phone around while transitioning to a new device in case you need to pull anything off of it.
[+] [-] scotu|4 years ago|reply
[+] [-] rlpb|4 years ago|reply
[+] [-] diggernet|4 years ago|reply
[+] [-] AnonC|4 years ago|reply
[+] [-] LogonType10|4 years ago|reply
[+] [-] stereoradonc|4 years ago|reply
It's easier to get seduced by tall sounding words like "cryptography", censorship, "security", anonymity etc. Those are just buzz words.
Try something that works seamlessly across platforms.
[+] [-] Waterluvian|4 years ago|reply
Forcing me to recite my PIN every few weeks is one of the most irritating UX I’ve seen.
I’m sure there’s some smart engineer explanation in terms of cryptography and being unable to recover it if lost. But just let me disable it if I’m okay losing my entire account if I forget it.
Also last time I checked the binding of a phone number to Signal was really bad. I had a friend abandon Signal and I could never sms them ever again.
[+] [-] Aulig|4 years ago|reply
I've disabled it as I store my PIN in my password manager.
[+] [-] JasonFruit|4 years ago|reply
[+] [-] gdrift|4 years ago|reply
[+] [-] asethos|4 years ago|reply
I recently created a Signal account after having a bad experience in which someone used my phone number to harass me with SMS/calls (every time, from a different VOIP number). It's frustrating that, even in this era, there is no good way to filter out malicious actors from SMS/call.
P.S. You can sign up with Signal via a VOIP number to avoid sharing your real phone number. See: https://theintercept.com/2017/09/28/signal-tutorial-second-p...
[+] [-] dheera|4 years ago|reply
Really, I don't want to give out a phone number to everyone.
[+] [-] alerighi|4 years ago|reply
To me Telegram is superior to Signal because it gives you both options, use your phone numbers with people that already have you in the contacts, and don't share your number with strangers in a group.
[+] [-] slownews45|4 years ago|reply
[+] [-] ISL|4 years ago|reply
It would seem that initially, spammers would have the upper hand, but in the long run, coordinated and open effort against spammers has the potential to pay off.
Furthermore, to have a closed algorithm that cuts off the ability for people to communicate opens the door to institutionalized censorship -- designating a political opponent's communication as spam could limit their ability to be heard.
[+] [-] Arnt|4 years ago|reply
Let me try an argument other than "it's been my experience", though.
Suppose that it's not true. In that case, the anti-spam filter provides a publicly visible definition of spam in its code, and it's one that isn't grounded in the recipients' opinions. If the filter is to be effective, spammers must not be able to adapt their behaviour in a way that users consider spam but isn't according to the definition. Many people have tried to provide such a definition since 1995 and AIUI all have failed. I think the persisten failure indicates that it's extremely unlikely that one can be found, and therefore it's extremely unlikely that a publicly visible spam filter can be as effective as one that can be updated in secret.
[+] [-] zimpenfish|4 years ago|reply
Can't answer that but there were, for a long time, services that would run your "targeted emails" through spamassassin et al for you and give you advice on how to tweak them to lower the resultant score.
[+] [-] xboxnolifes|4 years ago|reply
No. But that "no" is missing some qualifiers. It's not strictly true since you can implement some out-in-open options such as paying money, requiring a phone number, validating your physical identity, delays of use on new account creation, etc, that will make the barrier of entry for spamming much higher. There will still be spam, but it'll become more manageable to manually monitor and take action against. The downsides I feel are obvious based on the service we are talking about.
But I would say it is strictly true that you cannot have a free, anonymous, low barrier to entry service without hiding your anti-spam algorithms. There is fundamentally no difference in such an environment to differentiate a bot from a human besides the message content and patterns in who they message. If bots know what content they can't send or what patterns in messaging to avoid, they can still spam.
[+] [-] teawrecks|4 years ago|reply
The rule is not to RELY on security by obscurity. It shouldn't be the last line of defense. But by all means, use it where the ROI makes sense.
[+] [-] pornel|4 years ago|reply
However, there are plenty of opportunities to catch spammers when they make stupid mistakes. I've worked with web bot spam. I could catch a very popular bot brand by observing that it sent Accept header with content that was implausible for the User-Agent it sent. If the code was public, spammers could see `if (browser == safari && accept != safari-like) { spam }` and fix their header in minutes.
There's also a middle ground where the algorithm is public, but it's powered by secret data: blacklists, databases for classifiers, ML models. From accountability point of view data being secret is as problematic as secret code.
[+] [-] jakecopp|4 years ago|reply
Interesting. Matrix solves this problem by not associating a username with a phone number or email address (unless you opt in to). Does anyone have views on if this will work into the future?
I'm moved ~90 of my friends onto Matrix (along with plenty of group chats), and most of them quite like the Element [1] app (though there are rough edges on iOS I'm hearing).
[1]: https://element.io/
[+] [-] vlmutolo|4 years ago|reply
[+] [-] MaxGanzII|4 years ago|reply
They attempted ever more forcefully to make users to set a PIN to protect server-side state; it started with a dialog at the bottom of the screen, obscuring about 20% of the user list, which could not be dimissed, and then after a few weeks progressed to the a full page dialog, which could not be dismissed - rendering the app unusable.
All you saw upon starting was the full page dialog demanding you set a PIN to continue using Signal.
I did not want any server-side state, and so did not set a PIN, and stopped using Signal. After a few weeks, the full-page dialog went away, and I found I could use Signal again.
Signal actually blocked usage of the app to force users to adopt unwanted new functionality. It's hard to imagine any app doing well with such mis-management.
I opened a thread discussing the problem on their support/public discussion forum, which was deleted. I also at first opened a bug report on Git, before I understood it was all intentional, this was also deleted.
Since this experience, I've regarded Signal as on the way out, but it's still the best there is right now.
[+] [-] leevlad|4 years ago|reply
The PIN is a security option that prevents a SIM-swapping attacker from registering a new device under your phone number unless they know the PIN. You can opt out of it (and it might be opt-in to begin with). You can also easily opt out of PIN reminders. Both of these options are in Settings -> Account.
As for server state - my understanding is that Signal attempts to be zero-knowledge overall, but they definitely store some state on the server. I believe it's encrypted using your private key that's not backed up to the server. Setting the PIN does not change that.
Server state comment aside, it seems your main complaint is about a pop-up PIN entry UI that can be opted out of? I get that it might seem annoying, but it feels like a fairly weak criticism of a messaging platform, certainly not one that should warrant an impression that Signal is "on the way out"?
[+] [-] smusamashah|4 years ago|reply
[+] [-] myself248|4 years ago|reply
What I have received were creepy messages from long-lost acquaintances who had been suddenly reminded, by virtue of my having installed the Signal app, that I existed and that I was attending a security-focused event at the time.
And last I heard, Signal was adamantly opposed to removing these notifications. That's a big problem, IMHO.
[+] [-] GekkePrutser|4 years ago|reply
I don't really like it either but all the other big players do this and it does really help. If signal doesn't it'll be even harder to get decent marketshare.
There's this weird situation where fighting something means adopting some of their tactics even if you don't agree with them, otherwise it's not viable to compete. I think Signal is still better than eg WhatsApp but they are pretty similar in some ways because less privacy-sensitive users expect certain features like auto discovery. And without them it'll always remain a fringe thing.
[+] [-] Tistron|4 years ago|reply
Why is this not done?
[+] [-] tptacek|4 years ago|reply
If the client is doing it, you need to come up with a system that is resilient to people using a bunch of different devices, getting new phones, etc.
This is a theme about building Signal: they're doing everything on hard mode, because they start from the premise that they don't get to know everyone's contacts. Virtually every other mainstream messaging system, including the ground-up E2E encrypted ones, keep a complete plaintext contact database serverside.
[+] [-] kijiki|4 years ago|reply
[+] [-] GekkePrutser|4 years ago|reply
We should try to bring this down, not up. And bitcoin showed how quickly the technology catches up with such requirements by using GPUs, ASICs etc.
[+] [-] wpietri|4 years ago|reply
[+] [-] tandav|4 years ago|reply
[+] [-] noja|4 years ago|reply
Please let people allow Signal to automatically save their media to their photo roll (or whatever ios calls it). The number of people I have met who require this and left Signal _after starting to use it_ is astonishing.
[+] [-] csydas|4 years ago|reply
I despise that it shows up in my main photos whenever someone sends a photo/gif/vid on these apps. Telegram segregates its local storage to a separate folder that Android doesn't seem to grab for the photo-roll and I want it this way; my photoroll should only be __my__ pictures, not random stuff my friends send me.
I talk to different people on different apps in very different ways; the ways I joke with my best friends is absolutely not appropriate for the ways I talk with the older people in my life, and vice versa. And it's absolutely not about me testing these waters; that's not for an App/OS to decide, that's for me to decide, and mixing these photos into one central location makes it more annoying and precarious for me to accidentally tap and share something I never meant to because the App/OS saved it to a central location that it never should have.
I don't even agree with "give people the option"; photocell/gallery/whatever are for __my pictures__. Downloads are for just that; Downloads. Apps should stay away from the pre-defined user space and put it in correctly named locations.
I realize I'm "reeeeee-ing" here, but it's honestly a huge pet peeve of mine to see all the photos from the chat apps I use to keep in touch with a small handful of people dump themselves into a user space that isn't theirs.
Telegram does it right, and apparently Signal does too -- I cannot understand why other chat-apps feel the need to behave otherwise; finding said photos is a matter of a single system call, so it's not like discoverability is a challenge for them. The users __will learn__, it's not hard to summon the file manager even with locked down permissions.
[+] [-] dewey|4 years ago|reply
[+] [-] kemayo|4 years ago|reply
I certainly turn it off the moment I set up WhatsApp on any device -- why would I want random memes mixed into my photos?
[+] [-] diebeforei485|4 years ago|reply
But I have issues with the wisdom of such a feature, given that we're entering the age of client-side CSAM scanning.
[+] [-] farmerstan|4 years ago|reply
[+] [-] kijiki|4 years ago|reply
I have consulted for the Signal Foundation in the past. None of the employees or consultants I worked with indicated that they'd had any problems.
Around 2010 timeframe, I hosted the RedPhone server backend on some spare colo hardware I owned. RedPhone, along with TextSecure, were the predecessors to Signal Messenger.
I have never had any (signal related) issues with the government.
[+] [-] Markoff|4 years ago|reply
[+] [-] Canada|4 years ago|reply
I wish I could put up collateral, say stablecoin in a smart contract, where Signal can ban my account and keep the money if I abuse, but otherwise do not break my automation.
[+] [-] kijiki|4 years ago|reply
[+] [-] awiuerqwoieru|4 years ago|reply