I guess I can copy over a comment I made when this previously made rounds:
I have a few problems with this. The short summary of these claims is “APT checks signatures, therefore downloads for APT don’t need to be HTTPS”.
The whole argument relies on the idea that APT is the only client that will ever download content from these hosts. This is however not true. Packages can be manually downloaded from packages.debian.org and they reference the same insecure mirrors. At the very least Debian should make sure that there are a few HTTPS mirrors that they use for the direct download links.
Furthermore Debian also provides ISO downloads over the same HTTP mirrors, which are also not automatically checked. While they can theoretically be checked with PGP signatures it is wishful thinking to assume everyone will do that.
Finally the chapter about CAs and TLS is - sorry - baseless fearmongering. Yeah, there are problems with CAs, but deducing from that that “HTTPS provides little-to-no protection against a targeted attack on your distribution’s mirror network” is, to put it mildly, nonsense. Compromising a CA is not trivial and due to CT it’s almost certain that such an attempt will be uncovered later. The CA ecosystem has improved a lot in recent years, please update your views accordingly.
The other big problem is that people can see what you're downloading. Might not be a big deal but consider:
1. You're in China and you download some VPN software over APT. A seemingly innocuous call to package server is now a clear violation of Chinese law.
2. Even in the US, can leak all kinds of information about your work habits, what you're working on, etc.
3. If it's running on a server, it could leak what vulnerable software you have installed or what versions of various packages you're running to make exploiting known vulnerabilities easier
Also, their claim that HTTPS doesn’t hide the hosts that you are visiting is about to not be true. Encrypted SNI is now part of the TLS 1.3 RFC, so HTTPS will actually hide one’s internet browsing habits quite well. The only holes in privacy left on the web’s stack are in DNS.
>Packages can be manually downloaded from packages.debian.org and they reference the same insecure mirrors.
That's a reasonable complaint. I think it would make sense for the individual packages to be signed as well (and checked during install). This way you'd get a warning if you install an untrusted package regardless of the source. I'm not sure why it doesn't work that way.
>Furthermore Debian also provides ISO downloads over the same HTTP mirrors, which are also not automatically checked. While they can theoretically be checked with PGP signatures it is wishful thinking to assume everyone will do that.
I do do that. If you care about such an attack vector why wouldn't you? And if you don't, why should Debian care for you? There are plenty of mirrors for Debian installers, I often get the mine from bittorent, trusting a PGP sig makes much more sense than relying on HTTPS for that IMO.
>Finally the chapter about CAs and TLS is - sorry - baseless fearmongering.
I don't think it's baseless (we have a long list of shady CAs and I'm sure may government agencies can easily generate forged certificates) but it's rather off-topic. Their main argument is that the trust model of HTTPs doesn't make sense for APT, if that's true then whether or not HTTPS is potentially hackable is irrelevant.
>Compromising a CA is not trivial and due to CT it’s almost certain that such an attempt will be uncovered later. The CA ecosystem has improved a lot in recent years, please update your views accordingly.
This is simply not true. Governments can simply compel your CA to do as they want. Not to mention that "uncovered later" is pretty damn worthless.
That said, I do agree that there should be HTTPS mirrors.
The justifications for why APT does not use HTTPS ( by default, it is possible to add https transport ) are just mind blowing. It is however not at all surprising considering how broken Debians secure package distribution methodology is -- I'm saying it as someone who had to implement workarounds for it for a company that was willing to spend significant amount of resources on making it work.
Here's are some low level gems:
1. I have installed package X. I want to validate that the files that are listed in a manifest for package X have not changed on a host.
APT answer: Handwave! This is not a valid question. If you are asking this question you already lost.
2. I want to have more than one version of a package in a distribution.
APT answer: Handwave! You don't need it. You can just have multiple distributions! It is because of how we sign things - we sign collections!
3. I want to have a complicated policy where some packages are signed, some are not signed, and some are signed with specific keys.
APT answer: Handwave! You should have all or nothing policy! Nothing policy, actually, because we mostly just sign collections, rather than the individual packages
The arguments are correct. APT does not need HTTPS to be secure. That said, if APT was designed today I'm sure it would use HTTPS. It's now the default things to do, and Let's Encrypt makes it free and easy.
However Debian, where APT is from, relies on the goodwill of various universities and companies to host their packages for free. I can see that they don't want to make demands on a service they get for free, when HTTPS isn't even necessary for the use case.
Also since APT and Debian was created in the pre-universal HTTPS days, it does things like map something.debian.org to various mirrors owned by different parties. That makes certificate handling complicated.
TM1: attacker does not posses zerodays to installed software
TM2: attacker possesses speccific (perhaps OS, perhaps library, perhaps userland) zerodays, usage of which (including unsuccesful attempts) should be minimized to avoid detection
in TM1: it's ok to use HTTP in the clear, as long as signatures are verified
in TM2: everything should be fetched over encrypted HTTPS, since HTTP would leak information about available attack surface
EDIT: not only would this increase security by not revealing what a user installs (perhaps download some noise as well such that it becomes harder to detect what a user is installing?), it could also improve security by turning the APT servers into honeypots, so that monitoring these can reveal zerodays...
>I can see that they don't want to make demands on a service they get for free, when HTTPS isn't even necessary for the use case
you could imagine a situation where https would be optional for APT mirrors. Then the package manager would have a config flag to use any mirror or only https-enabled mirrors (probably enabled by default). This would allow to use https without creating any demands to organizations that host those mirrors - if they can they would enable it, but it would not be required. The https-enabled hosts could also provide plain http for backwards compatibility.
The argument isn’t correct, what does a user do when the download is damaged by an injection? A re-download results in exactly the same tampered with file.
Yeesh, as someone who has had to troubleshoot more than a few times HTTP payloads that were mangled by shitty ISP software to inject ads or notifications I would love HTTPS as a default to prevent tampering. I get the arguments against it, but I have 100% seen Rogers in Canada fuck up HTTP payloads regardless of MIME types while rolling out "helpful" bandwidth notifications. Signatures will tell you yes this is corrupt but end-to-end encryption means that the person in the middle can't even try.
Likewise, integrity of the download is the primary reason I’ve switched downloads to HTTPS too. The argument that singed downloads is enough fails to address what the user is supposed to do after the integrity has failed? A redownload can result in the same tampered with file. This isn’t hypothetical btw, it happens in the real world, I’ve had ISPs in Ireland and South Africa damage downloads due to their injections and users don’t care if it’s their ISP, they get pissed off at you unfortunately.
I myself uses HTTPS mirror provided by Amazon aws (https://cdn-aws.deb.debian.org/). I do so because My ISP sometimes forward to it's login page when I browse HTTP URLs. Also, it does sometime include Ads (Yeah, it's really bad, but it does remind me that I'm being watched).
What a coincidence. Just earlier this week I was installing yarn in a docker container using their official instructions (https://yarnpkg.com/lang/en/docs/install/#debian-stable) and found out I had to install apt-transport-https for it to work.
Since the image was already apt-get install'ing a bunch of other packages at that point and everything seemed to work, the obvious question that popped in my head was: does this mean none of the other packages I've been downloading used https? That's what led me to this website.
If your personal ISP injected into HTTPS, it'd be broken too. So this is purely a complaint about the particular behavior of your ISP in that it serves HTTPS more faithfully than HTTP.
My corporate ISP hijacks HTTPS (MITM with self-signed CA), but not HTTP. Any system that uses any HTTPS security properties will verify certificates and fail on my work's network.
The argument about poorly behaved ISPs for one particular protocol but not the other cuts both ways — there are different kinds of poorly behaved ISP.
It didn't used to support it not too long ago so I had to setup my server to explicitly not redirect to HTTPS for one particular location because people would need to install apt-transport-https for it.
One reason to prefer HTTPS is that in the event of a vulnerability in the client code, an attacker cannot trigger that vulnerability using a MITM attack if HTTPS is in use. One such vulnerability was recently found in apk-tools: https://nvd.nist.gov/vuln/detail/CVE-2018-1000849
> "Furthermore, even over an encrypted connection it is not difficult to figure out which files you are downloading based on the size of the transfer"
Is it really not difficult? I bet if you sorted all the ".deb" packages on a mirror by size a lot of them would have a similar or the same size, so you wouldn't be able to tell them apart based on the size of the dialog.
Furthermore, when I update Debian I usually have to download some updates and N number of packages. I don't know if this is now done with a single keep-alive connection. If it is, then figuring out what combination of data was downloaded gets a lot harder.
Finally, this out of hand dismisses a now trivial attack (just sniff URLs being downloaded with tcpdump) by pointing out that a much harder attack is theoretically possible by a really dedicated attacker.
Now if you use Debian your local admin can see you're downloading Tux racer, but they're very unlikely to be dedicated enough to figure out from downloaded https sizes what package you retrieved.
I'm somewhat surprised that no-one has (yet) linked to this post [1] by Joe Damato of Packagecloud, which digs into attacks against GPG signed APT repos, or the paper which initially published them [2].
The post makes clear in the third paragraph: "The easiest way to prevent the attacks covered below is to always serve your APT repository over TLS; no exceptions."
Both sides sometimes argue disingenuously. It's true that caching is harder with layered HTTPS and that performance is worse. It's also true that layering encryption is more secure. (It's what the NSA and CIA do. You smarter than them?)
Personally, I'd default to security because I'm a software dev. If I were a little kid on a shoddy, expensive third world internet connection I'd probably prefer the opposite.
I've personally experienced this too: using apt in the presence of a captive portal replaces random bits of `/var/cache/apt` with HTML pages, breaking future updates until you manually find and fix the problem yourself.
One important factor this article left out is upgrades. If the given HTTPS implementation is broken because of what is now insecure protocols, insecure ciphers etc. Older systems can't update from the mirror if it's updated to use a 'secure' HTTPS configuration while it only supports the 'vulnerable' solution. If HTTPS is left insecure, then it is not much different from using HTTP.
APT's methodology avoids this and as the current signing and protection mechanisms are file based, the worst case scenario is introducing a new file with a new cryptographic signature along side the old schema, to support still updating a system running old security mechanism.
In comparison, trying to run multiple HTTPS servers with different configurations for specific versions of the system being updated would be a significant engineering effort, especially for mirrors.
Huh? All you would do is configure the web server running your apt mirror site to serve the same content on both HTTP and HTTPS ports. If the client want to use TLS, they connect to HTTPS. If they want to use plain HTTP, they connect to HTTP. Both sites serve the same content, which is just a series of flat files. AFAIK, the client is responsible for determining the correct versions for the installed distro based on the indices.
>This can lead to a replay attack where an attacker substitutes an archive with an earlier—unmodified—version of the archive. This would prevent APT from noticing new security updates which they could then exploit.
>To mitigate this problem, APT archives includes a timestamp after which all the files are considered stale[4].
> The Valid-Until field may specify at which time the Release file should be considered expired by the client. Client behaviour on expired Release files is unspecified.
Besides the privacy issue of sending package names clear-text, there is a second non-mitigated issue: Censorship.
An MitM could selectively block certain package being installed / update. Imagine using this to prevent: Bitcoin being installed / enforce a ban on crypto without backdoors / block torrent installations.
This doesn't work as well with the 'recognize package size' method because you need to download the entire package before you know the size. Given the need for Ack in TCP, an MitM can't just buffer data until they have the entire package size.
I bet there are many FLOSS advocates who don't read that page as being the result of a cost-benefit analysis. They read it as an inspiring story of the rebels winning one against the https Empire. Because I never see the caveat wrt apt that, "of course, we are a super edgy edge-case that should not be used as a model to rationalize a knee-jerk refusal to use SSL for common cases."
I say this because I've corresponded with such advocates about a completely common case for SSL-- setting up a LetsEncrypt certicate, say. The response I often get doesn't make any sense unless I assume they read a page like this and remembered the feels while forgetting all the relevant details that separate apt from their common case.
Even though the packages are signed cryptographically, there are possible risks when using an unencrypted connection.
A man-in-the-middle attack could simply work by serving you a signed, but outdated packages list, preventing your distribution from updating and leaving you vulnerable to security holes. It's the same attack an evil mirror could do as well.
So if you want to be really sure you should probably use two independent mirrors over an HTTPS connection.
"HTTPS does not provide meaningful privacy for obtaining packages. As an eavesdropper can usually see which hosts you are contacting, if you connect to your distribution's mirror network it would be fairly obvious that you are downloading updates."
It is a dangerous mistake to decide what kind of privacy people need. Privacy should be absolute and without conditions.
What if you live in Iran? Some Ubuntu packages are already inaccessible due to government's pornography keywords censorship. E.g. I can't download "libjs-hooker" from this http link http://archive.ubuntu.com/ubuntu/pool/universe/n/node-hooker... from Iran. What if the government decides to censor the "tor" package?
Do we now have a custom domain name on a per article basis?
I find it strange to have a site that is just about one thing that is not that important to most people on a custom domain. If there were pages and pages of information then yes this might make sense but there isn't.
Coming soon...
howtotieyourownshoelaces.com
The premise of this article per domain reminds me of 1998 when everyone thought that instead of search engines people would be typing in URLs, e.g. 'yescupofteaplease.com' so URLs like 'pets.com' were seen as goldmines-to-be.
As an exploit analyst currently focusing on network traffic, can we stop all this fascination with SSL/TLS? TLS is incredible, but let's use the right tool for the job. Contrary to Let's Encrypt motto, applying TLS to _everything_ can be bad for security.
Let's Encrypt, when are you going to revoke placeimg.com's certificate? The site has been pushing Exploit Kit's malicious payloads since Jan 18 2019 via SSL. Many Flash/IE users are getting infected because most firewalls are unable to peer into SSL tunnels signed by you.
(To be fair, Let's Encrypt is not the only cert authority getting abused (Comodo, yes you))
>To mitigate this problem, APT archives includes a timestamp after which all the files are considered stale
How often is this, practically? If I'm understanding this right, each new timestamp would come only with a package upgrade, meaning the time period is quite a long time indeed, long enough for a replay attack to work. I would argue that there should be a mechanism requiring a signed the-latest-package-is-X message updated at least every day or so.
Edit: it looks like this is actually what's going on. The page wasn't clear, but it is a metadata "Releases" file that is timestamped, not the packages themselves.
The security repository generally serves up a field in its metadata saying that the data shouldn't be trusted for more than 7 days, if it hasn't changed since 2014 when I encountered this duration as part of my day-job work. It's safe to assume the trusted duration hasn't increased, at least.
[+] [-] hannob|7 years ago|reply
I have a few problems with this. The short summary of these claims is “APT checks signatures, therefore downloads for APT don’t need to be HTTPS”.
The whole argument relies on the idea that APT is the only client that will ever download content from these hosts. This is however not true. Packages can be manually downloaded from packages.debian.org and they reference the same insecure mirrors. At the very least Debian should make sure that there are a few HTTPS mirrors that they use for the direct download links.
Furthermore Debian also provides ISO downloads over the same HTTP mirrors, which are also not automatically checked. While they can theoretically be checked with PGP signatures it is wishful thinking to assume everyone will do that.
Finally the chapter about CAs and TLS is - sorry - baseless fearmongering. Yeah, there are problems with CAs, but deducing from that that “HTTPS provides little-to-no protection against a targeted attack on your distribution’s mirror network” is, to put it mildly, nonsense. Compromising a CA is not trivial and due to CT it’s almost certain that such an attempt will be uncovered later. The CA ecosystem has improved a lot in recent years, please update your views accordingly.
[+] [-] omarforgotpwd|7 years ago|reply
1. You're in China and you download some VPN software over APT. A seemingly innocuous call to package server is now a clear violation of Chinese law.
2. Even in the US, can leak all kinds of information about your work habits, what you're working on, etc.
3. If it's running on a server, it could leak what vulnerable software you have installed or what versions of various packages you're running to make exploiting known vulnerabilities easier
[+] [-] NicoJuicy|7 years ago|reply
There were some topics about it yesterday.
Eg. https://news.ycombinator.com/item?id=18948195
Some arguments of the blog post are also valid here
[+] [-] hacknat|7 years ago|reply
[+] [-] jdamato|7 years ago|reply
APT's use of plain text HTTP (even with GPG) is vulnerable to several attacks outlined in this paper: https://isis.poly.edu/~jcappos/papers/cappos_mirror_ccs_08.p....
Yes, this paper is old, but APT is still vulnerable to most of these attacks. I would advise anyone wanting to use APT to do so only with TLS.
[+] [-] Spooky23|7 years ago|reply
When I was a 19 year old idiot, I was responsible for a mirror server. As a bad actor, I could easily get access to a valid organizational cert.
[+] [-] simias|7 years ago|reply
That's a reasonable complaint. I think it would make sense for the individual packages to be signed as well (and checked during install). This way you'd get a warning if you install an untrusted package regardless of the source. I'm not sure why it doesn't work that way.
>Furthermore Debian also provides ISO downloads over the same HTTP mirrors, which are also not automatically checked. While they can theoretically be checked with PGP signatures it is wishful thinking to assume everyone will do that.
I do do that. If you care about such an attack vector why wouldn't you? And if you don't, why should Debian care for you? There are plenty of mirrors for Debian installers, I often get the mine from bittorent, trusting a PGP sig makes much more sense than relying on HTTPS for that IMO.
>Finally the chapter about CAs and TLS is - sorry - baseless fearmongering.
I don't think it's baseless (we have a long list of shady CAs and I'm sure may government agencies can easily generate forged certificates) but it's rather off-topic. Their main argument is that the trust model of HTTPs doesn't make sense for APT, if that's true then whether or not HTTPS is potentially hackable is irrelevant.
[+] [-] emilfihlman|7 years ago|reply
This is simply not true. Governments can simply compel your CA to do as they want. Not to mention that "uncovered later" is pretty damn worthless.
That said, I do agree that there should be HTTPS mirrors.
[+] [-] pbhjpbhj|7 years ago|reply
In fact, is the FOSS community running it's own trust network still, it used to be a thing at LUGs.
[+] [-] notyourday|7 years ago|reply
Here's are some low level gems:
1. I have installed package X. I want to validate that the files that are listed in a manifest for package X have not changed on a host.
APT answer: Handwave! This is not a valid question. If you are asking this question you already lost.
2. I want to have more than one version of a package in a distribution.
APT answer: Handwave! You don't need it. You can just have multiple distributions! It is because of how we sign things - we sign collections!
3. I want to have a complicated policy where some packages are signed, some are not signed, and some are signed with specific keys.
APT answer: Handwave! You should have all or nothing policy! Nothing policy, actually, because we mostly just sign collections, rather than the individual packages
https://wiki.debian.org/SecureApt
[+] [-] xcaaa|7 years ago|reply
Hint: The Netherlands is world leader in surveillance of its own citizens and inhabitants.
[+] [-] TorKlingberg|7 years ago|reply
However Debian, where APT is from, relies on the goodwill of various universities and companies to host their packages for free. I can see that they don't want to make demands on a service they get for free, when HTTPS isn't even necessary for the use case.
Also since APT and Debian was created in the pre-universal HTTPS days, it does things like map something.debian.org to various mirrors owned by different parties. That makes certificate handling complicated.
[+] [-] arghwhat|7 years ago|reply
[+] [-] stefan_|7 years ago|reply
You can not get more security by adding a less secure mechanism to a better one. It's not additive.
[+] [-] DoctorOetker|7 years ago|reply
TM1: attacker does not posses zerodays to installed software
TM2: attacker possesses speccific (perhaps OS, perhaps library, perhaps userland) zerodays, usage of which (including unsuccesful attempts) should be minimized to avoid detection
in TM1: it's ok to use HTTP in the clear, as long as signatures are verified
in TM2: everything should be fetched over encrypted HTTPS, since HTTP would leak information about available attack surface
EDIT: not only would this increase security by not revealing what a user installs (perhaps download some noise as well such that it becomes harder to detect what a user is installing?), it could also improve security by turning the APT servers into honeypots, so that monitoring these can reveal zerodays...
[+] [-] mtgx|7 years ago|reply
Or it would use the much more modern and more secure Noise, like I think QUIC will end up using through nQUIC:
https://dl.acm.org/citation.cfm?id=3284854
https://noiseprotocol.org
[+] [-] mr__y|7 years ago|reply
you could imagine a situation where https would be optional for APT mirrors. Then the package manager would have a config flag to use any mirror or only https-enabled mirrors (probably enabled by default). This would allow to use https without creating any demands to organizations that host those mirrors - if they can they would enable it, but it would not be required. The https-enabled hosts could also provide plain http for backwards compatibility.
[+] [-] fyjvd90|7 years ago|reply
[+] [-] lreeves|7 years ago|reply
[+] [-] fyjvd90|7 years ago|reply
[+] [-] pksadiq|7 years ago|reply
I myself uses HTTPS mirror provided by Amazon aws (https://cdn-aws.deb.debian.org/). I do so because My ISP sometimes forward to it's login page when I browse HTTP URLs. Also, it does sometime include Ads (Yeah, it's really bad, but it does remind me that I'm being watched).
[+] [-] RidingPegasus|7 years ago|reply
Storm in a teacup.
[+] [-] fro0116|7 years ago|reply
Since the image was already apt-get install'ing a bunch of other packages at that point and everything seemed to work, the obvious question that popped in my head was: does this mean none of the other packages I've been downloading used https? That's what led me to this website.
[+] [-] loeg|7 years ago|reply
My corporate ISP hijacks HTTPS (MITM with self-signed CA), but not HTTP. Any system that uses any HTTPS security properties will verify certificates and fail on my work's network.
The argument about poorly behaved ISPs for one particular protocol but not the other cuts both ways — there are different kinds of poorly behaved ISP.
[+] [-] LeonidasXIV|7 years ago|reply
[+] [-] hyperpape|7 years ago|reply
[+] [-] avar|7 years ago|reply
Is it really not difficult? I bet if you sorted all the ".deb" packages on a mirror by size a lot of them would have a similar or the same size, so you wouldn't be able to tell them apart based on the size of the dialog.
Furthermore, when I update Debian I usually have to download some updates and N number of packages. I don't know if this is now done with a single keep-alive connection. If it is, then figuring out what combination of data was downloaded gets a lot harder.
Finally, this out of hand dismisses a now trivial attack (just sniff URLs being downloaded with tcpdump) by pointing out that a much harder attack is theoretically possible by a really dedicated attacker.
Now if you use Debian your local admin can see you're downloading Tux racer, but they're very unlikely to be dedicated enough to figure out from downloaded https sizes what package you retrieved.
[+] [-] jen20|7 years ago|reply
The post makes clear in the third paragraph: "The easiest way to prevent the attacks covered below is to always serve your APT repository over TLS; no exceptions."
[1]: https://blog.packagecloud.io/eng/2018/02/21/attacks-against-... [2]: https://isis.poly.edu/~jcappos/papers/cappos_mirror_ccs_08.p...
[+] [-] 3pt14159|7 years ago|reply
One side: We want performance / caching!
Other side: We want security!
Both sides sometimes argue disingenuously. It's true that caching is harder with layered HTTPS and that performance is worse. It's also true that layering encryption is more secure. (It's what the NSA and CIA do. You smarter than them?)
Personally, I'd default to security because I'm a software dev. If I were a little kid on a shoddy, expensive third world internet connection I'd probably prefer the opposite.
I just wish it were up to me.
[+] [-] ctz|7 years ago|reply
I've personally experienced this too: using apt in the presence of a captive portal replaces random bits of `/var/cache/apt` with HTML pages, breaking future updates until you manually find and fix the problem yourself.
[+] [-] Dormeno|7 years ago|reply
APT's methodology avoids this and as the current signing and protection mechanisms are file based, the worst case scenario is introducing a new file with a new cryptographic signature along side the old schema, to support still updating a system running old security mechanism.
In comparison, trying to run multiple HTTPS servers with different configurations for specific versions of the system being updated would be a significant engineering effort, especially for mirrors.
[+] [-] da_chicken|7 years ago|reply
This is what many mirrors already do:
http://mirrors.lug.mtu.edu/debian/
https://mirrors.lug.mtu.edu/debian/
[+] [-] mgliwka|7 years ago|reply
>To mitigate this problem, APT archives includes a timestamp after which all the files are considered stale[4].
Let's take a look at the repo spec then:
https://wiki.debian.org/DebianRepository/Format#Date.2C_Vali...
> The Valid-Until field may specify at which time the Release file should be considered expired by the client. Client behaviour on expired Release files is unspecified.
“Should”, “may”, and unspecified behaviour.
[+] [-] rocqua|7 years ago|reply
An MitM could selectively block certain package being installed / update. Imagine using this to prevent: Bitcoin being installed / enforce a ban on crypto without backdoors / block torrent installations.
This doesn't work as well with the 'recognize package size' method because you need to download the entire package before you know the size. Given the need for Ack in TCP, an MitM can't just buffer data until they have the entire package size.
[+] [-] LeonidasXIV|7 years ago|reply
[+] [-] jancsika|7 years ago|reply
I say this because I've corresponded with such advocates about a completely common case for SSL-- setting up a LetsEncrypt certicate, say. The response I often get doesn't make any sense unless I assume they read a page like this and remembered the feels while forgetting all the relevant details that separate apt from their common case.
[+] [-] skrause|7 years ago|reply
A man-in-the-middle attack could simply work by serving you a signed, but outdated packages list, preventing your distribution from updating and leaving you vulnerable to security holes. It's the same attack an evil mirror could do as well.
So if you want to be really sure you should probably use two independent mirrors over an HTTPS connection.
[+] [-] gralx|7 years ago|reply
Scroll to the end for a very simple how-to.
[+] [-] sepent|7 years ago|reply
It is a dangerous mistake to decide what kind of privacy people need. Privacy should be absolute and without conditions.
What if you live in Iran? Some Ubuntu packages are already inaccessible due to government's pornography keywords censorship. E.g. I can't download "libjs-hooker" from this http link http://archive.ubuntu.com/ubuntu/pool/universe/n/node-hooker... from Iran. What if the government decides to censor the "tor" package?
[+] [-] Theodores|7 years ago|reply
I find it strange to have a site that is just about one thing that is not that important to most people on a custom domain. If there were pages and pages of information then yes this might make sense but there isn't.
Coming soon...
howtotieyourownshoelaces.com
The premise of this article per domain reminds me of 1998 when everyone thought that instead of search engines people would be typing in URLs, e.g. 'yescupofteaplease.com' so URLs like 'pets.com' were seen as goldmines-to-be.
[+] [-] guardiangod|7 years ago|reply
Let's Encrypt, when are you going to revoke placeimg.com's certificate? The site has been pushing Exploit Kit's malicious payloads since Jan 18 2019 via SSL. Many Flash/IE users are getting infected because most firewalls are unable to peer into SSL tunnels signed by you.
(To be fair, Let's Encrypt is not the only cert authority getting abused (Comodo, yes you))
[+] [-] dooglius|7 years ago|reply
How often is this, practically? If I'm understanding this right, each new timestamp would come only with a package upgrade, meaning the time period is quite a long time indeed, long enough for a replay attack to work. I would argue that there should be a mechanism requiring a signed the-latest-package-is-X message updated at least every day or so.
Edit: it looks like this is actually what's going on. The page wasn't clear, but it is a metadata "Releases" file that is timestamped, not the packages themselves.
[+] [-] jkaplowitz|7 years ago|reply