top | item 34588862

(no title)

denom | 3 years ago

Alternatively, they could extract the compression code and maintain it for repo tags created before the git algo update release date.

Isn’t that the only humane course given all that depends on this?

discuss

order

ecnahc515|3 years ago

That's my thought as well. They could also potentially retroactively generate the source tarballs using the old method for every possible repository/tag on Github, store it, and serve that, and then only generate it on-demand for new tags, but I doubt they'll do that. They might though, given this is what led to the problem in the first place (ie; the on-demand generation vs generating on push+storing).

Kwpolska|3 years ago

That seems wasteful. Many projects do not actively advertise the GitHub tag downloads, and instead have their own stored and stable tarballs (or other distributions). And I suppose many users of those auto-generated downloads don’t care about their checksums.