top | item 43606752

(no title)

adwf | 10 months ago

In the specific case here, 7z is your friend for all zips and compressed files in general, not sure I've ever used unzip on Linux.

Related to that, the Unix philosophy of simple tools that do one job and do it well, also applies here a bit. More typical workflow would be a utility to tarball something, then another utility to gzip it, then finally another to encrypt it. Leading to file extensions like .tar.gz.pgp, all from piping commands together.

As for versioning, I'm not entirely sure why your Debian and Ubuntu installs both claim version 6.00, but that's not typical. If this is for a personal machine, I might recommend switching to a rolling release distro like Arch or Manjaro, which at least give upto date packages on a consistent basis, tracking the upstream version. However, this does come with it's own set of maintenance issues and increased expectation of managing it all yourself.

My usual bugbear complaint about Linux (or rather OSS) versioning is that people are far too reluctant to declare v1.00 of their library. Leading to major useful libraries and programs being embedded in the ecosystem, but only reaching something like v0.2 or v0.68 and staying that way for years on end, which can be confusing for people just starting out in the Linux world. They are usually very stable and almost feature complete, but because they aren't finished to perfection according to the original design, people hold off on that final v1 declaration.

discuss

order

Squossifrage|10 months ago

Info-Zip Unzip 6.00 was released in 2009 and has not been updated since. Most Linux distros (and Apple) just ship that 15-plus-year-old code with their own patches on top to fix bugs and improve compatibility with still-maintained but non-free (or less-free) competing implementations. Unfortunately, while the Info-Zip license is pretty liberal when it comes to redistribution and patching, it makes it hard to fork the project; furthermore, anyone who wanted to do so would face the difficult decision of either dropping or trying to continue to support dozens of legacy platforms. Therefore, nobody has stepped up to take charge and unify the many wildly disparate mini-forks.

DonHopkins|10 months ago

The "Unix Philosophy" is a bankrupt romanticized after the fact rationalization to make up excuses and justifications for ridiculous ancient vestigial historic baggage like the lack of shared libraries and decent scripting languages, where you had to shell out THREE heavyweight processes -- "[" and "expr" and a sub-shell -- with an inexplicable flurry of punctuation [ "$(expr 1 + 1)" -eq 2 ] just to test if 1 + 1 = 2, even though the processor has single cycle instructions to add two numbers and test for equality.

chubot|10 months ago

??? This complaint seems more than 20 years too late

Arithmetic is built into POSIX shell, and it's universally implemented. The following works in basically every shell, and starts 0 new processes, not 2:

    $ bash -c '[ $((1 + 1)) = 2 ]; echo $?'
    0
    $ zsh -c '[ $((1 + 1)) = 2 ]; echo $?'
    0
    $ busybox ash -c '[ $((1 + 1)) = 2 ]; echo $?'
    0
YSH (part of https://oils.pub/ ) has a more familiar C- or JavaScript-like syntax:

    $ ysh -c 'if (1 + 1 === 2) { echo hi }'
    hi
It also has structured data types like Python or JS:

    $ echo '{"foo": 42}' > test.json
    $ ysh
    ysh-0.28$ json read < test.json
    ysh-0.28$ echo "next = $[_reply.foo + 1]"
    next = 43
and floats, etc.

    $ echo "q = $[_reply.foo / 5]"
    q = 8.4
https://oils.pub/release/latest/doc/ysh-tour.html (It's probably more useful for scripting now, but it's also an interactive shell)

verandaguy|10 months ago

    > TWO heavyweight processes
If you're going to emphasize that it's two processes, at least make sure it's actually two processes. `[` is a shell builtin.

    > `eval` being heavy
If you want a more lightweight option, `calc` is available and generally better-suited.

    > inexplicable flurry of punctuation
It's very explicable. It's actually exceptionally well-documented. Shell scripting isn't syntactically easy, which is an artifact of its time plus standardization. The bourne shell dates back to 1979, and POSIX has made backwards-compatibility a priority between editions.

In this case:

- `[` and `]` delimit a test expression

- `"..."` ensure that the result of an expression is always treated as a single-token string rather than splitting a token into multiple based on spaces, which is the default behaviour (and an artifact of sh and bash's basic type system)

- `$(...)` denotes that the expression between the parens gets run in a subshell

- `-eq` is used for numerical comparison since POSIX shells default to string comparison using the normal `=` equals sign (which is, again, a limitation of the type system and a practical compromise)

    > even though the processor has single cycle instructions to add two numbers and test for equality
I don't really understand what this argument is trying to argue for; shell scripting languages are, for practical reasons, usually interpreted, and in the POSIX case, they usually don't have to be fast since they're usually just used to delegate operations off to other code for performance. Their main priority is ease of interop with their domain.

If I wanted to test if one plus one equals two at a multi-terabit-per-second bandwidth I'd write a C program for it that forces AVX512 use via inline assembly, but at that point I think I'd have lost the plot a bit.

whatnow37373|10 months ago

Shell != Unix (philosophy) as I’m sure you are aware. The unix philosophy is having a shell and being able to replace it, not its particular idiosyncrasies at any moment in time.

This is like bashing Windows for the look of its buttons.

eesmith|10 months ago

I realized the hype for the Unix Philosophy was overblown around 1993 when I learned Perl and almost immediately stopped using a dozen different command-line tools.

a-french-anon|10 months ago

I don't see what crusty implementation details have to do with a philosophy. In fact, UNIX itself is a poor implementation of the "UNIX" philosophy, which is why Plan 9 exists.

The idea of small composable tools doing one thing and doing it well may have been mostly an ideal (and now pretty niche), but I don't think it was purely invented after the fact. Just crippled by the "worse is better".

pjmlp|10 months ago

The "Unix Philosophy" is some cargo cult among FOSS folks that never used commercial UNIX systems, since Xenix I haven't used any that doesn't have endless options on their man pages.

setopt|10 months ago

> Related to that, the Unix philosophy of simple tools that do one job and do it well, also applies here a bit. More typical workflow would be a utility to tarball something, then another utility to gzip it, then finally another to encrypt it. Leading to file extensions like .tar.gz.pgp, all from piping commands together.

I do this for my own files, but half of the time I zip something, it’s to send it to a Windows user, in which case zip is king.

issafram|10 months ago

fyi latest version of Windows 11 supports native opening of 7zip files

aragilar|10 months ago

The issue in this case is upstream is dead, so there are random patches. Same thing happened to screen for a bit.

tecleandor|10 months ago

Was there any problem with 7z some years ago? I feel like I've been actively avoiding it for having the feeling that I've read something bad about it, but I can't remember what. But I could've mixed it with something else. It sometimes happens to me.

oblio|10 months ago

Hard to say for sure, did SourceForge put malware in their installers many millennia ago?

pxc|10 months ago

I came here to make the same recommendation. Just use p7zip for everything; no need to learn a bunch of different compression tools.

setopt|10 months ago

If you use `atool`, there is no need to use different tools either – it wraps all the different compression tools behind a single interface (`apack`, `aunpack`, `als`) and chooses the right one based on file extensions.