top | item 28195580

Zx 3.0

189 points| medv | 4 years ago |github.com | reply

163 comments

order
[+] another-dave|4 years ago|reply
First impression is that I wouldn't go near it until it settles down a bit — it was on v1.7 in May and now on version 3 already.

Great that it's under active development, but changes to my scripts are infrequent and slow. I wouldn't want to be touching an old script on a server somewhere saying "How do I do X in ZX again?" and everything I find online is now for v15 while I'm still on v3.

Maybe if it had plans around a LTS version I'd take a look at that stage.

To be honest though I like doing things in bash. You can get pretty far & it's 100% portable. If something's too complicated for bash it's probably time it's not just a simple CLI script anymore, in my experience.

[+] Stampo00|4 years ago|reply
If you wrote your script against v1.7 and now they're on v3, you can still run your script. The v1.7 source and binary haven't disappeared. It's still available and work exactly the same as when you wrote your script.
[+] SevenSigs|4 years ago|reply
> First impression is that I wouldn't go near it until it settles down a bit — it was on v1.7 in May and now on version 3 already.

Would you feel better if they called version three v1.9 instead?

[+] tgsovlerkhgsel|4 years ago|reply
As annoying as having to write "await" in front of everything is (probably should have a variant of $ that's synchronous), the ability to effortlessly implement parallelism is something that all other common scripting (and many "full") languages seem to lack.

Even Go, that prides itself on the good support for parallelism, fails once you actually want go get results back.

If this were paired with some way to also display the status/output of multiple commands in parallel, this would be the ultimate toolkit to write scripts.

If you look at apt, for example: Does it really have to wait for the last package to be downloaded before installing the first? Is there a good reason, on modern SSD-based computers, to not parallelize the installation of multiple packages that don't depend on each other?

[+] q3k|4 years ago|reply
> Even Go, that prides itself on the good support for parallelism, fails once you actually want go get results back.

It's doable if you're willing to build a very thin abstraction over whatever you're trying to parallelize keeping in mind all possible edgecases [1] and how you'd like to handle them.

A generic `run N things in parallel and collect their results` function would be nice, but would probably end up being quite unwieldy in comparison to just writing out a purpose-specific function for what you exactly need.

[1] - Error handling, timeouts/cancellation, maximum amount of processes running in parallel, wait-for-full-join vs. returning results as they appear, introspection of currently running subordinates, subordinate output streaming/buffering/logging, ... A lot of things to consider when you're trying to build something universal, but that are easy to solve/ignore when you're building something purpose-specific with known constraints.

[+] otabdeveloper4|4 years ago|reply
Awaiting on every line is hardly parallelism.
[+] rurban|4 years ago|reply
using sh is usually preferred:

   somecmd & someothercmd & wait
runs the two commands in parallel and waits for both. no need for await. to run in serial replace & with && and skip the wait
[+] reddit_clone|4 years ago|reply
Give Perl6/Raku a shot if you haven't yet.

IMO, it is pretty much the best-of-all-worlds language that is still accessible. (Meaning not exotic like Haskell or Erlang..)

The concurrency is beautifully painless.

[+] Siira|4 years ago|reply
Take a look at GNU Parallel. It’s magnificent.
[+] otabdeveloper4|4 years ago|reply
> await await await await

Oh yeah, right, because the first thing that comes to mind when writing bash scripts is "I sure wish there was more 'await' noise in all this code!"

Anyways: 'async' in the Python and Javascript sense was a mistake, future generations will think we were insane to adopt it.

https://journal.stuffwithstuff.com/2015/02/01/what-color-is-...

[+] eyelidlessness|4 years ago|reply
Given Promises and callbacks have always been pervasive in JS—that it’s always been heavily asynchronous—I can’t imagine thinking async/await was a mistake. It’s a drastic ergonomic improvement over both, and it also drastically improves debugging and error stacks.

Adding it to a language where IO is typically blocking with other ergonomic concurrency mechanisms, that I can imagine being somewhat controversial.

It would be good if async weren’t infectious, but as a sibling comment said that was already the case with Promises and callbacks.

It would maybe also be better if concurrency was the default behavior and keeping a reference to a promise was the explicit syntax, but I could see downsides to that. It potentially hides/encourages excessive use of expensive calls. And implicit concurrency is a potential source of many bugs.

[+] Vinnl|4 years ago|reply
I haven't looked into Zx yet, but at least for JavaScript that "what color is your function" article also applied before they added the syntactic sugar. If your function was async via a direct callback then the calling function would have to accept a callback too.
[+] laurent123456|4 years ago|reply
Those are small examples but for more complex scripts I expect you'd mostly have synchronous code, for example to process strings, validate input, etc. and the "await" will become the exception rather than the norm.
[+] cfeliped|4 years ago|reply
In my first job I gave a try at using Python for scripts.

It was a terrible idea, as I now had 3 problems instead of 1:

1 - Writing scripts, plus:

2 - Installing Python in every host that needed to run those scripts

3 - Maintaining Python and the necessary dependencies up to date in each host / container.

(2) and (3) are trivial when done just once in your own computer, but end up being a huge time sink for larger heterogeneous environments with (possibly) network and security barriers. Moreover, if you are using containers, installing Python will make images unecessarily large.

Granted, I'm not a JS person, but I imagine one will have the same issues.

Even though bash has its shortcomings, I really appreciate how low maintenance it is as long as you keep scripts small and sane.

[+] qalmakka|4 years ago|reply
That's why Perl is my go-to choice for glue scripts. It's almost always installed by default, it has stayed consistent for decades without any breaking changes, its regex support is by far the best around, and its almost as good as the shell at gluing commands together.
[+] andrewl-hn|4 years ago|reply
JS is marginally better than Python for this, because tools for runtime and package version management are better for Node than for Python (proper lockfiles, local package folders, npm bundled with Node for installation, etc.).

Still, zx isn't as nice as Ruby, which has backticks for calling shell as part of the language. So, you don't have to have a separate package installed, just the ruby runtine.

But Perl is even better! Just like Ruby, it comes with backtick shell execution build-in, but unlike Ruby Perl is ubiquitous and is pretty strict about backward compatibility. So, your scripts will just run on all machines you have, no matter what distro you use and how old the machine is (within reason).

[+] gravypod|4 years ago|reply
Luckily today you can side step many of these issues. You can compile a copy of python + modules + pex into a single file with some open source tooling and get a flat file you just need to copy to your hosts.
[+] sumtechguy|4 years ago|reply
Bash just sort happened to be the lowest common denominator in most unix style environments. Because it just sorta ended up being the default. Unix ended up being this different thing where each distro seems to be kind of unique. Then on top of that they are very easy to customize. So while one distro you can assume a particular python level the next disto python is totally optional. Mix in containers and docker images and you have a new level of 'what may or may not be there'. Ah but you may say 'oh just use tech XYZ' that may or may not have the exact same issue as using python in its place. So either 'deal with it' and install what is needed and keep it up to date. Or 'play the LCD game' you use only the bare min and hope you can build something useful enough in bash when another lang may be more appropriate.
[+] kjgkjhfkjf|4 years ago|reply
I'm having a hard time believing that Google uses this much internally. Is github.com/google just a place where Googlers can put personal projects to get additional exposure?
[+] Dobbs|4 years ago|reply
I was under the impression that `github.com/google` was largely projects being worked on by googlers where the code is owned by google, but the code isn't always being used by google. Say 20% time work, side projects built using google resources, etc.
[+] tym0|4 years ago|reply
The copyright of the thing Googlers build on their oswn time are usually owned by Google and so it ends up on the Google org if they want it on GitHub. They usually have a disclaimer like "this is not a official Google product..."
[+] tyingq|4 years ago|reply
I suppose it's easy to poke fun at combining the "best" of shell scripting and JS. But, the examples of using await() with child processes and pipes are pretty nice, and less verbose than the Perl or Python equivalents. It does seem well designed for anything where you're orchestrating parallel runs of commands.
[+] satyanash|4 years ago|reply
I personally find Ruby's built-in shell features more ergonomic:

    `cat package.json | grep name`

    branch = `git branch --show-current`
    `dep deploy --branch=#{branch}` if $?.success?

    [
      "sleep 1; echo 1",
      "sleep 2; echo 2",
      "sleep 3; echo 3",
    ].map { |cmd| Thread.new { %x(cmd) } }.each(&:join)

    name = 'foo bar'
    `mkdir /tmp/#{name}` if $?.success?
[+] jitl|4 years ago|reply
What I don’t like about backticks in Ruby is that they “ignore” errors in commands you run. It’s up to the program author to remember to check $? for the last executed command’s exit status. And guess how many times the average Ruby script using this feature implements error handling? Usually it’s totally forgotten.

To be safe, abstractions that make it easy to shell out must also:

- escape all variable interpolations by default using “shellwords” or similar.

- throw exceptions on error by default.

Backticks in Ruby are very easy to use, but aren’t safe.

[+] chubot|4 years ago|reply
How do you do shell escaping of #{name} if it comes from user input?
[+] eurekin|4 years ago|reply
Does this carry over to jruby as well?
[+] antifa|4 years ago|reply
Crystal does this, too.
[+] azkae|4 years ago|reply
Somewhat related, if you are looking to do scripts in Python you should take a look at `sh`:

  import sh
  print(sh.ifconfig("eth0"))
https://amoffat.github.io/sh/
[+] jeswin|4 years ago|reply
Shameless plug:

If you want the exact opposite of this (that is, to use JS from the shell), try https://bashojs.org

For example:

  curl 'https://api.openweathermap.org/data/2.5/weather?q=Bangalore&appid=2a125de83c277f4ce0ace5ed482f22b9' | basho --json -j 'x.weather[0].description + ", feels like: " + Math.round(10 * (x.main.feels_like - 273.15))/10 + "°C"'
[+] miohtama|4 years ago|reply
Zx is good if JavaScript is only language you know. If you have ability to program or learn other programming languages, Python with packages like Shell and Plumbum might be better choices for shell scripting tasks that outgrow shell.

- Python code with Shell is easier to read ana write

- No nees to carry async/await keywords thru the code

- Powerful plain text manipulation in the stdlib

https://pypi.org/project/python-shell/

https://plumbum.readthedocs.io/en/latest/

[+] dogma1138|4 years ago|reply
Powershell is also available for Linux these days, and well Powershell is awesome.

Not sure how I feel about needing Node.JS to run shell scripts…

[+] SahAssar|4 years ago|reply
> Python code with Shell is easier to read and write

That is pretty subjective

[+] alexghr|4 years ago|reply
Maybe, but the main advantage of `zx` (at least for me) is that I don't have to manage another language environment. I can keep everything contained within Node :)
[+] IshKebab|4 years ago|reply
Maybe but Python is sloooow and its type annotation situation is pretty poor.

I would recommend using Deno. It has the following advantages:

* Uses Typescript which gives you a great static typing system.

* Runs via V8 so it's about a million times faster than Python.

* Single statically linked binary with no project setup files required (package.json/dependencies.txt) makes it about a million times easier to deploy than either Node or Python. Especially on Windows.

* You can still use third party libraries even without a project file and you get IDE integration.

It's the clear winner at this point. Someone has even made a version of zx for it:

https://deno.land/x/[email protected]

[+] gizdan|4 years ago|reply
Wow, plumbum, despite it's unfortunate name, looks awesome. I will definitely try it next time I write a script.
[+] hnarn|4 years ago|reply
> Bash is great, but when it comes to writing scripts, people usually choose a more convenient programming language. JavaScript is a perfect choice

um… what?

[+] mikevm|4 years ago|reply
Ugh, typing `await` for every command is terrible. JavaScript's async-by-default model is clearly not a good fit for this use case.
[+] TeMPOraL|4 years ago|reply
What does "DX" stand for in "tons of DX improvements"?
[+] ggambetta|4 years ago|reply
I thought this was going to be somehow related to the ZX Spectrum :(
[+] ilaksh|4 years ago|reply
I feel like the await keyword everywhere is a bit verbose. One might use something like the gpp preprocessor (hopefully the macros could be defined in a separate file):

        #!/usr/bin/env zx

        #define _ await $
        #define __ await Promise.all

        #define _1 process.argv[1]
        #define _2 process.argv[1]
        #define _3 process.argv[1]

        _`cat package.json | grep name`

        let branch = _`git branch --show-current`
        _`dep deploy --branch=${branch}`

        __([
          $`sleep 1; echo 1`,
          $`sleep 2; echo 2`,
          $`sleep 3; echo 3`,
        ])

        let name = 'foo bar'
        _`mkdir /tmp/${_1}`
[+] jpxw|4 years ago|reply
After reading about various vulnerabilities which are the direct result of backwards JS language features - most recently prototype pollution - I’d be hesitant to use JS for writing scripts.

Maybe I’m just being paranoid.

[+] shell0x|4 years ago|reply
This looks horrible to me. The mixing of bash commands in Javascript looks absolutely terrible and running node itself for scripting is a total mess(node_modules, versions).

I usually go with bash + set -euxo pipefail and if it becomes too complicated I switch to Python.

GoLang/Rust are the perfect tool when distributing tools as a single binare. Where in the world does JS fit in?

[+] lixtra|4 years ago|reply

   await $`dep deploy --branch=${branch}`
   …
   await $`mkdir /tmp/${name}`
Code like this[1] looks prone to shell injection

[1] https://github.com/google/zx