I've been a die-hard Linux user for about a dozen years. Recently I had to do some development with MS Powershell. I was very reluctant at first, but after getting familiar with the technology, I almost fell in love.
"Cmdlets", basically commands used in Powershell, output "objects" instead of the streams of text used in a more classical shell. Powershell has built-in tools to work with these objects. For example, you can take the output from one Cmdlet, pipe it through `SELECT` with a list of fields specified, and get a stream of objects only containing those fields. Other operations can be performed against those objects as well, such as filtering and whatnot.
Back to normal nix commands, we're starting to see more and more commands introduce direct JSON support [1]. There are even tools to translate output from common commands into JSON [2]. We'll probably see `jq` shipped directly with modern distros soon. Eventually we'll reach a tipping point where it's expected that command supports JSON output. Tools like `awk`/`sed` might get updated to have a richer support for JSON. Finally, we'll have ubiquitous Powershell-like capabilities on every nix machine.
Powershell _is_ available on Linux. The model of piping objects instead of JSON is both powerful and more efficient (For example, there's no redundant keys like in a stream of JSON objects, leading to less moving bytes, like how CSV headers aren't repeated with every row. Plus, binary data is smaller than text.) But, most developers are hesitant to switch out their shell and existing workflows for a completely new tool, which is why Powershell will likely only be adopted by a small subset of sysadmins.
Though it's pretty immature, nushell has a similar idea, with its own internal data model being streams of structured, typed data: https://www.nushell.sh/
And back to nix commands, libxo is used by a chunk of the FreeBSD base tools to offer output in JSON, amongst other things: https://github.com/Juniper/libxo
> But, most developers are hesitant to switch out their shell and existing workflows for a completely new tool, which is why Powershell will likely only be adopted by a small subset of sysadmins.
What I would like to see is some sort of stddata stream be offered by the kernel itself so devs won't have to switch their shells and object manipulation can be a standard.
Not only that, but pwsh support for objects doesn't stop on passing objects around and mapping to properties to parameters. There are number of mehanisms in place. All nix variants solve just 1 of those mechanisms.
IMO, powershell should be added to ALL mainstream distros as first order citizen. There is no downside to that given that MS is now legit FOSS player and that anybody can fork in case something goes wrong along the way...
Very neat project! I think visibility into intermediate stages of a pipeline can be enormously useful for data restructuring tasks, where the individual steps aren't necessarily "difficult" to understand, but it can be hard to keep track of what's going on without a good feedback loop.
Here's a demo of a prototype live programming environment I made for jq, which similarly shows step-by-step views of the data but also gives live feedback as you construct your pipeline:
By the end of that Twitter thread I ultimately morphed it into a tool for building interactive GUIs (eg, get API data in JSON, use jq to morph it into the right shape for your UI, output to a HTML template).
That is very cool! For other readers, "process" means submit jq-like document queries, and "various formats" means other JSON or JSON-like representations, such as BSON, Bencode[1], TOML, XML, and YAML. Thank you for sharing!
[1]: In 2001, prior to the huge popularity of JSON, it is an ASCII-coded dict+string+int+list format used in Bittorrent .torrent files.
As a JS dev I tend to have node installed anyhow so I just use a shell alias to wrap ‘node -pe’ these days. It’s not really for shell scripts but it’s great for quick every day usage. Plus you can use JS if needed instead of their DSL.
Is there something similar for YAML? I've tried `yq` briefly but weirdly enough it doesn't seem to accept standard input in the way that jq does (ie pipe in some json, and output some pretty json)
it shouldn't be too difficult to convert between yaml and json, funny i couldn't find a light weight converter easily. I think i will try to write one.
I don't know what the term would be, mental model, but I just can't get jq to click. Mostly because i only need it every once in a while. It's frustrating for me because it seems quite powerful.
I like jq for simple stuff which is what I mostly use it for. Whenever I have to dive into the documentation to do something complicated I die a bit on the inside.
I like how you have output for each process of the pipeline, however it would be much better in terms of usability if you could dynamically load the result just below the query rather than opening a new page.
Would like to have a kind of "rosetta stone" where each of these examples is rewritten by passing the json to "gron" and then using the standard unix tools.
I guess some of the examples would be simpler than the jq solution.
God forbid that someone should ever write (- y + x) rather than (x - y), what a useless use of the plus sign!
Why have a problem with this? Catting a single file is a well-known idiom for outputting its contents into a stream, plays well with positioning in pipelines, and has the nice property that you can erase as much of the pipeline as you want, to be able to peek inside it at any point.
I was going to suggest that shells should allow specifying the input redirect before the command, but it turned out that "<input command" in bash already works. Anyone knows about other shells?
This is so much easier with Powershell. V7 is totally cross-platform and I cant see why people have a problem to use it, if nothing else then for `ConvertFrom/To-Json/CSV/Whatever` cmdlets ..
As someone fairly ignorant of Powershell, how well does it interact with the rest of the Unix-minded ecosystem? jq is nice because its plainly compatible with being piped into another command like xargs. I don't want a better jq if it means relearning and remastering everything else along with it.
[+] [-] tlhunter|6 years ago|reply
"Cmdlets", basically commands used in Powershell, output "objects" instead of the streams of text used in a more classical shell. Powershell has built-in tools to work with these objects. For example, you can take the output from one Cmdlet, pipe it through `SELECT` with a list of fields specified, and get a stream of objects only containing those fields. Other operations can be performed against those objects as well, such as filtering and whatnot.
Back to normal nix commands, we're starting to see more and more commands introduce direct JSON support [1]. There are even tools to translate output from common commands into JSON [2]. We'll probably see `jq` shipped directly with modern distros soon. Eventually we'll reach a tipping point where it's expected that command supports JSON output. Tools like `awk`/`sed` might get updated to have a richer support for JSON. Finally, we'll have ubiquitous Powershell-like capabilities on every nix machine.
Powershell _is_ available on Linux. The model of piping objects instead of JSON is both powerful and more efficient (For example, there's no redundant keys like in a stream of JSON objects, leading to less moving bytes, like how CSV headers aren't repeated with every row. Plus, binary data is smaller than text.) But, most developers are hesitant to switch out their shell and existing workflows for a completely new tool, which is why Powershell will likely only be adopted by a small subset of sysadmins.
[1] https://daniel.haxx.se/blog/2020/03/17/curl-write-out-json/
[2] https://github.com/kellyjonbrazil/jc
[+] [-] Freaky|6 years ago|reply
And back to nix commands, libxo is used by a chunk of the FreeBSD base tools to offer output in JSON, amongst other things: https://github.com/Juniper/libxo
Be nice to see more tools converted.[+] [-] vips7L|6 years ago|reply
What I would like to see is some sort of stddata stream be offered by the kernel itself so devs won't have to switch their shells and object manipulation can be a standard.
[+] [-] majkinetor|6 years ago|reply
IMO, powershell should be added to ALL mainstream distros as first order citizen. There is no downside to that given that MS is now legit FOSS player and that anybody can fork in case something goes wrong along the way...
[+] [-] ramblerman|6 years ago|reply
Every time I open it up I just nope back out, and most of the docs I peeked at didn't draw me in.
[+] [-] onion2k|6 years ago|reply
[+] [-] gklitt|6 years ago|reply
Here's a demo of a prototype live programming environment I made for jq, which similarly shows step-by-step views of the data but also gives live feedback as you construct your pipeline:
https://twitter.com/geoffreylitt/status/1161033775872118789
By the end of that Twitter thread I ultimately morphed it into a tool for building interactive GUIs (eg, get API data in JSON, use jq to morph it into the right shape for your UI, output to a HTML template).
[+] [-] dang|6 years ago|reply
I'm sure it's fine reading material, but if we allowed Show HN to be reading material, every submission would be a Show HN.
[+] [-] jzelinskie|6 years ago|reply
https://github.com/jzelinskie/faq
[+] [-] loeg|6 years ago|reply
[1]: In 2001, prior to the huge popularity of JSON, it is an ASCII-coded dict+string+int+list format used in Bittorrent .torrent files.
[+] [-] dastx|6 years ago|reply
[+] [-] ilSignorCarlo|6 years ago|reply
[+] [-] Kaze404|6 years ago|reply
[+] [-] terenceng2010|6 years ago|reply
edit: the article does not have <p> tag. so reader view is not triggered.
[+] [-] kylepdavis|6 years ago|reply
I liked jq but liked json, a similar npm package, a little bit better for simple tasks.
You can find more about it here: https://github.com/trentm/json
As a JS dev I tend to have node installed anyhow so I just use a shell alias to wrap ‘node -pe’ these days. It’s not really for shell scripts but it’s great for quick every day usage. Plus you can use JS if needed instead of their DSL.
Here the code for the alias in my shell profile: https://github.com/KylePDavis/dotfiles/blob/master/.profile#...
[+] [-] soheilpro|6 years ago|reply
[+] [-] oftenwrong|6 years ago|reply
https://github.com/tomnomnom/gron
Some options of gron I use often:
--stream, which treats the input as "JSON lines" format
--ungron, which converts from the flat format back to JSON
[+] [-] pletnes|6 years ago|reply
[+] [-] zackmorris|6 years ago|reply
https://github.com/soheilpro/catj/issues/7
[+] [-] hugg|6 years ago|reply
[+] [-] moondev|6 years ago|reply
It can do everything with json as well and convert between
[+] [-] jolmg|6 years ago|reply
https://news.ycombinator.com/item?id=22626664
[+] [-] ecnahc515|6 years ago|reply
[+] [-] MichaelMoser123|6 years ago|reply
[+] [-] toisanji|6 years ago|reply
[+] [-] choward|6 years ago|reply
[+] [-] wonginator1221|6 years ago|reply
One suggestion is to use with_entries as a replacement for the 'to_entries | map(...) | from_entries' pattern. For example:
is equivalent to[+] [-] alexellisuk|6 years ago|reply
[+] [-] kbrazil|6 years ago|reply
I created jello[0], which uses python list and dict syntax to filter JSON. Here's a blog post[1] I wrote that shows how it can be used.
[0] https://github.com/kellyjonbrazil/jello [1] https://blog.kellybrazil.com/2020/03/25/jello-the-jq-alterna...
[+] [-] jcims|6 years ago|reply
[+] [-] Rapzid|6 years ago|reply
Give something like this a go if you know javascript: https://www.npmjs.com/package/jsling
[+] [-] fokinsean|6 years ago|reply
With that said this is a great overview!
[+] [-] MichaelMoser123|6 years ago|reply
[+] [-] inetknght|6 years ago|reply
s/rather than/in addition to/
For those of us with javascript disabled, the way the page works is perfectly fine as it is.
[+] [-] MichaelMoser123|6 years ago|reply
Do you have some small example where they do such an UI properly, so that i can copy it? I am not much of an expert in javascript/css.
[+] [-] enriquto|6 years ago|reply
I guess some of the examples would be simpler than the jq solution.
[+] [-] loeg|6 years ago|reply
[+] [-] unknown|6 years ago|reply
[deleted]
[+] [-] fs111|6 years ago|reply
[+] [-] joppy|6 years ago|reply
Why have a problem with this? Catting a single file is a well-known idiom for outputting its contents into a stream, plays well with positioning in pipelines, and has the nice property that you can erase as much of the pipeline as you want, to be able to peek inside it at any point.
[+] [-] m4r35n357|6 years ago|reply
[+] [-] Karliss|6 years ago|reply
[+] [-] unknown|6 years ago|reply
[deleted]
[+] [-] unknown|6 years ago|reply
[deleted]
[+] [-] majkinetor|6 years ago|reply
[+] [-] dotancohen|6 years ago|reply
[+] [-] evilduck|6 years ago|reply