I think the reason I've been avoiding Powershell is specifically the pattern of `Verb-Noun -Option argument` the author brings up.
I'm not saying it's not "better" (however you measure that) but I think it's very wordy and that's always been a turn-off and barrier of entry for me.
some examples:
pwd --> Set-Location
ls --> Get-ChildItem
cp --> Copy-Item
I totally understand how bash's sometimes seemingly random shortcuts or acronyms can ALSO be a huge barrier of entry (just as it was for me!) and the straightforwardness of Powershell is a lot clearer, but personally now that I'm "in" on the short-hand of bash, it's been hard to start using Powershell.
Wordiness for some interesting reason makes it HARDER for me to remember stuff.
Kind of like I struggled with PMP exam and memorization because "Integrated Change Control Management" became "Controlled Management Change Integration" or "Managed Variation Integration" or whatever in my mind,
"Retrieve-Child-Item" vs "Get-ChildItem" or "Send-Command" vs "Invoke-Command" "Remove-Tiingamajiggy" vs "Delete-Whatchamacallit"... my mind is seemingly better equipped to memorize "obscure but unique" gobbledygook rather than "meaningful-but-generic" verbiage :|
It also doesn't help that by default (at least in my experience), Powershell's tab completion is slow and annoying to use. I don't know if it's technically inferior, but it feels terrible.
It does the thing where you get to cycle through various options instead of completing to the longest common prefix, which is really hard to get used to after years and years of interfaces that do the other thing.
It's also difficult to form a repertoire of common shortcuts over time because so many commands share a prefix, so most often the shortest you will be typing is 5-6 characters before you get to anything unique.
Though I definitely agree that bash's abbreviations are a barrier to entry, I think a bigger impediment not just with bash but many languages (say, Haskell) is ungooglable operator syntax.
If you didn't know what it was, how would you figure out what `$(expression)` means in bash for example?
Commonly used terms/functions tend to become shorter and symbolized as some notation, similar to how all the Math notation came to be. A mathematician would be unproductive without those shorthand notations.
But I guess there is a balance. Not everyone want to write matrix multiplications as A+.×B, though A%*%B seems acceptable to some, most nowadays write it as np.dot(A,B).
Fwiw, those aliases you mention all exist. I always do "ls" and "cd" for example. But what really annoys me is that there's no aliases for arguments. So "find . -name foo" becomes "ls -Recurse -Include foo" which makes my carpal tunnel just a little worse.
It's a shame because the rest of Powershell is so good.
More importantly, when you have to adhere to the "Generic Verb-Generic Noun" pattern, the namespace fills up quickly and now creating a new program and sharing it is stifled by two problems:
1. coming up with a name that won't clash with others in already-extremely-limited namespace
2. marketing of a generically-named thing.
In the Unix world, let's take top. After top came htop. How would that play out in the Powershell world? List-Processes -> SuperDuperList-Processes?
Microsoft may contribute to open source but not all contributions are good. Powershell breaks with every significant open-source language style and is therefore a typically bad investment of time for a non-Windows developer.
MS technologies have sometimes been a bad investment for Windows developers too.
As others have said, aliases do exist, but I don't think you can deny that verb-noun makes learning Powershell a lot easier. For instance, if I want to get a user account, you know that Get-User exists. You can therefore also assume that Set-User also exists as a command. Tab completion helps a lot too with figuring out what options there are.
Lots of people here suggesting the main benefit of PowerShell is its object model and indeed that is very useful, but there are other great features as well. First and foremost, PowerShell basically has a command line parameter framework built in. You also have a runtime backed by one of the best standard libraries out there (.NET) - one in which you can easily reach into anywhere in your PowerShell scripts. It also has a module ecosystem supporting development in either PowerShell or C# proper. And now with PowerShell core it's cross platform. PowerShell also supports pipelines, but I mention this last because it's obviously not a distinguisher for it with bash. It's really not even a contest, PowerShell is way more... well, powerful than bash. Since I've become proficient I would never go back to bash.
This isn’t necessarily something I’m proud of, but at my old C# job I would do inline C# in Powershell as a very hackish “C# REPL” for prototyping and interactive testing. At the time C# Interactive in Visual Studio was unreliable and I found it easier to just copy-paste C# code into a Powershell script.
It seems that C# Interactive has gotten better (and since leaving that job I have switched to 100% F# for .NET stuff). But a more useful application is using Powershell to bundle a .NET class library into a flexible, low-weight, modular command line application for internal use. For instance, a C# library which does serious analytics on large data, and then a Powershell script that deals with easier annoyances like AWS authentication or FTP access, argument parsing, and so on. Obviously a real .exe is a better long-term solution but I found Powershell worked really well for rapidly sharing compiled .NET code into a tool that data scientists on my team could use.
I don't think it's a question of it being more powerful than bash. Why does it need to be? I'm not going to use shell script to write apps. There are better scripting languages like python and node for scripting non-trivial apps.
I'm a powershell user for about 7 years now, or more. Have a lot of bindings, helpers in my powershell profile, and use it everyday for work.
That being said, I'll never use it on Linux as my shell. It's just too slow to start. Fish, nushell, bash, all start instantaneously, powershell (legacy and core), all take more than a second to start, on beefy machines.
I've been looking at the powershell core repo in hopes of fixing this with .net core ready to run profile, but they seemed to have something like that in place, but was disabled at that time.
Anyway, PowerShell is good, probably the best you can get on Windows(nushell also takes a bit to start, and it's still new). But, on Linux you can do much better, even if that means having to struggle with Bash/Fish scripting.
For more complex scripts a full language like Lua or Python are most likely better.
Also, last time I checked the docker container for PowerShell Core was easily over 100MB, if I remember correctly. Might work well for a dev machine, where you set it once, but for a CI, it's not ideal.
PowerShell 7.2.0-preview.2 starts nearly instantaneously for me on Linux, while it's significantly slower on Windows.
The non-core version on Windows takes ages (up to 10 seconds) to start. It's helped a bit by configuring it with -NoLogo, but still extremely slow, especially as soon as there are any modules loaded in the profile.
Pipes in powershell are just as powerful as in bash.
Every curly braces pair is a lambda function with a single parameter $_.
You can approach powershell very functionally and the abstractions work fine.
It is however a minefield of pitfalls.
Obscure error handling at the root script. Always run code at a main function that is called in the root script.
Permissive undefined variables and parameters access.
Unwinding nested attributes by referring to it at the root object level: what if you need a variable attribute from an object but the attribute collides with a built in method?
Truthy and equality are not interchangeable. -eq and Object.Equals produces vastly different results, due to the first not being type safe.
Some functions are sintactically similar to flags and parameters: -join.
Case insensitivity. Seriously...
All in all, if you are mindful of the pitfalls, use Set-StrictMode early on and don't run code at the script root, you are fine. Better than bash. A well written poweshell code is in my opinion way more readable and maintainable by than a well written bash code, despite its shortcomings.
However both of them don't come close to what python is in terms of type safety, error handling, legibility, flexibility and maintainability, even for system scripting.
My favorite is that functions don't actually have return statements.
A return is just an exit for the function. If you try to assign the output of a function to a variable, it will essentially be a string containing whatever is printed to the console throughout the function's duration.
Makes me wonder if there might ever be support for standardizing a json input/output schema for all the typical Linux userland stuff, /proc structures, and so on. Then a "jq" like utility would have more value. Having it co-exist with the current "just a bunch of text" isn't ideal though, as you'd be stuck with something clunky like adding a switch to every command. Or an environment variable that changes everything. Or a "j-" version of each.
The best approach I've heard is to "standardize" on doing it with IO descriptors, so we'd have stdin, stdout, stderr, stdjsin, stdjsout. Individual programs could check which file descriptors are open and read/write the format automatically. This may also be the best way to leverage the benefits of plain text and structured data and move between them quite naturally as the situation demands. It's also really not that hard to write a few basic utils or fork old ones to the new scheme as a proof of concept, but AFAIK no one ever has.
While we're here I'd also like to see more experimentation on using file desciprtors with GUI/TUI tools. I've had good like using things like vim in the middle of a pipe for data that needs a human touch before being passed through. The suckless world uses dmenu quite a bit.
JFTR, stumbled over this recently: https://github.com/kellyjonbrazil/jc
"CLI tool and python library that converts the output of popular command-line tools and file-types to JSON or Dictionaries. This allows piping of output to tools like jq and simplifying automation scripts."
There was an endeavor sometime around 1998-2002 to reimplement a lot of coreutils, textutils, and other GNU-ish *utils in perl with regular, structured output formats (I believe XML at the time). Unfortunately, I don't remember the exact name and I don't think it was ever completed enough to be a contender.
Maybe a better idea would be something like the HTTP Accept header for the shell, where the user can set it once and any supporting application will respond in that format.
I think for the devops person or someone doing server administration (sysadmins etc) powershell everywhere must make things easier even with the trade offs (as some have mentioned, it does have a bigger memory foot print, which may or may not matter depending on a host of factors)
As a developer? I haven't found PowerShell more useful than zsh/bash or fish (if you haven't, try fish, it has a lot of benefits of PowerShell (fish has its own scripting language that is more "language like", like some of the more simple constructs of Python, syntax wise) but via a simple plugin[0] you get bash compatibility too, and its made for Unix like environments). I do like that it has a rich object data model, I just don't do that kind of thing in my shell. I mostly use aliases, shortcuts, and maybe some grepping. I don't do heavy duty tasks from the command line where I'm not writing the logic in the first place, and I just find it easier to use the standard that my team does (currently, this is JavaScript, with the shebang it executes just like a binary. We can reliably say everyone has the same version of node)
Maybe in the future this will change, but I don't see the win to divide my attention economy to it deeply, personally.
I agree its more for a devops person, not developers so much. Specifically Windows techs who maybe don't use Linux very often. I've steered a couple of the new IT guys towards it when they ask about how to automate something. Its much easier to learn new, than to learn shell scripts, batchfiles or C#.
The only place where I use PowerShell as a developer is in Visual Studio, in NuGet Package Manager. Update-Package, Generate-Migration, Create-Database, etc..
I tried Powershell for cross platform usage. While I didn't put it through the paces, I didn't have any issues on linux. In the end, I ditched it
- Powershell is too verbose. I might as well use Python or node. Hard to beat lodash and node.js for processing objects.
- Powershell starts slow, almost 2 sec. I use i3 and it's noticeable when I start a terminal with Powershell. Sure it's a one time cost but I'm nerdy like that.
- If I work on Windows I use virtual machines or WSL2 obviating the need for a cross platform shell.
It's funny how the article claims that treating everything like a string is a drawback, when it's often touted as a strength of bash.
The author of this piece claims: "Powershell [...] offers a series of commands useful for developing tools and automatisms that are very difficult to implement with simple strings." but as far as I can tell, they don't go on to actually explain any of these cases where Powershell is a more appropriate tool than traditional string-based shells.
ifconfig eth0 | awk '/^[a-z]/ { iface=$1; mac=$NF; next } /inet addr:/ { print iface, mac }'
Both of these are tied to the way the MAC address is printed in the output of the ifconfig command, and there's no contract that says that can't change. In fact, there are probably versions of it out there where these won't work. In PowerShell you would do this...
This is far more readable if you pass this script on to someone else, it won't break if the way it Get-NetAdapter gets rendered to the screen changes, and best of all, since I know the discoverability tricks in PowerShell, even though I've never done this before, I didn't have to go to stack overflow to find it.
I can bring 10 - 30 instances any day in parallel without a glitch and still consuming low amount of mem. Its higher then bash, but again, bash sux and when you have 0 features compared to pwsh you can use lower amount of memory I guess. Bash is ancient, memory was much bigger problem then. Its funny that in age in electron apps we talk about memory when your todo app takes more then anything else.
> The killer though is a disregard for economy of expression.
?
> the rich object data model - isn’t actually that useful in practice day to day.
I love powershell because I have a basic working knowledge of Linux commands. In bash this means I have to google a bit for even slightly complex things like 'change the extension of all .swp files under this directory'.
In powershell the short syntax tends to be noisy but I usually can start with the linux commands and muddle through with tab completion:
Almost always when I see PowerShell critiqued by people used to Linux and Bash, the criticisms leveled against if often add up to: "I am used to the workarounds for the limitations of my system. Sure, your system does not have these limitations, but what if I need the workarounds, like with my current system?"
This is like... AutoCAD before the year 2000. It was digital paper, and acted exactly like a drafting board with pens, rulers, and protractors. It was better than paper, but not by much! SolidWorks came out and blew it away. It was proper 3D, with constructive solid modelling. It could generate drawings of arbitrary projections or cross sections in under a second, saving months of time. Yet... people complained the same way. What if I need drafting tools? What if I want to draw lines manually? How do I draw "just" a circle? Why do I need to define what I'm doing in 3D? Just give me digital paper! That's what I want!
Seriously. In UNIX, if you want to sort the output of "ps"... sss... that's hard. Sure, it has some built-in sorting capabilities, but they're not a "sort" command, this is a random addon it has accumulated over time. It can order its output by some fields, but not others. It can't do complex sorts, such as "sort by A ascending, then by B descending". To do that, you'd have to resort to parsing its text output and feeding that into an external tool. Ugh.
Heaven help you if you want to sort the output of several different tools by matching parameters. Some may not have built-in sort capability. Some may. They might have different notions of collations or internationalisation.
In PowerShell, no command has built in sort, except for "Sort-Object". There are practically none that do built in grouping, except for "Group-Object". Formatting is external too, with "Format-Table", "Format-List", etc...
So in PowerShell, sorting processes by name is simply:
ps | sort ProcessName
And never some one-character parameter like it is in UNIX, where every command has different characters for the same concept, depending on who wrote it, when, what order they added features, what conflicting letters they came across, etc...
UNIX commands are more an accident of history than a cohesive, coherent, composable design. PowerShell was designed. It was designed by one person, in one go, and it is beautiful.
The acid test I give UNIX people to see if they really understand how weak the classic bash tools they use is this:
Write me a script that takes a CSV file as an input, finds processes being executed by users given their account names and process names from the input file, and then terminates those processes. Export a report of what processes were terminated, with ISO format dates of when the processes were started and how much memory they used into a CSV sorted by memory usage.
Oh, there's a user called "bash", and some of the CSV input fields may contain multiple lines and the comma character. (correctly stored in a quoted string, of course!)
This kind of thing is trivial in PowerShell. See if you can implement this, correctly in bash, such that you never kill a process that isn't in the input list.
Give it a go.
...
After I posted the above, "JoshuaDavid" provided a correct Bash solution, which blew my mind because I just assumed it was borderline impossible: https://news.ycombinator.com/item?id=23267901
Note how complex his solution is, and that he had to resort to using "jq" to convert the output of "ps" to JSON for the processing!
Clear, readable, and easy to modify even for a junior tech.
What I didn't say in that thread was this: I didn't actually bother to work out the solution to my toy problem in PowerShell before JoshuaDavid posted his solution.
I made up the problem simply assuming that it's ludicriously difficult in bash -- without checking -- and I similarly assumed that it's trivial in PowerShell -- without bothering to check.
I was that confident.
Are you still that confident that Bash is superior to PowerShell? Or have you internalised its constraints, and are too used to drawing fiddly little lines on digital paper to realise that your tooling is hopelessly outmatched by solid modelling?
Well... if it's about complex scripting, you have Python or Perl whose are standard, well integrated, lightweight, powerfull and preinstalled on Linux.
If it's for a day to day usage as a shell, bash / fish / zsh... are more concise and faster.
I wish there was a sane way to bootstrap powershell without having to use an existing binary release. It is not surprising thought, you can't bootstrap bash using .net core either.
[+] [-] bszupnick|5 years ago|reply
I'm not saying it's not "better" (however you measure that) but I think it's very wordy and that's always been a turn-off and barrier of entry for me.
some examples:
pwd --> Set-Location
ls --> Get-ChildItem
cp --> Copy-Item
I totally understand how bash's sometimes seemingly random shortcuts or acronyms can ALSO be a huge barrier of entry (just as it was for me!) and the straightforwardness of Powershell is a lot clearer, but personally now that I'm "in" on the short-hand of bash, it's been hard to start using Powershell.
[+] [-] BoiledCabbage|5 years ago|reply
Essentially every command you'd use has a 2 or 3 character alias that is easy to remember and quick to type.
On top of it their almost programitacilly named, so if you know the powershell commands full name you can almost certainly guess the alias
Typing Get-Alias lists them all out.
https://docs.microsoft.com/en-us/powershell/module/microsoft...
[+] [-] NikolaNovak|5 years ago|reply
Kind of like I struggled with PMP exam and memorization because "Integrated Change Control Management" became "Controlled Management Change Integration" or "Managed Variation Integration" or whatever in my mind,
"Retrieve-Child-Item" vs "Get-ChildItem" or "Send-Command" vs "Invoke-Command" "Remove-Tiingamajiggy" vs "Delete-Whatchamacallit"... my mind is seemingly better equipped to memorize "obscure but unique" gobbledygook rather than "meaningful-but-generic" verbiage :|
[+] [-] chousuke|5 years ago|reply
It does the thing where you get to cycle through various options instead of completing to the longest common prefix, which is really hard to get used to after years and years of interfaces that do the other thing.
It's also difficult to form a repertoire of common shortcuts over time because so many commands share a prefix, so most often the shortest you will be typing is 5-6 characters before you get to anything unique.
[+] [-] Kuinox|5 years ago|reply
[+] [-] voodootrucker|5 years ago|reply
If you didn't know what it was, how would you figure out what `$(expression)` means in bash for example?
[+] [-] jxy|5 years ago|reply
But I guess there is a balance. Not everyone want to write matrix multiplications as A+.×B, though A%*%B seems acceptable to some, most nowadays write it as np.dot(A,B).
[+] [-] sterlind|5 years ago|reply
It's a shame because the rest of Powershell is so good.
[+] [-] yakubin|5 years ago|reply
1. coming up with a name that won't clash with others in already-extremely-limited namespace
2. marketing of a generically-named thing.
In the Unix world, let's take top. After top came htop. How would that play out in the Powershell world? List-Processes -> SuperDuperList-Processes?
[+] [-] heresie-dabord|5 years ago|reply
MS technologies have sometimes been a bad investment for Windows developers too.
[+] [-] axiomdata316|5 years ago|reply
[+] [-] alexfrydl|5 years ago|reply
[+] [-] evad3r|5 years ago|reply
[+] [-] meragrin_|5 years ago|reply
[+] [-] philstephenson|5 years ago|reply
[+] [-] ojnabieoot|5 years ago|reply
It seems that C# Interactive has gotten better (and since leaving that job I have switched to 100% F# for .NET stuff). But a more useful application is using Powershell to bundle a .NET class library into a flexible, low-weight, modular command line application for internal use. For instance, a C# library which does serious analytics on large data, and then a Powershell script that deals with easier annoyances like AWS authentication or FTP access, argument parsing, and so on. Obviously a real .exe is a better long-term solution but I found Powershell worked really well for rapidly sharing compiled .NET code into a tool that data scientists on my team could use.
[+] [-] mmgutz|5 years ago|reply
[+] [-] 6d65|5 years ago|reply
That being said, I'll never use it on Linux as my shell. It's just too slow to start. Fish, nushell, bash, all start instantaneously, powershell (legacy and core), all take more than a second to start, on beefy machines.
I've been looking at the powershell core repo in hopes of fixing this with .net core ready to run profile, but they seemed to have something like that in place, but was disabled at that time.
Anyway, PowerShell is good, probably the best you can get on Windows(nushell also takes a bit to start, and it's still new). But, on Linux you can do much better, even if that means having to struggle with Bash/Fish scripting.
For more complex scripts a full language like Lua or Python are most likely better.
Also, last time I checked the docker container for PowerShell Core was easily over 100MB, if I remember correctly. Might work well for a dev machine, where you set it once, but for a CI, it's not ideal.
[+] [-] solarkraft|5 years ago|reply
The non-core version on Windows takes ages (up to 10 seconds) to start. It's helped a bit by configuring it with -NoLogo, but still extremely slow, especially as soon as there are any modules loaded in the profile.
[+] [-] gchamonlive|5 years ago|reply
Every curly braces pair is a lambda function with a single parameter $_.
You can approach powershell very functionally and the abstractions work fine.
It is however a minefield of pitfalls.
Obscure error handling at the root script. Always run code at a main function that is called in the root script.
Permissive undefined variables and parameters access.
Unwinding nested attributes by referring to it at the root object level: what if you need a variable attribute from an object but the attribute collides with a built in method?
Truthy and equality are not interchangeable. -eq and Object.Equals produces vastly different results, due to the first not being type safe.
Some functions are sintactically similar to flags and parameters: -join.
Case insensitivity. Seriously...
All in all, if you are mindful of the pitfalls, use Set-StrictMode early on and don't run code at the script root, you are fine. Better than bash. A well written poweshell code is in my opinion way more readable and maintainable by than a well written bash code, despite its shortcomings.
However both of them don't come close to what python is in terms of type safety, error handling, legibility, flexibility and maintainability, even for system scripting.
[+] [-] addicted|5 years ago|reply
A return is just an exit for the function. If you try to assign the output of a function to a variable, it will essentially be a string containing whatever is printed to the console throughout the function's duration.
[+] [-] tyingq|5 years ago|reply
[+] [-] flukus|5 years ago|reply
While we're here I'd also like to see more experimentation on using file desciprtors with GUI/TUI tools. I've had good like using things like vim in the middle of a pipe for data that needs a human touch before being passed through. The suckless world uses dmenu quite a bit.
[+] [-] eMPee584|5 years ago|reply
[+] [-] thwarted|5 years ago|reply
[+] [-] opk|5 years ago|reply
[+] [-] rossvor|5 years ago|reply
It's perennially in my "to check out" list and I haven't actually played around with it, mainly because I'm lazy and it's not in AUR.
So can't comment how ready it is to use to today. But it looks interesting.
[1] https://relational-pipes.globalcode.info/v_0/index.xhtml
[+] [-] symlinkk|5 years ago|reply
[+] [-] bloblaw|5 years ago|reply
https://helgeklein.com/blog/2014/11/hate-powershell/
I still use CMD.exe, so please get off my lawn ;-)
When I need the power of Powershell, I use C# (or even Python + Py2Exe if I'm deploying).
[+] [-] mjevans|5 years ago|reply
Did PowerShell devs learn nothing from what was horrid about PHP?
[+] [-] no_wizard|5 years ago|reply
As a developer? I haven't found PowerShell more useful than zsh/bash or fish (if you haven't, try fish, it has a lot of benefits of PowerShell (fish has its own scripting language that is more "language like", like some of the more simple constructs of Python, syntax wise) but via a simple plugin[0] you get bash compatibility too, and its made for Unix like environments). I do like that it has a rich object data model, I just don't do that kind of thing in my shell. I mostly use aliases, shortcuts, and maybe some grepping. I don't do heavy duty tasks from the command line where I'm not writing the logic in the first place, and I just find it easier to use the standard that my team does (currently, this is JavaScript, with the shebang it executes just like a binary. We can reliably say everyone has the same version of node)
Maybe in the future this will change, but I don't see the win to divide my attention economy to it deeply, personally.
[0]: https://github.com/edc/bass
[+] [-] icameron|5 years ago|reply
The only place where I use PowerShell as a developer is in Visual Studio, in NuGet Package Manager. Update-Package, Generate-Migration, Create-Database, etc..
[+] [-] mmgutz|5 years ago|reply
- Powershell is too verbose. I might as well use Python or node. Hard to beat lodash and node.js for processing objects.
- Powershell starts slow, almost 2 sec. I use i3 and it's noticeable when I start a terminal with Powershell. Sure it's a one time cost but I'm nerdy like that.
- If I work on Windows I use virtual machines or WSL2 obviating the need for a cross platform shell.
[+] [-] ghostpepper|5 years ago|reply
The author of this piece claims: "Powershell [...] offers a series of commands useful for developing tools and automatisms that are very difficult to implement with simple strings." but as far as I can tell, they don't go on to actually explain any of these cases where Powershell is a more appropriate tool than traditional string-based shells.
[+] [-] 13of40|5 years ago|reply
Imagine you need to get the MAC address of a network adapter in Bash. One way to do it would be...
...another... Both of these are tied to the way the MAC address is printed in the output of the ifconfig command, and there's no contract that says that can't change. In fact, there are probably versions of it out there where these won't work. In PowerShell you would do this... This is far more readable if you pass this script on to someone else, it won't break if the way it Get-NetAdapter gets rendered to the screen changes, and best of all, since I know the discoverability tricks in PowerShell, even though I've never done this before, I didn't have to go to stack overflow to find it.[+] [-] CraigJPerry|5 years ago|reply
The benefit of powershell - the rich object data model - isn’t actually that useful in practice day to day.
The memory usage is ridiculous for a shell.
The killer though is a disregard for economy of expression. The example in the article can be expressed as just “find /path -mtime -3”.
I have kept ps on Windows, i didn’t go back to cmd.exe there but WSL2 is still my default when opening Microsoft Terminal.
[+] [-] majkinetor|5 years ago|reply
I can bring 10 - 30 instances any day in parallel without a glitch and still consuming low amount of mem. Its higher then bash, but again, bash sux and when you have 0 features compared to pwsh you can use lower amount of memory I guess. Bash is ancient, memory was much bigger problem then. Its funny that in age in electron apps we talk about memory when your todo app takes more then anything else.
> The killer though is a disregard for economy of expression.
?
> the rich object data model - isn’t actually that useful in practice day to day.
Its most useful every day feature for me.
[+] [-] Tarean|5 years ago|reply
In powershell the short syntax tends to be noisy but I usually can start with the linux commands and muddle through with tab completion:
[+] [-] jiggawatts|5 years ago|reply
This is like... AutoCAD before the year 2000. It was digital paper, and acted exactly like a drafting board with pens, rulers, and protractors. It was better than paper, but not by much! SolidWorks came out and blew it away. It was proper 3D, with constructive solid modelling. It could generate drawings of arbitrary projections or cross sections in under a second, saving months of time. Yet... people complained the same way. What if I need drafting tools? What if I want to draw lines manually? How do I draw "just" a circle? Why do I need to define what I'm doing in 3D? Just give me digital paper! That's what I want!
I made a comment on YC News nearly a year ago that I'm going to partially repeat below: https://news.ycombinator.com/item?id=23257776
PowerShell is more UNIX than UNIX.
Seriously. In UNIX, if you want to sort the output of "ps"... sss... that's hard. Sure, it has some built-in sorting capabilities, but they're not a "sort" command, this is a random addon it has accumulated over time. It can order its output by some fields, but not others. It can't do complex sorts, such as "sort by A ascending, then by B descending". To do that, you'd have to resort to parsing its text output and feeding that into an external tool. Ugh.
Heaven help you if you want to sort the output of several different tools by matching parameters. Some may not have built-in sort capability. Some may. They might have different notions of collations or internationalisation.
In PowerShell, no command has built in sort, except for "Sort-Object". There are practically none that do built in grouping, except for "Group-Object". Formatting is external too, with "Format-Table", "Format-List", etc...
So in PowerShell, sorting processes by name is simply:
And never some one-character parameter like it is in UNIX, where every command has different characters for the same concept, depending on who wrote it, when, what order they added features, what conflicting letters they came across, etc...UNIX commands are more an accident of history than a cohesive, coherent, composable design. PowerShell was designed. It was designed by one person, in one go, and it is beautiful.
The acid test I give UNIX people to see if they really understand how weak the classic bash tools they use is this:
Write me a script that takes a CSV file as an input, finds processes being executed by users given their account names and process names from the input file, and then terminates those processes. Export a report of what processes were terminated, with ISO format dates of when the processes were started and how much memory they used into a CSV sorted by memory usage.
Oh, there's a user called "bash", and some of the CSV input fields may contain multiple lines and the comma character. (correctly stored in a quoted string, of course!)
This kind of thing is trivial in PowerShell. See if you can implement this, correctly in bash, such that you never kill a process that isn't in the input list.
Give it a go.
...
After I posted the above, "JoshuaDavid" provided a correct Bash solution, which blew my mind because I just assumed it was borderline impossible: https://news.ycombinator.com/item?id=23267901
Note how complex his solution is, and that he had to resort to using "jq" to convert the output of "ps" to JSON for the processing!
Compare to the solution in PowerShell, of which nearly half is just sample data: https://news.ycombinator.com/item?id=23270291
Clear, readable, and easy to modify even for a junior tech.
What I didn't say in that thread was this: I didn't actually bother to work out the solution to my toy problem in PowerShell before JoshuaDavid posted his solution.
I made up the problem simply assuming that it's ludicriously difficult in bash -- without checking -- and I similarly assumed that it's trivial in PowerShell -- without bothering to check.
I was that confident.
Are you still that confident that Bash is superior to PowerShell? Or have you internalised its constraints, and are too used to drawing fiddly little lines on digital paper to realise that your tooling is hopelessly outmatched by solid modelling?
[+] [-] knicknic|5 years ago|reply
bash set -e.(really really miss this)
Find it hard to set a script to abort with a stack trace.
Find it hard to deal with relative imports(this script imports a file in the same folder)
explaining the scoping rules
Disklike explains how your array is now a single object when you returned it from a function
Absolutely love powershell JSON support, miss native yaml support.
Love parameter globing
Love integration of parameters with a script, dislike that auto generated help can’t be done via single line comment to function
[+] [-] Toniglandyl|5 years ago|reply
If it's for a day to day usage as a shell, bash / fish / zsh... are more concise and faster.
The example given on the article :
# Get all file modified in the last 3 days
Get-ChildItem -Path path -Recurse | Where-Object {
}Is just:
find path -mtime 3
in bash ...
The object thing is nice, but using strings as output is universal.
[+] [-] hyperion2010|5 years ago|reply
[+] [-] unknown|5 years ago|reply
[deleted]
[+] [-] fartcannon|5 years ago|reply