Ever wanted to know the progress of a long running operation? Copying a file, importing a mysql db, etc. Pipe Viewer (pv) is what you need: http://www.ivarch.com/programs/pv.shtml.
It lets you monitor the progress of any piped command and gives you time elapsed, speed, time left, and a progress bar (wget style).
Pipe Viewer dramatically increased my productivity for large scale data processing. In particular, it lets you quickly know whether something will take 5 minutes, or 2 hours, so you can plan accordingly. It's painful watching people try to do this without pv.
It provides a great suite of tools for interacting with S3, and is best used on an EC2 instance you are connected to via SSH. It is also ridiculously fast, much faster than trying to interact with S3 from a local FTP browser, or even from Amazon's own S3 dashboard. For example on my computer using Amazon's own web facing dashboard it will take about 30-45 minutes to make 14000 files on S3 public, versus via the command line tool downloaded and running on one of my EC2 instances it can make those files public within minutes.
I assume this is because it is local network traffic. Anyway, if you are ever in a bind and need to move a bunch of files to S3, I highly recommend S3 Tools. It has saved me many times.
Along those lines wget is the most powerful command line tool I've ever used, with so much capability. It is simply incredible when combined with S3 tools, allowing you to easily grab gigabytes of images off of a personal server or staging location and upload them to S3 very quickly.
And if you need to do more than move files around you can manage even more aspects of AWS including EC2 instances from the command line using this powerful command line tool:
I prefer curl'ing icanhazip.com, and I make it a quick alias in my .bashrc:
alias myip="curl icanhazip.com"
Apachebench (ab) is a decent alternative to siege, and knowing how to use tcpdump and netcat comes in handy for debugging. Other than that, my new favorite command line tool over the past several months has to be vagrant, which lets you script and streamline VM creation and builds from the command line. If I need to completely reproduce my production environment on a test box, it's my utility of choice.
My most used cli tool outside of the default nuts and bolts is dtrx, the best and easiest file extractor for *nix. No more fiddling with flags or looking up, handles issues with putting lots of things in different directories or the wrong permissions on files. It has saved me a ton of time over the years.
Wow! Just wow! Bookmarked! A mere upvote is not enough for this! Thank you!!!
Questions:
- Is there a site pointing out amazing command line tools like this? (their existence as opposed to usage examples like at http://www.commandlinefu.com/)
- Is there a site listing the OS X equivalents for Linux command line tools? (e.g. what's the OS X equivalent for dos2unix / flip?)
lftp is a practical SFTP, FTPS and FTP transfer program, including automatic upload/download resumption and synchronization (mirror) mode. Good for both interactive use and scripting.
curl -I and wget -S are particularly helpful when debugging redirects.
Sometimes I migrate URL schemes, and set up permanent redirects in my .htaccess files. Testing them in a browser is a real pain, because browsers cache the redirect (which is the point of having a permanent redirect), so even if you change the .htaccess, you still get the old response. And pressing the refresh button is no help, because that reloads the destination page, not the source of the redirect.
That's when a non-caching command line client saves your day.
Screen is a bit like a window manager for consoles. I use it to start multiple (software) servers without daemon mode. I can then switch between their outputs with screen.
Screen also detaches the console from your ssh session. In my case this means that the servers keep running if I loose my ssh connection to the (hardware) server.
It's a very handy tool and definitely belongs in the articles list of tools.
(Sort of) in the same vein, I've recently started using xmonad as my window manager and so far it's a lot more comfortable than the Ubuntu default. You may need to learn a teeny but of Haskell to get up and running, but so far I've been OK copy-pasting from sample configs and muddling through.
I hate that the default escape key sequence clobbers over Ctrl-A though, so the first thing I always have to do when I log into a new server or account is this:
$ echo 'escape ^uU' > ~/.screenrc
Or I quickly start tearing my hair out and screaming profanities every time I try to do something.
screen is one of those things that's been on my todo list for way too long, along with its alternatives(?), tmux and byobu. Anyone that uses/used all three and can offer a comparison?
It's been mentioned on Hacker News a few times before, but my project PageKite (and showoff.io and localtunnel) is designed to help out with quick web demos and collaboration.
... answer a few questions and whatever is running on port 80 will be visible as https://SOMENAME.pagekite.me/ within moments, almost no matter what kind of network connection you have. :-)
There are also .deb and .rpm packages available for heavier users.
ngrep was news to me. I think I've heard of it once before, recently, but didn't really realize what it was. Having not tried it, it sounds like a really nice option when something like Wireshark or Ethereal would be overkill or just too much effort to bother with.
[+] [-] bretthopper|14 years ago|reply
It lets you monitor the progress of any piped command and gives you time elapsed, speed, time left, and a progress bar (wget style).
Copy a file:
Import mysql db: More tricks: http://blog.urfix.com/9-tricks-pv-pipe-viewer/edit: To install on OS X just do
[+] [-] brendano|14 years ago|reply
[+] [-] NathanKP|14 years ago|reply
http://s3tools.org/s3tools
It provides a great suite of tools for interacting with S3, and is best used on an EC2 instance you are connected to via SSH. It is also ridiculously fast, much faster than trying to interact with S3 from a local FTP browser, or even from Amazon's own S3 dashboard. For example on my computer using Amazon's own web facing dashboard it will take about 30-45 minutes to make 14000 files on S3 public, versus via the command line tool downloaded and running on one of my EC2 instances it can make those files public within minutes.
I assume this is because it is local network traffic. Anyway, if you are ever in a bind and need to move a bunch of files to S3, I highly recommend S3 Tools. It has saved me many times.
Along those lines wget is the most powerful command line tool I've ever used, with so much capability. It is simply incredible when combined with S3 tools, allowing you to easily grab gigabytes of images off of a personal server or staging location and upload them to S3 very quickly.
And if you need to do more than move files around you can manage even more aspects of AWS including EC2 instances from the command line using this powerful command line tool:
http://www.timkay.com/aws/
Its great for writing automated bash scripts to manage your instances via cron jobs.
[+] [-] epi0Bauqu|14 years ago|reply
[+] [-] Pewpewarrows|14 years ago|reply
[+] [-] mwill|14 years ago|reply
[+] [-] LiveTheDream|14 years ago|reply
[+] [-] RyanKearney|14 years ago|reply
[+] [-] genieyclo|14 years ago|reply
http://brettcsmith.org/2007/dtrx/
[+] [-] divtxt|14 years ago|reply
Questions:
- Is there a site pointing out amazing command line tools like this? (their existence as opposed to usage examples like at http://www.commandlinefu.com/)
- Is there a site listing the OS X equivalents for Linux command line tools? (e.g. what's the OS X equivalent for dos2unix / flip?)
(edit: merge 2 comments into 1)
[+] [-] ludwigvan|14 years ago|reply
[+] [-] dexen|14 years ago|reply
[+] [-] nikcub|14 years ago|reply
for eg.
will start 10 concurrent download processes (n) and continue (c). lftp is full of hidden gems.[+] [-] LiveTheDream|14 years ago|reply
[+] [-] perlgeek|14 years ago|reply
Sometimes I migrate URL schemes, and set up permanent redirects in my .htaccess files. Testing them in a browser is a real pain, because browsers cache the redirect (which is the point of having a permanent redirect), so even if you change the .htaccess, you still get the old response. And pressing the refresh button is no help, because that reloads the destination page, not the source of the redirect.
That's when a non-caching command line client saves your day.
[+] [-] cultureulterior|14 years ago|reply
I find it amazingly useful.
[+] [-] flexterra|14 years ago|reply
[+] [-] trb|14 years ago|reply
Screen also detaches the console from your ssh session. In my case this means that the servers keep running if I loose my ssh connection to the (hardware) server.
It's a very handy tool and definitely belongs in the articles list of tools.
[+] [-] cosgroveb|14 years ago|reply
[+] [-] alinajaf|14 years ago|reply
[+] [-] jodoherty|14 years ago|reply
$ echo 'escape ^uU' > ~/.screenrc
Or I quickly start tearing my hair out and screaming profanities every time I try to do something.
[+] [-] pyrhho|14 years ago|reply
[+] [-] reinhardt|14 years ago|reply
[+] [-] HerraBRE|14 years ago|reply
Getting started is now two lines:
... answer a few questions and whatever is running on port 80 will be visible as https://SOMENAME.pagekite.me/ within moments, almost no matter what kind of network connection you have. :-)There are also .deb and .rpm packages available for heavier users.
[+] [-] nobuff|14 years ago|reply
[+] [-] DevX101|14 years ago|reply
[+] [-] pbogdan|14 years ago|reply
[+] [-] sebmck|14 years ago|reply
[deleted]
[+] [-] hasenj|14 years ago|reply
But I didn't find anything else to be interesting.
[+] [-] ams6110|14 years ago|reply
[+] [-] tecoholic|14 years ago|reply
[+] [-] brs|14 years ago|reply
[+] [-] LiveTheDream|14 years ago|reply
[+] [-] sarp|14 years ago|reply
[+] [-] ryduh|14 years ago|reply
[+] [-] LiveTheDream|14 years ago|reply
[+] [-] marcusramberg|14 years ago|reply
[+] [-] vld|14 years ago|reply
wget post-file.it --post-file /path/to/file
[+] [-] sfoguy|14 years ago|reply
[+] [-] gluecode|14 years ago|reply
[+] [-] dstein|14 years ago|reply
[+] [-] ez77|14 years ago|reply