Do you have a favorite bash tip or trick to share with others? We'd like to collect an increasing set of "best practices" or common practices or useful shortcuts, or idiomatic expressions. Unfortunately we've been trashed too many times by others abusing this site, so we've had to disable general editing of this page. If you have a good tip or trick, e-mail it to us.

Thanks for your help with this effort, and remember: all submissions become the property of the authors and editors of "bash Cookbook", which is how the lawyers say it when we mean "we can't pay you with anything but our heartfelt thanks."

For more tips and tricks, follow us on twitter.


Absolute Value
An easy way to take the absolute value of a variable is by using bash string manipulation to remove the leading minus sign. Just write:

    ${VAR#-}

Get to the bottom
16.15, page 383 has the alias:

    alias bot='cd $(dirname $(find . | tail -1))'

But I just wanted to get to the bottom of a tree without cd'ing into it first. So I changed it to:

    # cd to the bottom of a narrow but deep dir tree
    function bot {
        local dir=${1:-.}
        \cd $(dirname $(find $dir | tail -1))
    }

p.s. I've made a note of that for the next edition. -JP


simple array assignment
When you have a list of values in a shell variable and you want to put them into an array, it's easy - as long as the values have no embedded blanks. It goes like this:

    MYVALS="first second third fourth"
    MYRA=($MYVALS)

That's all there is to it. Then you can reference a piece as:

    echo ${MYRA[2]}

The echo will print out third (remember, it's a zero-based array).

p.s. Thanks for the question, Jim.


the last argument
The arguments to a script (or function) are $1, $2, ... and can be referred to as a group by $* (or $@ ). But is there an easy way to refer to the last argument in list ? Try ${!#} as in:

    echo ${!#}
    LAST=${!#}

capturing time output
The time keyword in bash will report on the real, user, and system time that a pipeline of commands consumes. How can I capture that output in a file or pipeline?

You'll notice the "problem" if you try a simple exercise like:

$ time ls >/dev/null 2>&1

real    0m0.005s
user    0m0.002s
sys     0m0.003s
$

We redirected both stdout and stderr, yet we still get output. Since time is a keyword, it's treated specially and its output is not affected by the redirections. After the ls command completes only then is the time output sent to stderr.

So we need to group the timed command (and its outputs) using a mechanism like parentheses. We don't need the overhead of a subshell, though, so we'll use braces to group it all together:

$ { time ls >/dev/null ; } 2>time.out
$ 

Similarly, we could use 2>&1 | somecmd ... if we wanted to pipe the time results somewhere.

p.s. Thanks, Josh, for asking. - Carl


readable dirs output
The dirs command output isn't that readable with many long pathnames. To make it more readable, I create a function (dirnl) that prints one to a line. Here's the bash function:

function dirnl() { for i in $(dirs); do echo $i; done }

Caveat: it does not work for pathnames with embedded blanks...but I don't use blanks in my filenames.


readable dirs output
...or you could just use the -p option on dirs:

alias dirs="dirs -p"

pipeline of commands
We discuss in the book (in the chapter on common mistakes) the fact that a pipeline of commands runs those commands in subshells. The result (or dilema) is that what happens in those subshells (e.g. counting something) is lost to the parent shell script unless the parent captures output from the pipeline, but that isn't always easy or desirable.

The bash man page describes a feature of bash called "Process Substitution" that lets you substitute the output of a pipeline of commands (actually a list of commands) using <(list) as the syntax.

But notice how the feature is described:

The process list is run with its input or output connected to a FIFO or some file in /dev/fd. The name of this file is passed as an argument to the current command as the result of the expansion.
The <(...) is going to be replaced with the name of a fifo. So if you wrote:
wc <(some commands)
the result would be:
wc fifo

that is, the fifo filename is passed to the command. That's fine for commands like wc that can accept a filename. But what about a builtin like while?

It turns out that you can add the redirect from the fifo, but the space between the two less-than signs is crucial to distinguish it from "<<", the "here document" syntax.

So you can write:

  while read a b c
  do
    ...
  done < <(pipeline of commands)

closing stdin
When we write scripts that will run as (or launch) daemons we often remember to redirect stdout and stderr somewhere - typically a logfile. But we forget to add " <- " on the command line to close stdin. When you don't close it, but launch some script in the background, then ssh won't "let go" of your session. Maybe you've seen that scenario? (yes, it's in the book: ch. 10, p.199) -- Carl


saving the working directory and cloning to another terminal window
I keep a lot of gnome-terminals open, each running bash. When I want to copy the working directory from one g-term window to another, I used to do a pwd in the first window, copy the directory with the mouse, and paste after a cd in the second (and subsequent) windows. It is much easier to use these two aliases in .bashrc:

  alias sd="pwd > ~/.sd"
  alias ds="cd `cat ~/.sd`"

Now I can type sd in the first directory, then ds in the second and subsequent directories. I am left-handed, so these keys are easy for me to type; If you are right handed, then kl and lk might work better - and make me famous. :-) Keith Lofstrom