164

On the CLI, sometimes a command I type takes a while to complete, and sometimes I know when that's about to happen. I'm a bit confused on "backgrounding" and such in Linux.

What is the most common (or user-friendly way) of telling the CLI that I don't want to wait, please give me back my prompt immediately. And if it could give me a progress bar or just busy-spinner, that would be great!

1

9 Answers 9

175

Before running the command, you can append & to the command line to run in the background:

long-running-command &

After starting a command, you can press CtrlZ to suspend it, and then bg to put it in the background:

long-running-command
[Ctrl+Z]
bg
5
  • I used "find ~/ a.java &" command but i didn't went in background? Also, I tried cmd+z (for mac) ....it's not working Commented Aug 5, 2015 at 10:08
  • 2
    @AbhimanyuAryan, maybe you want to debug your find ... command first, then run it in the background. In this case you were missing the -name option. Commented Aug 18, 2016 at 5:19
  • I hope you're okay from the volcano eruption today @Greg Hewgill Commented Dec 9, 2019 at 17:54
  • 1
    @KyleBridenstine Yeah no problems here, it was like 800 km away. Thanks for the note. Commented Dec 9, 2019 at 18:40
  • re @abhimanyuaryan (though I know your comment was a VERY long time ago, so this is mostly for other/future readers): I think your find command did go in the background... if you think it wasn't because it was generating lots of output, well, that's what backgrounded commands do. And maybe that output obscures your prompt, but try hitting return a few times, it's probably still there. (If it's more silent, then... that's probably running, too -- or finished quickly, whether because of an error or because it was done.)
    – lindes
    Commented 2 days ago
127

This is the favorite of all since apart of sending the process into the background you don't have to worry about the text output dirtying your terminal:

nohup command &

This not only runs the process in background, also generates a log (called nohup.out in the current directory, if that's not possible, your home directory) and if you close/logout the current shell the process is not killed by preventing the child proccess from recieving the parent signals when killed (ie. logging out, by SIGHUP to the parent, or closing the current shell).

There's other called disown but that's rather a extension of other answers rather that a method in itself:

command & # our program is in background
disown # now it detached itself of the shell, you can do whatever you want

These commands do not allows you to recover easily the process outputs unless you use a hackish way to get it done.

4
  • 5
    Sysadmin since '95. Never heard of disown until today, thanks! It doesn't work with every shell though (checked bash, zsh, tcsh. tcsh did not work).
    – yoonix
    Commented Dec 5, 2013 at 2:36
  • Using nohup in a job control shell is silly. Modern shells don't send HUP to background processes. Redirecting to a unique filename is a small price to pay.
    – chicks
    Commented Nov 4, 2015 at 16:47
  • @chicks the use case here is to prevent the stderr and stdout to pollute the terminal with the plus that it creates a log file automatically.
    – Braiam
    Commented Nov 4, 2015 at 16:50
  • 2
    command > file.log 2>&1 & disown is a bit longer than nohup command & but I guess that's what @chicks was suggesting. Commented Sep 1, 2016 at 7:33
40

This is probably what you want

my_command > output.log 2>&1 &

this will start your command, redirecting both stdout and stderr to some output.log which you can specify. If you don't care to store the output at all - you can use /dev/null instead of an actual file.

& will execute command in the background so that you can continue inputting commands while it is running. 2>&1 redirects stderr to stdout so that all output is caught.

also, when you run a command like this, you should get a confirmation from the kernel similar to this: [2] 1234 This means that your process is running in the background and its id is 1234, so you can kill it later if you wish with kill -9 1234

3
  • 2
    thank you, 2>&1 is very important coz the command can fail and try it output the error in some cases!
    – ericn
    Commented Jun 16, 2017 at 3:09
  • 1
    This one worked when the above answers didn't thanks.
    – domagoj
    Commented Oct 1, 2018 at 7:30
  • This is what I needed. Commented Jan 28, 2021 at 6:03
12

Look into screen or tmux. An example with tmux:

$ tmux new -d 'longrunningcommand'

While the other answers using '&' to background will work, you have to redirect stdout (and stderr!). Without doing that, the output will go straight to your shell, mixing with whatever other output you may have.

Backgrounding will also fail if you're running a long command and log out or get disconnected. The system will kill your job.

If you aren't familiar with either screen or tmux, they basically allow you to completely detach from your shell. Instead of backgrounding your program, you background the whole shell. You can then switch back to it later, even from another computer. They both have a ton more features that you may or may not find useful beyond this use case.

Screen is the old tried and true program; tmux is much younger but has learned from screen's past.

2
  • This answer misses an actual answer and sounds like a RTFM.
    – Lloeki
    Commented Jun 20, 2017 at 12:54
  • no it's not look here
    – imbr
    Commented Apr 27, 2021 at 12:00
5

(For completeness-- answered already:) You put a command in the background by adding & after the command:

long_command with arguments > redirection &

I'm adding this answer to address the other part of your question:

There's no real equivalent of the spinner for showing in-progress background commands, but you can see the status of background commands by typing jobs or jobs -l. It'll show you your backgrounded commands, and whether they're running, stopped via signal (e.g., with ^Z), or occasionally stopped because they're waiting for interactive input from you.

1
  • long_command with arguments &> redirection & to redirect the stderr too
    – Ice-Blaze
    Commented Jan 11, 2017 at 7:24
5

You can run a program in the background using &. For example, if you wanted to run yum install XyZ for example, you could run:

yum install XyZ &

The stdout or output from the program can be redirected using > to overwrite a file, or >> to append to a file. For example, if you wanted to log yum in a file yum.log:

yum install XyZ > yum.log &

Or, if you wanted to add the output to an existing file log:

yum install XyZ >> log &

Errors are printed to stderr and not stdout, and can be redirected to a file in the same way, but using 2>:

yum install XyZ 2> errors
yum install XyZ 2>> errors

If you wanted to redirect both stderr and stdout, you can use &>:

yum install XyZ &> output
yum install XyZ &>> output
3

You can run a command in the background simply by putting & sign after it.

For example:

areallylong_command &

will run it in the background.

You can further redirect the stdout/stderr to appropriate files so that they don't appear on your terminal while you are doing something.

See this for more info: http://tldp.org/HOWTO/Bash-Prog-Intro-HOWTO-3.html

1

You can also pop the script/program into background when you realize that this will run for awhile AFTER you started it running.

Enter a "cntl-z". Which will stop the program.

then enter "bg" for background which will resume the program in background.

This method will also work if you started a program that uses a GUI, but you need to enter other commands through the CLI. Cntl-z and "bg" will let you use the terminal window prompt again.

If you need to bring the program from background to foreground, use "jobs" command to get the background job number, then enter "fg ". If you only have one job in the background a simple "fg" will work.

Practice using vim. cntl-z works great and so does "fg". I usually run multiple vim edits at the same time. Jumping between function edits.

0

Try using the 'screen' command before you start your long running task then when you detach you can reattach back to with 'screen -r -d' whenever you need to again. I find that when using a terminal over ssh or other network connections they can sometimes get broken by a bad connection to the server. running it inside a 'screen' fixes this issue.

You must log in to answer this question.

Not the answer you're looking for? Browse other questions tagged .