Xargs

xargs (short for "extended arguments" ) is a command on Unix and most Unix-like operating systems used to build and execute commands from standard input. It converts input from standard input into arguments to a command.

Some commands such as  and   can take input either as command-line arguments or from the standard input. However, others such as  and   can only take input as arguments, which is why xargs is necessary.

A port of an older version of GNU xargs is available for Microsoft Windows as part of the UnxUtils collection of native Win32 ports of common GNU Unix-like utilities. A ground-up rewrite named wargs is part of the open-source TextTools project. The xargs command has also been ported to the IBM i operating system.

Examples
One use case of the xargs command is to remove a list of files using the rm command. POSIX systems have an for the maximum total length of the command line,  so the command may fail with an error message of "Argument list too long" (meaning that the exec system call's limit on the length of a command line was exceeded):   or. (The latter invocation is incorrect, as it may expand globs in the output.)

This can be rewritten using the  command to break the list of arguments into sublists small enough to be acceptable:

In the above example, the  utility feeds the input of   with a long list of file names. then splits this list into sublists and calls  once for every sublist.

Some implementations of xargs can also be used to parallelize operations with the  argument to specify how many parallel processes should be used to execute the commands over the input argument lists. However, the output streams may not be synchronized. This can be overcome by using an  argument where possible, and then combining the results after processing. The following example queues 24 processes and waits on each to finish before launching another.

xargs often covers the same functionality as the command substitution feature of many shells, denoted by the backquote notation ( or  ). xargs is also a good companion for commands that output long lists of files such as,   and  , but only if one uses   (or equivalently  ), since   without   deals badly with file names containing  ,   and space. GNU Parallel is a similar tool that offers better compatibility with find, locate and grep when file names may contain,  , and space (newline still requires  ).

-I option: single argument
The xargs command offers options to insert the listed arguments at some position other than the end of the command line. The  option to xargs takes a string that will be replaced with the supplied input before the command is executed. A common choice is.

The string to replace may appear multiple times in the command part. Using -I at all limits the number of lines used each time to one.

Shell trick: any number
Another way to achieve a similar effect is to use a shell as the launched command, and deal with the complexity in that shell, for example:

The word sh at the end of the line is for the POSIX shell sh -c to fill in for $0, the "executable name" part of the positional parameters (argv). If it weren't present, the name of the first matched file would be instead assigned to  and the file wouldn't be copied to. One can also use any other word to fill in that blank, my-xargs-script for example.

Since cp accepts multiple files at once, one can also simply do the following: This script runs cp with all the files given to it when there are any arguments passed. Doing so is more efficient since only one invocation of cp is done for each invocation of sh.

Separator problem
Many Unix utilities are line-oriented. These may work with  as long as the lines do not contain ,  , or a space. Some of the Unix utilities can use NUL as record separator (e.g. Perl (requires  and   instead of  ),   (requires using  ),   (requires using  ),   (requires   or  ),   (requires using  )). Using  for   deals with the problem, but many Unix utilities cannot use NUL as separator (e.g. ,  ,  ,  ,  ,  ,  ,  ).

But often people forget this and assume  is also line-oriented, which is not the case (per default   separates on newlines and blanks within lines, substrings with blanks must be single- or double-quoted).

The separator problem is illustrated here: Running the above will cause  to be removed but will remove neither the directory called , nor the file called.

The proper fix is to use the GNU-specific  option, but   (and other tools) do not support NUL-terminated strings:

When using the  option, entries are separated by a null character instead of an end-of-line. This is equivalent to the more verbose command: or shorter, by switching   to (non-POSIX) line-oriented mode with the   (delimiter) option:

but in general using  with   should be preferred, since newlines in filenames are still a problem.

GNU  is an alternative to   that is designed to have the same options, but is line-oriented. Thus, using GNU Parallel instead, the above would work as expected.

For Unix environments where  does not support the   nor the -d option (e.g. Solaris, AIX), the POSIX standard states that one can simply backslash-escape every character:. Alternatively, one can avoid using xargs at all, either by using GNU parallel or using the -exec ... + functionality of find.

Operating on a subset of arguments at a time
One might be dealing with commands that can only accept one or maybe two arguments at a time. For example, the  command operates on two files at a time. The  option to   specifies how many arguments at a time to supply to the given command. The command will be invoked repeatedly until all input is exhausted. Note that on the last invocation one might get fewer than the desired number of arguments if there is insufficient input. Use  to break up the input into two arguments per line:

In addition to running based on a specified number of arguments at a time, one can also invoke a command for each line of input with the  option. One can use an arbitrary number of lines at a time, but one is most common. Here is how one might  every git commit against its parent.

Encoding problem
The argument separator processing of  is not the only problem with using the   program in its default mode. Most Unix tools which are often used to manipulate filenames (for example,  ,  , etc.) are text processing tools. However, Unix path names are not really text. Consider a path name /aaa/bbb/ccc. The /aaa directory and its bbb subdirectory can in general be created by different users with different environments. That means these users could have a different locale setup, and that means that aaa and bbb do not even necessarily have to have the same character encoding. For example, aaa could be in UTF-8 and bbb in Shift JIS. As a result, an absolute path name in a Unix system may not be correctly processable as text under a single character encoding. Tools which rely on their input being text may fail on such strings.

One workaround for this problem is to run such tools in the C locale, which essentially processes the bytes of the input as-is. However, this will change the behavior of the tools in ways the user may not expect (for example, some of the user's expectations about case-folding behavior may not be met).