Laziness is one of the hallmarks of a good sysadmin. The Father of Perl, Larry Wall says so. An economy of keystrokes is the mark of a keen mind that needs to get as much done as possible.
From time to time, one needs to change a single piece of text in a lot of files, and no one wants to manually open, edit and save each file individually with vi at the command line. An example of this would be the cloning of a set of application configuration files from one host to another. But repeatedly changing a host-specific string is mind-numbing.
There are several ways make repeated updates easily and safely, and they’re worth committing to memory in order to add them to your sysadmin toolbox. The best way, I believe, is with Perl.
# perl -pe -i.bak 's/this/that/g' filename
This will edit the file “filename” and replace all occurrences of the text “this” with the text “that”. Furthermore, it will back up “filename” by copying the original and appending the extension “.bak” to “filename.bak”.
The backup extension works even if you specify the “*” operator, and will search & replace and backup every file in the directory. It does make a backup copy of every file, whether it edits it or not, so this will double the number of files in a directory.
So if you wanted to economically perform a find-replace recursively on every file under a directory, and all its subdirectories, something like this would suffice. Although use with caution:
# cd DIR # perl -i.bak -pe 's/foo/bar/' `grep -lr foo *`
grep returns a list of files (with full paths) which contain the pattern “foo”, so the script limits itself to matching files only, rather unnecessarily littering your filesystem with extraneous backup copies.
Commit this one to your mental toolbox. You’ll always have something better to do than editing swags of files manually.
Matt Parsons is a freelance Linux specialist who has designed, built and supported Unix and Linux systems in the finance, telecommunications and media industries.
He lives and works in London.