May 092013
 

It’s a common task when writing shell scripts to need to source a set of environment variables from an external configuration file. The advantage of this is that the dynamic variables can be abstracted from the script itself, which can then be more securely version controlled. In shell scripts, it’s a simple matter of assigning values to variables in a literal shell fashion, and then sourcing the config file from within the executable script with the dot “.” operator. Perl scripts will not accept this syntax, but it can often be useful to re-use shell config files. Here’s a simple and intuitive method to do so.

Short Answer

Iterate over each line in the shell config file and parse it like this:

      $$1 = $2 if /^([^=]+)=(.+)$/;

Long Example

In this example there is a file called “auth.cfg” which contains a username and password, assigned to the variables USER and PASSWORD respectively:

auth.cfg:
USER="backup"
PASSWD="password123"

The code displayed in the “short answer” above can be used an readably expanded to this:

   #!/usr/bin/perl

   my $cfgFile = "auth.cfg";
   open(CFG,"<$cfgFile") || die "Cannot open config file $cfgFile\n";
   while() {
      if (/^([^=]+)=(.+)$/) {
         $$1 = $2;     # Take the variable value as the literal name of a variable
      }
   }
   print "USER = $USER\n";
   print "PASSWD = $PASSWD\n";

The only catch is that the config file must accurately specify the name of the variable that the script expects (in “$$1”), otherwise the variable will be unassigned and the script’s behaviour could be unpredictable. Some form of error checking is therefore advised, but this will be left as an exercise to the reader.


Matt Parsons is a freelance Linux specialist who has designed, built and supported Unix and Linux systems in the finance, telecommunications and media industries.

He lives and works in London.

Aug 132012
 

I had to edit a Silverlight manifest file – an XML file that was uneditable due to being of characterset UTF-16, and therefore, as far as the usual utilies were concerned, was not strictly text. Not having had much experience with file encoding, I had to do some research. Fortunately, I found this rather good article which explains the whole unicode characterset thing.

I had to edit multiple such files, and so a script was in order to automate the process, but the unicode thing was something I’d never seen. Here’s how I got around it.

First, here’s how to discern what characterset youre dealing with:

  # file -bi filename
application/xml; charset=utf-16le

It turns out you can convert between encodings using iconv, but I was concerned that data could be lost when converting, in that some characters may need to be ignored. Particularly when I got errors like this:

  # iconv -f UTF16LE -t ascii tmp
iconv: illegal input sequence at position 0

After much Googling, I found this Perl code, written by Enrique Nell, with which to process the Silverlight XML manifest.

I needed to import the perl Encode module to use the encode and decode functions:

  open(my $in, '<:raw', $in_path) || die "Couldn't open file: $!";
  my $text = do { local $/; <$in> };
  decode('UTF-16LE', $text); 

Otherwise, the code referenced above is all that was needed.


Matt Parsons is a freelance Linux specialist who has designed, built and supported Unix and Linux systems in the finance, telecommunications and media industries.

He lives and works in London.

May 012012
 

Laziness is one of the hallmarks of a good sysadmin. The Father of Perl, Larry Wall says so. An economy of keystrokes is the mark of a keen mind that needs to get as much done as possible.

From time to time, one needs to change a single piece of text in a lot of files, and no one wants to manually open, edit and save each file individually with vi at the command line. An example of this would be the cloning of a set of application configuration files from one host to another. But repeatedly changing a host-specific string is mind-numbing.

There are several ways make repeated updates easily and safely, and they’re worth committing to memory in order to add them to your sysadmin toolbox. The best way, I believe, is with Perl.

  # perl -pe -i.bak 's/this/that/g' filename

This will edit the file “filename” and replace all occurrences of the text “this” with the text “that”. Furthermore, it will back up “filename” by copying the original and appending the extension “.bak” to “filename.bak”.

The backup extension works even if you specify the “*” operator, and will search & replace and backup every file in the directory. It does make a backup copy of every file, whether it edits it or not, so this will double the number of files in a directory.

So if you wanted to economically perform a find-replace recursively on every file under a directory, and all its subdirectories, something like this would suffice. Although use with caution:

  # cd DIR
  # perl -i.bak -pe 's/foo/bar/' `grep -lr foo *`

The grep returns a list of files (with full paths) which contain the pattern “foo”, so the script limits itself to matching files only, rather unnecessarily littering your filesystem with extraneous backup copies.

Commit this one to your mental toolbox. You’ll always have something better to do than editing swags of files manually.


Matt Parsons is a freelance Linux specialist who has designed, built and supported Unix and Linux systems in the finance, telecommunications and media industries.

He lives and works in London.