I do one-liner shell scripts all the time. One of the things that has always bugged me is having to write a perl script just for it’s parsing capabilities when it’s just a 3 column text file. I always found it difficult to handle parsing lines in Bash when I need to use multiple values in each line. When there’s only one value in each line, I typically use “| while read line; do”. This week, I made good friends with the ‘eval’ bash command. Try this little exercise. I always end up doing grocery lists for this type of stuff, since it’s easy to keep in your head. Create a file (ie: values.txt) with the following text:

apples .24 50
oranges .54 21
grapes .05 100

Now run the command:

$ awk '{printf "fruit[%d]=%s; value[%d]=%s; quantity[%d]=%s;\n", NR,$1,NR,$2,NR,$3} END {printf "count=%s\n", NR}' values.txt

This prints out some handy bash assignment operations. If you eval the result of this command, you can loop around the array with a for loop; like so:

eval `awk '{printf "fruit[%d]=%s; value[%d]=%s; quantity[%d]=%s;\n", NR,$1,NR,$2,NR,$3} END {printf "count=%s\n", NR}' values.txt`; for num in `seq 1 $count`; do echo ${fruit[$num]} will cost \$`echo ${value[$num]} \* ${quantity[$num]} | bc -l`; done

In the old days (a week ago) I would have needed to revert to a perl script to parse each line, just for a little math. This one-liner (even though it’s really long) uses only shell scripting tools that I use on a regular basis. I can crank this bad boy out a whole lot faster than it’s equivalent in perl.