A great concept, Taco Bell Programming. Take simple, known elements and mix to get what you need.
Here’s a concrete example: suppose you have millions of web pages that you want to download and save to disk for later processing. How do you do it? The cool-kids answer is to write a distributed crawler in Clojure and run it on EC2, handing out jobs with a message queue like SQS or ZeroMQ.
The Taco Bell answer? xargs and wget. In the rare case that you saturate the network connection, add some split and rsync. A “distributed crawler” is really only like 10 lines of shell script.