How the 9 major tenets of the Unix/Linux philosophy affect you

0

Image by: Internet Archive Book Images. Modified by Opensource.com. CC BY-SA 4.0

The Linux Philosophy is not just a historical curiosity — it’s more relevant today than ever.

When my article, The impact of the Linux philosophy, was originally published on Opensource.com, there were some really good comments about it. One was that another operating system has just as much capability on the command line as Linux does. That person said that you could just add this software to get these features and that package if you want those features. That just makes my point. With Linux, it is all built in. You do not have to go elsewhere to get access to the power of Linux.

Many people left comments stating that they could see how it might be nice to know the Linux philosophy as a historical curiosity, but that it had little or no meaning in the context of daily operations in a Linux environment. I beg to differ. Here’s why.

Nine major tenets

There are nine major tenets to the Unix/Linux philosophy. These are described in detail by Mike Gancarz in his book, Linux and the Unix Philosophy.

  1. Small is Beautiful
  2. Each Program Does One Thing Well
  3. Prototype as Soon as Possible
  4. Choose Portability Over Efficiency
  5. Store Data in Flat Text Files
  6. Use Software Leverage
  7. Use Shell Scripts to Increase Leverage and Portability
  8. Avoid Captive User Interfaces
  9. Make Every Program a Filter

There are also 10 lesser tenets and some corollaries to the Linux philosophy that are also important.

Let’s look at this quick command line program as an example that encompasses most of these nine major tenets.

who | awk '{print $1}' | sort | uniq

This little command line program performs the very simple task of listing all users who are currently logged in while only listing each logged in user once. It also eliminates extraneous data provided by the who command.

Make every program a filter

Each of the commands that make up this command line program is a filter — or as I like to say, a transformer. That is each command will take an input, usually from Standard Input, and “filters” the data stream by making some change to it — transforming it — then sends the resulting data stream to Standard Output. Standard Input and Standard Output are known collectively as STDIO.

The who command generates an initial stream of data. Each following command changes that data stream in some manner, taking the Standard Input and sending the modified data to Standard Output for the next command to manipulate. This ability to join a series of commands into pipelines is what imbues the command line with such amazing power.

Small is beautiful and Each program does one thing well

These two tenets go hand in hand. Each of the commands in this program is fairly small, and each performs a specific task. The sort command, for example does only one thing. It sorts the data stream sent to it via Standard Input and sends the results to Standard Output. It can perform numeric, alphabetic and alphanumeric sorts in forward and reverse order. But it does nothing else. It only sorts but it is very, very good at that. Because it is very small, having only 2614 lines of code as shown in Figure 1, it is also very fast.

CommandsSource lines
who755
awk3412
sort2614
uniq302
Figure 1: Each of these powerful programs does one things and does it well using only small amounts of code.

Choose portability over efficiency and Use shell scripts to increase leverage and portability

The portability of shell scripts can be far more efficient in the long run than the perceived efficiency of writing a program in a compiled language—not even considering the time required to compile and test such a program—because they can run on many otherwise incompatible systems.

For example, I worked on the email system at the State of North Carolina for several years. I was responsible for a collection of Perl and BASH CGI programs that ran on a Red Hat Linux host. These programs provided remote users in all of our 100 counties and hundreds of large and small agencies to perform account maintenance for their users. I was asked to test the possibility of moving all of these programs to Red Hat running on an IBM Z-series mainframe.

I created a tarball from the existing programs, data, and configuration files, copied the tarball to a Red Hat instance on the mainframe and extracted the files from it. I made one configuration change to the Apache httpd.conf file—I changed the IP Address on which Apache listened. I started Apache and everything worked as it was supposed to with no other changes required. This all took about ten minutes and is an excellent example of true portability.

Had those programs been compiled, I would have had to recompile them and remove any hardware specific efficiencies that we might have programmed in.

Use software leverage

Software leverage means a couple things to me. First, and in the context of this example, it means that by using four command line commands, we are leveraging the work of the programmers who created those commands with over 7,000 lines of C code. That is code that we do not have to create. We are leveraging the efforts of those other, under-appreciated programmers to accomplish the task we have set for ourselves.

Another aspect of software leverage is that good programmers write good code and great programmers borrow good code. Never rewrite code that has already been written.

One of the great advantages of Open Source software at all levels, from the kernel, through the GNU and other utilities, and all the way up to complex applications, is that there is an incredibly large amount of well-written and tested source code out there that can do almost anything you want to do. Just find the pieces you need and include them in your own code. I use my own code over many times in different programs. I also use a lot of code written by other folks when it meets my needs. Either way, it saves me a lot of work and keeps me from having to reinvent perfectly good code.

Avoid Captive User Interfaces

A captive user interface (CUI) is one that commandeers the terminal session when it’s started. It’s not possible to communicate directly with the shell until you exit the program. The text editors Vim and Emacs are both excellent examples of CUIs. The Midnight Commander file manager is another good example of a CUI. You start each one and work within its interface until you are finished. That’s not to say those aren’t excellent programs; I use them every day to do the mundane tasks I must as a SysAdmin.

The command line program we wrote above consists of four small programs connected by pipes. We can easily add or remove programs in the pipeline. If we need to do something that no one has ever considered before, we can arrange the pipeline in ways that can do it. We’re not constrained by the limits imposed by a captive user interface.

Impact

This article is not meant to be a programming tutorial. Rather, it is intended to illustrate how the Linux Philosophy impacts and informs the daily work of system administrators and developers.

We are the beneficiaries of decades of code that was well-designed, well-thought out, and well-written by people who had a lot of skin in the game and actually knew what they were doing. The best code on the planet was written using these tenets.

The GNU Utilities alone represent a huge investment of time and effort by Richard Stallman and other programmers to provide open, free Unix utilities to anyone who wanted them. These GNU utilities were incorporated into the original Linux distribution by Linus Torvalds. Together this constitutes GNU/Linux and provides us with a source for an operating system and utilities that are incredibly powerful and useful.

GNU/Linux gives us the free and open tools that enable us to perform incredibly complex and creative tasks, many of which cannot be performed with any other tools. Many of these small, “do one thing well” GNU utilities are designed using STDIO so that they can be strung together in ways that could never be imagined by the programmers of large, monolithic utilities. In fact, the programmers of these GNU Utilities themselves had no idea of the virtually infinite combinations in which they could be combined to perform tasks they could not imagine. Yet these utilities that are over 50 years old are still in heavy daily use on computers around the world.


References

  1. Eric Raymond: The Art of Unix Programming http://www.catb.org/~esr/writings/taoup/html/index.html
  2. Mike Gancarz: Linux and the Unix Philosophy; Digital Press, 2003, ISBN 1-55558-273-7
  3. Wikipedia: http://en.wikipedia.org/wiki/Unix_philosophy
  4. Oregon State University: https://web.engr.oregonstate.edu/~traylor/ece474/beamer_lectures/linux_philosophy.pdf