So what is Unix?
Unix or more correctly UNIX was developed at AT&T Bell Labs by Ken Thompson, Dennis Ritchie, Douglas McIlroy, and Joe Ossanna as well as others in 1970. Now in the 1970's there was no such thing as personal computers. As a result Universities and Big Companies tried to make the most out of their Hardware. One way was to enable more than one user to access a single room sized computer. This forced the development team to make Unix a multi user/multi tasking Operating System (OS).
In the 1970's software was a small part of the cost of the computer that you was using. It was common that the OS was given along with the code along with the hardware. Something that would later change. So it was in this environment that AT&T gave licenses to both government bodies as well as commercial firms as well as source code. The year was 1974 just 1 year before the 1st ever pc hit the market. It would take only a few years before the computing industry was turned on its head and software became a real commodity.
Over time new features were add and the source code was modified. Two main forks of the original AT&T code developed. The AT&T version System V and the California University Berkley distribution called BSD. Most current Unix and Unix like OSes can be considered to have been derived from these two forks.
When personal computers came out in the late 1970's Unix was never an option due to its complexity. The OS that was simple in design and had less functionality were dominant in this low spec market. That meant that porting over Unix to pc's would have to wait until the processing power and memory constraints of pc's increased to degree that would allow it. CP/M and then DOS would take the lead. DOS and CP/M were based on the Unix Philosophy but heavily modified to suit then pc's specifications.
At the same time people working on Unix systems at work or while they were studying at Uni wanted the benefits of Unix on their home pc's. This frustration lead to people working on their own to port over Unix. FreeBSD and Linux were very much a part of this push, which happened just as cheap suitable processors and memory chips that could run Unix were finally arriving. It was the 1990's and the time of the PC Unix was dawning.
It's hard for us to understand just what an effort it took for Unix to make it to the pc. Nowadays it's easy to run Unix on a base standard pc and the very low spec machines that will now run under Unix. Even more impressive is that embedded systems abound with Unix OSes from mobile phones and modems. Times have changed from the mid 1990's. Even Apple has proven with Mac OSX that with a bit of effort Unix is just as usable as any other system.
Big servers still run mostly on Unix (the exceptions being specially designed OSes in extreme cases) so knowledge gain on a pc running under Unix can be used across the whole board from micro pc's to room sized computers.
What Makes Up A Unix Distribution
In Unix everything is a file. A good example of this is the /dev file directory. This contains the files that are "devices". From hard drives to mice there is a file that matches the hardware device. Files are stored under directories which are arranged in tree/hierarchical levels. These are defined by using the / or forward slash character. The / when used on its own defines the root or the master directory. The one which all other directories break off from. The one directory that rules them all.
On all Unix systems there is a special place for users. Often called /home or in Mac OSX case /Users. Another point to make here is that Unix is case sensitive. That means the "Users" is different from "user".
The command line is by far the dominant interface under Unix. This is in a lot of ways why Unix has been so flexible. Simple text configuration files means that a Unix distribution can be shaved down to key components. Not having to need a GUI to do configuration takes less memory and yet can still be managed without a complicated setup.
In a lot of ways Unix is both a state of mind and an OS. There is the Unix Philosophy side, which states what is the ideal Unix and then there are the implementations of Unix. In some cases these can be same, in others they might be totally different. It really depends on what works or sometimes just what the creators of the code have chosen to do.
Mike Gancarz's UNIX Philosophy:
9 Greater Tenets
- Small is beautiful.
- Make each program do one thing well.
- Build a prototype as soon as possible.
- Choose portability over efficiency.
- Store data in flat text files.
- Use software leverage to your advantage.
- Use shell scripts to increase leverage and portability.
- Avoid captive user interfaces.
- Make every program a filter.
10 lesser tenets
- Allow the user to tailor the environment.
- Make operating system kernels small and lightweight.
- Use lowercase and keep it short.
- Save trees.
- Silence is golden.
- Think parallel.
- The sum of the parts is greater than the whole.
- Look for the 90-percent solution.
- Worse is better.
- Think hierarchically.
Unix and its variants have been around since the year 1970. Unix is a multi tasking/user OS which means that it can handle more than one person using it at a time and can make it look like it's doing multiple things at the same time. In the early years Unix was just one OS but has since multiplied and produced offspring which may break both the philosophy and the implementation to such an extent to not be considered to be a Unix/Unix like variant (Cromix and Minix are examples of this.)
Unix will always refer to an OS that has at its core a command line interface from which it can be configured and the hierarchical file system.
Unless stated otherwise Content of this page is licensed under Creative Commons Attribution-NonCommercial-ShareAlike 3.0 License