Unix Client Server Technology
Understanding the Role of Unix
When discussing client-server technology it is important to understand UNIX, one of the first, primary building blocks for network possibilities that continues its influence today. UNIX is an operating system designed for multi-tasking by multiple users. This description alone suggests the exponential increase in work productivity by networking through UNIX. Operating systems (OSs) are computers’ central program. Every computer has one.
At their most general, operating systems help perform basic computational functions, such as recognizing keyboard input and sending output to the display screen. Operating systems manage external data, like disk files and directories, incorporating it so that it can be manipulated. Operating systems also manage peripheral devices, like keyboards, printers and disk drives. Larger operating systems have more responsibilities. They handle security and make sure that different users running programs simultaneously don’t interfere with each other. They also provide security to control access through authorization.
The primary way users interaction with operating systems is through commands. The command processor or command line interpreter executes these commands. The most popular and contemporary way of interacting with the operating system is through a graphical user interface (GUI). This allows the user to point and click on objects displayed on the monitor screen to input commands.
There are several types of operating systems: multi-user; multi-processing; multi-tasking; multi-threading; and real-time. A multi-user OS allows more than two, sometimes thousands, of users to run programs simultaneously. Multi-processing OSs support the running of a single program on multiple CPUs or central processing units (the brains of the computer). Multi-tasking OSs allow for more than one program to run simultaneously without interfering with each other. Multi-threading allows parts of a single program to run simultaneously. Real-time OSs allow for immediate response to input. UNIX, however, is not one of these.
The other major component of an operating system is the software platform, which is written on top of the operating system. The software platform allows application programs to run. Therefore, the operating system you choose will determine which applications your computer can run.
This article will focus on the UNIX operating systems, its components and functions, and its impact and contributions to client-server technology.
General Overview and Structure
UNIX is a multi-user, multi-tasking operating system originally designed to be used primarily by programmers. It is the leading OS for workstations (less popular with personal computers) and servers because of its flexibility and portability. Flexibility allows UNIX to adapt easily to new demands and correct any errors it detects. UNIX is one of the first OSs to be written in the high-level C programming language. This increased portability, since any machine with a C compiler could use UNIX. Compilers are programs that translate source code into object code, making UNIX command instructions intelligible across multiple operating systems.
Most high-level programming languages have compilers developed for them. C language has several compilers. This portability made it popular with academic and government institutions during its early implementation. The combination of UNIX and client-server technology has been central to the evolution of network, rather than individual, computing and the development of the Internet.
UNIX contains several identifying characteristics. For example, UNIX stores data in plain, unformatted, human-readable text such as ASCII (American Standard Code for Information Interchange), which is based on the English alphabet. It contains a hierarchical file system, which organizes files in directories in an upside-down tree formation with the ‘root’ or primary directory at the top. Files symbolize the leaves and the connections relating them are the branches. UNIX environments treat inter-process communication (IPC) devices as files.
Finally, UNIX enables multiple programs to be strung together using a command line interpreter or UNIX shell. These are programs that allow text entered by users to be processed and executed in the context of a specific operating system or programming language. Command line interpreters are organized through pipes, which are temporary software connections between two commands or programs. They pass the output of one command as the input for the next command.
UNIX structure is often described as a series of concentric circles, with hardware at the center, the kernel around that, utilities interact with and surround the kernel, and shells communicate with the utilities.
The UNIX OS consists of the kernel or master control program and the services described above. The kernel controls program start and stop functions, and handles low-level tasks shared by multiple programs, such as file system management.
Most importantly, the kernel schedules hardware access so that programs accessing the same application or device simultaneously do not interfere with each other. The kernel manages a system’s entire resources and represents them coherently to all users. It allocates memory to each process. It transfers data between different parts of the machine. And it interprets and executes shell instructions. However, this model was most effective for linear input and output computing. Modern computing introduced event-driven computing through GUIs. Micro-kernels were developed to perform tasks through smaller utilities and move network protocols outside of the kernel.
After logging into a UNIX system, the user directly interacts through shell programs. Commands are entered at prompts visible on the screen where the cursor is. The shell is the command line interpreter and provides the user interface for UNIX. It identifies the programs to be run and uses file expansion through data streams to connect them to the kernel and each other. Command files or shell scripts are the specific programming languages used to communicate commands to the kernel and return processed results to the display screen.
There are several types of shell scripts that accomplish this function. These include the following: Bourne shell; C shell; Korn shell; TC shell; and Bourne Again shell. However, UNIX allows for programmers to make any program their shell by specifying as much. The Bourne shell is the standard shell for UNIX. Every UNIX or UNIX-like system contains a shell that is compatible with the Bourne shell. UNIX has also developed visual shells like GNOME or KDE to support GUIs.
Utilities provide the primary frontier between user and UNIX. UNIX offers hundreds of utilities, which are also known as ‘commands’. These utilities include universal functions like program support, file maintenance, sorting, editing, printing, and connection to online information. These utilities are modular which can be grouped together through programming to accomplish more complex tasks. As UNIX developed, it also provided for TCP/IP communications utilities for communicating on the Internet.
The focus on text and binaries has also universalized UNIX and disseminated its influence to other UNIX-like systems, like LINUX. The text-based focus has also been popular in application layers of Internet protocols like FTP (File Transfer Protocol), HTTP (Hypertext Transfer Protocol), and XML (Extensible Mark-up Language), which are central to client server technology and any technology that requires communication with the Internet.
Impact / Contributions
As previously stated, UNIX has been instrumental in the movement towards networking. The client-server model is the most common model for interacting with the Internet. Indeed, the Internet itself is a client-server model. This section will examine how UNIX facilitates client-server models, which communicate over the Internet. To understand how UNIX functions as a fundamental component of client-server networking, we will use the popular UNIX-based operating system, LINUX. Like UNIX, LINUX supports TCP/IP as its network transport system. TCP/IP is the basic collection protocols that allow two programs to communicate over a network, regardless of location.
Therefore, it is central to any communication within client-server models, which are organized around network communication. It is one of the underlying technologies that makes the Internet possible. BSD socket interfaces allow network-based applications to be constructed independent of communication facilities. BSD sockets are APIs (Application Program Interfaces) that were developed for UNIX at the University of California at Berkeley. They are also known as Berkley sockets. They function as the standard for network sockets, since most programming languages use C-language APIs, which is also the language of UNIX. However, creating networks through coding in C and using BSD socket APIs can require a lot of labor. Recently, two alternatives have developed, RPC (Remote Procedure Call) and CORBA (Common Object Request Broker Architecture). But UNIX did it first.
Another important innovation by UNIX that influenced the explosion of such profitable web-based industries as e-commerce, was making the command line interpreter a user-level program. The UNIX shell used the same language for interactive commands as it used for scripting. Therefore, new commands could be added to the shell without changing the shell structure. UNIX’s command line syntax provided a model for creating pipelines or chains of producer-consumer processes that form the bulk of e-commerce. E-commerce sites follow client-server models and UNIX is often the backbone of these examples.
UNIX can be used to program both clients and servers. Its natural inclination towards networking with its TCP/IP technology and standard shell lend it to being a foundational operating system for client-server models where computers must network with each other and the Internet to create cost and labor effective environments.