Client Server Features
Web Server Introduction
Web servers are the machines or programs from which requests are made and fulfilled over the Internet. They provide one of the endpoints in communication when users request an online service.
Clients or users request data through their web browsers. These requests are delivered to the server in HTTP (HyperText Transfer Protocol). The server then searches the World Wide Web for results based on browser specifications and returns its responses in HTML (Hypertext Markup Language), also formatted according to HTTP. This is the basic language used to write web pages. It allows for hypertext features, such as linking to other web pages, images, or other media (video, audio).
The two features common to all web servers are HTTP and logging. HTTP provides the protocol according to which requests are processed and responses are retrieved and formatted. These responses usually are delivered in the form of HTML documents. Logging is the recording and storing of information about client requests and server responses. This recorded data is stored in log files. These files can provide an audit trail to uncover the source of an error. They can also be analyzed to better understand and predict client and server behavior.
This article will describe some of the more practical web server features. It will also discuss performance parameters and server load limits. Finally, it will discuss some of the causes of server overload, how to detect it and how to prevent it.
Common Practical Features
Some of the practical features employed by many web servers include the following: configurability; authentication; inclusion of programs that handle and generate dynamic content; module support; HTTPS support; content compression; virtual hosts; large file support; and bandwidth throttling.
Configurability refers to “the ability of a system to be adapted to new requirements and operating environments without change to the fundamental structure of the software.” (Ian Sommerville, 2006) This implies the ability for a system or program to rearrange or customize attributes and features depending on user requests and, in this case, HTML commands. Many web servers contain configuration files or external user interfaces to enhance adaptability. Configuration files are read at startup by operating systems and applications to define user environment. External user interfaces provide the combination of tools through which users interact with the server. These include screen design, menus, and command language. External user interfaces can also increase a server’s configurability.
Servers also use a variety of authentication techniques to control access to web resources. These techniques include login requests that verify the user’s identity through the use of a registered username and password. These servers include authorization components as part of their operating system. Authorization, like authentication, permits users access to the server’s data and devices that will fulfill requests.
Web servers must also provide module support. Modules are smaller parts of a program that carry out specific tasks. Programs are made up of individual modules that are then linked. Each module contains one or more routines that perform a specific task. Module support allows for the expansion of server capacities by allowing the addition or modification of modules linked to the server software. This support also allows modules to be loaded dynamically based on user specifications by the core server.
Servers also often provide HTTS support. HTTS refers to a secure Internet connection. Using HTTS in the URL indicates HTTP use with a different default TCP (Transmission Control Protocol) port. This default port (443 instead of the standard port 80) functions like a regular port, that is, in networks it refers to a specific endpoint of a connection. Port numbers designate function. For example, port 80 is used for HTTP traffic. HTTS uses a different port (443) and a layer of encryption and authentication between the HTTP and TCP. This is used for secure transactions like payment transactions.
Content compression is employed by web servers to reduce the size of responses, thereby reducing bandwidth usage. Data compression stores more data in fewer bits by encoding it. This encoding is achieved by creating abbreviated syntax that is understood by both the browser and the server. Compression is achieved through, for example, ‘gzip’ encoding. Gzip is free software used for file compression.
Virtual hosting is a shared web hosting service. Web hosting services allow individuals or organizations, such as government or corporate ones, to make their web sites accessible through the World Wide Web. These provide space on a server owned by a company and provide data centers that provide Internet connectivity. It is a method by which web servers host more than one domain name on the same computer at the same IP address, making it more economical.
Some of the other features a web server might contain are large file support and bandwidth throttling. Large file support means that the server must be able to deliver files bigger than 2Gb on 32 bit operating system. Bandwidth throttling limits the amount of data that can be transmitted over a certain amount of time. This improves quality of service by preventing server crashes and stabilizing the transfer of data.
Client Server Performance Parameters
Web server programs are supposed to serve multiple requests simultaneously on various TCP/IP connections. Client loads vary and so do requests per client. Taking that into consideration, the performance parameters of web servers include the following: number and type of requests per second; latency time, measuring in milliseconds how long it takes to complete each new connection or request; throughput or the amount of data transmitted in response to a request measured in bytes per second. This depends on, among other things, file size and available network bandwidth. Performance is also determined by concurrency levels or the ability for the server to provide access to specific files, in this case, those that make up web pages, to multiple users simultaneously. Finally, the server model, whether client-side or server-side, used to execute web server programs establishes scalability. Scalability is a system property that refers to a system or network’s ability to manage increasing workloads well and the ability to expand gracefully.
Load Limits : Causes and How to Avoid Overload
Web servers have load limits because they can only support a fixed number of clients making requests at the same time from the same IP address and IP port. They can only process a certain number of requests per second and when the system becomes overloaded, it becomes unresponsive. Load limits are determined by the server’s settings, whether the content retrieved is static or dynamic, HTTP request types, and whether the content is cached or not. Cached content is easier to access than original data. Cached content is the duplicate of original data that is stored elsewhere, but might be difficult or time consuming to retrieve. It reduces the need to re-compute original data that is frequently accessed. This reduces the load and increases the server’s speed.
Overload can be caused by a variety of factors, from too much traffic at one point in time, to computer worms generating excessive traffic. However, the partial unavailability of poorly maintained servers in need of upgrade or maintenance is another cause of overload.
In order to prevent overload network traffic must be managed through a variety of methods. Web sites protect themselves through the use of firewalls. Firewalls are security devices used to block traffic and set up connections only within the parameters of the company’s or individual’s IT security policy. They shield against bad IP sources or bad IP patterns. HTTP request managers redirect or re-write requests with bad HTTP patterns. Bandwidth management and traffic shaping helps stabilize network usage. Bandwidth management measures and controls the amount of data or traffic, for example, that takes place on a network link. Overloading links or even filling them to capacity affects performance and causes network congestion. Traffic shaping helps control traffic to optimize server performance as defined by low latency (speed) and low bandwidth use (more economical). By classifying requests and enforcing policies, traffic shaping mechanisms reduce network congestion and improve quality of service (QoS).