CST 426 -Introduction_Module1
CST 426 -Introduction_Module1
Sc
MLM College of Engineering)
Module – 1 (Introduction)
Client/Server Architecture:
The client/Server architecture is based on hardware and software components that interact to form
a system. This system includes three main components:
1. •Clients
2. •Servers
3. •Communication middleware
Client Server Computing: In client server computing, the clients request a resource and the server
provides that resource. A server may serve multiple clients at the same time while a client is in
contact with only one server. Both the client and server usually communicate via a computer
network but sometimes they may reside in the same system.
All the required data is concentrated in a single place i.e. the server. So it is easy to protect
the data and provide authorisation and authentication.
The server need not be located physically close to the clients. Yet the data can be accessed
efficiently.
It is easy to replace, upgrade or relocate the nodes in the client server model because all the
nodes are independent and request data only from the server.
CST426 - CLIENT SERVER ARCHITECTURE (Dept. of Comp.Sc
MLM College of Engineering)
All the nodes i.e. clients and server may not be built on similar platforms yet they can easily
facilitate the transfer of data.
The general forces that drive the move to client/server computing are:
• The demand for end user productivity gains based on the efficient use for data resources.
1. GUI : A GUI builder that supports multiple interfaces, Graphical User Interface provides the
user graphical means to interact with the system. GUI can be combination of both hardware
and software. Using GUI, user interprets the software.
Typically, GUI is more resource consuming than that of CLI. With advancing technology, the
programmers and designers create complex GUI designs that work with more efficiency,
accuracy and speed.
5. Debugging tools: Debugging is the process of finding and fixing errors or bugs in the source
code of any software. When software does not work as expected, computer programmers
study the code to determine why any errors occurred. Bugs and errors happen in computer
programming because it is an abstract and conceptual activity. Computers manipulate data
in the form of electronic signals. Programming languages abstract this information so
humans can interact with computers more efficiently. Any type of software has several
layers of abstraction, with different components communicating for an application to work
CST426 - CLIENT SERVER ARCHITECTURE (Dept. of Comp.Sc
MLM College of Engineering)
correctly. When errors occur, finding and resolving the issue can be challenging. Debugging
tools and strategies help to fix problems faster and improve developer productivity. As a
result, both software quality and the end-user experience improve.
Error identification
Developers, testers, and end-users report bugs they discover while testing or using the
software. Developers locate the exact line of codes or code module causing the bug. This can
be a tedious and time-consuming process.
Error analysis
Coders analyse the error by recording all program state changes and data values. They also
prioritize the bug fix based on its impact on software functionality. The software team also
identifies a timeline for bug fixing depending on development goals and requirements.
Client/Server security
Client/server security uses various authorization methods to make sure that only valid user and
programs have access to information resources such as databases.
• Access control mechanisms must be set up to ensure that properly authenticated users are
allowed access only to those resources that they are entitled to use.
• Such mechanisms include password protection, encrypted smart cards, biometrics, and firewalls.
Desktops are the front-end system devices, the ones that deal most directly with user input. They
are also the least secure environments in client-server models. Clients connect to servers and these
connections, if left open or not secured; provide entry points for hackers and other intruders that
may use data for nefarious purposes. Aside from physical client security in the form of disk drive
locks or diskless workstations that prohibit the loading of unauthorized software or viruses,
accessibility to all files stored on a workstation operating system is the other gaping security hole in
clients.
For example, the machine assumes that whoever turns on the computer is the owner of all the files
stored on it. They even have access to configuration files. This could result in sabotage or the leaking
of sensitive data. The transmission of corrupted data may also occur on the level of the operating
system, outside the realm of client-server application security, as data is transferred to different
tiers of the architecture.
CST426 - CLIENT SERVER ARCHITECTURE (Dept. of Comp.Sc
MLM College of Engineering)
However, the primary culprits of breaching client security are not hackers or viruses, but the users
themselves. The front-line of defence in client security is user identification and authentication. The
easiest way to gain illegal access to computers is to get users’ login ID and passwords. Sometimes
users pick short or easily guessed passwords or share their passwords with others.
Password management provides a security measurement for this by requiring a minimum amount of
characters to be used in passwords checking passwords for guess ability, and regularly asking users
to change their passwords. For example, more organizations are adopting policies of ‘passphrases’
rather than passwords that are more complicated and harder to identify or guess.
According to ‘networkworld.com’, which conducts tests on new technology to compile its ‘best
products’ issue, the best client-side security product is McAfee’s Secure Web Gateway. In its testing,
it deflected most spyware attacks. The system contains a scheme (minimalist, multi-paradigm
programming language) that proactively detects and blocks spyware. It also updates daily. Gateways
are nodes on a network that create entrances to other networks. It routes traffic from workstations
to broader networks. Therefore, securing the gateways will prevent malware from ever reaching the
client. Omni Quad’s Anti spy Enterprise also won high marks from ‘networkworld.com’ testers.
Security is often thought of only in terms of protecting software. However, any security plan should
be implicated hierarchically at every level. Servers must be located in secure, access-controlled
environments. Only authorized personnel should be allowed to supervise and administer it.
Essentially, server security is the controlling of access to the database server itself. The server must
be attached to a stable power supply that provides backup up power if the there’s a problem with
the supply. This enables the server to shut down in a way that protects data and causes the least
amount of damage. They should comply with business standards in password policy to protect
database access.
Encryption also protects data through advanced DES (Data Encryption Standard) mechanisms or
cryptograms. The degree of encryption depends on government standards. Database servers should
not be visible to the world. (Web servers, however, are and require specific security measurements,
since they support anonymous connections.)
For security and performance issues, the database backend should never be on the same machine as
the web server with its open connections. To secure the database, the server should be configured
to accept only trusted IP addresses. If the database is a backend for a web server, the IP address of
the web server should be the only address that can access the database server. Another security gap
in servers emerges from increasingly dynamic applications that allow on-line upgrades and can
infiltrate the database server.
CST426 - CLIENT SERVER ARCHITECTURE (Dept. of Comp.Sc
MLM College of Engineering)
Networks are vulnerable to intruders who ‘sniff’ or eavesdrop on networks that can contain sensitive
company information, passwords, and other potential company weaknesses. Secure networks
should conform to four principles that form a ‘trusted computing base’ (TCB). These are:
2) Discretionary control
3) Audit, and
4) Object re-uses.
Identification determines the user’s identity. The user is then authenticated through a password or
the completion of a registration form or some other access-controlling barrier. Authentication also
ensures the identity stays consistent across time. Authorization defines what the user is allowed to
do, what processes users have access to. Discretionary access control (DAC) is a security system that
gives users, processes, and devices specified permissions to gain access to system resources in
clearly defined ways.
Audits are systematic evaluations of the security of a company’s information systems. Audits
examine the most secure physical configuration of hardware and software connections, how
information is handled, and user practices. Object reuse takes a storage medium that contains one
or more objects. It protects network security by ensuring that all residual data from previous objects
is removed before the storage can be re-assigned.
Organizational Expectations
Among the many advantages they provide are cooperative processing and flexible end-user
application. Other features that make client-server systems more convenient for users are system
openness, the use of a common database and the presence of a database management system.
often-disparate computer platforms, are especially well-positioned to benefit from the flexibility and
adaptability offered by the Client/Server infrastructure.
Client/Server computing opens the door to previously unavailable corporate data. End users can
manipulate and analyse such data on an ad hoc basis by means of the hardware and the software
tools that are commonly available with client server environments. Quick and reliable information
access enables end users to make intelligent decisions. Consequently, end users are more likely to
perform their jobs better, provide better services, and become more productive within the
corporation.
Organizations that face problems with their internal data management typically favour the
introduction of Client/Server computing. Providing data access is just the first step in information
management. Providing the right data to the right people at the right time is the core of decision
support for MIS departments. As competitive conditions change, so do the companies’ internal
structure, thus triggering demands for information systems that reflect those changes. Client/Server
tools such as Lotus Notes are designed exclusively to provide corporations with data and forms
distribution, and work group support, without regard to geographical boundaries. These workgroup
tools are used to route the forms and data to the appropriate end users and coordinate employee
work. The existence and effective use of such tools allows companies to re-engineer their
operational processes, effectively changing the way they do the business.
New strategic opportunities are likely to be identified as organizations restructure. By making use of
such opportunities, organizations enhance their ability to compete by increasing market share
through the provision of unique products or services. Proper information management is crucial
within such a dynamic competitive arena. Therefore, improved information management provided
by a Client/Server system means that such systems could become effective corporate strategic
weapons.
As new and better services are provided, customer satisfaction is likely to improve. Client/ Server
systems enable the corporate MIS manager to locate data closer to the source of data demand, thus
increasing the efficiency with which customer enquiries are handled.
(i) Offload work to server: Database and communications processing are frequently offloaded to a
faster server processor. Some applications processing also may be offloaded, particularly for a
complex process, which is required by many users. The advantage of offloading is realized when the
processing power of the server is significantly greater than that of the client workstation. Separate
processors best support shared databases or specialized communications interfaces. Thus, the client
workstation is available to handle other client tasks. These advantages are best realized when the
client workstation supports multitasking or at least easy and rapid task switching.
(ii) Reduce total execution time: The server can perform database searches, extensive calculations,
and stored procedure execution in parallel while the client workstation deals directly with the
current user needs. Several servers can be used together, each performing a specific function.
Servers may be multiprocessors with shared memory, which enables programs to overlap the LAN
functions and database search functions. In general, the increased power of the server enables it to
perform its functions faster than the client workstation. In order for this approach to reduce the
total elapsed time, the additional time required to transmit the request over the network to the
server must be less than the saving. High-speed local area network topologies operating at 4, 10, 16,
or 100Mbps (megabits per second) provide high-speed communications to manage the extra traffic
in less time than the savings realized from the server. The time to transmit the request to the server,
execute the request, and transmit the result to the requestor, must be less than the time to perform
the entire transaction on the client workstation.
(iii) Use a multitasking client: As workstation users become more sophisticated, the capability to be
simultaneously involved in multiple processes becomes attractive. Independent tasks can be
activated to manage communications processes, such as electronic mail, electronic feeds from news
media and the stock exchange, and remote data collection (downloading from remote servers).
Personal productivity applications, such as word processors, spread sheets, and presentation
graphics, can be active. Several of these applications can be dynamically linked together to provide
the desktop information-processing environment. Functions such as Dynamic Data Exchange (DDE)
and Object Linking and Embedding (OLE) permit including spreadsheets dynamically into word-
processed documents. These links can be hot so that changes in the spreadsheet cause the word-
processed document to be updated, or they can be cut and paste so that the current status of the
spreadsheet is copied into the word-processed document.
Systems developers appreciate the capability to create, compile, link, and test programs in parallel.
The complexity introduced by the integrated CASE environment requires multiple processes to be
simultaneously active so the workstation need not be dedicated to a single long-running function.
Effective use of modern CASE tools and workstation development products requires a client
workstation that supports multitasking.
Rapid changes have occurred in computer technology resulting in system of increased capabilities.
This indicates that maximum resources are available to accept all these new products. For the
organizations using Client/Server systems the environment is heterogeneous whereas the users
CST426 - CLIENT SERVER ARCHITECTURE (Dept. of Comp.Sc
MLM College of Engineering)
prime concern to achieve the maximum functionality. Every Client/Server system should give equal
importance to the developers’ and users’ requirements. For the users, this means the realization of a
single-system-image. “A single system-image is the illusion, created by software or hardware, that
presents a collection of resources as one, more powerful resource.” SSI makes the system appear
like a single machine to the user, to applications, and to the network. With it all network resources
present themselves to every user in the same way from every workstation and can be used
transparently after the user has authorized himself/ herself once. The user environment with a
desktop and often-used tool, such as editors and mailer, is also organized in a uniform way. The
workstation on the desk appears to provide all these services. In such an environment the user need
not to bother about how the processors (both the client and the server) are working, where the data
storage take place and which networking scheme has been selected to build the system.
But as more companies follow the trend towards downsized Client/Server networks, some find the
promise elusive. Security, scalability and administration costs are three of the key issues. For
example, the simple addition of a new user can require the definition to be added to every server in
the network. Some of the visible benefits due to single-system image are as given below:
• Facilitates process migration across workstations transparently along with load balancing.
Downsizing: The downward migrations of business applications are often from mainframes to PCs
due to low costing of workstation. And also today’s workstations are as powerful as last decade’s
mainframes. The result of that is Clients having power at the cost of less money, provides better
performance and then system offers flexibility to make other purchase or to increase overall
benefits.
Rightsizing: Moves the Client/Server applications to the most appropriate server platform, in that
case the servers from different vendors can co-exist and the network is known as the ‘system’.
Getting the data from the system no longer refers to a single mainframe. As a matter of fact, we
probably don’t know where the server physically resides. Upsizing: The bottom-up trend of
networking all the stand-alone PCs and workstations at the department or work group level. Early
LANs were implemented to share hardware (printers, scanners, etc.). But now LANs are being
implemented to share data and applications in addition to hardware. Mainframes are being replaced
by lesser expensive PC’s on networks. This is called computer downsizing. Companies implementing
business process reengineering are downsizing organizationally. This is called business downsizing.
All this would result in hundreds of smaller systems, all communicating to each other and serving the
need of local teams as well as individuals working in an organization. This is called cultural
downsizing. The net result is distributed computer systems that support decentralized decision-
making. This is the client/server revolution of the nineties.
The centralized network has complete leverage to control the processes and activities
CST426 - CLIENT SERVER ARCHITECTURE (Dept. of Comp.Sc
MLM College of Engineering)