WEB& DATABASE SECURITY Final-1-13 Unit-1
WEB& DATABASE SECURITY Final-1-13 Unit-1
UNIT – I: The Web Security, the Web Security Problem, Risk Analysis and Best Practices
Cryptography and the Web: Cryptography and Web Security, Working Cryptographic Systems
and Protocols, Legal Restrictions on Cryptography, Digital Identification
UNIT – II: The Web‘s War on Your Privacy, Privacy-Protecting Techniques, Backups and
Antitheft, Web Server Security, Physical Security for Servers, Host Security for Servers,
Securing Web Applications
UNIT – III: Database Security: Recent Advances in Access Control, Access Control Models for
XML, Database Issues in Trust Management and Trust Negotiation, Security in Data
Warehouses and OLAP Systems
UNIT – IV: Security Re-engineering for Databases: Concepts and Techniques, Database
Watermarking for Copyright Protection, Trustworthy Records Retention, Damage Quarantine
and Recovery in Data Processing Systems, Hippocratic Databases: Current Capabilities and
UNIT – V: Future Trends Privacy in Database Publishing: A Bayesian Perspective, Privacy-
enhanced Location-based Access Control, Efficiently Enforcing the Security and Privacy Policies
in a Mobile Environment.
Text Books:
1. Web Security, Privacy and Commerce Simson GArfinkel, Gene Spafford, O‘Reilly.
2. Handbook on Database security applications and trends Michael Gertz, Sushil Jajodia
Reference Books:
1. Michael Gertz and Sushil Jajodia (Editors), Handbook of Database Security: Applications
and Trends , ISBN-10: 0387485325. Springer, 2007
2. Osama S. Faragallah, El-Sayed M. El-Rabaie, Fathi E. Abd El-Samie, Ahmed I. Sallam,
and Hala S. El-Sayed, Multilevel Security for Relational Databases.
3. Bhavani Thuraisingham, Database and Applications Security: Integrating Information
Security and Data Management, CRC Press, Taylor & Francis Group, 2005.
Course Outcomes:
1. To understand the concepts of web security and cryptographic system.
2. To learn privacy protection techniques and web server security concepts.
3. To understand access control models in XML, web server security and security in data
warehouses.
4. To learn the techniques and concepts of re-engineering security for databases.
5. To expose to trends in database publishing and mobile environment securities.
4
INDEX
S. No. Unit Topic Page No
1. I INTRODUCTION TO DATABASE SECURITY 1
2. I RISK ANALYSIS 2
3. I WEB CRYPTOGRAPHY AND ITS SECURITY 3
4. I WORKING CRYPTOGRAPHIC SYSTEMS AND 5
PROTOCALS
5. I LEGAL RESTRICTIONS ON CRYPTOGRAPHY, 9
DIGITAL IDENTIFICATION
6. II PRIVACY AND PRIVACY PROTECTING 16
TECHNIQUES
7. II BACKUPS AND ANTITHEFT, WEB SERVER 28
SECURITY
8. II PHYSICAL SECURITY OF THE SERVERS 30
9. II HOST SECURITY FOR SERVERS 32
10. II SECURING WEB APPLICATIONS 36
11. III RECENT ADVANCES IN ACCESS CONTROL 40
12. III ACCESS CONTROL MODELS FOR XML 42
13. III DB ISSUES IN TRUST MANAGEMENT AND TRUST 46
NEGOTIATION
14. III SECURITY IN DATA WAREHOUSE SYSTEMS AND 50
OLAP SECURITY
15. IV SECURITY RE-ENGINEERING FOR DBS: 54
CONCEPTS AND TECHNIQUES
16. IV DB WATERMARKING AND COPYRIGHT 58
PROTECTION
17. IV TRR AND DQR IN DATA PROCESSING SYSTEMS 62
18. IV HIPPOCRATIC DATABASE 64
19. V FUTURE TRENDS IN DATABASE PUBLISHING – 66
BAYESIAN PERSPECTIVE
20. V PRIVACY ENHANCED LBAC 70
21. V EFFICIENTLY ENFORCING THE SECURITY AND 72
PRIVACY POLICIES IN MOBILE ENVIRONMENT
5
UNIT-1
Introduction to Database Security
Database security refers to the range of tools, controls, and measures designed to establish and preserve
database confidentiality, integrity, and availability. This article will focus primarily on confidentiality since
it‘s the element that‘s compromised in most data breaches.
Database security is a complex and challenging endeavor that involves all aspects of information security
technologies and practices. It‘s also naturally at odds with database usability. The more accessible and usable
the database, the more vulnerable it is to security threats; the more invulnerable the database is to threats, the
more difficult it is to access and use. (This paradox is sometimes referred to as Anderson‘s Rule).
Why is it important
By definition, a data breach is a failure to maintain the confidentiality of data in a database. How much harm a
data breach inflicts on your enterprise depends on a number of consequences or factors:
Compromised intellectual property: Your intellectual property—trade secrets, inventions, and proprietary
practices—may be critical to your ability to maintain a competitive advantage in your market. If that
intellectual property is stolen or exposed, your competitive advantage may be difficult or impossible to
maintain or recover.
Damage to brand reputation: Customers or partners may be unwilling to buy your products or services (or
do business with your company) if they don‘t feel they can trust you to protect your data or theirs.
Business continuity (or lack thereof): Some business cannot continue to operate until a breach is resolved.
Fines or penalties for non-compliance: The financial impact for failing to comply with global regulations
such as the Sarbanes-Oxley Act (SAO) or Payment Card Industry Data Security Standard (PCI DSS),
industry-specific data privacy regulations such as HIPAA, or regional data privacy regulations, such as
Europe‘s General Data Protection Regulation (GDPR) can be devastating, with fines in the worst cases
exceeding several million dollars per violation.
Costs of repairing breaches and notifying customers: In addition to the cost of communicating a breach to
customer, a breached organization must pay for forensic and investigative activities, crisis management,
triage, repair of the affected systems, and more.
6
Risk Analysis:
A security risk assessment identifies, assesses, and implements key security controls in applications. It
also focuses on preventing application security defects and vulnerabilities. Carrying out a risk
assessment allows an organization to view the application portfolio holistically—from an attacker‘s
perspective. It supports managers in making informed resource allocation, tooling, and security control
implementation decisions. Thus, conducting an assessment is an integral part of an organization‘s risk
management process..
Factors such as size, growth rate, resources, and asset portfolio affect the depth of risk assessment
models. Organizations can carry out generalized assessments when experiencing budget or time
constraints. However, generalized assessments don‘t necessarily provide the detailed mappings between
assets, associated threats, identified risks, impact, and mitigating controls.
If generalized assessment results don‘t provide enough of a correlation between these areas, a more in-
depth assessment is necessary.
1. Identification. Determine all critical assets of the technology infrastructure. Next, diagnose sensitive
data that is created, stored, or transmitted by these assets. Create a risk profile for each.
2. Assessment. Administer an approach to assess the identified security risks for critical assets. After
careful evaluation and assessment, determine how to effectively and efficiently allocate time and
resources towards risk mitigation. The assessment approach or methodology must analyze the
correlation between assets, threats, vulnerabilities, and mitigating controls.
3. Mitigation. Define a mitigation approach and enforce security controls for each risk.
4. Prevention. Implement tools and processes to minimize threats and vulnerabilities from occurring in
your firm‘s resources.
Identify assets (e.g., network, servers, applications, data centers, tools, etc.) within the organization.
Create risk profiles for each asset.
Understand what data is stored, transmitted, and generated by these assets.
Assess asset criticality regarding business operations. This includes the overall impact to revenue,
reputation, and the likelihood of a firm‘s exploitation.
Measure the risk ranking for assets and prioritize them for assessment.
Apply mitigating controls for each asset based on assessment results.
7
Cryptography and Web Security:
Increasingly, systems that employ cryptographic techniques are used to control access to computer
systems and to sign digital messages. Cryptographic systems have also been devised to allow the
anonymous exchange of digital money and even to facilitate fair and unforgivable online voting.
Authentication:
Digital signatures can be used to identify a participant in a web transaction or the author of an email
message; people who receive a message that is signed by a digital signature can use it to verify the
identity of the signer. Digital signatures can be used in conjunction with passwords and biometrics or
as an alternative to them.
Authorization:
Whereas authentication is used to determine the identity of a participant, authorization techniques
are used to determine if that individual is authorized to engage in a particular transaction.
Cryptographic techniques can be used to distribute a list of authorized users that is all but
impossible to falsify.
Confidentiality:
Encryption is used to scramble information sent over networks and stored on servers so that
eavesdroppers cannot access the data‘s content. Some people call this quality ―privacy,‖ but most
professionals reserve that word for referring to the protection of personal information (whether
confidential or not) from aggregation and improper use.
Integrity:
Methods that are used to verify that a message has not been modified while in transit. Often, this is
done with digitally signed message digest codes.
8
Current practices, however, dictate that it is better to use algorithms that are specifically designed to
assure integrity for this purpose, rather than relying on integrity as a byproduct of other algorithms.
Using separate algorithms allows finer control of the underlying processes. Using separate
algorithms for confidentiality, authentication, and integrity also minimizes the impact of any legal
restrictions that apply to cryptography, because these restrictions are usually aimed at confidentiality
but not other cryptographic practices.
Nonrepudiation means adding assurance mechanisms to verify the identity and intent of the user.
This is needed so the user cannot claim, after the fact, that she did not actually conduct the
transaction. This claim may be phrased as a denial that the activity was ever conducted, or it may be
a claim that someone else was using her account. Although nonrepudiation is often listed as one of
the advantages of public key technology, the ―nonrepudiation‖ provided by this technology is not
true nonrepudiation. Public key technology can prove that a certain private key was used to create a
digital signature, but it cannot prove the intent of the key‘s user.
―Nonrepudiation,‖ as the term is commonly used by cryptographers, analysts, and even lawmakers, is
simply not possible. Even if the cryptography is perfect, the person‘s computer might be infected with
a virus that causes it to behave in a manner other than what‘s intended. Smart cards and biometrics do
not solve the nonrepudiation problem either—you might insert your smart card into a reader, thinking
you are signing an electronic check to subscribe to a magazine, only to discover that a hostile ActiveX
control has noticed the insertion and used your smart card to authorize the transfer of $1000 out of
your bank account. Or a crook may force your signature at gunpoint. People can always repudiate
something that a computer has done on their behalf.
9
WORKING CRYPTOGRAPHIC SYSTEMS AND PROTOCALS:
A cryptographic system is a collection of software and hardware that can encrypt or decrypt information. A
typical cryptographic system is the combination of a desktop computer, a web browser, a remote web server,
and the computer on which the web server is running.
Cryptographic protocols and algorithms are difficult to get right, so do not create your own. Instead, where
you can, use protocols and algorithms that are widely-used, heavily analyzed, and accepted as secure. When
you must create anything, give the approach wide public review and make sure that professional security
analysts examine it for problems. In particular, do not create your own encryption algorithms unless you are
an expert in cryptology, know what you‘re doing, and plan to spend years in professional review of the
algorithm. Creating encryption algorithms (that are any good) is a task for experts only.
A number of algorithms are patented; even if the owners permit ―free use‖ at the moment, without a signed
contract they can always change their minds later, putting you at extreme risk later. In general, avoid all
patented algorithms - in most cases there‘s an unpatented approach that is at least as good or better
technically, and by doing so you avoid a large number of legal problems.
Another complication is that many counties regulate or restrict cryptography in some way. A survey of legal
issues is available at the ―Crypto Law Survey‖ site, http://rechten.kub.nl/koops/cryptolaw/.
Often, your software should provide a way to reject ―too small‖ keys, and let the user set what ―too
small‖ is. For RSA keys, 512 bits is too small for use. There is increasing evidence that 1024 bits for RSA
keys is not enough either; Bernstein has suggested techniques that simplify brute-forcing RSA, and other
work based on it (such as Shamir and Tromer‘s "Factoring Large Numbers with the TWIRL device") now
suggests that 1024 bit keys can be broken in a year by a $10 Million device. You may want to make 2048
bits the minimum for RSA if you really want a secure system.
Cryptographic Protocols:
When you need a security protocol, try to use standard-conforming protocols such as IPSec, SSL (soon to be
TLS), SSH, S/MIME, OpenPGP/GnuPG/PGP, and Kerberos. Each has advantages and disadvantages; many
of them overlap somewhat in functionality, but each tends to be used in different areas:
Internet Protocol Security (IPSec). IPSec provides encryption and/or authentication at the IP
packet level. However, IPSec is often used in a way that only guarantees authenticity of two
communicating hosts, not of the users. As a practical matter, IPSec usually requires low-level
support from the operating system (which not all implement) and an additional keyring server that
must be configured. Since IPSec can be used as a "tunnel" to secure packets belonging to multiple
users and multiple hosts, it is especially useful for building a Virtual Private Network (VPN) and
connecting a remote machine. As of this time, it is much less often used to secure communication
from individual clients to servers. The new version of the Internet Protocol, IPv6, comes with
IPSec ―built in,‖ but IPSec also works with the more common IPv4 protocol. Note that if you use
IPSec, don‘t use the encryption mode without the authentication, because the authentication also acts
as integrity protection.
Secure Socket Layer (SSL) / TLS. SSL/TLS works over TCP and tunnels other protocols using
TCP, adding encryption, authentication of the server, and optional authentication of the client (but
authenticating clients using SSL/TLS requires that clients have configured X.509 client certificates,
something rarely done). SSL version 3 is widely used; TLS is a later adjustment to SSL that
strengthens its security and improves its flexibility. Currently there is a slow transition going on from
SSLv3 to TLS, aided because implementations can easily try to use TLS and then back off to SSLv3
without user intervention.
10
Unfortunately, a few bad SSLv3 implementations cause problems with the backoff, so you may need
a preferences setting to allow users to skip using TLS if necessary. Don‘t use SSL version 2, it has
some serious security weaknesses.
SSL/TLS is the primary method for protecting http (web) transactions. Any time you use an https://
URL, you‘re using SSL/TLS. Other protocols that often use SSL/TLS include POP3 and IMAP.
SSL/TLS usually use a separate TCP/IP port number from the unsecured port, which the IETF is a
little unhappy about (because it consumes twice as many ports; there are solutions to this). SSL is
relatively easy to use in programs, because most library implementations allow programmers to use
operations similar to the operations on standard sockets like SSL_connect(), SSL_write(),
SSL_read(), etc. A widely used OSS/FS implementation of SSL (as well as other capabilities) is
OpenSSL, available at http://www.openssl.org.
OpenPGP and S/MIME. There are two competing, essentially incompatible standards for securing
email: OpenPGP and S/MIME. OpenPHP is based on the PGP application; an OSS/FS
implementation is GNU Privacy Guard from http://www.gnupg.org. Currently, their certificates are
often not interchangeable; work is ongoing to repair this.
SSH. SSH is the primary method of securing ―remote terminals‖ over an internet, and it also includes
methods for tunelling X Windows sessions. However, it‘s been extended to support single sign-on
and general secure tunelling for TCP streams, so it‘s often used for securing other data streams too
(such as CVS accesses). The most popular implementation of SSH is
OpenSSH http://www.openssh.com, which is OSS/FS. Typical uses of SSH allows the client to
authenticate that the server is truly the server, and then the user enters a password to authenticate the
user (the password is encrypted and sent to the other system for verification). Current versions of
SSH can store private keys, allowing users to not enter the password each time. To prevent man-in-
the-middle attacks, SSH records keying information about servers it talks to; that means that typical
use of SSH is vulnerable to a man-in-the-middle attack during the very first connection, but it can
detect problems afterwards. In contrast, SSL generally uses a certificate authority, which eliminates
the first connection problem but requires special setup (and payment!) to the certificate authority.
Kerberos. Kerberos is a protocol for single sign-on and authenticating users against a central
authentication and key distribution server. Kerberos works by giving authenticated users "tickets",
granting them access to various services on the network. When clients then contact servers, the
servers can verify the tickets. Kerberos is a primary method for securing and supporting
authentication on a LAN, and for establishing shared secrets (thus, it needs to be used with other
algorithms for the actual protection of communication). Note that to use Kerberos, both the client and
server have to include code to use it, and since not everyone has a Kerberos setup, this has to be
optional - complicating the use of Kerberos in some programs. However, Kerberos is widely used.
AES Standards Key Size (in bits) Block Size (in bits) Number of
Rounds
AES-128 128 128 10
AES-192 192 128 12
AES-256 256 128 14
11
Digital Signature Algorithm:
Digital Signature Algorithm (DSA) is also a public key algorithm that is used only for digitally signing
documents. This scheme is suitable for achieving authentication before a message or documents are shared
(Forouzan, 2011). Receiving a digitally signed document, the recipient becomes confident that the sender
was a genuine one and the document was not altered during the transmission. Digital signatures are applied
in software distribution, financial transactions, and for documents that might be tampered with. To verify the
document the receiver performs the following steps:
1. Decrypts the digital signature using the sender's public key to read the message.
2. Generates a message digest for the receiver‘s message using the same algorithm used by the sender.
3. If both message digests do not match, the sender‘s message digest is considered to be compromised.
Hash functions:
Hash functions or one-way functions are used in public-key cryptography for implementing protocols
(Alawida et al., 2021). Hash functions do not need any key. They are easily computable but harder to
reverse. For example, f(x) can be computed easily but the computation of x from f(x) will take many years
even for all the computers of the world collectively. The value of f(x) is a fixed-length hash value computed
out of x which is the plaintext. Neither the contents of the plaintext nor its length can be obtained. Hash
functions are used to verify the integrity of the documents and encryption of passwords. Even a small bit of
change in the contents can be easily detected because the hash values of the two versions will be absolutely
different.
Cryptographic protocol:
Cryptography analyses the issues of integrity, authentication, privacy, and Nonrepudiation. Cryptographic
algorithms are having academic importance (Schneier, 2007). Application of these algorithms alone cannot
guarantee to achieve the goal of Cryptography. Well-defined policies and agreements between the parties
involved in the communication are also required in order to make Cryptography a reliable technology for
achieving its goals so that it can solve real problems in completing online tasks between trusted parties.
A cryptographic protocol is a distributed algorithm designed to precisely describe the interactions between
two or more parties with the objective of implementing certain security policies. It follows some series of
steps in exact sequence. Every step must be completely executed without any alteration in the agreed-upon
sequence. It must be complete and able to finish a task. At least two parties are required. Any single party
executing a series of steps to complete a task is not a protocol. Every party must know, understand, and
follow it. They must not be able to do something beyond the specified agreement.
Arbitrated Protocols:
Arbitrated protocols use a trusted third party called an arbitrator. The arbitrator has no vested interest and
cannot favor any of the involved parties. Such protocols are used to complete tasks between two or more
parties not trusting each other.
Adjudicated Protocols:
The arbitrated protocols are implemented with two sub protocols to reduce the cost of third-party
involvement. Some non-arbitrated protocol is used in the first level which is executed for each task. In the
second level, an arbitrated protocol is used which is executed only in case of disputes occur between the
involved parties during the task.
12
Self-Enforcing Protocols:
These protocols require no arbitrator to complete tasks or to resolve disputes. The protocol itself ensures that
there is no dispute between the involved parties. One party can detect whenever the other party is trying to
play smart and the task is stopped immediately. It is ideal that every protocol should be self-enforcing.
Similar to the attacks on Cryptographic algorithms and techniques, protocols can also be attacked by the
cheaters.
Types of Protocols:
A key exchange protocol is required for two parties to reach an agreement for a shared secret key. Either one
party can authenticate the other or both parties can authenticate each other. The protocol can agree for the
generation of a random key. One party can generate the key and send it to another party or both parties can
participate in the key generation.
This protocol is used by the involved parties to agree on a shared key by exchanging messages through a
public channel. Therefore, the key is not revealed to any unauthorized party. This is protected only against
passive attacks.
Identification protocols are required to ensure the identity of both parties when they are online for a task.
Genuine possession of their private keys needs to be verified. The level of identification by the protocols
may be judged with three levels: (1) Who is he? – Biometrics is used, (2) What he possesses? –Some
hardware gadgets can be used, (3) What he knows? – Secret keys or passwords are used. Some popular
protocols are zero-knowledge protocol, Schnorr Protocol, Guillou-Quisquater protocol, witness hiding
identification protocols, etc.
In absence of any digital signature scheme the two parties can share a password that is comparatively less
powerful.
Issues in Cryptography
In symmetric cryptography, if the key is lost, communication cannot be completed. This creates an issue of
secure key distribution with possibly involving either the sender and the receiver to communicate directly or
via a trusted third party or communicating via an existing cryptographic medium (Sharma et al., 2021). The
issue of key distribution is to be dealt with delicately: keys must be stored, used, as well as destroyed
securely.
Cryptography only transforms plaintext but never hides it (Rahmani et al., 2014). One weakness of
Cryptography is if somehow any third party detects the presence of an encrypted message, it can make
attempts to break into it out of curiosity. Sometimes curiosity feeds the cat. As a consequence, it can reveal
the secrecy, modify or misuse the information.
13
Legal Restrictions on Cryptography:
The legal landscape of cryptography is complex and constantly changing. In recent years the legal
restrictions on cryptography in the United States have largely eased, while the restrictions in other countries
have increased somewhat.
Patents applied to computer programs, frequently called software patents, have been accepted by the
computer industry over the past thirty years-some grudgingly, and some with great zeal.
The doctrine of equivalence holds that if a new device operates in substantially the same way as a patented
device and produces substantially the same result, then the new device infringes the original patent. As a
result of this doctrine, which is one of the foundation principles of patent law, a program that implements a
patented encryption technique will violate that patent, even if the original patent was on a machine built
from discrete resistors, transistors, and other components. Thus, the advent of computers that were fast
enough to implement basic logic circuits in software, combined with the acceptance of patent law and
patents on electronic devices, assured that computer programs would also be the subject of patent law.
Although new encryption algorithms that are protected by patents will continue to be invented, a wide
variety of unpatented encryption algorithms now exist that are secure, fast, and widely accepted.
Furthermore, there is no cryptographic operation in the field of Internet commerce that requires the use of a
patented algorithm. As a result, it appears that the overwhelming influence of patent law in the fields of
cryptography and e-commerce have finally come to an end.
Until very recently, many nontechnical business leaders mistakenly believed that they could achieve
additional security for their encrypted data by keeping the encryption algorithms themselves secret. Many
companies boasted that their products featured proprietary encryption algorithms. These companies refused
to publish their algorithms, saying that publication would weaken the security enjoyed by their users.
Today, most security professionals agree that this rationale for keeping an encryption algorithm secret is
largely incorrect. That is, keeping an encryption algorithm secret does not significantly improve the security
that the algorithm affords. Indeed, in many cases, secrecy actually decreases the overall security of an
encryption algorithm.
There is a growing trend toward academic discourse on the topic of cryptographic algorithms. Significant
algorithms that are published are routinely studied, analyzed, and occasionally found to be lacking. As a
result of this process, many algorithms that were once trusted have been shown to have flaws. At the same
time, a few algorithms have survived the rigorous process of academic analysis. Some companies think they
can short-circuit this review process by keeping their algorithms secret. Other companies have used
algorithms that were secret but widely licensed as an attempt to gain market share and control. But
experience has shown that it is nearly impossible to keep the details of a successful encryption algorithm
secret. If the algorithm is widely used, then it will ultimately be distributed in a form that can be analyzed
and reverse-engineered.
14
Regulation of Cryptography by International and National Law:
In the past 50 years there has been a growing consensus among many governments of the world on the need
to regulate cryptographic technology. The original motivation for regulation was military. During World
War II, the ability to decipher the Nazi Enigma machine gave the Allied forces a tremendous advantage-Sir
Harry Hinsley estimated that the Allied "ULTRA" project shortened the war in the Atlantic, Mediterranean,
and Europe "by not less than two years and probably by four years."As a result of this experience,
military intelligence officials in the United States and the United Kingdom decided that they needed to
control the spread of strong encryption technology-lest these countries find themselves in a future war in
which they could not eavesdrop on the enemy's communications.
Export controls in the United States are enforced through the Defense Trade Regulations (formerly known as
the International Traffic in Arms Regulation-ITAR). In the 1980s, any company wishing to export a machine
or program that included cryptography needed a license from the U.S. government. Obtaining a license
could be a difficult, costly, and time consuming process.
Following the 1992 compromise, the Clinton Administration made a series of proposals designed to allow
consumers the ability to use full-strength cryptography to secure their communications and stored data,
while still providing government officials relatively easy access to the plaintext, or the unencrypted data.
These proposals were all based on a technique called key escrow. The first of these proposals was the
administration's Escrowed Encryption Standard (EES), more commonly known as the Clipper chip.
In 1997, an ad hoc group of technologists and cryptographers issued a report detailing a number of specific
risks regarding all of the proposed key recovery, key escrow, and trusted third-party encryption schemes:
The potential for insider abuse-There is fundamentally no way to prevent the compromise of the system "by
authorized individuals who abuse or misuse their positions. Users of a key recovery system must trust that
the individuals designing, implementing, and running the key recovery operation are indeed trustworthy. An
individual, or set of individuals, motivated by ideology, greed, or the threat of blackmail, may abuse the
authority given to them.
The creation of new vulnerabilities and targets for attack: Securing a communications or data storage
system is hard work; the key recovery systems proposed by the government would make the job of security
significantly harder because more systems would need to be secured to provide the same level of security.
Scaling might prevent the system from working at all-The envisioned key recovery system would have to
work with thousands of products from hundreds of vendors; it would have to work with key recovery agents
all over the world; it would have to accommodate tens of thousands of law enforcement agencies, tens of
millions of public-private key pairs, and hundreds of billions of recoverable session keys.
The difficulty of properly authenticating requests for keys- A functioning key recovery system would deal
with hundreds of requests for keys every week coming from many difficult sources. How could all these
requests be properly authenticated?
The cost-Operating a key recovery system would be incredibly expensive. These costs include the cost of
designing products, engineering the key recovery center itself, actual operation costs of the center, and (we
hope) government oversight costs. Invariably, these costs would be passed along to the end users, who
would be further saddled with "both the expense of choosing, using, and managing key recovery systems
and the losses from lessened security and mistaken or fraudulent disclosures of sensitive data."
15
The Digital Millennium Copyright Act:
There is a huge and growing market in digital media. Pictures, e-books, music files, movies, and more
represent great effort and great value. Markets for these items are projected to be in the billions of dollars per
year. The Internet presents a great medium for the transmission, rental, sale, and display of this media.
However, because the bits making up these items can be copied repeatedly, it is possible that fraud and theft
can be committed easily by anyone with access to a copy of a digital item.
International agreements on the control of cryptographic software date back to the days of COCOM
(Coordinating Committee for Multilateral Export Controls), an international organization created to control
the export and spread of military and dual-use products and technical data.
The Council of Europe is a 41-member intergovernmental organization that deals with policy matters in
Europe not directly applicable to national law. On September 11, 1995, the Council of Europe adopted
Recommendation R (95) 13 Concerning Problems of Criminal Procedure Law Connected with Information
Technology, which stated, in part, that "measures should be considered to minimize the negative effects of
the use of cryptography on the investigation of criminal offenses, without affecting its legitimate use more
than is strictly necessary."
16