RDB PRIME!
Engineering
Home
Research Paper(s)
Resume
Technology Items
Site Map
Site Search
 
 It is 00:27 PST on Saturday 05/04/2024

UNIVERSITY OF MARYLAND
UNIVERSITY COLLEGE

myUMUCWebTycho


Professor Sam Chaudhuri
Foundations of Information Technology
Course MSIT610 Section 9042

Safe As A House-Methods of Enhancing Security Using Information Technology

December 07, 2003

Robert D. Betterton, &
Mark Thomson





ABSTRACT:

Security procedures have existed since the begining of recorded history. With the relatively recent explosion in the capabilities of information technology, the discipline provides unique opportunities to improve on security. Current techniques where information technology is being used to enhance security are surveyed in this paper. Biometrics, cryptography, signal and emanation interception and jamming, frequency hopping, collection management, and predictive systems are considered. The impact associated with information-technology enabled or enhanced applications are assessed.

INTRODUCTION

Security in its broadest sense is the sum of all measures used to prevent an undesirable outcome. The undesirable outcome depends on the person using the security measure, and may range from preventing an interception of a message with sensitive information, to identifying an unauthorized person in a secure facility, to using predictive techniques to guess where a determined adversary will strike next-and prevent that strike.

Security measures can be broadly categorized as passive vs. active. Passive security measures are put in place and used to "harden" targets or sites, making them more difficult to monitor, attack, or disrupt with little or no action on the part of the defending activity. Active security measures are put in place to act or react and address a threat, and require constant monitoring or actions to implement. Either active or passive security measures are valid ways to address security threats and some types of security measures fit into both categorizations.

An active measure, for instance, interception of a signal, is still considered a security measure. Although it may appear that the active measure is not "enhancing" security because it is not always in place and requires initiation or management, it still enhances security because the active measure allows a response prior to a security breach, and helps prevent an undesirable outcome.

TRADITIONAL SECURITY MEASURES

Traditional security measures are those measures, which existed before the current IT revolution. These measures have relied largely on physical structure, manual and labor-intensive systems, and are generally reactive vs. passive approaches. Examples of traditional security measures include:

  • Controlled access architecture
  • Guards at access points
  • Manual locks
  • Security cameras, with manual review of what the cameras are monitoring
  • Manual fingerprinting by law enforcement agencies
  • Wiretapping or search warrants where suspicion exists
  • Collection of intelligence by human agents
  • Analysis of intelligence by human analysts

APPLICATIONS OF INFORMATION TECHNOLOGY TO SECURITY

Information technology has dramatically increased the possible actions, which can be taken to enhance security. Information technology is used to do one of two things:

  1. Enhance traditional techniques, these are fields, which could be done without IT, but would not be done as efficiently.
    1. Cryptography,
    2. Fingerprint identification,
    3. Security clearance management.
  2. Enable non-traditional techniques, fields that are either virtually impossible without IT, or completely dependent on the existence of IT.
    1. Signal hopping,
    2. Retinal identification,
    3. Predictive threat techniques.

BIOMETRICS

Biometrics is the use of physical features of an individual as a means of identification. (Schneier, 2001). Everybody uses biometrics in day to day situations, recognizing someone's face in a crowd or voice on a phone is a form of biometrics. Information technology enables more sophisticated systems of identification based on biometrics to be created. Any unique physical feature can be used as a system on which a biometric identification system can be built if the technology currently exists to use that feature to identify an individual. Some features are more easily discernable or more unique.

For instance, a fingerprint is an excellent way of distinguishing an individual. However, the height of an individual, while it could theoretically be used as a biometric identification tool, would be of limited use, too many people have exactly the same height, or so close as to be undiscernable to an automated system. Interestingly, in identification of human remains, dental records and intact dentition are considered a more accurate method of identification than DNA sampling.

Almost all-biometric identification systems suffer from a problem with "acceptability," or the willingness of the average person to submit to the scans or intrusive actions required to make the system function properly.

According to Deborah Russell and G.T. Gangemi, "surveys indicate that in order of effectiveness, biometric devices rank as follows, the most secure to least secure, (Russell, Gangemi, 1991):

Retina pattern devices
Fingerprint devices
Handprint devices
Voice pattern devices
Keystroke pattern devices
Signature devices

In order of personal acceptance, the order is just the opposite:

Keystroke pattern devices
Signature devices
Voice pattern devices
Handprint devices
Fingerprint devices
Retina pattern devices

Specific features of each device are:

Retinal identification systems:

Each individual has a unique pattern of blood vessels in their eyes that can be used as a means of identification. Retinal identification systems "use an infrared beam to scan your retina, measuring the intensity of light as it is reflected from different points." (Russell, Gangemi, 1991) The different light intensities create a pattern that is used much as a fingerprint, a unique form of identification.

Retinal scanning is highly accurate. According to the GAITS company website, "The iris can have more than 250 distinct features... the probability of two irises being alike is approximately 1 in 1078 (the population of earth is approximately 1010)."

However, there is significant fear on some people's part of allowing retinal identification systems to project light into their eye, which limit their acceptability to the public. Some highly secure installations, notably military sites, do use them.

Fingerprint identification systems:

Each individual has a pattern of features on the tips of their fingers that uniquely identify that individual. Fingerprint recognition systems scan fingerprints into computer using a glass plate and reflective light, and then digitize the captured fingerprint and interprets whether it matches the database of "trusted" fingerprints, allowing access.

Despite long-term use by law enforcement agencies, which transitioned from a cumbersome manual system to an automated fingerprint database system between the 1960s and 1990s, fingerprint pattern devices are still not widely accepted. Possibly this is because of public association between being fingerprinted and being apprehended.

Handprint recognition systems:

Handprint recognition systems operate on the unique measurements and proportions of your hand. They have a similar methodology to fingerprint devices in that they sample and quantize the hand's geometry and then interpret whether it matches the "trusted" database.

However, handprint devices are subject to significantly more variation due to swelling, injuries, dirt, or other variations on hand geometry that are not significant degraders to fingerprint device performance.

Voiceprint recognition systems:

Voiceprint recognition systems work by recording unique characteristics associated with a particular individual saying a specific phrase. The phrase is then analyzed for its components, and allows access to the secure area if the voiceprint is within a certain range of the recorded voiceprint. They are subject to significant degradation of performance for conditions affecting voices, such as laryngitis.

Keystroke recognition systems:

Keystroke recognition systems are relatively new. They analyze individual's unique pattern of striking keys and the infinitesimal, but measurable, time differences between how quickly individuals hit those keys.

Signature recognition systems:

Signature recognition systems are also relatively new. They analyze individual signatures for either features found during the process of signing (speed of signature, pressure, etc.) or features found in a finished signature, such as specific geometries of the handwriting used to form certain letters.

Signature recognition has probably the highest public acceptance rate of all widely used biometric technologies, since it requires an artifact (a signature) that is constantly given in other situations and does not concern most people to give.

Facial recognition systems:

Facial recognition systems are relatively new. They analyze individual faces for unique characteristics (such as distances and ratios of significant features) to create a unique template.

Facial recognition is not currently a highly accurate tool, although in conjunction with other recognition systems can be a good secondary method of identifying individuals. Since facial recognition is not intrusive, it can be done by a video surveillance system. It has the strange characteristic of both being highly accepted in some situations (due to the fact that intrusive tests are not needed to identify the personnel) and being highly unaccepted in some situations (for instance, civil libertarians are extremely concerned about constant monitoring).

BIOMETRIC SYSTEMS AND MONITORING

Biometric systems are being used and developed to create surveillance systems, that provide for the public safety at many places. Examples are airports, police stations, train stations, roads, access ways, etc. These systems tend to use nonintrusive scanning, especially facial recognition, to pick out suspicious people and activity, but other, more novel purposes have been used. For example, a system exists to scan for people sized objects at the bottom of a pool that have not moved for extended periods of time or otherwise show signs of having problems. Obviously, since lifeguards are fallible, this system helps them pinpoint possible problems, supplementing human talents.

CRYPTOGRAPHY

Cryptography refers to the science and practice of creating "secret" languages. These languages are distortions of the true language. In that, nobody except the initiator and, hopefully, the recipient can understand the message being transmitted. In other words, a cryptographic system will "...disguise confidential information in such a way that its meaning is unintelligible to an unauthorized eavesdropper."

TRADITIONAL CRYPTOGRAPHY

Cryptography translates a "plain text" message, something with meaning to an unauthorized viewer, into something unintelligible to an unauthorized viewer. It uses a "key," or a transformative tool, which acts on the plain text to change it into something different, a cryptogram. The authorized viewer, the intended receiver, will be able to use another key to transform the message into a coherent result. (Piper, Murphy, 2002)

myCryptoDecryptoProcess


"Modern encryption algorithms tend to operate on bits rather than...letter substations" (Piper, Murphy, 2002). Thus, an IT tool is indispensable for creating a coded message in modern days. Only an IT tool could have the processing power to operate on huge amounts of bits and return a usable message, consistent with the cryptographic intent of creating an unintelligible (to the outside user) message, in a reasonable period of time.

QUANTUM CRYPTOGRAPHY

This is a rather new field in security systems, and is primarily theoretical at this time, however, there have been some experimental systems built, early 1990s, with some systems going live this year. But, quantum cryptography does remain an immature technology. Scientists have been working on the concepts behind quantum cryptography for three decades. After a long journey from chalkboard to lab to working prototype, the field is on the verge of a breakout.

Quantum theory remains as shocking today as it was when Bohr first proposed the quantum theory for the atom early in the last century. The notion that very small things, such as atoms and molecules, do not behave in the same way as macroscopic matter was then, and remains today, nearly incomprehensible to the human mind. We rely on experience and observation to develop our intuition, and most of us have never observed the behavior of individual atomic or subatomic particles.

But a few have explored the world of the very small. And among those few, a handful of visionaries have been able to fathom ways to use the discontinuous (quantum) behavior of these small particles to our advantage. Quantum cryptography is one example of applying a deep understanding of quantum physics to create a novel technology of potentially enormous significance.

Stephen Weisner introduced the ideas behind quantum cryptography, in a proposal called "Conjugate Coding" from the early 1970s. His work was eventually published in 1983 in Sigact News. In 1984 Bennett and Brassard, who were familiar with Weisner's ideas, were ready to publish ideas of their own, and they gave us "BB84", the first quantum key exchange protocol. In 1991 the first experimental prototype based on the Bennett and Brassard protocol became operable within a distance of 32 centimeters. Recently fiber optic cable systems have been tested successfully within kilometer distances. (Wikipedia, The Free Encyclopedia, 2003, and Business Week, 2003). This quantum fiber optic cable system is showing promise for this new type of cryptography.

Quantum cryptographic systems take advantage of Heisenberg's uncertainty principle, according to which measuring a quantum system in general disturbs it and yields incomplete information about its state before the measurement. Eavesdropping on a quantum communication channel therefore causes an unavoidable disturbance, alerting the legitimate users. (Brassard, 1994)

Basic thoughts:

In its basic form a quantum cryptography system is where the sender and receiver can tell if the key transmission has been intercepted due to the quantum properties of the photons involved. This is because measuring a photon will change its properties. Another thought is for the sender to send a large number of photons of varying spin. The receiver then contacts the sender on an insecure phone-line to say which photons arrived correctly, as many will be lost. The photons received are then used to encrypt the main message. Even if an eavesdropper knows which photons are being used for the encryption they have no way of knowing which spin those photons have, whether they represent a 0 or 1. With a long code, the number of possible combinations of 0's and 1's would be impractical to test and so the main message cannot be decrypted.

Thus, the advantage of quantum cryptography over traditional key exchange methods is that the exchange of information can be shown to be secure in a very strong sense, without making assumptions about the intractability of certain mathematical problems. Even when assuming hypothetical eavesdroppers with unlimited computing power, the laws of physics guarantee (probabilistically) that the secret key exchange will be secure, given a few other assumptions. (Ford, 2003).

The protocol, an example:

The general protocol for agreeing on a secret key, as described by Bennett et al [1991]. Furthermore, see (Henle WWW, BB84 Demo, 2003) for an online demonstration, uses polarization of photons as its units of information. Polarization can be measured using three different bases, which are conjugates: rectilinear (horizontal or vertical), circular (left-circular or right-circular), and diagonal (45 or 135 degrees). Only the rectilinear and circular bases are used in the protocol (Ford, 2003).

Alice wants to send a message to Bob. They both have devices that can generate pulses of light in any of the different polarization's, and also devices that detect the polarization of light:

  1. The light source, often a light-emitting diode (LED) or laser, is filtered to produce a polarized beam in short bursts with a very low intensity. The polarization in each burst is then modulated randomly to one of four states (horizontal, vertical, left-circular, or right-circular) by the sender, Alice.
  2. The receiver, Bob, measures photon polarization's in a random sequence of bases (rectilinear or circular).
  3. Bob tells the sender publicly what sequence of bases were used.
  4. Alice tells the receiver publicly which bases were correctly chosen.
  5. Alice and Bob discard all observations not from these correctly chosen bases.
  6. The observations are interpreted using a binary scheme: left-circular or horizontal is 0, and right-circular or vertical is 1.

This protocol is complicated by the presence of noise, which may occur randomly or may be introduced by eavesdropping. When noise exists, polarization's observed by the receiver may not correspond to those emitted by the sender. In order to deal with this possibility, Alice and Bob must ensure that they possess the same string of bits, removing any discrepancies. This is generally done using a binary search with parity checks to isolate differences; by discarding the last bit with each check, the public discussion of the parity is rendered harmless. In the Bennett et al. [1991] protocol, this process is:

  1. The sender, Alice, and the receiver, Bob, agree on a random permutation of bit positions in their strings (to randomize the location of errors).
  2. The strings are partitioned into blocks of size k (k ideally chosen so that the probability of multiple errors per block is small).
  3. For each block, Alice and Bob compute and publicly announce parities. The last bit of each block is then discarded.
  4. For each block for which their calculated parities are different, Alice and Bob use a binary search with log(k) iterations to locate and correct the error in the block.
  5. To account for multiple errors that might remain undetected, steps 1-4 are repeated with increasing block sizes in an attempt to eliminate these errors.
  6. To determine whether additional errors remain, Alice and Bob repeat a randomized check:
    1. Alice and Bob agree publicly on a random assortment of half the bit positions in their bit strings.
    2. Alice and Bob publicly compare parities (and discard a bit). If the strings differ, the parities will disagree with probability 1/2.
    3. If there is disagreement, Alice and Bob use a binary search to find and eliminate it, as above.
  7. If there is no disagreement after l iterations, Alice and Bob conclude their strings agree with low probability of error (2-l)

Implementation:

Early Years (1990s) --

At least three experimental apparatuses have been built for implementing quantum key distribution, in addition to the original 32 centimeter implementation shown by Bennett, Bessette, Brassard, Salvail, and Smolin. A prototype built in Geneva follows the original protocol of Bennet: it uses four different polarization states (see the example above) to carry the quantum information over more than one kilometer of optical fiber (Muller, Breguet, and Gisin, 1993). Another prototype built independently by British Telecom in association with the Defense Research Agency works by phase modulation over a distance of 10 kilometers of fiber; it is described in a sequence of two papers, (Townsend, Rarity, and Tapster, 1993). Yet another experimental demonstration was done that used the Einstein-Podolsky-Rosen entangled pairs sent over kilometers of fiber (Rarity, Owens, and Tapster, 1994).

Today (2003) --

A Swiss firm, ID Quantique, introduced the first commercial quantum cryptography products last summer. Sometime this summer, MagiQ Technologies in New York City is expected to unveil its Navajo quantum cryptographic system. Several communications companies are currently testing Navajo on their networks, and researchers in the field say the U.S. government could already be using quantum cryptography to secure communications. (Salkever, 2003)

The US Defense Dept. is funding numerous quantum cryptography experiments as part of its $20.6 million quantum information initiative at the Defense Advance Research Projects Agency (DARPA). MagiQ estimates that the market for quantum cryptography will hit $200 million within the next few years. It sells its quantum cryptography units for $50,000 apiece. (Salkever, 2003)

BBN, is building a test network funded by DARPA that will allow multiple parties to tap into a fiber-optic cable loop secured by quantum cryptography. Under the DARPA sponsorship and together with our academic colleagues, Harvard University and Boston University, BBN Technologies is building the world's first Quantum Key Distribution (QKD) network. "Rather than having one link protected by quantum cryptography, we imagined a big service where everyone could connect to everybody else," explains Chip Elliott, of Cambridge (Mass.) labs of Verizon (VZ) subsidiary BBN Corp.

Problems:

Like all systems, quantum cryptography isn't the only system without problems. The bursts of single photons move too slowly to be an effective means of real-time data exchange. Once errors are factored in, most quantum encryption systems move data at a rate of 1,000 bits per second or less. This is 1/10,000 the transmission speed of today's fastest systems.

The solution for this is BBN's Quantum Network which will be optical-based and built with new, very fast (femtosecond) entangled photon sources and novel network protocols that marry QKD (Quantum Key Distribution) with classic cryptography. As a result, the expectation is a network that can distribute keying material securely at speeds of up to millions of bits per second, which will offer orders of magnitude improvement over the current point-to-point speeds of a few thousand bits per second. Thus, with a key distributed via quantum cryptography would be all but impossible to steal. If a bank pairs a quantum cryptography system with a classical encryption system, then the quantum unit can be automated to pass fresh, secret keys from the sender to the receiver with assurance that no one has read those keys. It can do so as often as several times a second without slowing the data transmission. Since the key exchange is automated with quantum crypto, it's also much easier to work with than existing key-exchange mechanisms, which require more human intervention.

Final thoughts:

The computer world just might be witnessing a new and intriguing phase in the history of cybersecurity. While the concept and execution of quantum cryptography remain complex, apparently the technology, even it is immature state, is ready for prime time.

SIGNAL INTERCEPTION, ANALYSIS, AND PREVENTION

Signals from one piece of electronic equipment to another can take many forms. Some of the more common types include radio signals, e-mail, and Internet traffic. Security measures dealing with signal interception are concerned with either interception of a signal, or prevention of that interception.

SIGNAL INTERCEPTION

Traditional signal interception means getting a firm intercept of a transmission. Usually this applies to radio communications, and usually it means discovering the frequency the transmission is being sent on and covertly recording the transmission. The content of the message may be unintelligible (such as a foreign language or a transmission using key words,) but if the receiver can record the transmission or understand the words being used, valuable data will potentially have been given away. Information technology has enhanced signal interception by allowing extremely rapid scanning of numerous channels without human input into the scanning process. An automated tool performs the scanning and uses algorithms to determine which channels are "interesting," and should be forwarded to a human operator for a more detailed analysis.

TRAFFIC ANALYSIS

Traffic analysis goes along with signal interception. Traffic analysis refers to the continual analysis of patterns of messages, rather than the messages themselves. Patterns such as, which frequency transmitted when, how many times, the apparent information content of those transmissions, what communications happen after a certain kind of transmission is received, and so forth can often reveal a significant amount of information that an intercept of an encoded channel cannot. Information technology allows rapid or simultaneous scanning of multiple channels and aggregation of the "interesting" channels for further analysis by either an information system or a human operator. (Schneier, 2001)

FREQUENCY-HOPPING

Frequency hopping is a technology, which is designed to defend against signal interception and traffic analysis. Frequency hopping is a system where multiple "hops," or changes of frequency are made each second. This prevents an opponent's interception of a channel or communication by a simple scanning or traffic analysis tool. Essentially, frequency hopping is based on a "key," which informs the sending and receiving stations what frequency to start at, how often to hop, and which frequency to hop to next.

As an example:

Second 1 Second 2 Second 3 Second 4
Freq. 0.083 Freq. 9.247 Freq. 5.437 Freq. 1.219

A transmitter and receiver use this simple, four frequency sequence and only change frequencies every second (real frequency hopping transmission devices change many times per second).

Even if an interceptor with automated equipment tried to get a fix on the signal say, starting at 0.001 and going up .001 every second, they would only catch at most .17 seconds of first second of the transmission. By the time the enemy system realized it was on a valid, "interesting" channel, the frequency "hops" to the second frequency. Frequency hopping happens so rapidly that even this much interception is unlikely to happen, frustrating efforts to intercept the signal or perform traffic analysis-even if an un-encoded transmission is sent over the frequency-hopping channel.

E-MAIL AND INTERNET INTERCEPTION (ECHELON AND CARNIVORE)

Signals, which are not receptive to traditional interception techniques, are the basic tools of the information age-e-mail, telephones, the Internet, and so forth. In order to respond to the possibility of an undesirable outcome originating with use of these tools, national intelligence agencies have created various programs to monitor these tools, intercept suspicious traffic, and provide tools to allow a response. Two current tools, which can demonstrate these systems, are CARNIVORE and ECHELON. The CARNIVORE and ECHELON programs are attempts to use the basic tools of information technology to create systems to monitor e-mail and Internet traffic for suspicious activity.

CARNIVORE is run by the U.S. Federal Bureau of Investigation and is essentially an e-mail sniffer. It monitors e-mail traffic and zeroes in on words of interest, such as (presumably) "jihad," "bomb," "hijack," and so forth. According to the What You Need to Know About Networking website, "Carnivore does not work like a single-phone line wiretap; it must be installed on the public Internet where it filters through many otherwise uninvolved people's data to get to the subject(s) of interest." CARNIVORE has to be installed at a central location on a network where numerous messages are routed-such as at a large ISP. Once installed, it acts much as a wiretap on an individual's phone line would, except it accesses numerous messages to get at the individuals-and messages-the current search is interested in. However, the FBI claims that CARNIVORE only zeros in on packets which are from, or to, its intended wiretap candidate, and is not a wholesale e-mail filtering system.

Famously, Earthlink has allowed CARNIVORE to be used at one of its data centers. The FBI, however, has fairly strict rules on privacy and wiretapping and is theoretically somewhat constrained in indiscriminate use of CARNIVORE.

ECHELON is a multinational program, run by the intelligence agencies of the United States, Great Britain, Canada, Australia, and New Zealand. The US's secretive National Security Agency (NSA) is the leader of this program. ECHELON intercepts a huge amount-the exact amount is unknown, but up to 3 billion a day have been estimated-communications every day, of all modern types-e-mail, telephone calls, Internet downloads, satellite transmissions, and so forth. "The system gathers all of these transmissions indiscriminately, then sorts and distills the information through artificial intelligence programs." (Schneier, 2001). The major difficulty in this kind of system is the sheer volume of interception-if the analysis program fails to pick up on a message as threatening, further investigation by a human is probably not going to happen. The stunning 2001 terrorist attacks in New York City proved that no matter how much raw intelligence one gathers, such as with ECHELON, an enemy may, through purposeful action or sheer luck, evade detection.

SECURITY CLEARANCE MANAGEMENT

A security clearance is an indication that an individual's background, character, and actions have been checked and that they are now "cleared," or authorized to view, handle, store, or otherwise be responsible for sensitive information. With a clearance, an individual can enter secure areas, use classified information (up to the level of their classification), and perform effectively in their job. However, if they cannot gain a clearance, the individual cannot enter some areas, use some classifications of information, and is not an asset to some types of organizations and has to move to other types of work. This program has been in existence for some time. Prior to the explosive growth of information technology, management of the sheer volume of required clearances was difficult. Each individual requiring a clearance had to have some form of investigation performed, an adjudication (a judgement on that individual's trustworthiness) performed, and a clearance granted. When hundreds of thousands of people enlisted or entered government service each year or many individuals came up for periodic reinvestigation, in many cases an individual's clearance was delayed, especially if there was a backlog of investigative resources, as they usually are.

Information technology significantly enhances management of this enormously complex program. By using basic information technology tools, tracking of personnel requiring clearances, their security status, and the state of the investigations pertaining to them was greatly improved. Tools used to help this program were:

  1. Initiation and tracking of databases - to track security candidates and the status of their investigations,
  2. Automated checks of databases - to do low-level checks for possible security problems such as arrests, bounced checks, or enrollment in psychological or medical counseling (counseling checks are subject to rules on what is accessible to these levels of investigations). Some low level clearances, particularly confidential clearances, rely entirely on automated checks of existing records for "flags" to tell an adjudicator that an investigative agent needs to look at an individual or situation more closely.
  3. Automated forms, which were a major source of frustration for candidates.

PREDICTIVE THREAT SYSTEMS

Predicting where a threat will occur, and removing the threat or neutralizing the danger are worth more than any reactive system ever could be. Systems are being developed to take advantage of Information Technology and potential predictive techniques in order to identify threats before they occur.

Systems can generally be characterized by one of two methods, machine based and human based:

  • Machine based, requiring little human intervention. An example of this would be the US government's attempt to build a system called the "Total Information Awareness Program" (TIPS for short), which would use data mining techniques to bring together numerous individual transactions and pieces of data. The hope was that unforeseen connections between known terrorists might emerge-for instance, they tend to have accounts at Chase Manhattan, come from Botswana, and construct bombs from fertilizer. By using this kind of seemingly unrelated information, generated by day to day transactions, the system could flag an individual with an account at Chase Manhattan, naturalized from Botswana, who recently bought 100 pounds of fertilizer, alerting law enforcement agencies to investigate his current actions.

    The problem with TIPS is that it raised specters of George Orwell's 1984, with Big Brother watching all citizens. Many people worldwide value privacy over safety, especially if they are unconvinced that the system really would be used to increase safety, and not necessarily as another "tool" in law enforcement, even in the absence of corroborating evidence.

  • Human based, requiring active human input but still requiring IT to implement. The best example of a failed system in this category is the US government's recent attempt to build a "terrorist futures" market, where interested parties could buy and sell options on where and what terrorist attacks would take place. This system was designed around the concept that any given individual could be wrong in a given situation, but in the aggregate, well-informed people with a profit motive would more accurately predict terrorist attacks. This is the same profit motive and aggregate mentality that currently drives the stock market (the invisible hand), but the apparent ghoulishness of making profits out of predicting death and suffering meant this system was doomed before it was ever fully implemented.

    The concept remains the same for any predictive technique-use IT to bring together numerous interested people to make decisions or guesses, and give them something they want in return-such as money or recognition.


CONCLUSION

Information technology has opened huge areas of opportunity to improve security systems. Dozens of existing systems are available to help react to unforeseen security incidents, and prevent them in the future. While civil liberties are a potential casualty of the use of these systems, progress in this area is unavoidable, especially as new technologies are developed. Close attention to these technologies will assist any reader in knowing exactly what systems are available to enhance their security.

REFERENCES:

  1. ASIS International, 2003
    Retrieved November 25, 2003, from http://www.asisonline.org
  2. Brassard, G. and Salvail, L., "Secret-key reconciliation by public discussion",
    Advances in Cryptology, Eurocrypt '93 Proceedings, May 1993
  3. Brassard Gilles, "A Bibliography of Quantum Cryptography", 1994, Université de Montréal.
    Retrieved November 25, 2003, from http://www.cs.mcgill.ca/~crepeau/CRYPTO/Biblio-QC.html
  4. Defense Security Service, 2003
    Retrieved November 30, 2003, from http://www.dss.mil
  5. Ford James, "Quantum Cryptography Tutorial", 2003, Dartmouth College
    Retrieved November 30, 2003, from http://www.cs.dartmouth.edu/~jford/crypto.html#1
  6. Frequency Hopping, Wikipedia, The Free Encyclopedia, 2003
    Retrieved November 30, 2003, from http://en.wikipedia.org/wiki/Frequency_hopping
  7. GAITS website (biometrics),
    Retrieved November 24, 2003, from http://www.gaits.com
  8. General source references (not directly quoted) Security Management Online:
    Retrieved November 25, 2003, from http://www.securitymanagement.com
  9. Henle Fred, "BB84 Demo", 2003, Mercersburg Academy
    Retrieved November 24, 2003, from http://monet.mercersburg.edu/henle/bb84/
  10. Muller, A., Breguet, J. and Gisin, N., "Experimental demonstration of quantum cryptography using polarized photons in optical fibre over more than 1 km",
    Europhysics Letters, vol. 23, no. 6, 20 August 1993, pp. 383 - 388.
  11. Piper F., Murphy S., Cryptography: A Very Short Introduction, 2002, pp. 7-10, 60,
    Oxford University Press
  12. Quantum Cryptography, Wikipedia, The Free Encyclopedia, 2003,
    Retrieved November 30, 2003, from http://en2.wikipedia.org/wiki/Quantum_cryptography
  13. Rarity, J. G., Owens, P. C. M. and Tapster, P. R., "Quantum random number generation and key sharing",
    Journal of Modern Optics, vol. 41, no. 12, December 1994, pp. 2435 - 2444.
  14. Russell D., Gangemi G. T. Sr., Computer Security Basics, 1991, pp. 248, 250,
    O'Reilly & Associates, Inc.
  15. Salkever Alex, "A Quantum Leap in Cryptography", 2003, Business Week online, SECURITY NET
    Retrieved November 30, 2003, from http://www.businessweek.com/technology/content/jul2003/tc20030715_5818_tc047.htm
  16. Schneier B., Secrets & Lies: Digital Security in a Networked World, 2001,
    pp. 34, 35, 141; John Wiley & Sons, Inc.
  17. Shenk David, "Watching You,", George Steinmetz (photographs),
    National Geographic, Vol. 204, no. 5, November 2003, pp. 2-29.
  18. Townsend, P. D., Rarity, J. G. and Tapster, P. R., "Single photon interference in a 10 km long optical fibre interferometer",
    Electronics Letters, vol. 29, no. 7, April 1993, pp. 634 - 635.
  19. Townsend, P. D., Rarity, J. G. and Tapster, P. R., "Enhanced single photon fringe visibility in a 10 km-long prototype quantum cryptography channel",
    Electronics Letters, vol. 29, no. 14, 8 July 1993, pp. 1291 - 1293
  20. U.S. Department of Homeland Security,
    Retrieved November 29, 2003, from http://www.dhs.gov/dhspublic/index.jsp
  21. Virginia Commonwealth Preparedness,
    Retrieved November 26, 2003, from http://www.commonwealthpreparedness.virginia.gov/SecureVa/vathreat.cfm

Back | Home | Top | Feedback | Site Search


E-Mail Me

This site is brought to you by
Bob Betterton; 2001 - 2011.

This page was last updated on 04/26/2004
Copyright, RDB Prime Engineering



This Page has been accessed "3047" times.