1. World problems
  2. Vulnerability of high technology systems

Vulnerability of high technology systems

  • Fragility of computerized information systems
  • Risks to computers and computer records
  • Inadequate safeguards against electronic crimes
  • Inadequate prevention of computer disasters
  • Vulnerability of telecommunications systems

Nature

Computers are widely used in private, commercial, and governmental operations to store vast quantities of information. In many cases the information constitutes a record vital to the continued functioning of organizational systems, particularly in the case of personal data, stock data, and financial or accounting records. Computer systems are vulnerable in a number of ways. Data may be lost due to equipment failure, electrical surges, operator error, software error, over or under heating, theft or computer viruses. Equipment may be damaged in fire, flood, earthquake or any of a number of natural or man made disasters. Programmers, operators or maintenance personnel may damage hardware, software or data bases. Data may be damaged in transit over telecommunications lines or when storage media, such as tapes or disks, are shipped. The losses include replacement or repair of equipment, software or data; lost business or other transactions; and losses of employee time.

The destruction of a computer, its files and backup media through accident can be a disaster. A business' ability to carry on operations can be halted through fire or explosion in a computer room. Records and files of a mainframe, mini or micro computer can be destroyed through operator error, electrical surges like from lightning, radiation from microwave oven and similar devices or physical abuse. A single process failure can cascade through a networked system that handles computing routines in serial fashion.

Vulnerability increases with the increase in quality and performance of modern technology. The higher level of performance of most technological advances relies upon a reduction of the margins of error that a system can tolerate without breakdown. Accidents and management mistakes may still occur, but their effects now have more costly systemic consequences. "Leanness" of systems (in which redundancies, backup procedures and check systems have been eliminated in the name of efficiency) also makes the system highly vulnerable. Compounding the structural fragility of computer systems is that the extent of their interconnectedness with modern life is largely invisible and unpredictable. The problem is only seen as such when the relationships are already disrupted.

The FBI list 3 levels of vulnerability risk for computer systems to outside interference: High – A vulnerability that will allow an intruder to immediately gain privileged access (e.g., sysadmin, and root) to the system. An example of this would be a vulnerability in which a sequence of instructions is sent to a machine by an unauthorized user and the machine responds with a command prompt. Medium – A vulnerability that will allow an intruder immediate access to the system that is not privileged access. This allows the intruder the opportunity to continue the attempt to gain root access. An example would be a configuration error that allows an intruder to capture the password file. Low – A vulnerability that provides information to an intruder that could lead to further compromise attempts or a Denial-of-Service (DoS) attack. The reader should note that while the DoS attack is deemed low from a threat potential, the frequency of this type of attack is very high. DoS attacks against mission-critical nodes are not included in this rating and any attack of this nature should instead be considered as a "High" threat.

Incidence

Most large organizations using computers have experienced disastrous hardware or software failures. Organizations are necessarily reluctant to reveal the insecurity or inadequacies of their systems because of their need to maintain security, competitive advantage or public confidence. The most visible disasters tend to occur in the public sector such as, for example, the case of the London Ambulance Service or the London Stock Exchange Taurus project. Private sector examples are: from 1990, when AT&T's long distance telecommunications system experienced repeated failures. At that time, it took two million lines of computer code to keep the system operational. But these millions of lines of code were brought down by just three lines of faulty code. In May of 1998, 90% of all pagers in the USA crashed for a day or longer because of the failure of one satellite. Late in 1997, the Internet could not deliver email to the appropriate addresses because bad information from their one and only central source corrupted their servers.

One study by the USA Department of Defence concluded that only 30 out of 17,000 computers then used by the DOD met minimum standards for protection from attack. The computers were vulnerable to a broad range of high-tech hit-and-run spying techniques, such as "spoof" programs which appear to be conducting routine activities while they are actually collecting passwords or other useful information; or the implementation of undetectable instructions into the software which might order the alteration or destruction of highly classified data.

Claim

People have not been eliminated from technology. They still figure everywhere. They maintain the equipment. They design, sell, buy, repair and operate it. Human beings make errors and they make unique and unpredictable errors particularly in unique situations. As new technologies are developed, new people are introduced to old technologies errors will happen in spite of any so called the safeguards, especially the foolproof ones.

To err is human, but to really foul things up requires a computer.

Our ability as an economy and as a society to deal with disruptions and breakdowns in our critical systems is minuscule. Our worst case scenarios have never envisioned multiple, parallel systemic failures. Just in time inventory has led to just in time provisioning. Costs have been squeezed out of all of our critical infrastructure systems repeatedly over time based on the ubiquity and reliability of these integrated systems. The human factor, found costly, slow, and less reliable has been purged over time from our systems. Single, simple failures can be dealt with; complex, multiple failures have been considered too remote a possibility and therefore too expensive to plan for.

Counter-claim

Computers are unreliable, but humans are even more unreliable.

Broader

Narrower

Aggravates

Cybercrime
Excellent
Terrorism
Presentable

Aggravated by

Hyperefficiency
Presentable

Reduced by

Related

Strategy

Value

Risk
Yet to rate
Inadequacy
Yet to rate
Crime
Yet to rate
Vulnerability
Yet to rate
Disaster
Yet to rate

Reference

SDG

Sustainable Development Goal #3: Good Health and Well-beingSustainable Development Goal #9: Industry, Innovation and InfrastructureSustainable Development Goal #10: Reduced InequalitySustainable Development Goal #13: Climate Action

Metadata

Database
World problems
Type
(D) Detailed problems
Subject
  • Cybernetics » Systems
  • Informatics, classification » Informatics
  • Information » Documentation
  • Information » Information
  • Societal problems » Crime
  • Societal problems » Emergencies
  • Societal problems » Inadequacy
  • Societal problems » Prevention
  • Societal problems » Protection
  • Societal problems » Vulnerability
  • Technology » Electronics
  • Technology » Technology
  • Transportation, telecommunications » Telecommunications
  • Content quality
    Presentable
     Presentable
    Language
    English
    Last update
    May 20, 2022