Throughout history a basic problem has been to acquire sufficient information to generate effective change. The individual wishing to become expert in some field of knowledge had to buy information; the government wishing to understand even the rudiments of the structure of its society had to buy information through the census. Increasing amounts of money have continued to be paid for data acquisition on the assumption that data constitute information. But data only become information at the point when the exposed individual is changed and the capacity to be changed is clearly strictly finite. In conditions of data paucity, almost all data acquired can be transformed into information and used to procure effective change. But in conditions when the supply of data far outruns the processing capability, most data are literally worthless.
Systems for the collection of data can acquire a momentum of their own such that resources continue to be deployed to gather, order and store that data beyond any currently foreseeable need. In addition to an often costly diversion of resources from projects of higher priority, such systems gradually become to be seen as an end in themselves. This obstructs the formulation of more fundamental questions which would call for the collection of new kinds of data or the ordering of what has been collected in new ways. For individuals forced to consider more information and opportunities than they can effectively process, data overload leads to anxiety, stress, alienation, and potentially dangerous errors of judgment.
World book titles increased 132% between 1950 and 1979. At any given time 3,000 million titles may be in print. Estimates suggest that there are about 80,000 regular scientific and technical journals out of a total of about 150,000 journals with valid information content. As an example, it took 32 years from 1907 to 1938 before Chemical Abstracts reached its millionth abstract; the fifth million was reached in 3 years and 4 months. According to the area of science chosen and the method used, an annual growth rate of 4 to 8% is encountered with a doubling period of from 10 to 15 years. A spectacular example of the ability of information to reach people is the over 3,500% increase in television receivers since 1950, totalling some 500 million receivers in present use; this indicates an appetite for information that will be increasingly served by 24 hour programming, hundreds of television broadcasting stations, and thousands of dependent companies producing informational, educational or documentary programmes and films. In the decade from 1979 to 1989 the National Space Science Data Centre of the USA has accumulated some 6,000,000,000,000, or six trillion, bytes of information. That is about double the amount of information contained in all of the Library of Congress's 19 million books. The space probe of Venus, Magellan will increase this by an additional three trillion bytes. The Hubble Space Telescope scheduled to be launched in 1990 will generate several trillion bytes every year. If the Earth Observing System is launched, it will generate a trillions bytes of information every day. A host of other space probes are planned.
Data are an excrescence, the very latest kind of pollution. Nothing can be done about the management of information and knowledge towards the regulation of society as long as an approach is made in data-processing terms. Data are worthless until mechanisms are developed to transform data into information and to enable the use of that information to innervate society.