Meaning from Chaos: Gleick’s The Information
Diana Ascher
IS 291B—Special Topics in Theory of Information Studies
Prof. J.F. Blanchette
3/8/13
Meaning from Chaos: Gleick’s The Information
As I have evolved so has my understanding of the three laws.
~ V.I.K.I.
I, Robot
Introduction
Gleick’s The information: A history, a theory, a flood certainly can be examined in multiple ways. His intended audience of the general interested public will glean different lessons than the philosophy scholar or the engineer. While it is tempting to craft an analysis according to author James Gleick’s trifurcated delineation, I find it more interesting to examine the history, theory, and proliferation of information through the lens of decision making within a new scientific paradigm. In this paper I explore the way in which Gleick follows a shift in scientific paradigm that has redefined chaos and its relationship to the meaning of information.
Storytelling
Through a seemingly (but not completely) chronological accounting, Gleick smoothly guides the reader—who may be a member of the general lay public as easily as a member of the academic community—through the history of various communications technologies, with the aim of explaining how we conceptualize and talk about information today. Beginning with African talking drums (13) and ending with our current digital systems of communication (419), Gleick touches on the integral role played by scientific innovators in communications and technology and explains how their contributions have far-reaching effects that pervade our world.
Gleick offers memorable vignettes in demonstration of the amorphous state of our understanding of information. For example, the telegraph caused people to think of the message as something abstract, rather than physical. “A message had seemed to be a physical object. That was always an illusion; now people needed consciously to divorce their conception of the message from the paper on which it was written.” (151)
Further, Gleick explains that the advent of the telephone led to the feasibility of building skyscrapers because messages could be conveyed without the physical constraints associated with using human couriers. (192) Examples such as this make the content particularly digestible, as most readers 1) know what communication, skyscrapers, telegraphs, and couriers are, and 2) have never thought about a correlation between communication and skyscrapers. It is an example that demonstrates the physical consequences of communications technology—today and from the beginning of time—in an extremely memorable and relatable way.
Gleick’s examples reflect a history of various counterintuitive realizations that not only made new technologies and products possible, but also have shaped our evolving understanding and definition of information. For example, the repetition in African talking drum messages may seem inefficient. However, Gleick says, “redundancy—inefficient by definition—serves as the antidote to confusion.” (25) Redundancy provides clarification of meaning in the drum messages.
In similar fashion, each paradigm shift in the definition and conceptualization of information and meaning coincides with a counterintuitive conundrum. The entropy underpinning Claude Shannon’s information theory presents one such conceptual problem. It would seem that information is an antidote to entropy. However, Gleick explains that information is better conceived as a cognitive state, such that uncertainty resides in the mind of the information receiver and not “a property of material things in themselves.” (272) When thought of as a cognitive state, information is both objectively quantifiable and subjectively meaningful.
Foundational concept
Gleick relates the evolution of our understanding of information and meaning in light of emerging theories and technologies over time. Primary among these influences is Shannon’s information theory, which involves the assignation of meaning as a choice from among a range of possible meanings embedded within symbols, actions, and other manifestations of humans’ desire to communicate.
At first, such a correlation seems counterintuitive: Doesn’t information reduce chaos? However, Shannon’s theory frames entropy as an indication that information is present; that information serves to narrow down the possible meanings of a message. The narrowing of possible meanings clarifies message meaning and, thereby, facilitates better-informed decision making.
Shannon’s work on quantifying information is the basis for Gleick’s discussion of message transmission and the role of randomness in discerning meaning. Shannon looked at communication as the successful transmission of a message from one location to another. He said that the meaning associated with the message was irrelevant. The proposition that communication is divorced from meaning served many functions, the most notable of which is that it enabled scientific theorists to reevaluate chaos and to think differently about information and meaning. The idea that randomness factors into order opened the floodgates for different understandings of how meaning is constructed.
Shannon defined information as an equation that bears strong resemblance to Boltzmann’s equation for entropy. He considered information and entropy not as opposites like Maxwell’s Demon, but as interchangeable concepts. The equating of information and entropy led to the redefinition of chaos as a state of maximum information—the source of new information.
Shannon’s mathematical approach to information forms a basis for understanding complex systems that are governed by deterministic laws, but generate unpredictable outcomes. However, the ways in which these systems become unpredictable (chaotic) is, indeed, predictable. In other words, we can’t predict how the system behaves, but we can predict how the system will become chaotic. Or, as Dan Ariely would say, systems are predictably irrational.
Unpredictability creates noise. Noise is a factor that has been targeted for filtering in many disciplines, including audio storage and transmission, atlas representation, telephony, and mathematics. Noise is also an indicator of probability. Choosing from among the possible meanings of a message is a manner of assigning probabilities to each contending meaning and filtering out those that do not conform to the rules of the system. This choice relates to my work in decision making and evaluation of truth in that there are thresholds for a decision maker’s confidence that a statement is true, just as there are deterministic rules in a system that interpret the likelihood that a message is the true representation of a communication transmitted through the system. Again, the meaning or significance of these decisions is irrelevant to the assessment of confidence or probability. Rather, the weighting of each possible outcome and the quantity of the information determines the choice. This is decision analysis and information theory rolled into one.
Critique
While Gleick’s explanation of the evolution of information and meaning is interesting and relevant to my own work, I notice that there is little—if any—reference to societal, economic, or cultural influences. Granted, Gleick situates the paradigm shifts within the greater context of history, but he does not address with any depth how the social context was or was not ripe for new theoretical approaches to the timeless problem of defining, representing, communicating, and assigning meaning to information.
I find this omission ironic. Gleick laments the loss of meaning in contemplating communication systems, yet sacrifices the social context surrounding the evolution of information theory to focus on strictly scientific problem situation.
Perhaps the inclusion of a counterbalancing example for each instance of technological progress enabled by a paradigm shift offered would present a more well-rounded context. As a side note, the inclusion and exclusion of various contextual information is an example of Gleick’s making order from chaos with decision rules he established for his book.
Conclusion
Since information is omnipresent, more accessible than ever, and constantly increasing, Gleick presents the reader with an outlook for how we will manage and interpret information in the future. He calls for the return of meaning in how we conceptualize the communication of information.
Gleick’s information flood may be interpreted as an overwhelming, chaotic amount of information from which we must ferret out knowledge. Alternatively (and preferably, in my opinion), the flood represents an opportunity to glean new knowledge. This is one of the underlying drivers of big data and the prospect of uncovering previously indiscernible patterns from the massive data stores we continue to accumulate.
Additionally, order can be observed within this torrent of bits, because we no longer confine our thinking to the individual. Now we tend to examine systems as groups of individual units that behave according to general rules. We focus on symmetric recursiveness and filter out noise to associate meaning with message—much like Eglash with his fractals and Daston & Gallison with their atlas representations of nature. (See my forthcoming term paper.) In doing so, we are making choices along the way as to what is included and excluded from consideration to reconstruct meaning on the receiving end of the communication system.
Message meaning is uncertain. How do we make decisions when meanings are unclear and determined under uncertainty? We rely on the probabilities we assign to the possible outcomes and act in a manner that is likely to be the most beneficial.
References
Gleick, J. (2011). The information: A history, a theory, a flood. Fourth Estate (GB).