Chaotic and Strangely Attractive
University of North Carolina at Asheville
There is evidently a strong human tendency to conceptualize reality in terms of objects. This may in part be due to the influence of language. For example when infants acquire names for things they immediately begin to form hard and fast object categories from which accurate predictions about real physical objects are made (Gelman & Markman, 1986). Members of each such category seem to acquire an underlying "essence." That the formation of firm linguistic- bound classifications occurs so early in life suggests that the tendency to form them is rooted in the biology of the brain itself. Evidently there has been an evolutionary advantage to seeing the world in terms of crisply defined objects. Such an advantage is not difficult to imagine if one thinks about the early history of our species. There are obvious benefits to living in a world which yields clear distinctions, for example, between lions and gazelle, or between ripe red apples and bitter green ones. Monkey's of course can do this too. But monkeys cannot distinguish similar but distinct types of flint tools, or variations in the appearance of grain that indicate resistance to blight- - and certainly cannot pass these on to offspring.
Thus the world presents itself in terms of objects and categories of objects. Processes and patterns of activity, on the other hand, are seen only as through a glass darkly. This is especially true where it comes to the processual aspects of our own inner lives. Emotion, memory, and thought are such processes, and can only awkwardly be construed as having fixed form, shape, or permanence. Consciousness itself is such a process, or at least those aspects or reflections of it that can be sensed. If one thinks of consciousness in terms of metaphors such as light or clear water, then it has no discernible substance other than that which is disclosed by it- - such as stones seen at the bottom of a brook, to use Sartre's analogy. A more process oriented image, however, would be that of a shaft of light falling across a vortex of airborne dust. The light itself is both emptiness and illumination, but the vortex of dust that gives it definition is awhirl with motion. In the following paragraphs I will try to show that consciousness is similarly awhirl with motion, both at the level of experience, and at the level of the neurological events which undergird experience.
States of consciousness.
In the mid 1970s psychologist Charles Tart (1975, 1972) showed that consciousness can be usefully understood from a systems perspective. In particular he argued that ordinary waking consciousness, or "ordinary reality" as it has come to be called, can be seen as a discrete state of consciousness surrounded by a potentially large number of alternative or "altered" states, as suggested by William James' phrase:
Our normal waking consciousness...is but one special type of consciousness, whilst all about it, parted from it by the filmiest of screens, there lie potential forms of consciousness entirely different. We may go through life without suspecting their existence; but apply the requisite stimulus, and at a touch they are all there in all their completeness. (James, 1902/1929, pp. 378)
Tart's basic notion is that a state of consciousness is comprised of a harmonious set of psychological functions. The latter include memory, cognition, sense of humor, sense of self, external perception (exteroception), perception of internal body states (interoception), and so on. Together they form a gestalt- like whole, or in other words a working system. Ordinary waking consciousness is one such system. Dream sleep is another- - though in fact several states may be accessible in dreams (Krippner, 1994; Tart, 1969; LaBarge, 1988). Others include an unknown number of ecstatic states that can erupt spontaneously into ordinary consciousness, plus states that are accessible through meditation, the shamanic trance, hypnosis, and myriad drug induced states, as well as ordinary non- dream sleep.
Tart conceptualized states of consciousness as discrete. At first glance this idea may seem questionable, but it is useful in contrasting different states, and in fact seems to have some empirical justification. For instance falling asleep is accompanied by a slow vertical eye roll which marks an abrupt transition of awareness away from exteroceptive stimulation and the outside world and into an internal state of reverie. One abruptly looses awareness of external stimulation such as light or sound (Dement, 1972). The onset of dream sleep is likewise abrupt. Drug intoxication, brought on, for example, by alcohol or marijuana, seems gradual, but often is heralded by a distinct moment when one first feels intoxicated.
Tart described several types of processes that stabilize a state of consciousness. One of the most important of these is loading stabilization, or in plain English keeping a person busy with activities that support the desired state. For certain ecstatic states this might include chanting, or the repetition of a prayer or mantrum, but in the case of ordinary reality it usually means staying productively busy. Anything from doing dishes to mowing lawns, paying bills, building houses, or writing books will do it. Such productive business is the passion of our civilization, and we tend to look poorly on anyone or anything that is not so.
Transitions between states of consciousness can be brought about by disrupting the stabilizing processes and bringing to bear positive patterning forces in the direction of the new desired state. For example, to go to sleep we must quit doing dishes, mowing lawns, and writing bills, and move to a dark and relatively quiet place where natural biological tendencies toward rest and sleep can have their play. The technologies of consciousness, found for example in yoga or shamanism, involve many patterning techniques designed both to disrupt ordinary consciousness and to move the practitioner toward extraordinary states (Combs, 1993, 1995).
To my mind Tart's ideas are of the first order, but can benefit from more recent advances in the sciences of complexity, which yield more dynamic and fluid conceptions of the nature of systems. For example, a state of consciousness can be reconceptualized as an attractor. Speaking informally, an attractor is a condition to which a system is drawn by its own nature. If a cup is placed slightly tilted on a table, it will roll about in a spiral till it comes to rest standing up. This latter condition is termed a static attractor, because it represents the static position to which the cup is disposed. More interesting are cyclic or fixed cycle attractors. The human heart, for instance, runs through its cycle many times each minute. The moon passes through its various phases each month. These, and many others, are instances of systems that naturally settle into predictable cyclic routines. Most interesting, however, are the class of attractors that are neither fixed nor precisely predictable. These are termed strange or chaotic attractors.
On close inspection the cyclic rhythm of the human heart is found not to be precise, like the motions of a clock, but only approximately so. It's global form is well known and easily recognized, but the precise action of an individual heart differs from beat to beat, thus defying exact prediction. Moreover, it is unlikely that the heart ever, in the strictest possible sense, repeats itself the same way twice. This situation of global familiarity but non- predictability, along with the idea that the system never exactly repeats itself, is exactly what defines a chaotic system, one whose action is described by a strange attractor. Though it is often difficult to completely satisfy these criteria in particular instances (Rapp, 1993), systems of this general type are found abundantly in nature, including biological systems such as the human brain (e.g., Basar, 1990; Pribram, 1994). Here I will refer to them as chaotic or chaos like.
States of consciousness as chaotic attractors.
I want to suggest that a state of consciousness is a chaotic attractor, one that brings together the many elements that form it in a way that coalesces them into a unified pattern of activity. The tendency of an attractor to draw its various constituents into a coherent configuration can be seen in the tendency to "fall" into a state of consciousness such as dream or nondream sleep once we tip into its basin (to use systems terms). This tendency of the attractor to capture consciousness in a particular state goes beyond Tart's original patterning forces and represents an intrinsic dynamic of the system itself. When sleep begins to overtake consciousness we drift off swiftly, carried away into its attractor basin. The swiftness of the transition depends on the steepness of the basin's slope. If we have gone without sleep for a long time the passage may be quick, but if we are worried, or have had too much coffee, it may be slow and troubled.
Why view consciousness as a chaotic attractor? The answer is that consciousness as a total event is, as William James (1890/1981) pointed out a hundred years ago, a constantly changing process, clearly not static or even following a fixed cycle, but nevertheless one that has an identifiable global character, at least for each individual. Memories come and go, thoughts pass through the mind only to disappear and return again later, moods are continually changing, and alertness and energy levels vary from hour to hour. These are the elements of a kind of mental soup, or more accurately a kind of mental weather, with the equivalent to the latter's constantly fluctuating temperature, humidity, wind, barometric pressure, and so on.
It is not surprising that weather is chaotic. Indeed, the elements that comprise it, such as temperature, oscillate in an identifiable cycle from day to day, but cannot be predicted with precision. What is more, it is unlikely that temperature fluctuations ever follow exactly the same course on any two days. Much the same can be said about mental weather. It is formed of the interaction of elements such as moods, thoughts, memories, and so on. These are Tart's original psychological functions. For some, such as moods, there is already empirical evidence that they are chaotic (e.g., Combs, Winkler, & Daley, 1994; Sacks, 1973/1990; Winkler & Combs, 1993), while virtually all are consistent with the general description, above, of chaotic processes. As a group, their interaction, like the interaction of the elements of the weather, yield an exquisitely complex process fabric that we know as consciousness. This fabric is far too complicated to describe in detail, but efforts have been made to mathematically conceptualize it as a grand chaotic attractor. Chaos mathematician Ben Goertzel (1994) recently developed the broad conception of a mathematical expression, which he calls the cognitive equation, that represents the entire process structure of an individual's mental life.
Goertzel imagines this structure as operating on two levels, that of the mind and that of the brain. He observes that, "the brain, like other extremely complex systems, is unpredictable on the level of detail but roughly predictable on the level of structure. This means that the dynamics of its physical variables display a strange attractor with a complex structure of "wings" or "compartments" (p. 157). These wings or compartments are in effect small attractors that reside in the larger attractor of the overall neurological activity of the brain. They might, for example, be associated with individual states of consciousness. For the sake of clarity I will continue to speak of them as separate, understanding that they are always part of the larger event which is the total process structure of the brain.
Goertzel views mental activity as running on top of the brain process, creating the second level of the system. The mental level is somewhat less finely detailed, however, and more generalized than the neurological level of activity. "If physical level attractors are drawn in ball- point pen, process [mental] level attractors are in magic marker" (p. 158). Nevertheless, the same overall process structure is apparent at both levels.
Conceptualizing states of consciousness as chaotic attractors takes us a good way down the road toward understanding something about their internal dynamics. For one thing, many complex chaotic systems are self- organizing. For example, a living cell is composed of a rich and complex matrix of chemical cycles which self- organize in such a way as to regulate the overall activity of the cell. In 1974 biologists Maturana, Varela, and Uribe, carried this notion further, suggesting that the total ongoing product of this matrix of activity is no less than the cell itself. In other words the principle activity of a living cell, when all its complex metabolic activities are summed up, is the continuing creation of itself. The above authors termed this process autopoiesis, or self- creation. Living cells are autopoietic systems. So are ecologies, as it turns out, as are many other complex systems, such as the international economy, and even human societies (Laszlo, Csányi, Combs, & Artigiani, in press).
I believe that consciousness, and in particular a state of consciousness, is also an autopoietic event (Combs, 1993, 1994, 1995). Like a living cell, it is made up of complex processes that interact in such a fashion as to create as a net result the state itself. It is not hard to see how this works. When we are depressed, for instance, we tend to selectively recall unhappy events (Bower, 1981), while these in their turn contribute to the mood of depression. On the other hand, a good mood facilitates happy memories, which support a positive disposition. Beyond this, the very context created by a particular state of consciousness conspires to support each of its own elements. Consider, for example, marijuana intoxication (Tart, 1971). A decline in short term memory contributes to an inability to sustain concentration for significant periods of time, while also supporting a style of cognition that relies more on intuition and imagination than discursive thought. This in turn fosters the style of inane yet imaginative humor so characteristic of the state.
It seems that this kind of self- regulating, essentially autopoietic activity, is seen at all levels of the system. For instance, beliefs rush to support each other even if they have to be invented on the spot, and even if they fail to form a logically coherent cloth. It is well known in social psychology that disproving a valued belief, say, that the Second Coming is at hand, does not eliminate it. Rather, it is modified, extended, and reissued with new force (e.g., Festinger, Riecken, & Schacter, 1956).
Interestingly, psychological stages of development, like states of consciousness, also appear to be autopoietic. Each is constructed and supported by a process fabric. In the case of Piaget's developmental epistemology, this fabric is woven of behavioral and mental schemata (Flavell, 1963; Gruber & Voneche, 1977). These wind together in a mutually supportive fashion. For instance the schema or concept of conservation, which specifies that matter does not appear or disappear out of nowhere, is supported by the schema of reversibility, the ability to perform mental operations in reverse. If a child is presented with the puzzling observation that a thin glass of water poured into a wide glass does not fill it to the level of the original, he or she can imagine the water poured back again into the tall glass and note that it's level is at the original height. The point here is simply that the mental processes which undergird consciousness, whether they be states of consciousness or developmental stages, seem to represent a single common regimen, that of an autopoietic process matrix.
The edge of chaos.
The basic idea here is that consciousness comes in units, termed states, each woven of psychological processes, or functions, such as memory, emotion, cognition, one's sense of humor, one's sense of self, etc. Here, I want to momentarily consider these elements in a more formal fashion, as simply forming a self- organizing assembly of constituents that interact with each other in a rich variety of ways. European systems theorist George Kampis (1991) has pondered such systems in detail. He terms them component- systems, and argues that they produce new and creative combinations that cannot, even in principal, be predicted by computational procedures such as performed by computers. The reason for this fundamental creativity is that the interactions of the component parts of these systems (processes to be more accurate) tend during their ordinary activity to build novel new structures (processes) while at the same time destroying some of those already present. These new elements in turn interact with each other and with the previously existing ones to produce new components not foreseeable from the original constituents. The net result is a rolling autopoietic event in which old structures are destroyed and original new ones routinely come into existence.
It is not hard to imagine how this works in day to day reality. The thoughts, memories, and feelings of one moment combine to create the thoughts, memories, and feelings of the next moment, and so on for each moment of our lives. Thus, like Heracleides' river, our inner lives are constantly in flux, never the same twice. But wait a moment; this is precisely what one would expect of a chaotic system. In this regard, it turns out that Goertzel's (1994) notion of the cognitive equation, a chaotic attractor, has a great deal in common with Kampis' idea of a component- system. Indeed, Goertzel was both influenced and encouraged by Kampis' thinking in the development of his own work.
In an interesting twist on this whole line of thought, Goertzel considers the problem of how such a network of interacting processes could find the limited though real constancy that characterizes our inner lives. To this end he offers the "productivity hypothesis," according to which "within a high degree of approximation, every mental process X which is not a pattern in some other mental process, can be produced by applying some mental process Y to some mental process Z, where Y and Z are patterns in some other mental process" (p. 157). In other words, any mental process that stands alone can be produced by the interaction of other mental processes, themselves the products of still other mental processes. Thus, for example, particular ideas and beliefs are the products of the interaction of other ideas and beliefs, themselves dependent upon still other interactions. In a different example, a particular emotion may result from the interaction of certain memories with certain thoughts. Alternatively, the interaction of the emotion with the memories might create the thoughts. Whether the combination of the emotion and the thoughts will actually recreate the memory is perhaps more questionable, but there is considerable evidence that memory is much more the product of construction that would have been guessed a few decades ago (e.g., Loftus & Hoffman, 1989; Winograd, E. & Neisser, 1992).
Goertzel is a computationalist, and does not agree with Kampis that the creativity of such a system as the human mind cannot, in principal, be modelled by a computer program. The question of whether the properties of consciousness can be represented computationally is the source of considerable debate today (e.g., Churchland, 1984; Dennett, 1991; Penrose, 1994) but need not distract us at the moment. Here I want to return to Goertzel's notion that the cognitive equation, which represents the complex process fabric consciousness, is chaotic. The idea that the processes which configure consciousness are chaotic is broad, and does not mean that they exhibit noticeable chaotic properties at every moment. Indeed, they do not. Even a relatively simple chaotic function such as the Lorenz attractor does not necessarily appear chaotic on brief observation. More to the point, psychological processes (e.g., Combs, Winkler, & Daley, 1994; Combs; Combs, 1995) as well as the neurological events that undergird them (e.g., Basar, 1993; Pribram, 1994) seem to move back and forth between relatively chaotic patterns of activity and more cyclic or even static ones. For example, moods can oscillate in a regular and predictable rhythm over the course of the day. Such a pattern, however, could not last indefinitely without perturbations contributing to more chaotic periods of activity. Moods are an obvious example, but there is no reason not to suspect that functions such as memory, thought, dreams, and general arousal do not also exhibit periods of constant or rhythmatic activity, and other periods of more chaotic change. If this is correct, then the overall process fabric of consciousness would likewise be expected to exhibit periods of calm, periods of more or less regular oscillation, and pronounced of periods chaotic activity. The long view of such a regimen would be that of a very complex mathematical attractor such as suggested by the cognitive equation. A shorter view, however, would disclose a system moving in and out of chaos.
Such systems on the edge of chaos are not unknown in the sciences of complexity. They may in fact turn out to be common if not universal among self- regulating systems such as living cells, ecologies, and brains. Discussing the evolution of biological systems, Stewart Kauffman (1993) recently observed that "selection achieves and maintains complex systems poised on the boundary or edge between order and chaos. These systems are best able to coordinate complex tasks and evolve in a complex environment" (p.xv).
The ability of a system to move in and out of chaos gives it a creative advantage. It is capable of shifting from a steady or cyclic routine to one that generates novel emergent properties, whether those be original ideas or perceptions, new patterns of behavior, or novel emotional responses. Moreover, there is a tenacious resilience to a chaotic regimen that is absent in routinized behavior. This is seen in the excited dance of both prize fighters and neurons. We meet challenge effectively, not by assuming a fixed posture, but by moving in a dynamic cross- step that absorbs blows, counterpoises against shifting ground, and yields new and unexpected rejoinders. From a theoretical point of view, chaos protects a system from sticking in small grooves or attractors, and thus failing to find larger, more effective outcomes. For instance, a memory search can be thought of as a trip through neural "state space" in search of the correct memory attractor (Abraham, 1995; Freeman, 1991; Skarda & Freeman, 1987). If the system gets stuck in the attractor of a wrong solution, subsequent recall will be incorrect. What is needed is a process that keeps it from settling down too quickly in the first attractor basin that comes along. This process is chaos. One can easily think of it as operating in a similar fashion during the search for a solution to a mathematical or linguistic problem, or a quest for the right artistic expression. Chaos is the antidote to stasis and stagnation.
Abraham, F.D. (1995). Dynamics, bifurcations, self- organization, chaos, and mind. In R. Robertson and A. Combs (Eds.). Proceedings of The Society for Chaos Theory and the Life Sciences. Lawrence Erlbaum.
Basar E. (Ed.). (1990). Chaos in Brain Function. Berlin: Springer- Verlag.
Bower, G.H. (1981). Mood and memory. American Psychologist, 36, 129- 148.
Churchland, P.S. (1984). Matter and consciousness. Cambridge, MA: Bradford Books, MIT Press.
Combs, A. (1993). The evolution of consciousness: A theory of historical and personal transformation. World Futures: The Journal of General Evolution. 38, 43- 62.
Combs, A. (1994). Psychology, chaos, and the process nature of consciousness. In F. Abraham and A. Gilgen (Eds.). Chaos Theory in Psychology. Westport, CT: Greenwood Pub.
Combs, A. (1995). The radiance of being. Edinburgh, Scotland: Floris.
Combs, A., Winkler, M., & Daley, C. (1994). A chaotic systems analysis of circadian rhythms in feeling states. The Psychological Record, 44, 359- 368..
Dement, W.C. (1972). Some must watch while some must sleep: Exploring the world of sleep. New York: W.W. Norton.
Dennett, D.C. (1991). Consciousness explained. Boston: Little, Brown.
Festinger, L., Riecken, H.W., Jr., & Schacter, S. (1956). When prophecy fails. Minneapolis: University of Minnesota Press.
Flavell, J. H. (1963). The developmental psychology of Jean Piaget. New York: Van Nostrand.
Freeman, W.J. (February, 1991). The physiology of perception. Scientific American. 78- 85.
Gelman, S.A., & Markman, E.M. (1986). Categories and induction in young children. Cognition, 23, 183- 209.
Goertzel, B. (1994). Chaotic logic. New York: Plenum.
Gruber & Voneche (Eds.). (1977). The essential Piaget. New York: Basic Books.
James, W. (1890/1981). The principles of psychology. Cambridge, Mass: Harvard University Press.
James, W. (1902/1929). The varieties of religious experience. New York: Modern Library.
Kampis, G. (1991). Self- modifying systems in biology and cognitive science. New York: Pergamon.
Kauffman, S.A. (1993). The origins of order. New York: Oxford University Press.
Krippner, S. (1994). Waking life, dream life, and the construction of reality. Anthropology of Consciousness, 5(3), 17- 23.
LaBarge, S. (1988). Lucid dreaming in western literature. In J. Gackenbach & S. LaBarge (Eds.). Conscious mind, sleeping brain. (pp.11- 26). New York: Plenum.
Laszlo, E., Csányi, V., Combs, A. L., & Artigiani, R. (in press). The evolution of cognitive maps: New paradigms for the 21st century. London: Adamantine Press.
Loftus, E.F., & Hoffman, H.G. (1989). Misinformation and memory: The creation of new memories. Journal of Experimental Psychology: General, 118, 100- 104.
Maturana, H. R., Varela, F. J., & Uribe, R. (1974). Autopoiesis: The organization of living systems, its characterization and model. Biosystems, 5, 187- 196.
Penrose, R. (1994). Shadows of the mind. Oxford: Oxford University Press.
Pribram, K.H. (Ed.). (1994). Origins: Brain and self organization. Hillsdale, New Jersey: Lawrence Erlbaum.
Rapp, P. (1993). Chaos in the neurosciences: Cautionary tales from the frontier. Biologist, 40(2), 89- 94.
Sacks, O. (1973/1990). Awakenings. New York: Harper.
Skarda, , C.A., & Freeman, W.J. (1987). How brains make chaos in order to make sense of the world. Behavioral and brain sciences, 10(2), 161- 195.
Tart, C.T. (1969). The high dream. In C. Tart (Ed.). Altered States of Consciousness. (pp. 171- 176). New York: Doubleday.
Tart, C.T. (1971). On being stoned. Palo Alto, CA: Science and Behavioral Books.
Tart, C.T. (1972). States of consciousness and state- specific sciences. Science, 176, 1203- 1210.
Tart, C.T. (1975). States of consciousness. New York: E.P. Dutton.
Penrose, R. (1994). Shadows of the mind. New York: Oxford University Press.
Winograd, E. & Neisser, U. (1992). Affect and accuracy in recall. Cambridge: Cambridge University Press.
Winkler, M., & Combs, A. (1993, July). A chaotic systems analysis of individual differences in affect. Paper presented at the 24th Interamerican Congress of Psychology, Santiago, Chile.