Wild Computing -- copyright Ben Goertzel, © 1999

Back to Wild Computing Table of Contents

Chapter 4:
A Fourfold Model of Information Space

1. The Four Levels of Being

I have posited a fairly close structural and dynamic mapping between the Internet, the human brain, and mind in general. To make this mapping explicit, I will now venture into the domain of philosophy, and present a "fourfold model of information space", applying to digital and neural information equally, and derived substantially from the four "levels of being" described by the contemporary philosopher Kent Palmer (1997): pure being, process being, hyper being and wild being ... the latter category being the source of the name for this book. Internet intelligence is bringing us a new kind of computing which spans all philosophical levels.

A brief note on the role of philosophy in this kind of inquiry may be appropriate here. Cognitive science is generally defined as an interdisciplinary study involving a number of fields, including psychology, computer science, linguistics and philosophy. However, transmission of ideas from the technical disciplines to philosophy is generally much more common than transmission in the reverse direction. I believe that cognitive science, and perhaps science in general, would benefit from paying more serious attention to philosophical concepts. Of course, philosophy cannot solve scientific problems, but it can point science in interesting and useful directions, saving time that would otherwise be spent exploring conceptually barren "dead ends." To make a moderately loose analogy, one might say that the relation between philosophy and science is like the relationship between design and coding in software engineering. Design doesn't tell you how to code, but it gives you the structure according to which the coding will proceed. While coding, you will often discover new things that force you to go back and re-design, but this doesn't mean that the design process was useless. Similarly, philosophy doesn't tell you how to do science, it just gives structures according to which science can proceed; and when the empirical or mathematical results feed back and modify the philosophy, this is merely part of the form-generation process.

Palmer's four levels of being are:

  1. static being, that which is present, existent

  2. process being, that which is dynamic, changing, becoming

  3. hyper being, emergent, autopoietic being, that which achieves semi-stasis emergent from dynamics

  4. wild being, a meta-field of overlapping fields of stasis and dynamics, involving interaction of numerous emergent beings

These are extremely general concepts which, nevertheless, can be quite useful for conceptualizing particular situations, such as the human brain, and the Internet.

Traditional science focuses on static being and process being. Differential equations, as used in physics, represent the most refined possible understanding of process being. Chaos and complexity science, on the other hand, represent an understanding of hyper being. The notion of an attractor, a structure emergent from complex or chaotic dynamics, is moving toward hyper being -- and the notion of a class of attractors, a type of attractor, is hyper being par excellence. The word "hyper" here evokes the fact that this is neither just being, nor just becoming, but is rather being overlaid on becoming -- becoming becoming being! A given type of attractor -- e.g. a spiral, say -- may come out of an unlimited number of dynamical systems; it is a form, a definite entity, which exists tenuously as an emergent layer atop an underlying reality of process. An autopoietic system is also an example of hyper being; and it is no coincidence that autopoietic systems are full of archetypal attractor structures -- this is the compositionality of hyper being, the tendency of emergent forms to combine to form yet greater emergent forms, a tendency which culminates in the dual network structure of the mind.

Wild being, finally, is not confronted by modern science hardly at all, though it is discussed extensively, under various names, in existentialist, postmodern and Eastern philosophy. Every empirical scientist has had numerous interactions with this type of being, in the form of raw data. Raw data, from a fairly uncontrolled experiment, will have all sorts of interesting patterns popping out of it, but none of them quite strongly enough to be ascertained as definitely existent. In order to do away with this field of possibilities, the scientist focuses the experiment, and thus achieves hyper being -- a definite trend, a pattern, emergent from the dynamical uncertainties of the actual phenomenon being studied.

In the mind, one may say that wild being plays around the edges of the dual network, arising as different parts of the dual network transform each other and reinforce or decrease one anothers' structural rigidity. Wild being plays with autopoietic structures as process being plays with rigid, static reality.

2. Internet Information Space

In this section I will use these philosophical ideas to give a fairly concrete model of Net information space, based on the intuitive concept of the Net as an self-organizing agent system. The four levels of being, we shall see, provide an abstraction of the Internet on a par with the neural net abstraction of the human brain, avoiding engineering specifics, but acknowledging the asynchronous nature of Internet communication, and the heterogeneity of the population of computational processes comprising the Net. The ideas of this section owe a great deal to the "agent-data-event" model of Internet information space developed by John Pritchard, and the presentation "Thinking Through Cyberspace" given (and placed on the Web) by Kent Palmer in 1997.

In an Internet context, the role of static being is fulfilled by nodes and connections -- by the physical substrate of the Net, being machines with processors and RAM, with cables running between them. This is what is present, what is given. Of course, it is also changing, but it is changing on a much slower scale than the information within the Net, so from the point of view of an initial conceptual model, it may be approximated as static.

The feedback between the static realm and the dynamic realm existing within it is interesting, but is different in nature from the faster feedback occuring within the static realm. It seems to be the case on general, system-theoretic grounds that, for interesting things to evolve and grow in a world (real or virtual), there must be a part of the world which is not allowed to evolve and grow, but is fixed.

The dynamic, process aspect of Net reality comes in two forms: the information flowing back and forth across inter-computer connections, and the computational processes carried out at individual nodes of the network. Hyper being, on the other hand, is represented by agents living within the Internet. Agents "surf" on the dynamics of node-resident processes and information flow. Their computation does not reside entirely at any particular node, and would not exist without the continual flow of data, both within individual machines and among different machines.

The distinction between an agent and a computational process resident at a node is fuzzy rather than crisp, but is nevertheless important (much like the distinction between ``living'' and ``nonliving'' matter in biology). An ``agent'' is, in essence, a computational process

It almost goes without saying that these ``agents'' may be either human or artificial; this distinction will not be an important one here, as we will seek a framework that allows us analyze characterize human and artificial agents in a uniform way, as parts of an overall matrix of agent-event determination. The distinction between human and artificial agents is a fuzzy one anyway, as all artificial agents are at present designed by humans and hence reflect human goals and biases; and all human interactions with the Internet occur by the medium of artificial computing ``agents'' such as Web browsers and search engines, which increase in their intelligence and autonomy each year. Rather than distinguishing between human and artificial agents, a better approach would be to say that each agent is characterized by a certain amount of ``human-directedness''; but for the present, this concept will not play a role in our considerations either. An agent is defined behaviorally: in terms of the activities that it carries out at each time, based on the information it has received in the past.

Agents do things, they are the nexus of intelligence in information space; nodes, on the other hand, are static and dynamic data resources. Network data is information available via network activity. One might call a static resource a store node and a dynamic resource a processing node; some nodes can be both store nodes and processing nodes. An agent retrieves information by calling on a node. Recursive retrieval occurs when a processing node calls on other nodes for processing or storage retrieval; when recursive retrieval becomes sufficiently complex it achieves the level of hyper being rather than dynamic being. Currently, recursive retrieval plays only a minor role in the Internet economy, but this can be expected to change: in the future, the Website that accepts your money and provides you with your service may not actually do any of the work that you've paid for -- it may subcontract out the work to the highest bidder from a number of artificial agents representing other machines.

This terminology may easily be related to more standard Internet terminology. Events, in the Internet context, are information transactions, carried out by agents or by processes resident at individual nodes. When an event has an effect on data form or content one may say that it is a transformative event, and that the agent effecting the event is an actor. When an event has no effect on the state of network information except to provide an agent with a local copy of some data, then the agent has effected a retrieval event. Transformative events have effects on network data subsequently available as processor or agent events, and so we may say that agents behave as distributed processors in their effects on nodes (remembering the fuzziness of the distinction between node-resident processing and agent action).

Examples of events propagated by agents are replies to email or additions of hyperlinks in Web pages. Much of the activity carried out by artificial agents is currently retrieval-oriented, but as the Internet develops, transformative events will become more and more common, and the distinction between the two types of events may ultimately vanish. For instance, to an intelligent search engine, each query represents a piece of information regarding the humanly-perceived structure of the Web, and thus potentially leads to modifications in the search engine database; a prototypical retrieval event, searching for a Web page, then becomes a transformative event as well.

It is interesting to observe that the dynamics of a network of agents in information space is essentially nondeterministic, in the sense that there is no way to determine the overall state of the system of agents at any given time. One might think that, since we are dealing with digital systems at this point in the evolution of computer technology, the dynamics of an Internet agent system could be understood by experimentation -- by manipulating an active network of data, involving continuous agent activity and continuous external modification of data. Such experimentation would require the collection of a periodic global state image, however, and the collection of this state image, in practice, would accidently change the definition of the network by interfering with system operations (much as, in quantum physics, measuring a system inevitably disturbs its state). Halting the network of agents to measure the overall state is not a viable option either -- this is difficult without building a system that synchronizes all activity; but synchronization must not be used in agent processing activity for communication or retrieval if the definition of the network is to remain the asynchronous, packet-switching network. While the asynchronous network employs lazy synchronization in the form of collection or set ordering for message encapsulation over packets, many such connections exist simultaneously which implies that node reading, processing and writing operations (effects) occur in random order relative to one another. In short, the very nature of the Internet makes global state measurement difficult, and results in a system that is in practice globally nondeterministic.

What we have, then, is a vision of the Net as complex system of agents passing messages around a graph of nodes, interacting with the computational processes and the static data stored at nodes, and giving rise to dynamical patterns that can be understood only partially by any particular observer. Understanding and utilizing these subtle dynamical patterns becomes an essential part of software practice.

And here we get to the crux of the matter. Up till now, the focus of writers and thinkers and programmers in the area of Internet intelligence has been on the intelligence of individual agents. But, this is a limited approach. What is more interesting and ultimately more fruitful is to think of the intelligence of the overall dynamic graph that is the Internet -- of the patterns of emergent computation binding together the different agents and computational processes on the net. And this brings us beyond the level of hyper being into the more intriguing territory of wild being. Wild being is what happens when a population of intelligent agents interact within a richly fluctuating network of information. It is intrinsically social, in the sense that it is made of interacting subjective perspectives; and in the same sense that the individual human brain is intrinsically social, being made up of the interacting subjective perspectives of the numerous sophisticated and specialized modules within it.

3. Net Versus Brain

It is simple enough to see that the Net, as a whole, has the very same network structure that modern AI theorists, with their neural nets and semantic networks, have simulated within individual serial computers for the purpose of modeling brain and mind processes. The Internet's nodes are more complex than individual neurons, having more of the computational power of neuronal modules. And, the packets sent around the Net are more complex than the jolts of charge sent around the brain -- they are more similar, perhaps, to coherent, nonlinear spatially-distributed electromagnetic waves passing through the brain. But, in essence, they are quite similar systems: they are self-organizing networks, in which information is carried and computation is carried out both locally and in a global, distributed fashion. While the early cyberneticists set out to build a brain based on the neuron level of abstraction, what has happened is that we have built a brain based on the neuronal module level of abstraction. Adapting our mathematics and design methodologies to this fact is not a nontrivial task, but nor is it impossible. "The network is the computer is the mind" is the important thing -- and not so much what kind of computer or what kind of network happens to be involved.

Analysis in terms of the levels of being allows us to take this analogy little deeper. In the brain, static being corresponds to the wiring of the brain, the layout of neurons, given at birth via evolution and pre-natal self-organization, and evolved to a lesser degree during the first years of life (Edelman, 1988). Dynamic being corresponds to electricity coursing through the brain, from neuron to neuron; and to the flow of neurotransmitters and other associated chemicals through the brain, and from the brain to the body. Hyper being corresponds to neural attractors, and classes of neural attractors -- structures that emerge from the dynamics of neural subnetworks of the brain. Different neural subsystems have different attractors, and communicate with each other by methods such as frequency synchronization (Koch, 1994), which depend sensitively on attractor structure. Furthermore, though this is as yet little understood, there are mechanisms for shifting attractors from one part of the brain to another. For instance, when an animal loses part of its visual cortex, another part of its brain may reshape itself into an image of the lost region. Or, in some cases, the processes enabling knowledge of a skill may move from one part of the brain to another as other skills accumulate, taking up room.

And wild being, finally, corresponds to the overall dynamics of the brain, which is a matter of overlapping subnetworks, each one imposing its own view on the remainder of the brain via complex, organized activity. Monitoring the brain's global activity is very difficult precisely because it is "wild" in this sense. Each EEG wave has numerous possible generators, and is usually a nonlinear superposition of different generators. The activity in any one part of the brain is not comprehensible in terms of that part alone, but only in terms of the views of that part of the brain held by the other parts of the brain that interact with it.

We see, then, that there is a harmony between the brain and the Internet on a fairly deep philosophical level. In both cases, we have a fixed underlying substrate consisting of nodes and links -- neurons and synapses on the one hand, computers and cable on the other. The level of granularity is smaller in the brain; but on the other hand, if one accepts the "Neural Darwinist" postulate of brain modularization, then the basic module of brain structure is not the neuron but the neuronal group, which is more closely similar in size to a single PC. In both cases, one has two kinds of dynamics constituting the process level of being -- dynamics within the individual neuron (as modeled by the Hodgkin-Huxley equation) or neuronal module, together with electrical and chemical flow between neurons; and computation within the individual computer, together with flow of sound and light along cables between computers.

On the hyper being level, in both cases, one has complex entities that span numerous nodes and links to carry out their work. Also, in both cases, many of these entities have a certain fixed location around which their distributed operation centers -- e.g. the home node of a Web spider; the physical location of a neuronal group with a certain attractor. There is a greater potential variety to the types of emergent entities -- computational agents -- in the Internet than in the brain. On the other hand, precisely because the range of emergent entities is more restricted in the brain, the crucial "compositionality" by which emergent entities build together to form more and more complex emergent entities may come more easily in the brain than in the Net.

In both cases, one has the potential for the emergence of wild being from the interpenetration of numerous emergent entities. The challenge for Internet intelligent engineers is to replicate the property of the brain whereby a population of emergent "hyper being" entities arise which are

This is an issue which will recur throughout this book. The WebMind architecture, to be described below, represents a solution to this problem via a proprietary software architecture. The Agents Interaction Protocol, also described below, represents a sketch of a public-domain solution to the problem, a solution which is neutral in regards to specific implementation of agents, consisting of a general "glue" language making it easier for different kinds of agents to act cooperatively as composites, thus contributing to the leap from hyper being to wild being.

Finally, in both cases, one has the potential for a "looping back" by which the wild dynamics of the whole loops back and affects the most basic, static level of being. However, the feedback works quite differently in the two examples. In the case of the Net, such feedback occurs when high-volume regions of the computer network get more cable, more computers, etc. It also occurs when systems break down due to excessive demand. In the case of the brain, on the other hand, this happens in youth when stimulation of brain regions causes growth in those regions; and in adulthood when drug use causes brain connections to atrophy. This is an area where the Net excels over the brain: the human brain has hardware which grows in youth, and then decays throughout adulthood, whereas the Net will be able to continue growing on the static level even as it develops through successive stages of maturity on the higher levels of being. This is a capability that will have to be managed carefully, but that will doubtless have tremendous consequences.

3. The Psynet in Biological and Computer Networks

In the psynet model of mind, the two main processes of mind are said to be evolution and autopoiesis. Autopoiesis, we have said, is closely aligned with hyper being -- it is the emergence of "semi-stable" structures from underlying fluctuations. Ecological evolution, acting on the field of autopoietic subsystems of the dual network, is both process being and wild being. It is a process creating new forms toward specific purposes, and it is also an aim-transcendent generative process, producing archetypal and incidental forms which are retained in the mind and serve as seeds for future self-organization of autopoietic structures.

The Internet itself has a naturally hierarchical structure, expressed in the domain name system and in the patterns of network connectivity. Individual machines naturally cluster with the other machines on their intranet, then with the other machines in their nation or continent, lastly with other machines around the world. This is a consequence of network bandwidth which may disappear as engineering improves, but it is a present fact, and a fact which may help rather than hinder the development of intelligence. The relative but not complete isolation of intranets provides an analogue of the clustering of neuronal modules into meta-clusters in the brain, and (as we shall see later in the discussion of Webmind) provides a lower-level impetus for the formation of autopoietic agent systems on the intranet level.

And, of course, there is an heterarchical structure to the Net as well. This is the obvious sprawling hyperlink structure of the Web, which is taken on by Web agents such as spiders, e-mail agents, e-commerce agents, etc. The heterarchy is indeed aligned with the hierarchy, in that machines within the same hierarchical domain are likely to deal with similar topics and experience similar phenomena, thus being strongly heterarchically interlinked. One has, in the Net of today, a germinal dual network. The primitive Internet agents currently in action, do their thing in the context of this dual network. And, when these agents become sufficiently numerous and sophisticated and intercoordinated to lock into a pattern of globally emergent intelligence, this dual network structure will no doubt be a part of it.

What the psynet model claims, in philosophical terms, is that the dual network structure is an incomparably effective means for giving rise to hyper being and wild being. The dual network is a compositional structure of hyper beings -- emergences upon emergences upon emergences -- which lends itself naturally to evolution, in both its directed and free-flowing, generative, wild aspects. That the brain and the Net both manifest dual network structures on the static being level reflects feedback processes taking place over a long time scale -- it reflects the fact that both of these systems have physically evolved in order to be more intelligent, one over millennia of biological advance and the other over years of engineering advance. The brain is the only current example of Wild Computing; but we are about to have another.