Saturday, May 19, 2007

Complexity…

Complexity theory is rather young. But in the short time it has been around, it has yielded a number of interesting parallels in respect to living systems. Like possibly modeling how cells differentiate during development. Or modeling how the initial stages of life on Earth may have developed.

Regarding cell differentiation, its interesting that the genetic information in the cells of a fetus are roughly identical, and yet cells specialize in different ways to form muscles, organs, bones, nerves, etc. So genetic information alone does not provide an explanation on how cells differentiate. Biologists conclude from this that it is different genes in each that are active, but how. Stuart Kauffman, a biologist and notable complexity aficionado, observed that cell differentiation is akin to "state cycles" within dynamical models.

What is a state cycle? Think of a formula that derives its next position from the previous position. At some point a new position will fall on a point in its past "trajectory". From then on the formula will recursively cycle back to this spot. In complexity theory, this is referred to as an "attractor" and the loop as a state cycle. Kauffman is simply inferring that the developing cells may be nudged from one attractor to another by an internal perturbation (agitation or disturbance) or a perturbation from a neighboring cell.

Regarding the initial stages of life, here Kauffman theorizes that catalytic molecular reactions in the "primordial soup" grew at a natural rate. At some point they reached critical mass, known in complexity terms as a "phase transition", and closure suddenly produced autocatalytic sets. An autocatalytic set, as definition would have it, is a collection of molecular reactions that are able to catalyze the sets own production (sound vaguely familiar, recall Autopoiesis).

To illustrate the point, Kauffman's book, At Home in the Universe, uses an analogy with 400 buttons and some thread. Visualize randomly connecting two buttons with thread, then randomly two more, and so on. At first, you're likely to thread fresh buttons. In time you'll add connections to a button. As you approach the ratio of 0.5 threads to buttons, the size of the connected cluster will reach a phase transition where a giant conglomeration will emerge if you were to pick up one of the buttons. Viewed as a graph, the cluster rarely grows at first as it’s unlikely a button is already threaded. It suddenly spikes at the phase transition. Then it levels off once again as new connections only marginally add to the conglomeration.

Kauffman excitedly proclaims "we the expected". Now you will probably see holes in the analogy. Kauffman addresses a few in the book. I didn't care to belabor the point here.

Actually, in the two examples what I found was surface complexity traded in for deep simplicity. It briefly made me think of a seed, which I once considered waiting for a wafting animating force, now just an autocatalytic state cycle. Leaching water and minerals kick off the animation.

And yet, the complexity with which these images can be manipulated within my little pea brain astounds me. In the end, I just feel more connected to everything else. Common principles, added complexity, different expression. More beauty in the world.

4 comments:

Paul said...

This is the first I hear of this so not so easy to get a handle on. I find myself wondering if it's more of a theory or more of a hypothesis - whether there's empirical evidence to support it at this point, especially with regard to cell differentiation.

n2 said...

PAUL: I think "complexity", in general, falls into the theory status. Stuart Kauffman's work is more of a modeling exercise or hypothesis within the theory. In the book, published in 1996, he does approach the hypothesis from a couple of angles in order to offer up some support.

I picked up the book based on a citation in another book I was reading.

The whole field of complexity is rather young as I noted. I gave it blogspace because it provided some insights (not to be confused with answers) into questions I've asked myself.

Vincent said...

I'm finding the question more fruitful than any hypothetical answer, but you have sketched both, for which I am grateful.

It's an instance of the mysteries which never come to an end as far as I am concerned. That's how I felt, also, about Chaos theory at the time. With the Mandelbrot set, for example, the mathematics was simple enough: the result infinite in complexity.

I remember once hearing that to model the flows of traffic, say cars on a motorway, it takes a great deal of computing power. Why was I so surprised? You need a brain to drive a car. Every driver is aware of what the other drivers are doing. Somehow it may be like this with cells growing into a fetus; but then changes over time when growth and development slows down and maturity is reached . . . then old age.

Till now I never was confronted with the open questions: how do cells "know" what to grow into? How does DNA give them the architectural plan, or their lines to rehearse, or however we want to visualise this?

n2 said...

YVES: "It's an instance of the mysteries which never come to an end as far as I am concerned."

I'm coming around to that view too. Which is why I value your perspective. When questions beget more questions, and mysteries more mysteries, you reach a point where you set them aside to simply enjoy who you are and listen to what your body is telling you.