what is intelligence



what is intelligence
The following are a collection of my thoughs and reflections regarding the concepts presented in the book "What Is Intelligence" - an attempt to understand life, evolution, computing and minds using ideas from AI.
origins of life
For the development of any relatively complex organism, certain biological pre-requisites are required. I skipped most of this part.
life as computation
But what is life? A fundamental aspect of life is that it is dynamic. It does things. And so there is a growing number of those who claim that life, at some level, consists of computation.
A few concepts are initially required to get at this claim. The first is that of a Turing Macine - machine that is capable of computing any algorithmic seqeunce. Turing came at this problem when trying to prove whether there exists a program that could prove whether an arbitrary mathematical statement was true or false. To try to solve this problem, he envisioned a machine that consists of a single read/write head, reading from an infinite tape with a sequence of symbols. The machine is given a "program" -> a set of rules for how to process each element in the tape. Given enough time & tape, a machine formulated in this way is capable of making any calculation that could be made by hand. Now suppose that the "program" (the set of instructions it was given that told it how to process the symbols on the tape) is written on the tape itself. There now exists a set of instructions one could give to the machine such that it could compute any arbitrary program given on the tape. This is what is called a "Universal Turing Machine". Thus a Universal Turing Machine is a machine that can simulate any other Turing Machine.
Programs can output other programs, and can lead to further nad further computation. Given an initial input program, Turing demonstrated that it is impossible to predict whether this program will come to a definitive end.
And the key insight behind what Turing demonstrated was that computation is a universal concept; it can be instatiated in multple ways and can perform arbitrary computations.
replication
Now a second aspect of things that are alive, is that it must be able to replicate itself. This is necessarily required for the process of natural selection and evolution to proceed. How could this come about organically via computation? Von Neumann presented an abstract framework for this. He proposed a a "cellular automaton" that consists of three components. The first is the description of itself - instructions for what the program is and how to reconstruct it. A universal Constructor then executes that description, creating a new machine. A Universal Copy machine then copies the description onto the new machine. Now the machine is the program and has the description for how to replicate, and so the whole process can continue indefinitely.
complexity & dynamic stability
The concepts above give us a foundation to work with. A key point of this is that in order for life (self-replicators) to arise, computation is fundamental. But now we come to a puzzle; why is it the case that as life evolved, things became more complex? In the book, the author gives an example in which they simulated "artifical life" through a program. Random sequences of bytes were initialized. At random times those bytes could be "mutated" (simulating the random mutation theory common in evolutionary theory). Then pairs of bytes were paired together and then executed in sequence. In this programming language the code and the data are paried so that each execution of code could potentially modify itself.
The author shows several figures that demonstrate that, while at first very little happens (mutations, etc), and code does not change, eventually the complexity of the programs become longer. There is a sudden phase transition in which the average length of computation grows dramatically. Basically the tape begins to reproduce itself; random mutations and computations give rise to purpose; self-replicators trying to survive and "beating" the sequences that are less apt at doing so. One way of looking at this is to say that at first when we look at the sequences of bytes, we have no way of determining causality, everything simply looks completely random. But after a while there emerges some sort of "agency" -> bytes trying to preserve themselves.
But why should things become less random? The second law of thermodynamics tells us that systems must trend towards things that have higher entropy. It would be counter-intuitive to then see the emergence of such complex life.
So let's take a step back and think of a very simple case; billiards balls in an ideal world where all collisions are elastic, and there is no frictino or air resistance. In this case, the state of "max-entropy" is one in which the balls are all randomly bouncing all over the place. If we were to see all the balls in a perfect configuration right before the "break" we would guess statistically, that we are at the "beginning" of the simulation. Because it is extremely unlikely, over time, for the randomness of the balls to actually end up in such a configuration.
So time has lost all its meaning in this equilibrium -> any two moments or configurations are equally likely to occur. Additionally, no "work" can be done. Work must be the transfer of energy, and this occurs from going from a state of low entropy to higher entropy.
This brings us back to the puzzle of life; when we look at things at a sort of local level, it looks like the second law of thermodynamics has been violated. Of course when we zoom out to the level of the universe (and include the Sun from which energy is being taken), we see that the overall universe must be tending towards more entropy, as in free energy (from the sun) is being used to extract resources and create more entropy on the world overall. But at a local level, say just a nation, we see that the complexity is increasing. The author gives us this idea of "dynamic stability" which is to say that compelex, self-replicators, are more dynamically stable than others. Take a stone for example. Over time, it just continuously degrades, all the processes slowly attack the stone so that it loses more and more of itself. However something like DNA, is more stable when it is able to replicate itself. It is antifragile. So at the "pattern level", things that are more "stable" in the sense of the 2nd law, are patterns that are able to replicate themselves accurately. So better self-replicators survive; and this brings a connection between Darwin's principle of natural selection with that of the 2nd Law of thermodynamics.
This still might not get completely at the reason for why things are generally more complex. An easy way to see complexity though is that if things that in general self-replicate are more stable, we will over time keep getting more and more of those. Those compuations will merge and "evolve" with each other, leading to more and more complexity. In the case of the artifical life environment created by the author, complexity could be defined by the average number of operations, and this would obviously increase as we increase the number of self-replicators and co-self-replicators.
Fitness & Survival
Whether or not concepts like "heat", or "concentration" are "real" do not really matter in the evolutioanry sense. They are real to the extent that they allow the organism to accurately predict the future; because accurately predicting the future is what leads to survival
The same is true for humans, the things that we perceive in the world, the people we consider "beautiful" (seeing youth in the eyes, being attracted to women with "x" quality) are purely a function of reproductive potential. And so based on the Fitness Beats Truth Theorem (see Hoffman), the likelihood that an organism sees some relatively high dimensional reality 100% accurately goes to zero in the limit. In the most simple case, take a 1-dimensional phenomenon
compression
We must model a joint compressed distribution P(X, H, O), or o = f(x, h).
cause & effect
we should go back now and think about the definitino fo life again; the things that life wants to do. It only makes sense to think of life as purposeful and agentic.