Greg Egan’s 1994 novel Permutation City, which features unlikeable characters, wooden dialogue, and a depressing storyline, is one of the most thought-provoking works of science fiction ever written. It’s basically a book-length expansion of Egan’s “Dust Theory”. The related Church-Turing thesis implies that I couldn’t know whether I’m made of real atoms or just accurate computer simulations of atoms. The Dust Theory expands this to the case where the output of the atom-simulation undergoes a permutation – I still couldn’t tell what’s happening in the “basement”. Since any pattern of sufficient length can be permuted to a simulation of my atoms, and therefore my subjective experience, I can never discern from the “inside” whether I’m made of atoms, of simulated atoms, or of a random pattern of black-and-white flowers in a field on a small planet orbiting Betelgeuse.
My argument against the dust theory is that it does not explain anything. I believe I’m made of atoms because that explains a lot, that is, it compresses a description of my perceptions given my actions. (This is an informal paraphrasing of Solomonoff induction.) In fact, I believe I’m Manuel, who is such-and-such a type of guy, because it explains an awful lot of the stuff I’m perceiving, and doing. A world model with a “basement” not of physical atoms, but simulated atoms on a small turing machine, has about the same Kolmogorov complexity as the original model, so my take on that is “who knows?”. But if a theory makes it necessary to specify an extra permutation in the end ... if the permutation is to be Martin-Löf random, its complexity is to be about equal to the length of the string to be scrambled. Whoah, that’s a lot of extra bits! Each extra bit reduces the theory’s prior probability by 50%, so that’s pretty much off the table.
That’s also why I don’t buy into the “We are in Digits of Pi” theory. Granted, pi itself has a small Kolmogorov complexity, but in order to explain my perceptions and actions, in sum N bits, one would have to specify a region that lies some 2^N digits behind the comma. That’s much more costly (N bits) than the “atom” or “Turing machine” based theories above (K(N) bits), and is therefore, by virtue of Solomonoff induction, a stillborn theory.
One of the reasons Egan’s Dust Theory is appealing at first glance is that he introduces it through permutations of low Kolmogorov complexity which nevertheless look “complex” to the human mind. (The general case, which he – I think –doesn’t explicitely state, is known as the pseudo-random number generator.) The big step from there to arbitrarily complex permutations – almost all seemingly random patterns cannot be created with a pseudo-random number generator – is swept under the argumentative rug. I admit the sweeping is not done deliberately, as Egan doesn’t seem to know about Solomonoff induction.
For the record I do believe in Tegmark’s mathematical universe theory. I also believe my laptop’s harddisk contains mostly random data (courtesy 7zip, matroska, and others.) And, yes, I also believe a tiny fraction of myself is in a field of lowers somewhere (not Betelgeuse). More on this soon, hopefully, in a post I’ve been struggling to write for two years.
2 comments:
Hi, did you ever write the follow-up, the Dust theory is fascinating, with people split on its feasibility.
No, never wrote it. But it still is my to-do list. I can't commit, but I will probably post it here no later than 06/2011.
Post a Comment