User:Sitar Physics/Gravity, randomness, data compression and the speed of light

From Wikipedia, the free encyclopedia
Gravity, randomness, data compression and the speed of light


Newtons laws stood unshaken for about two centuries. When the maximum speed of light was discovered, the theory of gravity was no longer consistent with the theory of light. It became important to harmonize the ideas. This article presents some analogies to help in understanding this idea.

One of the main ideas of this article is that lookahead (in a game like checkers) is really a form of data compression: a weak strategy in checkers represents less information, and a good strategy represent more. But it takes more energy to look ahead, and energy is mass. Therefore we have the following:

energy = mass = lookahead = information = momentum = strategy

LOL

Analogous concepts from entropy and randomness[edit]

Doppler effect and the game of checkers[edit]

If a checkers program does no lookahead, then it wont play very well. The more lookahead, the better. The distance to a win is less, but it takes longer. Space is smaller, but time runs more slowly. That would be a Lorentz transform.

All the program is really doing is sorting the list of moves, so that it can choose the best one. Thats implicit in the calculus of variations, although it doesnt really state it. You have to somehow be able to find out which path is the most stable, so you can choose that one.

List sorting version of the twin paradox[edit]

Suppose a pair of twins X and Y are sorting lists, and X is more scrambled. Y will see X as an entity that is scrambled in time, so that its harder for Y to add up his age. X isnt any younger, its just harder for Y to figure out how old he is. So Y interprets this by saying that X is younger, because thats as far as he got in his attempt to sort the list.

This is like a search engine. If those things werent sorted, it would take forever to search, and you would give up. Since they are sorted, you receive more results in a given time. So the twin paradox would be like two search engines, where each is trying to figure out the sorting ability of the other.

Sorting empty spacetime[edit]

We can form an analogy with sorting a three or four-dimensional list. The time dimension would also be randomized, so that one would have the impression that each event was unrelated to the previous one. The law of addition of velocities exists as follows: If the list is in random order, then the slightest effort to sort it will have an effect. But if the list is nearly sorted, then no amount of sorting will change it very much.

Here is another way of looking at it: empty space corresponds to a randomized list, massive objects would correspond to partially sorted lists, and compact stars to almost completely sorted lists. In the theory of gravity, one cannot tell the difference between mass, energy, motion, and other variables. Here mass, energy and motion are replaced by sorting or scrambling. The only thing that can happen in this interpretation is to sort or scramble a list.

List sorting and the Hamiltonian[edit]

So the sorting of lists (or data compression, or information crushing, or sphere packing) leads to structured spacetime, and that this is equivalent to mass, energy, momentum, pressure, and anything else that might warp spacetime. Here we give some examples. To make them easier to understand, we first notice that the history of science is really a form of animism; almost all the laws say "nature obeys this or that". This really means that there is a little genie telling nature what to do. Scientists dont say thats it's animism, but it seems very much like it. Thats how they design cellphones and computers - and they work.

So in keeping with animism, we will say that there is a genie which is trying minimize the action all the time. This is usually stated as "nature obeys the principle of least action".

Example - the principle of least time

Here is an example. The principle of least time was one of the first laws of modern animism. We might imagine that inanimate nature is a movie in spacetime. But if all of the time frames are out of order, then the genie can only guess how minimize the time. This more ordered they are, they better he can do. And if they are fully in order, then he can do his job perfectly.

Example - the principle of least energy

Here is another example. The principle of least energy has also been very important historically. Most of the universe, the stars and galaxies, consists of hydrogen. But what if the energy levels of hydrogen were randomized? The genie wouldnt know how to do things properly. He would be wasting energy. We can think of list of energy levels as the Hamiltonian operator. If this operator is diagonalized, then it is simply a list. This list must be in order. Then the genie can know how to minimize the energy.

Example - the principle of least action

Finally, we move to the action principle, an important law of modern science. The list has to be ordered in both energy and time. If not, then the genie wont know how to minimize the action. The more ordered it is, the better he will be able to minimize the action.

Neural network analogy[edit]

Schwarzschild black hole
A galaxy passing behind a strange star. Even pictures like this should remain self-consistent, if the viewer is moving near the speed of light, which would cause yet more distortion of spacetime.

Each computer has a certain amount of memory. This is also true for neural networks. These are functions that can be trained to perform certain tasks. People sometimes use the word 'mass' for the amount of memory that a neural network contains. It takes longer to train a more massive network, but more information can be crushed into it.

One may think of a computer program as having a fundamental complexity: the amount of memory required to store the shortest equivalent program. This is analogous to crushing mass into the smallest possible radius. One might call it the Schwarzschild radius of the program. Yet writing the program to put into the smallest memory is a daunting task. One could write another program to search for a solution, one that performs a lookahead procedure.

If a network has not been trained, even a small amount of training will have an effect. But if the network has been competely trained, then training it further has no effect. Here is a table:

computer program massive object
least number of bits entropy of object (also in bits)
smallest amount of memory schwarzschild radius
huge energy to do lookahead huge energy to melt & reorganize mass
program distilled and crystallized structure of spacetime crystallized

People have also written about fusing information in dreams. They wake up and suddenly have answer to a puzzle. Similarly, the information in neural memory is fused or melted. This is analogous to melting massive objects, thus enabling them to restructure themselves.

Summary[edit]

Here is a summary of the ideas:

near a black hole ordinary spacetime
maximum data compression nonmaximal data compression
sorted list partially sorted list
infinitely trained network partially trained network
information completely melted and fused information only partly fused
infinite lookahead (more doesnt improve it) one-ply lookahead (more improves it)
more training does nothing more training does something
Lorentz transform does nothing Lorentz transform does something

Newtons law in a new light[edit]

In effect, Newtons original law says that the list that we are sorting is infinitely complex: one may continue to sort it forever, and keep improving it. But the maximum speed of light says that the complexity is only finite, and that it is thoroughly sortable. There are actually distance functions that measure the amount of sortedness of a list. There are also many related distance functions.

A universal interpretation of Newtons law also implies that planets can have infinite mass, and that gravity can be infinite. Another interpretation is that the game of monopoly goes on forever, and therefore increasing the lookahead indefinitely will continue to improve the skill of the player.

Sorting in warped spacetime[edit]

Sorting is sometimes done on a computer, so that it goes faster. But if time slows down, then the computer doesnt do any good. One has to sort a list completely, but slowing down as one approaches a sorted list. There are many sorting algorithms. This might be called "hyperbolic sort", because it becomes more or less hopeless to sort the list completely. The same is true for scrambling it. As it becomes more scrambled, the program runs faster, but the list keeps getting longer and longer, making the goal impossible.

This has "profound implications" for checkers programs, and other programs like Deeper Blue. If the idea is to speed up the lookahead (this takes more energy), then sooner or later the energy will be enough to warp spacetime (because energy = mass), thus slowing the program down. So there is a maximum skill level of checkers playing programs inherent in nature.

Container theory or relational spacetime[edit]

Conway's game of life

One may have heard of sparse matrix mutiplication. You have a million-by-million set of matrices, and each one has only three non-zero entries. Why use up all that energy in multiplying them? There are more efficient methods. In the same way, the vacuum itself can be perceived as packed, so that empty space is nonexistent.

This is like Conway's game of life. Why compute a million pixels, when there are only a few gliders? Why even store the vacuum anyway? Is there a relational database that would express it is a more compact form? So we conserve the energy for situations in which there is more energy: we do "sphere packing" on the computational procedures themselves. But this is not possible for compact stars, because they are already packed. So this reinforces the idea that computation = mass = lorentz transformons = information = capacity of relational database.

References[edit]