3

Frequently when trying to solve cosmology questions physicists turn to computer simulations of the universe (albeit massively simplified) in order to verify or disprove their hypotheses. This got me thinking.

My question is about the theoretical maximum possible complexity of these systems.

Let me give an example, if we imagine a tennis ball bouncing on a flat surface if we want to accurately simulate and measure the results of every single facet of the collision right down to the atomic and quantum effects you could actually find a tennis ball and drop it over your surface. In this case the universe is "simulating" the collision for you.

Would it be possible to simulate this same event just as accurately using a computer? Is there a theoretical reason why the computer would need to have more mass than the two colliding objects? (in this case a tennis ball and the planet!)

Now I have always assumed that the answer to this question is "yes you need a more massive computer to simulate any object with total physical accuracy" because if that were not the case there would be no reason why a computer less massive than the universe could not simulate the entire universe with total accuracy, which seems to me to be counterintuitive.

Urb
  • 2,608
James
  • 75
  • 3
    Simulating a tennis ball collision by treating it as a collection of atoms is way beyond any computational power we have or are likely ever to have. We have to take shortcuts like using fewer atoms or describing the overall statistical effect of that many atoms in a collision. – Brandon Enright Dec 13 '13 at 17:47
  • Of course, if I'm just being a jerk, I can certainly conceptualize systems that have arbitrarily large masses and that can be simulated by arbitrarily small computers -- take, for instance, a $0$ K ensemble of $N$ non-interacting fermions stored in a harmonic well. – Zo the Relativist Dec 13 '13 at 17:49
  • But thats just boring Jerry! simulating no intereaction is not challenging! – James Dec 13 '13 at 17:56
  • The question is boring and of no physical consequence. You ask if we can compute something in the least intelligent way we might approach it and we obviously can't, but so what? – dmckee --- ex-moderator kitten Dec 13 '13 at 17:58
  • No dmckee I wasnt asking if we could do it, I was asking if there was a physical principle at work meaning that it was impossible. Im sorry if my question offended you! – James Dec 13 '13 at 18:00
  • I don't think people are being quite fair to the question. It's not about whether or not we could do it now (or whether it would be practical to do it in the future). The question is whether or not such a simulation can exist in principle, where the simulation apparatus is smaller than the objects being simulated. I suspect the answer is "Yes, but not if you require real time simulation." My intuition is that the information content of the simulated system should not be compressible in general, but you might be able to "trade time for space." But IANAP, so maybe an expert can answer. – Aaron Golden Mar 28 '14 at 01:47
  • More on simulating macroscopic systems: http://physics.stackexchange.com/q/8895/2451 – Qmechanic Jul 25 '14 at 19:15

2 Answers2

1

The precise answer to your question can be found in section 2 of quant-ph/9908043, named "Entropy limits memory space ".

From that paper I can extract a heuristic summary to answer your question - why do we need massive computers to simulate massive things:

  1. before simulating anything involving information describing the universe to arbitrarily high accuracy, you need to store all that information.

  2. The amount of information you can store is limited by the number of degrees of freedom of your computer.

  3. This number of accessible states can be determined from the entropy of your computer.

  4. This entropy is determined by the mass of your computer.

  5. Hence the amount of things you can simulate and store in a theoretically "ultimate computer" is limited by the computer's mass.

Once again you can read about the details of any of these steps in the paper cited.

Urb
  • 2,608
zzz
  • 2,807
  • #5 seems to be a very strange conclusion to me. My dual-core laptop weighs 5.4 lbs, but a newer model weighs 3.4 lbs and possesses a bigger processor, larger hard drive, and more RAM. – Kyle Kanos Jul 25 '14 at 01:33
  • Unless entropy is inversely related to mass, then their conclusions suggest that the bigger your computer, the larger the amount of things you can simulate. That is in direct conflict with experience. – Kyle Kanos Jul 25 '14 at 01:36
  • 1
    #5 says you can get the most advanced technology possible, the laws of thermodynamics yields an upper limit on the amount of information you can store in a computer of a given mass. What you gave is an comparison between two things neither of which saturates the upper limit given, and therefore doesn't say anything about the limit. – zzz Jul 25 '14 at 01:38
  • yields an upper limit on the amount of information you can store in a computer of a given mass If this is true, then the larger the mass, the larger the upper limit. But computers are getting lighter; despite that, we can simulate more with lighter computers. What part of that is unclear? – Kyle Kanos Jul 25 '14 at 01:40
  • 2
    Yes computers are getting lighter, and storing more information, but this mass to information ratio is nowhere near the limit proved in the results cited. – zzz Jul 25 '14 at 01:45
  • 5. Hence the amount* of things you can simulate and store is limited by your computer's mass.* Seems rather unambiguous to me that there is a correlation between mass and amount of information stored. – Kyle Kanos Jul 25 '14 at 01:46
  • 1
    There is a correlation between the maximum amount of information you can possibly store and your computer's mass. – zzz Jul 25 '14 at 01:46
  • 1
    In other words, your new, lighter, computer is getting better at reaching that maximum amount than your old, heavy computer, but it will never reach the ratio computed in the result cited. – zzz Jul 25 '14 at 01:48
  • I'm glad you're now accepting that the correlation exists, because we can move back to my original point: your #5 still contradicts experience; smaller computers can simulate more than their heavier predecessors. – Kyle Kanos Jul 25 '14 at 01:55
  • 1
    You have never experienced using a computer which saturates the maximum ratio referred to by #5, and given by the literature cited, therefore #5 cannot possibly contradict experience. It may contradict intuition. – zzz Jul 25 '14 at 01:57
  • #5 as you wrote it, suggests no such saturation point; it simply gives a correlation between mass & "amount of things we can simulate and store." Perhaps the paper suggests this saturation point, but your comment in no way suggests this. – Kyle Kanos Jul 25 '14 at 01:59
  • 1
    "the amount of things you can simulate and store is limited by your computer's mass". Not correlated, limited. In my humble opinion limited clearly suggests the existence of a limit, to be saturated. – zzz Jul 25 '14 at 02:00
  • Limit also means a restriction on the size; as written it could read as I interpreted it: more mass = more entropy (i.e., the two are correlated). – Kyle Kanos Jul 25 '14 at 02:04
  • 1
    A restriction is exactly what it is, is a "saturation point" not a restriction? And yes indeed more mass = more entropy, as you might know from freshman physics mass and entropy are obviously correlated, but you won't get to test this correlation until you can build computers that are good enough to saturate this limit. – zzz Jul 25 '14 at 02:06
  • So if more mass = more entropy, why are computers getting smaller yet able to simulate more? Your conclusion #5 suggests that a smaller laptop (less entropy) should simulate less than a larger one. – Kyle Kanos Jul 25 '14 at 02:08
  • 1
    No my conclusion states that a smaller ideal laptop, i.e. one which can saturate the proposed information to mass limit, can simulate less than a larger ideal laptop. – zzz Jul 25 '14 at 02:11
  • I do not see any mention of "ideal" anywhere in your post. Perhaps you should amend it to mean exactly what you mean, rather than assume people are telepathic. Perhaps also explain more clearly where the assumptions are coming from. – Kyle Kanos Jul 25 '14 at 02:13
  • 1
    limit. What do you think the limit limits? Also which assumptions are not clear? – zzz Jul 25 '14 at 02:16
  • Assuming you mean in the context of your question, the mass limits the entropy and thus the ability to store information. It says nothing about ideal laptops, only that the relation exists. I'd say statements 2, 4, & 5 could be cleared up (particularly since you disagree with my interpretation of 5, you might want to consider the re-writing). It's your choice to amend it, you really don't have to, but my recommendation is to do so. – Kyle Kanos Jul 25 '14 at 02:20
-2

The more precisely the position is determined, the less precisely the momentum is known in this instant, and vice versa. --Heisenberg, uncertainty paper, 1927

On the macro level (stuff we can see and touch) it is easy to make these predictions, that is what mechanical engineers do. However, as you get to the atomic and sub-atomic levels, the predictions become more difficult not because of the complexity of the system, but because quantum level particles cannot accurately be predicted or tracked. Instead, you have to calculate the probabilities of where they are likely to go. Stephen Hawking's book "The Grand Design" goes into much more detail about how this works.

The Feynman sum-over-paths quantum theory takes this idea a step further and says things like electrons don't even follow a single path, but follow ALL possible paths at the same time!

"Thirty-one years ago, Dick Feynman told me about his ‘‘sum over histories’’ version of quantum mechanics. ‘‘The electron does anything it likes,’’ he said. ‘‘It just goes in any direction at any speed ,... however it likes, and then you add up the amplitudes and it gives you the wavefunction.’’ I said to him, ‘‘You’re crazy.’’ But he wasn’t." --Freeman Dyson, 1980

"The electron is a free spirit. The electron knows nothing of the complicated postulates or partial differential equation of nonrelativistic quantum mechanics. Physicists have known for decades that the ‘‘wave theory’’ of quantum mechanics is neither simple nor fundamental. Out of the study of quantum electrodynamics ~ QED comes Nature’s simple, fundamental three-word command to the electron: ‘‘Explore all paths.’’ The electron is so free-spirited that it refuses to choose which path to follow—so it tries them all."

From: Teaching Feynman’s sum-over-paths quantum theory Edwin F. Taylor, Stamatis Vokos, and John M. O’Meara Department of Physics, University of Washington, Seattle, Washington 98195-1560 Nora S. Thornber Department of Mathematics, Raritan Valley Community College, Somerville, New Jersey 08876-1265 (Received 30 July 1997; accepted 25 November 1997)

Urb
  • 2,608