0

Does there exist a large scale scalable tool (engine) for simulating the universe that incorporates both quantum mechanics and cosmology, i.e. micro & macro scales?

(It would be best if this tool simulated the entire universe using quantum mechanics only, without any corrections to the model that come from the macro scale)

Additional resources (from @Qmechanic):

  1. How many bits are needed to simulate the universe?

  2. Simulate the universe?

tesgoe
  • 411
  • 2
    No, and it would be darn near impossible to do so for many reasons (computational requirements being key). You might be interested in this arxiv paper – Kyle Kanos Nov 14 '14 at 21:43
  • 1
    Also, why do you expect the entire universe can be modeled by quantum mechanics alone? – Kyle Kanos Nov 14 '14 at 21:47
  • 2
    According to Douglas Adams, the Earth was created as the computer to answer just one such question (the Ultimate Question of Life, the Universe, and Everything). I suppose that by extrapolation, the universe is actually a giant simulator of the universe - and you would need something that big to do a half decent job... – Floris Nov 14 '14 at 21:47
  • 1
    Somebody already started the analog simulator. Alas, we seem to be inside it! – Jon Custer Nov 14 '14 at 21:48
  • Kyle, thank you for the paper. I will read it today. As for your question, I learnt from other discussions that the macro-scale effects could be derived from the micro-scale ones. If that is not the case, please tell me why. As for the computational requirements, they would be probably very high, but we finally reached the time when we can actually scale things, process in parallel, exploit in-memory storage, balance traffic as well as have tools for unsupervised learning. We could use a number of machines for that. I consider trying things like that feasible. – tesgoe Nov 14 '14 at 22:56
  • I shall ask one another question. Is it possible that the macro-scale effects cannot be derived from the micro-scale ones? But this is beyond the scope of this question here. In building the tool, we would just need a combination of models & some scalable ensemble learning (exploiting a number of models). Learning, re-learning & not getting stuck in local minimas. – tesgoe Nov 14 '14 at 23:01
  • 1
    @tesgoe: No, we do not have the capability to do what you think can be done. The trillion body problem was numerically modeled about two years ago (a hydrodynamic evolution of the universe). The number of particles in the universe exceeds $10^9$ by about 71 orders of magnitude. What you propose is beyond absurd at this point in time. – Kyle Kanos Nov 15 '14 at 01:24
  • 1
    $\uparrow$ that should be $10^{12}$ (trillion) not $10^9$ (billion) and 68 orders of magnitude. – Kyle Kanos Nov 15 '14 at 01:54
  • @kyle-kanos, thank you for great insights.

    The observable universe is finite (even though that the curvature is approx. flat and therefore we expect the universe to be infinite), because its elements are within 13.8 billion light years. Could we just model part of it, instead of the entire universe?

    I assume that you are using the model (definition) of a particle from here http://t.co/C8cJ4rkI9O (Frank Heile, PhD from Physics, Stanford), that makes it 10^80 particles (and 10^12 and 10^68).

    – tesgoe Nov 15 '14 at 08:43
  • Had to divide my answer into two. @kyle-kanos, isn’t it possible to simulate the observable universe with a smaller number of particles, because it is isotropic and homogeneous (in order to make the computation feasible)? More precisely, does there exist a physics engine that takes a number of particles and a topology as inputs and simulates their behavior? – tesgoe Nov 15 '14 at 08:45
  • Related: http://physics.stackexchange.com/q/8895/2451 , http://physics.stackexchange.com/q/110854/2451 and links therein. – Qmechanic Nov 15 '14 at 09:31
  • Update: based on the insight from @dirk-bruere, we still need to wait for quantum computers until we will be able to run such simulations. – tesgoe Nov 15 '14 at 11:29

1 Answers1

2

Nobody has managed to simulate anything except the most trivial quantum systems on a classical computer. One of the reasons people are so keen to develop quantum computers is that they are ideally suited to simulating quantum systems and real life problems that involve QM, most notably things in chemistry and biology. Quantum computing promises to be exponentially faster than classical computer for these tasks. Another approach is to use existing QM systems to simulate other QM systems ie a something like a quantum analog computer

  • What is the reason that we cannot run such a simulation? Only the computational resources? Cannot unitary transformations and information storage (qubits) be simulated on classical machines? – tesgoe Nov 15 '14 at 10:02
  • Quantum systems can be simulated on a classical computer but beyond the simplest the computational requirements explode. A qubit is not simply a 1 or 0 or a number in between. http://en.wikipedia.org/wiki/Qubit –  Nov 15 '14 at 11:20
  • Thank you @dirk-bruere, will dig deeper into it. I think this (together with comments) answers my question. – tesgoe Nov 15 '14 at 11:27
  • While the answer is not known, I think that it's expected not to be the case that quantum computers are generally exponentially faster than classical ones. In particular I think it's expected that NP is not in QP (or is it BQP? I mean the class of problems soluble on a quantum machine in polynomial time). Obviously, since we don't even know if P is NP or not there is some uncertainty here. –  Mar 19 '16 at 18:48