37

What is the most efficient way to store data that is currently hypothesized? Is it theoretically possible to store more than one bit per atom?

Until Floris' comment, I hadn't considered that efficiency is dependent on what you're trying to optimize. When I wrote the question, I had matter efficiency in mind. Are there other orthogonal dimensions that one may want to optimize?

And to be clear, by matter efficiency, I mean representing the most data (presumably bits) with the least amount of matter.

Tyler
  • 489
  • 4
    Define efficient. How many different states can an atom have? – Floris Nov 04 '14 at 04:24
  • Thought provoking question. See edit. – Tyler Nov 04 '14 at 04:29
  • 1
    You might think of efficiency as "per unit mass" (favoring lighter atoms), "energy cost to read / write a bit" (favoring low energy stable states that can be flipped), etc... – Floris Nov 04 '14 at 04:54
  • Those would be interesting, but I was mostly interested in the hypothetical of the most information that could be stored. For example, I'd like to know if it could ever people possible to store all $2^{(108072024)}$ bits required to generate every picture possible at a 1080p resolution. Since all storage requires matter, and at that range we're way over the number of atoms in the universe, I'm wondering if there could ever be a way to store ~$2^{(108072024)}/10^{80}$ bits per atom. I just tried to generalize the question to make it more useful to others. Should I make it more specific? – Tyler Nov 04 '14 at 05:00
  • 5
    The easy way to store that information is to compress it. "All possible 1080p images" is actually pretty good compression - from a brief algorithmic description anyone could generate (compute on the fly) any of these images. Just like "$\pi$" is a really efficient way to store a specific sequence of billions of numbers. There is data, but no entropy. – Floris Nov 04 '14 at 05:16
  • 1
    At some point you have to deal with Shannon's Information Theory (which is very closely tied to quantum mechanics). Noise is the limiting factor. – Hot Licks Nov 05 '14 at 16:58

5 Answers5

57

It sounds as though you may be groping for the Bekestein Bound (see Wiki page of this name) from the field of Black Hole Thermodynamics. This bound postulates that the total maximum information storage capacity (in bits) for a spherical region in space $R$ containing total energy $E$ is:

$$I\leq \frac{2\,\pi\,R\,E}{\hbar\,c\,\log 2}\tag{1}$$

where $I$ is the number of bits contained in quantum states of that region of space. If you try to shove too much energy into a region of space, a Schwarzschild horizon and a black hole will form, to the Bekenstein bound implies a maximum information storage capacity for a region of space independent of $E$; the limit will be reached when $E$ is so high that $R$ becomes the Schwarzschild radius (radius of the event horizon) for that black hole; from this notion we get:

$$E\leq \frac{R\,c^4}{2\,G}\tag{2}$$

to prevent a black hole forming, with equality holding at the threshold of creating an horizon. From (1) and (2) we find:

$$I\leq \frac{\pi\,R^2\,c^3}{\hbar\,G\,\log\,2}\tag{3}$$

which is indeed the entropy of a black hole in bits: this is Hawking's famous formula $I = A/(4\,\log 2)$ bits, where $A$ is the black hole Schwartzschild horizon's area (but expressed in Planck units). Bekenstein derived these bounds by (1) postulating that the second law of thermodynamics stays true for systems containing black holes and then showing that (2) the second law can be made "safe" only if these bounds hold. Otherwise, one can imagine thought experiments that would violate the second law by throwing things into black holes. More on the grounding of the bounds can be found on the Scholarpedia page for the Bekenstein bound.

One gets truly colossal storage capacities for these formulas. For a region of space of 5 centimetres radius (the size of a tennis ball), we get $4.3\times10^{67}$ bits from (3). This is to be compared with the estimated total storage of Earth's computer systems of about $10^{23}$ bits in 2013 (see the Wikipedia Zettabyte Page). A one and a half kilogram human brain can store about $3\times 10^{42}$ bits and the mass of Earth roughly $10^{75}$ bits. These last two are more indicative of "normal" matter because the tennis ball example assumes we've packed so much energy in that a black hole is about to form. So the tennis ball would be made of ultracompressed matter like neutron star material.

From the human brain example, lets assume we have $(1500/12)\times 10^{24}$ atoms. (roughly Avagadro's number times the number of moles of carbon in that mass). The informational content worked out above would amount to more like $10^{16}$ bits per atom.

None of these bounds talk about the actual realisation of data storage. But it would be trivial to store more than one bit per atom theoretically by choosing an element with, say, three or four stable isotopes, and lining up the atoms in a lattice. You code your data by placing the appropriate isotope at each given position in the lattice, and retrieve your bits by reading which isotope is present at each position of the lattice. For example, Silicon has three stable isotopes: you code your message in a lattice of silicon like this, and your storage is $\log_2 3 \approx 1.58$ bits per atom.


Edit in answer to question by OP:

"since this is, as far as I can tell, relativistic/macro-scale physics, is there room for significant change when/if quantum physics is incorporated? (I.e. will this likely stay the same or change when the unifying theory is discovered? Or is it entirely independent of the unifying theory problem?)"

Yes it is macro-scale physics, but it will not improve when quantum effects are incorporated IF the second law of thermodynamics applies to black hole systems and my feeling is that many physicists who study quantum gravity believe it does. Macroscopic ensembles of quantum systems still heed the second law when you measure the entropy of a mixed states with the von Neumann entropy: this is the quantum extension of the Gibbs entropy. And, if you're talking about the second law, you are always talking about ensemble / large system behaviour: entropy fluctuates up and down: negligibly for macro systems but significantly for systems of small numbers of quantum particles. If you think about it though, it is the macro behaviour that is probably most interesting to you because you want to know how much information is stored on average per quantum particle. As I understand it, a great deal of quantum gravity theory is grounded on the assumption that black hole systems do indeed follow the second law. In causal set theory, for example, the assumed "atoms" of spacetime causally influence one another and you of course have pairs of these atoms that are entangled (causally influence one another) but which lie on either side of the Schwarzschild horizon: one of the pair is insidethe black hole and therefore cannot be probed from the outside, whilst the other pair member is in our universe. It is entangled and thus causally linked to the pair member inside the black hole which we cannot see. The outside-horizon pair member observable in our universe therefore has "hidden" state variables, i.e. encoded in the state of the pair member inside the horizon that add to its von Neumann entropy, as we would perceive it outside the horizon. This is why causal set theory foretells an entropy proportional to the horizon area (the famous Hawking equation $S = k\,A/4$) because it is the area that is proportional to the number of such pairs that straddle the horizon.


Links with Jerry Schirmer's Answer after discussions with Users Charles and user27542; see also Charles's question "How big is an excited hydrogen atom?"

Jerry Schirmer correctly (IMO) states that one can theoretically encode an infinite number of bits in the eigenstates of an excited hydrogen atom; this is of course if we can measure energy infinitely precisely and tell the states apart; since the spacing between neighbouring energy levels varies as $1/n^3$ where $n$ is the principal quantun number, we'd need to be willing to wait longer and longer to read our code as we try to cram more and more data into our hydrogen atom. Even if we are willing to do this, the coding scheme does not even come close to violating the Bekenstein bound because the size of the higher energy state orbitals increase, theoretically without bound, with the principal quantum number. I calculate the mean radius $\bar{r}$ of an orbital with principal quantum number $n$ in my answer here and the answer is $\bar{r}\approx n\,(n+\frac{3}{2})\,a \sim n^2$. Also, the angular momentum quantum numbers are bounded by $\ell \in1,\,2,\,\cdots,\,n-1$ and $m\in-\ell,\,-\ell+1,\,\cdots,\,\ell-1,\,\ell$, therefore the total number of eigenstates with principal quantum number $n$ is $1+3+5+\cdots 2n-1 = n^2$ and so the total number $N(n)$ of energy eigenstates with principal quantum number $n$ or less is $N(n)=1^2+2^2+\cdots+n^2 \approx n^3/3$. So $\bar{r}\propto n^2$ and $N \propto n^3$ thus $N\propto \sqrt{\bar{r}^3}$ and $I = \log N \approx A + \frac{3}{2}\,\log_2\,\bar{r}$ where $I$ is the encodable information content in bits and $A$ a constant.

  • 7
    This is exactly what I wanted. (Well, to be honest, I didn't know what I wanted. But I do now and this is it.) Thanks for the exceptionally thorough answer. – Tyler Nov 04 '14 at 06:54
  • @Tyler Thanks. Also see Jerry Schirmer's good answer: it would be interesting to see his answer to my question, as he's a real relativist (I'm only an amateur one). I suspect that the answer will be that yes it gainsays the Bekenstein bound but this is one of many things that a full theory of black holes might resolve. So I actually don't disagree with his answer: in any case, it is clear that theoretically the storage capacity of an atom is very much greater than one bit, and that one could even foresee technology surpassing the 1 bit per atom mark. – Selene Routley Nov 04 '14 at 06:59
  • So, in other words, no; it won't ever be possible to store all the images in uncompressed form. By that formula, it would require $6.6\times 10^{8426761} \text{ly}^3$ or $10^{8426729}$ times the volume of the universe. – Tyler Nov 04 '14 at 16:44
  • @Rod, since this is, as far as I can tell, relativistic/macro-scale physics, is there room for significant change when/if quantum physics is incorporated? (I.e. will this likely stay the same or change when the unifying theory is discovered? Or is it entirely independent of the unifying theory problem?) – Tyler Nov 05 '14 at 01:32
  • 1
    @Tyler Please see my edit at the end of the answer. – Selene Routley Nov 05 '14 at 07:14
  • 2
    @Tyler I've added a bit more that explains how Jerry's great answer and mine are perfectly in agreement: the H atom gets bigger the more information we try to cram into it. – Selene Routley Nov 06 '14 at 06:08
  • I just want to point out that the claim that the human brain can store $3 \times 10^{42}$ bits is incorrect, assuming you are talking about meaningful information. We don't even know how the brain encodes most of its information nor how sensitive each and every synapse is to small variations in potentiation, much less precisely how many functionally discrete states a brain can freely move itself between. It's trivial to know what the answer is for a computer, given that the storage is separate from the processing and the storage can only exist in a very easily quantifiable number of states. – forest Mar 07 '18 at 10:02
  • This is an upper bound given by the Bekenstein Bound - it's the maximum potential storage in a region of space with a given energy content. What number of bits is accessible to the human brain's thought mechanisms is quite another matter as you say. This is quite different from the information needed to specify the total quantum state. – Selene Routley Mar 07 '18 at 21:05
21

Assuming infinite precision in measurement, an infinite number of bits can be stored in a single atom.

Take the information you want to store, encode it into a string, and then calculate the Gödel number of the string. Call that number n. Then, excite a hydrogen atom to exactly the n${}^{\rm th}$ energy level.

In practice, the properties of a real hydrogen atom will make this completely impractical, of course, but we're just talking pure theory.

Zo the Relativist
  • 41,373
  • 2
  • 74
  • 143
  • +1 Most interesting and a neat, clean example. I'm not sure how this fits with the Bekenstein bound (you'd be better qualified than I to comment) if the latter indeed holds; maybe the H energy states would undergo a change in a full quantum description of spacetime such that the infinite number of very high $n$ ones actually become finite in number. I guess this comment would just be one of the many loose ends needing to be resolved with a full resolution of the black hole information paradox. – Selene Routley Nov 04 '14 at 06:50
  • @WetSavannaAnimalakaRodVance: well, you're kind of cheating the Bekenstein bound with this example, since you're basically utilizing a zero entropy state of a single atom to store your information. There's no reason to expect the thing to collapse to a black hole, since it is still just a hydrogen atom with a very nearly ionized electron. – Zo the Relativist Nov 04 '14 at 07:12
  • yes, if you are allowed to get rid of noise you can also compute using neural networks over the reals, which are also impractical but demonstratively more powerful than a Turing computer. The Bekestein Bound is related to digital storage, which is the only paradigm we know to work above thermal noise (I mean, you could code using other codes beyond binary, such as using n states of an atom, but there is an upper limit to n, n cannot not be arbitrary as in your example –  Nov 04 '14 at 07:16
  • 4
    The size of the excited atom is going to be very big. If you measure efficiency as "bits per atom", surely it's the way to go. If you measure efficiency as "bits per volume of space", not at all. – user27542 Nov 04 '14 at 08:45
  • 1
    an atom occupies a volume in space, the larger the number of bits the larger the atom (because the more exited the state, the larger becomes the electron cloud). Whatever system you would like to imagine, will fall short of the ultimate computer defined by the most efficient one you can have (which includes all possible degrees of freedom allowed by the laws of physics) just before it collapses into a black hole. –  Nov 04 '14 at 09:31
  • Aren't Godel number non-unique? In which case, the received information from observing the nth level could mean a multitude of things. – Kyle Kanos Nov 04 '14 at 19:10
  • @KyleKanos: no, the whole point of Godel numbers is that they're unique -- given a coding scheme, the relation between the number and the string is one-to-one (this is different than hashcodes). The fact that the atom would get arbitrarily large is a complaint about this. – Zo the Relativist Nov 04 '14 at 20:11
  • 1
    Since you are starting with bits, theres no real need to go to string encoding, then Godel encoding, you can just say "the integer value of the binary string of bits", which is unambiguous. – RBarryYoung Nov 05 '14 at 15:14
  • @WetSavannaAnimalakaRodVance: Doesn't the radius scale as $n^2$? But in any case having $n^3$ distinguishable states isn't a problem, since they only contain $\log_2(n^3)\sim k\log n$ bits of information. – Charles Nov 05 '14 at 21:47
  • @RBarryYoung Actually, in modern explanations of Gödel's theorem and uncomputability, you simply take the Gödel number to be exactly as you say - the integer value of the binary string of (say ASCII) characters encoding a mathematical proposition. Gödel worked his encoding scheme out before computers, and before the concept of an array in memory and its binary value had become internalised by scientists the world over. – Selene Routley Nov 05 '14 at 22:36
  • @WetSavannaAnimalakaRodVance: I don't think it scales as $n$ but rather as $n^2$ (making the volume scale as $n^6$), see http://physics.stackexchange.com/q/144819/2818 – Charles Nov 06 '14 at 00:47
  • @Charles Deary me I'm having a truly bad hair day. See my final calculation at the end of my updated answer. – Selene Routley Nov 06 '14 at 06:03
  • @user27542 Please see the end of my updated answer. I think your explanation solves the problem and Jerry's encoding wouldn't even come near to violating the Bekenstein bound, even if we could measure energy with infinite precision. – Selene Routley Nov 06 '14 at 06:05
  • $\delta E_n >> \Delta E_n$. The proposed system would require measurement better than theory allows. – user121330 May 24 '16 at 16:55
2

Actually, theorists have computed the ultimate computer, assuming that black holes exists. The basic procedure to estimate the maximum amount of information that can be stored in a given amount of space is related to the amount of information you can store at the smallest possible scales, as you keep concentrating matter and energy to code for bits. Information and entropy are related, so the ultimate limit is believed to be the entropy of a black hole, because as you keep miniaturization, a black hole will be the limit, as information density grows larger, it will eventually collapse into a black hole. See for instance this article for a popular science viewpoint http://www.nytimes.com/library/national/science/090500sci-physics-computers.html

1

I think in theory the size we can store information is 1 particle or 1 planck's length. That's the limit of quantum theory. Maybe we can store information in 1 sheet of something that has a 1x1 planck length square per slot. And state of particle in that slot is bit of information we can store

Thaina
  • 898
  • But how much in a unit, one bit or more? One of the main reasons I'm asking this is i don't see why a limitation to two states would be inherent to the physics. And if it's not, then what's the limit? – Tyler Nov 04 '14 at 04:43
  • I guess if a plank length is the smallest you could store something and you had a 3D lattice of this something and the only states a slot can be are "full" or "empty", then that would impute a binary system. – Tyler Nov 04 '14 at 04:47
  • I might be wrong (I has been years since I haven't read in deep about this), but before you would be able to get to manipulate bits on a Planck scale level, your system will collapse into a black hole, that is why it is now days considered the ultimate limit. See the link in my answer for more details. –  Nov 04 '14 at 05:32
-1

I can also think of a method that can store infinite data in a couple of objects/atoms/whatever that depends on how good you can measure. Just measure the rotation of an object in relation to another object on the same axis. Simple example: have two disks that can spin on the same axis, with on each disk a indicator on what the up side is. The angle between these two indicators is between 0 and 360 degrees and depending on how accurate you can measure you can store data this way.

Ivo
  • 99
  • 3