39

Does information itself have any detectable mass? If so, how is the mass of information measured if at all possible? Mathematically, is it possible for information itself to have mass? What would be the equation to ask this question or to demonstrate it? Is there a practical, physical way to measure if information itself has mass, how can this measurement be achieved? Furthermore, if information does have mass, isn't it necessary for information to contain energy as well as per the general theory of relativity?

Brando
  • 649
  • 3
    I've deleted a number of comments that were either discussion or attempting to answer the question. Please keep comments focused on suggesting/requesting improvements or clarifications of the question. Thanks! – tpg2114 Oct 07 '20 at 00:37
  • 1
    As per Carlo Roveli, information does not have or need mass. A black hole can evaporate and lose mass but it still stores information that is released in a white hole much into the future – Jus12 Nov 26 '23 at 07:16

12 Answers12

30

Information does not have mass, but the physical materials containing that information may. For example, if one wishes to have robust storage of that information, one may choose a structure which represents that information in a way which has a considerable amount of energy. That energy has mass by relativistic principles. However, it is not the information storage that causes the mass, but merely the physical properties of a medium.

As a concrete example, consider an object which can encode information (such as a magnetic tape). We have two identical tapes. Onto one tape, we load random noise, with no information content. Onto the other tape, we load information encrypted with a symmetric encryption key (we encrypt it here because one principle of encryption is that the encrypted results are indistinguishable from random bits unless you have the decryption key, making the comparison more clear). The energy and mass of these tapes will be identical, while one carries information and the other does not. The only way to distinguish them is to read their contents and decrypt the data.

As a more extreme step, consider what would happen if we destroyed the key. If information did indeed have mass, the second tape would be obliged to lose mass when we destroy the key.

Cort Ammon
  • 48,357
  • 46
    But even the random noise is information (which may or may not be useless) - it still follows that when you read the data multiple times, you get the same results. You're considering a stronger meaning of information - something that happens to be useful. Throwing away the encryption key is not really that different from the information no longer being useful, but that doesn't really change the property that when you read the information, it's still there. Compare this to an entropic system, where each read will give you different results - that's closer to what "no information" means. – Luaan Oct 05 '20 at 06:51
  • 4
    @Luaan True, there are multiple definitions of "information," which can complicate the issue. Here I was using the computer science definition (which I believe is the same as they use in signal processing). In that definition, a number derived from a random number generator is defined to have no information content. – Cort Ammon Oct 05 '20 at 07:06
  • 1
    For completeness, I think you need to demonstrate that the encryption key is also massless. I'm not entirely happy using a CS analogue, and think it would be more useful to demonstrate that information can move at precisely the speed of light rather than being retarded by some miniscule amount. – Mark Morgan Lloyd Oct 05 '20 at 08:32
  • 1
    Final paragraph is the clincher! If we burned a copy of MacBeth and then burned another copy where all the letters had been randomly jumbled before printing, but was otherwise identical, it is inconceivable that we would get a different amount of energy out. – Oscar Bravo Oct 05 '20 at 09:01
  • 8
    I don’t think using the CS theory of information works here. – Tim Oct 05 '20 at 09:32
  • @Luaan In that case, you can consider that all physical objects store some amount of information. All videotapes stores the same amount of information and have the same mass; all USB drives; all pieces of paper. Only a small fraction of that information is accessible to humans - the rest of it can't be changed without destroying the object. Is the information the same thing as the object? – user253751 Oct 05 '20 at 09:53
  • 16
    @CortAmmon "In that definition, a number derived from a random number generator is defined to have no information content. " Huh? Information derived from an ideal stochastic RNG is at maximum because it is unpredictable. – rexkogitans Oct 05 '20 at 13:47
  • 1
    @CortAmmon your answer would be more convincing if you compared a tape loaded with information with a clear tape. I admit that I do not know enough physics related to the recording technology to say if the recording process deposits a measurable amount of energy on the tape. – Vorbis Oct 05 '20 at 14:02
  • 2
    Its like saying "Does sound have mass? " . Information is not a physical entity but technically , you do need mass to read the information. Computers do it by creating high voltages(1) and low voltage (0) . So they are actually using electrons to read that information. No one can like read the information directly from the EM waves itself. – Jdeep Oct 05 '20 at 15:04
  • 1
    "Destroying the key" in a sense where the information is irretrievably lost is not as simple as it sounds. It is actually difficult to do this. The black hole information paradox has connections to this problem. – rnva Oct 05 '20 at 15:55
  • @Vorbis I thought about going down that road, but it seemed more difficult. A tape which has more potential energy beause of the data it has on it would have more mass. So I wanted a tape which should have the same amount of potential energy. Interestingly enough, many modern hardware protocols like SONET do indeed have some guarantees about the energy (such as a DC offset of 0), but that gets terribly into the weeds terribly quickly. – Cort Ammon Oct 05 '20 at 17:29
  • @MarkMorganLloyd I did think about the key. Assuming the encryption key is massless is a bit of a circular argument. However, the argument can be further completed by pointing out that the key can be used to encrypt an arbitrary amount of data. If the change in mass of the key is important, its importance changes based on how many bits you write, asymptotically approaching zero. I would argue this shows that the mass of the key cannot be the clincher arguing that its mass is why information has mass. If so, then its mass would be dependent on a completely separate process! – Cort Ammon Oct 05 '20 at 17:31
  • @CortAmmon I'm fairly happy with your logic, but there's still a temptation to argue that as the amount of encrypted information increased towards infinity the mass of the key decreased towards zero, and/or that you cannot determine accurately both the amount of information (which I think is equivalent to decryption) and its mass. But I'm well outside my area of speciality, so really ought to shut up . – Mark Morgan Lloyd Oct 05 '20 at 17:51
  • @MarkMorganLloyd One certainly can argue that way. There's always multiple models to describe what we see, with their own quirks. Personally, the idea that the mass of a key changes based on actions that are not based on the key (perhaps based on a copy that is destroyed in the writing process), is as bothersome as the retrocausality that arises from the delayed choice quantum eraser if you choose to think of it as a classical system. But you can approach them either way. – Cort Ammon Oct 05 '20 at 18:04
  • 21
    This answer can be improved by explicitly defining "information". The answer seems to implicitly rely on a nonstandard definition that many people would disagree with (including computer scientists). – usul Oct 05 '20 at 19:45
  • 1
    "Information does not have mass, but the physical materials containing that information may" Information doesn't exist outside its physical instantiation, and the energy required for its manipulation has a lower limit. See: https://en.wikipedia.org/wiki/Landauer%27s_principle – spacetyper Oct 06 '20 at 07:32
  • 1
    @spacetyper That corner case is where it gets real interesting. That's where we have to be really cautious in the splitting of hairs between the information and the processing of that information. Its known, for instance, that Landauer's principle can be violated by using reversible computing because it was never intended to apply in that case. – Cort Ammon Oct 06 '20 at 18:57
  • @spacetyper That is how much energy it takes to delete information. If you think about it that way, wouldn't you intuitively say that a blank tape has more mass, because you put energy in when you deleted the data? – user253751 Oct 07 '20 at 11:53
  • @user253751 Whenever you "create" information and have to store it somewhere, you have to overwrite/"delete" whatever information was preexisting on the medium (or create more medium). That's why Maxwell's demon doesn't work (it has to store the entropy it extracts from the gas into its memory), and your cassette requires energy both to "delete" and "add" data (it's "writing" either way). However, I suspect some patterns of bits have more energy than others (alignment of magnetizations), and I further suspect that, on average, more (computational) entropy/info => more energy/mass. – HTNW Oct 07 '20 at 22:49
  • @HTNW We assume that blank media are in a known state. Flipping a bit does not necessarily use energy according to Landauer. Only resetting a bit to a known state does. If you know the medium contains only 0s, then instead of forcing bits to 1, you can flip the bits that you want to be 1. The result will be incorrect if the medium wasn't actually blank. – user253751 Oct 13 '20 at 14:22
  • “Onto one tape, we load random noise, with no information content.“ Experimental noise is a perfectly respectable genre of music. – Jackson Walters Dec 03 '20 at 04:31
  • I think we're missing the potential energy that is stored in the tape when the information is written to it. That has its own small mass, the rest is the mass of the medium. – MatrixManAtYrService Mar 30 '22 at 03:21
28

Claude Shannon proposed the idea of information entropy, which is essentially about how much uncertainty you have about different outcomes. For example, when I read 100 bytes from a hard drive, I would (almost always) expect to get the same 100 bytes from the hard drive over and over again; if I read 100 bytes from a random number generator over and over, I would (almost always) expect to get different 100-byte sequence each time. In the first case, there is information - in the second, there is no information.

As it turns out, thermodynamic entropy is a kind of information entropy. I'm not going to dwell much on that because even giving a good explanation of thermodynamic entropy is tricky. But one can imagine a scenario where you can convert information into free energy. Consider two pistons, opposite in one chamber. There is a single molecule of "working fluid" between the two, and a removable partition. If you know which side of the partition the molecule is in, you can open and close the partition accordingly and produce useful work. In fact, this has been demonstrated in an experiment (though obviously not in any practical way). If you want to know more, have a look at Szilard's Engine. Note that what we have done was convert information to energy (regardless of how efficient the process actually is!).

Does this qualify as "information has energy"? Some claim it does, some don't. It's definitely weird to think about :)

Luaan
  • 6,369
  • 5
    The pretty much hidden essence of this answer may be that entropy and energy are closely related and connected, but pretty much different concepts. The energy has mass, the entropy doesn't. The information is related to the entropy and not to the energy. – fraxinus Oct 05 '20 at 08:15
  • 1
    @fraxinus Essentially, yes. It can be involved in energy changes (e.g. storing information always requires a theoretical minimum of energy and having information allows you to extract energy), but saying it is energy seems to me to be quite a stretch. E.g. knowing the position of all the atoms in a glass of water does not make it cold; but if you had the means to act on that information, you could extract the energy from the heat without violating thermodynamics. But there are people who would equate the two. – Luaan Oct 05 '20 at 14:10
25

Yes

The mass of information can be inferred from the Bekenstein Bound. However, it depends on the spatial extent of the information: a larger space requires less mass per bit. But don't worry...information is very "light": we can store up to $10^{43}$ bits per kilogram within a sphere of 1 meter radius.

Note that if the universe is really a simulation running on "God's computer", Bekenstein gives us a lower bound on its hardware specs. ;)

Computation

If we consider information in terms of irreversible computation (the usual kind done by computers), then we must also account for the Landauer Limit. This one is more difficult, because we can't tie it directly to mass. It implies an energy expenditure to "perform" the computation, but really, it is only an entropy expenditure which can apparently be "paid for" by non-energy conserved quantities, like angular momentum (spin).

A more direct bound is provided by the Margolus-Levitin Theorem. This puts an upper bound of $10^{33}$ on the number of "operations" per second per Joule of energy, which we can think of as the limit of "producing" new information via computation. Via mass-energy equivalence, we could also state this bound as kind of "operations per second per kilogram" limit, which indirectly implies that computations also have mass.

*** EDIT ***

Encoded Information

Mr. Anderson's answer gives a very nice link (please upvote for this alone) to a paper by Dr. Vopson which describes the process of encoding a bit onto a storage medium. This is perhaps the most natural and intuitive notion of "information" with which most people are familiar. Vopson argues that the fact that the state persists without further energy input is due to the fact that the system actually increases in mass like so:

In this paper a radical idea is proposed, in which the process of holding information indefinitely without energy dissipation can be explained by the fact that once a bit of information is created, it acquires a finite mass, mbit. This is the equivalent mass of the excess energy created in the process of lowering the information entropy when a bit of information is erased.

The increase is mass depends on the temperature of the system, but he claims that:

...at room temperature (T = 300K), the estimated mass of a bit is ∼ $3.19×10^{-38} kg$.

He then goes on to propose an experiment whereby a 1 TB storage device is erased and then written to, magnifying this tiny mass by about $1\times10^{12}$. Unfortunately, this only brings the "informational mass" into the range of $10^{-25}$ kg, which is roughly the weight of 60 H atoms.

Embodied Information

However, I will argue that the Bekenstein Bound referenced above is not about this so-called "encoded information", but rather refers to information which is intrinsic to a physical system. More precisely, I believe it refers to the amount of information required to clone the quantum system, if quantum cloning were possible. The bound explicitly defines an entropy, which is proportional to the number of microstates of the system.

So what does it mean for information to "have mass"? The Bekensteinian version is perhaps disappointing compared to the Vopsonian version. My interpretation is that for the universe to have information, it must have energy. I presume that an empty universe with no energy also has no information. But this also applies to a subregion of the universe as well. Information can only exist in a region which also contains energy. Furthermore, that energy has properties by which we can describe it. It has degrees of freedom. And those degrees of freedom result in an ensemble of possible microstates for that quantity of energy. The information embodied by the energy simply encodes which microstate corresponds to the quantity of energy.

Thus, a photon flying through space might be encoding information, if, for instance, a human selected it amongst a population of photons because of one or more of its properties. But regardless of the encoding, it also embodies information about its frequency, polarization, direction, etc.

More importantly, I will claim that all encoded information ultimately derives from embodied information via a selection process whereby particular microstates are chosen to represent information and other microstates are designated as "noise" and systematically filtered out or suppressed.

So ultimately, the idea that "information has mass" just boils down to the fact that information requires energy to exist, and mass and energy are equivalent. Boring, huh? The trick is that information does not have a fixed mass, but depends on energy density and particle number.

  • 3
    "a larger space requires less mass per bit" So this mass is not a property of information itself. – user76284 Oct 05 '20 at 21:09
  • 4
    @user76284 if you reduce the mass to 0, then you also eliminate the information capacity. Empty space holds no information. Mass is a necessary, but not sufficient property of information. The mass also needs to be be somewhere to have info. Interestingly, one could argue that mass and space are the minimum requirements to build a Turing machine. That is likely not a coincidence. – Lawnmower Man Oct 05 '20 at 21:14
  • 1
    What I mean is that the Bekenstein bound does not define the mass of, say, 1 bit. – user76284 Oct 05 '20 at 21:51
  • 3
    Nor does QED say the position of one electron. An electron may trade its location for momentum or vice versa, but that does not cause you to say that an electron has neither position nor momentum. In the same way, 1 bit may trade space for mass. In computing, we call it a space vs. time tradeoff, but perhaps physics has it right by calling it a space vs. energy tradeoff. I can store pi as an extended floating point value over space, or as an algorithm which requires an expenditure of energy to "read". Do you want to know where pi is at, or how fast it's moving, to mix metaphors? – Lawnmower Man Oct 05 '20 at 21:57
  • Lawnmower Man, the question is asking whether information has mass. Information does not have mass. Something which carries information may have mass. – user76284 Oct 05 '20 at 21:58
  • 3
    What is the difference? If you could point to some massless information, we could settle this dispute immediately. – Lawnmower Man Oct 05 '20 at 21:59
  • 2
    The question asks whether information itself has any detectable mass. "Is there a practical, physical way to measure if information itself has mass, how can this measurement be achieved?" – user76284 Oct 05 '20 at 22:00
  • Is the Bekenstein bound the mass of the information, or just the minimum amount of mass you need to store the information? There is a minimum size of garage I need to store a car, but that doesn't mean the garage is the car. – user253751 Oct 06 '20 at 13:56
  • @user253751 I can't say for sure. I think there are two kinds of information in play: embodied and encoded. Mr. Anderson cites Vopson, who describes encoded information and postulates an actual mass per bit. I think the Bekenstein bound is about embodied information, which I interpret as what you need to clone the quantum system. I know cloning is impossible, but each state embodies information nonetheless. Vopson's mass-energy-information equivalence principle is saying that info doesn't "need" mass or "energy"; but rather, that it is equivalent to those. I don't know what that means. ;) – Lawnmower Man Oct 06 '20 at 19:29
  • An intuitive arguement would be helpful here. I'm sure I'm not the only one who is confused by this answer. Why would information information have mass? I think you're saying its because there is a cost for reading the information? – Steven Sagona Oct 07 '20 at 00:27
9

Yes. Indirectly.

First, what is information? It is the ability to do prediction.

Second, entropy $S$ (dimensionless) is a measure of unavailable information, (for more see this answer), and we know entropy is proportional to energy. So, information and energy should also be related - Landauer's principle (bits).

$k_BT[{\rm J}]\cdot \ln2[{\rm bits}]\le E$

Now, you might consider that at rest $E=mc^2$ and so 'deduce the mass' of a bit of information like Vopson, or simply recognise that (nats)

$E=k_BT[{\rm J}]\cdot S[\rm{nats}]$

Entropy (can be considered to be) dimensionless, mass-energy isn't. This all expands Luaans answer. Also, this is the 2010 experiment which demonstrates work can be extracted from information.

Chris
  • 17,189
Mr Anderson
  • 1,399
7

For an experimental physicist, as I am, the question sounds like comparing apples and oranges.

Why do we have basic units? So that we can measure and compare apples with apples and oranges with oranges.

What are the units of information?

Does information have mass?

There are various unit systems for mass, in addition to kilograms.

For information there is a so called natural information unit, according to the wiki article

The natural unit of information (symbol: nat),sometimes also nit or nepit, is a unit of information or entropy, based on natural logarithms and powers of e, rather than the powers of 2 and base 2 logarithms, which define the bit. This unit is also known by its unit symbol, the nat. The nat is the coherent unit for information entropy. The International System of Units, by assigning the same units (joule per kelvin) both to heat capacity and to thermodynamic entropy implicitly treats information entropy as a quantity of dimension one, with 1 nat = 1. Physical systems of natural units that normalize the Boltzmann constant to 1 are effectively measuring thermodynamic entropy in nats.

If a physical variable may be said to be related to information entropy it is energy per kelvin, but the relationship just allows to see thermodynamic entropy in units of nats.

So, imo, information entropy and mass have no connection.

anna v
  • 233,453
  • 1
    From the fact that we have nits or bits does not follows that they have no mass. After all there is dimentionless way of measuring oxygen (1 molecule of oxygen, 6.0221409*10^23 molecules of oxygen etc.) yet oxygen has mass. – Maja Piechotka Oct 08 '20 at 00:51
  • @MaciejPiechotka Oxygen has mass, and 1 molecule of oxygen has mass , but the number 1 has no mass, and unless a second number is known in units of mass the mass of 1 molecule of oxygen is unknown. Mass is an attribute and has go be given in its units, in whatever system. 1 apple has mass,but the number 1 does not give the mass of the apple. – anna v Oct 08 '20 at 03:34
  • @annav okay so the question is asking: what's the weight of information in kg/bit? The fact that you don't know the mass of oxygen in kg/molecule doesn't mean it doesn't have any. – user253751 Oct 28 '20 at 13:18
4

Information does not have mass. Photons carry information and they are massless. One could ask if data has mass but that is also no.

Natsfan
  • 2,652
2

Information itself does not have mass.

A simple example here is having a series of coins, where I lay them down heads/tails based on a binary 1/0 state. I am able to essentially convey any information I want (that we are currently able to express electronically). I could describe the entire content of Wikipedia if I had enough coins.
But mass-wise, this is no different than if I had a bunch of coins without arranging them based on some binary logic.

That being said, there are ways to cheat the system in your favor here. If, for example, I would place the coins representing the binary 1 on a raised platform, then those coins have more potential energy, and you could argue that this information has energy, which in turn means it can have mass (good old E=MC²).
It's a stretch, but it's technically correct.

But then again, I could also arrange these coins on varying platforms without conveying any information, so the existence of the information still doesn't really force the amount of mass/energy contained in the system to be any different.


However, we as humans are pretty much unable to perceive anything massless, so you can argue that information, which is inherently intended for humans to perceive, indirectly requires something that has mass.

That's a different question from what you asked, though.


Also, just as a though experiment, if you consider that information could have mass, have you then also considered that it could have negative mass?

I'm thinking of carving a message into a stone tablet here. You are effectively removing bits of stone, thus lowering the mass rather than increasing it. If were were to hypothetically conclude that information has mass, it would seem contradictory that having the information contained on a stone tablet would then lower its mass.

Flater
  • 530
  • 2
    But mass-wise, this is no different than if I had a bunch of coins without arranging them - citation? How do you know? Analogy: before special relativity, people would have falsely asserted based on common sense that if A and B are each traveling at 1 m/s toward each other, then their relative speed is 2 m/s. Also: Imagine the coins were mounted vertically on frictionless axles. Then Landauer's principle implies (I believe) that spinning every coin to its reverse orientation requires strictly less energy than e.g. spinning them to all face heads. – usul Oct 07 '20 at 03:36
1

The Beckenstein bound sets a minimum energy per bit of storage in a confined region, and so a minimum mass. This is essentially due to the fact that a wavepacket that fits in the region has a minimum possible momentum and hence energy. Information is physical: to store information, something must be there.

We can also understand the minimum energy per bit directly in terms of the minimum energy for a given rate of change. In units with $h=2$, special-relativistic energy $E$ equals the maximum rate of distinct (orthogonal) state change (for a long evolution), and each bit change is a distinct change. Thus if we have energy $E$ in a region of radius $R$, the fastest all the energy can come out is in time $t=R/c$, and the maximum number $N$ of bits that can change when that happens is $N=Et=ER/c$. This must be all the bits that were within the region. The minimum energy per bit is thus $E/N=c\,/R=hc\,/\,2R$, so the minimum mass is $h\,/\,2Rc$.

0

No. As a trivial counterexample, take any system that has two different states of the same energy.

0

At the crux of the question is another question:

Can information exist without a medium for it to be held in/on as well as a means for interpreting what is in/on the medium?

I'm going to say no. You may have a different opinion.

Must the medium have mass?

Probably yes. In the case of bare photons, maybe it's debatable.

Must the means for interpreting have mass?

Well, interpreting something implies a "who" or "what" to do the interpreting. So, some system with some complexity must exist. I'm going to make the bold claim that such a system almost surely has mass.

The final question:

Is the concept of information separate from the means which makes it exist?

This is a pretty deep question and I haven't read nearly enough philosophy textbooks to attempt an answer.

Anyway, my point is: the answer to "Does information have mass?" is going to entirely depend on how you answer all the questions above, and I don't think these are answerable without opinion (especially not the final one).

Bamboo
  • 163
0

It costs energy to create information, and therefore the energy of ``information creation'' has a mass.

Reading information also costs energy, and therefore has a mass associated with it.

But I don't think there are any current models that explicitly require that the information that was encoded with energy (and therefore mass) itself has mass. You can certainly encode information on the frequencies/amplitudes/polarization of photons. While it costs energy to send them out into space, they certainly will exist without any masss while they propagate in space.

(Also, one thing to point out is that the creation of all information is NOT necessarily the same energy. For example, if I lay out a bunch of colored M&Ms on a table inside a van while its driving and the M&Ms are bumping around. It's going to be MUCH harder for me to keep all of the M&Ms separated in color than it is for them to be mixed up. The energy it requires to keep things from moving towards their natural most-likely "mixed-up" state is related to the "information entropy" - and the more likely it is for things to get mixed-up, the more energy it will cost to keep them in a particular state.)

-5

Yes information does have mass because if it did not, then that means information can exist without mass but if there is no mass then there is no information so the only other option is information does have mass and you can measure it and the measurements exists today! computer bits, computer bit measure how much how information something stores for example one kilobyte is roughly how much information that's in half a novel.

  • 1
    information has mass because if not it could exists without mass? I’m sorry but gobbledygook. – ZeroTheHero Oct 05 '20 at 23:47
  • Cars have snake tails because if they did not, that means cars could exist without snake tails but if there is no snake tail there is no car so the only other option is cars do have snake tails and you can see them and the pictures exist today! – user253751 Oct 06 '20 at 13:58