-7

Via a packet-switched network, like the internet, data is sent as packets (bits) wirelessly via radio waves with Wi-Fi, or 802.11g, etc.

What my question is is this:

Radio waves are light; light has no mass. How can data be sent via radio waves if radio waves have no mass, hence, can't carry data?

This applies to DTA or ATD TV set ups. An analog TV unit uses a digital-converter box to receive radio waves. How can the antenna receive radio waves, and send them to the box to receive data from radio waves if radio waves are light with no mass?

Data would have to be converted to some light form, like irradiance, or radiative flux values, or else it seems impossible to me, but I'll bite ... what is wrong here?.

Source 1: http://en.wikipedia.org/wiki/Radio_wave

Source 2: http://en.wikipedia.org/wiki/Bit#physical_representation

  • 15
    Why do you think data has mass?? – Bubble Oct 03 '13 at 20:33
  • Read the link. Physically implemented packet-switched bits are high and low voltage specifications through channels of flow on an electric circuit within an electric field; electrons have mass. Light, aside from radiative flux/irradiance has no "states" of mass as illustrated in a circuit's flow, logic gates, and bit usage. The mass would have to covetedly change to a state of corresponding, mass-lacking analog light values before being transferred digitally to the mass of electrons once again. – Andy Harglesis Oct 03 '13 at 20:34
  • 6
    It takes energy to transmit information. And, indeed, both electrons and photons have (and are) energy. – Dmitry Brant Oct 03 '13 at 20:41
  • So @MarkMitchison , you are telling me that the "data" is encoded in a different energy form, which then can be decoded appropriately and translated digitally to the circuit's logic itself? – Andy Harglesis Oct 03 '13 at 20:45
  • 4
    Yup, I sure am. – Mark Mitchison Oct 03 '13 at 20:46
  • Can you explain how light holds states? Is it irradiance, or radiative flux that is the reason for this? Can you surely tell me how the data is encoded in to different energy, such as light, and then back to another source, such as "bits", in a circuit? – Andy Harglesis Oct 03 '13 at 20:49
  • And no, I am not talking about digital-to-analog, or analog-to-digital conversion; I'm talking on energy changes themself. – Andy Harglesis Oct 03 '13 at 20:52
  • You've heard of antennas, right? – Mark Mitchison Oct 03 '13 at 20:54
  • Yes, but you did not answer my other questions above in regards to light's "states". – Andy Harglesis Oct 03 '13 at 20:55
  • 2
    The encoding is not a property of the light. It is a property of the system. Think about it for a minute and it should be obvious. Storage is also not a property of light, or magnetism or any other physical mechanism; it is a property of the system. – dmckee --- ex-moderator kitten Oct 03 '13 at 23:43
  • @AndyHarglesis How about this for light's states in terms of the sun: night and day. It's about that simple, but there are different ways to manipulate light other than on and off, you can change frequencies, phases or polarization for example... – user6972 Oct 04 '13 at 05:05
  • Andy, if you want to know how electromagnetic radiation can carry information, look up amplitude modulation and frequency modulation. – rurouniwallace Oct 07 '13 at 21:50

4 Answers4

4

In order to store data in a bit (or information, the two are usually not distinguished in physics) you just need to have access to two distinguishable states. For example, it should be obvious to you that you could shine a laser pointer at me, and by flicking it on and off, transmit a message in Morse code (hopefully not blinding me in the process). There is no requirement for the "on" and "off" (0 and 1, whatever) states to be different in energy, or cost energy to make. Of course, it almost always will cost you some energy to encode the data in your data carrier of choice. Indeed, it costs energy to generate the radio waves or laser beam in these examples. This is because data must be carried by something physical, which means the data carrier is associated with an energy, whether that be the energy of a photon or the energy stored in the mass of an atom or electron etc.

  • So data is just any way of representing a state; light still must have a way to hold "states" in an energy. How does it do that? Irradiance? – Andy Harglesis Oct 03 '13 at 20:45
  • 1
    I'd prefer to say that having more than one different state allows you to represent data/information. But basically yes. There are lots of ways of encoding data in light, since there are many different ways of tuning the light pulse. You could use intensity/amplitude, frequency, or digital schemes like PCM. Frankly I last learned about this in high school, you might be better off asking on EE.SE – Mark Mitchison Oct 03 '13 at 20:52
  • So light's states are not held in regards to irradiance, or radiative flux? I didn't ask for frequencies or intensity, or pulse-code modulation, but I asked specifically how the encoding of multiple values, and the procedure to do so works in regards to light; works in terms of photons. – Andy Harglesis Oct 03 '13 at 21:00
  • Actually I did answer your question @AndyHarglesis. The irradiance of an EM wave is given by the amplitude squared, so the multiple states for encoding data by amplitude modulation (AM) are given by the possible values of the electric field amplitude, or irradiance if you prefer thinking of it that way. – Mark Mitchison Oct 03 '13 at 21:16
4

I'd like to add some details to Mark Mitchison's spot on answer. There really, truly is in principle no lower bound to the cost in energy, or work you have to do or "mass" you have to supply to send and receive information between one physical system and another. Or, more precisely, there is no cost that arises from the nature of information itself. Costs in transmission arise from the fact that Mark concisely puts:

In our universe information has to be written in some kind of "real ink" and that ink is the states of physical systems

Information in practice cannot be some kind of abstract, disembodied knowledge, or a sequence of symbols even though we often (highly helpfully) treat it as such in probability and information theory. In the example in your comments:

Physically implemented packet-switched bits are high and low voltage specifications through channels of flow on an electric circuit within an electric field; electrons have mass.

the sent information is encoded in the physical states of massive particles, electrons. But it can equally well be encoded in the states of massless things like photons.

So the reasons why information sending and receiving "costs" something are slightly indirect, and they arise from, in rough order of descending fundamental importance:

  1. Landauer's principle, and, equivalently, the second law of thermodynamics;

  2. Signal to noise limitations in the particular physical system whose states you choose to "write" your information in;

  3. Dissipation in the particular physical system whose states you choose to "write" your information in;

Points 2. and 3. depend on the system you work with, so are not as fundamental as the first. Notice that I haven't mentioned things like the Heisenberg uncertainty principle or other quantum mechanical limitations. These show up in point 2. above; in principle, you can make these as small as you like by encoding your information in "bigger and bigger" reversible (lossless) classical systems so that, for example, the HUP accounts for less and less of the total noise limiting your signal transmission. Of course in practice, none of our technology so far is lossless or reversible, and so now and hitherto this means building bigger and more disspative systems, so the limits in 2. and 3. are very real. Quantum information transmission technology, which makes use of reversible state machines, may make these limits less important in the future.

Landauer's Principle:

The fundamental cost in shifting information from one system to another is that we have to make room for it at the receiver end! In other words, we have to encode it in the states of, say, electron spins at the receiver end. This sending and encoding we can do without energy loss: the loss arises when we ask what becomes of the information contained in the former electron spins before we wrote over them with this newly gotten information.

The microscopic laws of physics are reversible, so this means in principle that a closed physical system's total state at any time is related to its state at any other time, before or after, by a one-to-one mapping. So the former electron spins have to show up encoded somehow in the state of the physical system around our receiver. This means the thermodynamic entropy of these surroundings continually rises as the information arrives and writes over our receiver electron spins. Eventually, one must do work as stated by the second law of thermodynamics, to throw this entropy out of the system as the otherwise the system will simply stop functioning. A side note here is that Chemists sometimes talk about the Gibb's (or other e.g. Helmoholtz) free energy as the enthalpy less the work needed to expel the excess entropy the reaction products have relative to the reactants.

How do we show all this is true (at least by a physicist's as opposed to a mathematician's proof)? There is a way to account for the Maxwell Daemon and the Szilard Engine that makes these two thought experiments comply with the second law of thermodynamics in the long term and that is through Landauer's Principle: the idea that the merging of two computational paths or the erasing of one bit of information always costs useful work, an amount given by $k_B\,T\,\log 2$, where $k_B$ is Boltzmann's constant and $T$ the temperature of the system doing the computation.

This argument was finalised by Charles Bennet, whose excellent reference paper here is Charles Bennett, "The Thermodynamics of Computation: A Review", Int. J. Theo. Phys., 21, No. 12, 1982.

Bennett invented perfectly reversible mechanical gates ("billiard ball computers") whose state can be polled without the expenditure of energy and then used such mechanical gates to thought-experimentally study the Szilard Engine and to show that Landauer's Limit arises not from the cost of finding out a system's state (as Szilard had originally assumed) but from the need to continually "forget" former states of the engine.

Probing this idea more carefully, as also done in Bennett's paper: One can indeed conceive non-biological simple finite state machines to realise the Maxwell Daemon - this has actually been done in the laboratory and as the Daemon converts heat to work, it must record a sequence of bits describing which side of the Daemon's door (or engine's piston, for an equivalent discussion of the Szilard engine) molecules were on. So we get the same situation as our electron spin receiver above: for a finite memory machine, one needs eventually to erase the memory so that the machine can keep working, thus ultimately the need to throw thermodynamic entropy out of the Maxwell Daemon.

Signal to Noise Limitations

You might like to see my answer here to Maximum theoretical bandwidth of fibre-optics. Here the energy requirements to send information are indeed lower bounded by quantum mechanics and the statistics of counting photons. You must send a minimum number of photons to represent a symbol with acceptably low uncertainty. If you can send the knowledge of a system's state noiselssly, then you can theoretically transmit anything you like at symbol rates approaching zero bits per second. Given a description of an information sending channel that includes the realisable signal to noise ratio, the Shannon-Hartley form of the Noisy channel coding theorem (see also here) indirectly put lower bounds on the energy needed to send information in the face of noise. These remarkable theorems show that there is always a way to send information in an arbitrarily noisy environment, but our bandwidth is limited by the noise, given available energy supply to send information. Thus, although the Voyager 1 probe is still in contact with Earth, its transmission rate is exceedingly low.

2

I want to correct this misapprehension, though mass has nothing to do with information carrying.

Radio waves are light; light has no mass. How can data be sent via radio waves if radio waves have no mass, hence, can't carry data?

The misapprehension is the statement light has no mass. The true statement is "a photon has no mass" .

Two photons can very well carry a mass, the measure of the added four vectors. There exists an invariant mass even for mass less particles, this is for two photons with an angle theta in their direction of motion:

massquaredtwophot1

two photons twophot3

And this is just for two photons. An electromagnetic wave will always diverge, there is a 1/r**2 (laser light diverges less) fall in the energy density of the wave, and there will be angles between the multitude of photons. Thus an electromagnetic wave can be characterized by a total mass, but that has nothing to do with information of course. The Heisenberg uncertainty principle will also ensure that there will always be some divergence.

anna v
  • 233,453
0

You're thinking too strictly about data. Data can be stored in all sorts of ways. Take the following byte of data, for example:

10110100

That's eight binary bits, one byte. You could represent this byte all sorts of ways. On a magnetic disc with positive and negative charges. On a CD, with material that reflects a laser differently. With creek pebbles in sidewalk squares (pebble in a square represents 1, no pebble represents 0). All of these are different mediums. The only important thing is that the transmitter (or writer), and receiver (or reader) both agree on the system. If you don't know to read the sidewalk for pebbles, you're never going to get the message.

So with light, you could do morse-code style on/off (beep - beep beep - beep - -) or you could use frequency changes, amplitude changes, color changes, whatever.

Data, therefore can have lots of mass (pebbles) or very little mass (hard drive platter) or no mass (light, a.k.a. energy). Of course energy and mass are but different manifestations of the same thing.