3

For example, if a measurement gives a position with twice the uncertainty as another measurement, how much less information regarding position are you getting? In other words, if uncertainty doubles, is the information gained cut in half? What is the relationship? Also, does infinite uncertainty equate to 0 information?

Edit: David, I am referring to measurement in terms of the uncertainty principle. If you choose to increase the frequency of the photon used to locate an electron, the uncertainty of momentum will obviously increase. My question is, if uncertainty in an observable rises or falls between 2 measurements, what effect does that have on the information obtained for each observable? Put explicity, if you do a measurement that gives you a more precise particle location, doesn't that mean you are getting more information about that observable? If you tell me my seat in a hockey arena is in “section 104/row E/seat 16“, isn't that more information than telling me my seat is “somewhere in the stadium”?

4 Answers4

2

Suppose we have probability of measuring value $x_i$ given by $P(x_i)$. The uncertainty of $x$ is then measured by Shannon entropy (which is not quite the same as entropy in thermodynamics): $$ H(X)=-\sum_i P(x_i)\log P(x_i) $$ (if the base of logarithm is $2$, the entropy is measured in bits, if it is e, then in nats.)

After the measurement $A$, the distribution of the same quantity is given by the conditional probability $P(x_i|A)$, so that the uncertainty is: $$ H(X|A)=-\sum_i P(x_i|A)\log P(x_i|A) $$ The information (gained in the measurement) is then defined as the reduction in the uncertainty: $$ I(X|A) = H(X) -H(X|A). $$

More detailed discussion of these concepts can be found in the books on information theory, such as, e.g., Elements of Information Theory by Cover and Thomas.

Roger V.
  • 58,522
0

"Uncertainty" is an ambiguous term. For a standard measurement in physics, there are two recognized components of error.

Accuracy is a measure of how close a measurement matches the "known" value. Obviously, the known or true value of a measurement isn't normally known, but for situations such as determining the concentration of a chemical in a chemistry experiment, you would deliberately mix a known concentration of this chemical, known as the "standard", and make a measurement of this concentration to ensure that your measurement device/method was giving you a true reading.

Precision is a measure of how many significant figures you can assign to a given measurement, and is related to the device that was used to make the measurement. For a meter stick measurement, where the meter stick is marked in centimeters and contains no millimeter marks, you can measure lengths to the nearest centimeter and estimate the fraction of a centimeter that is exceeded by the unknown length. Thus, a line that is 65.3 centimeters long would be reported as being 0.653 meters in length, where the least significant digit (e.g., "3") is recognized as an estimated value. You wouldn't report the length as 0.6530 meters, because the rules of precision require that you only estimate and report the first estimated digit in your answer. Likewise, if a different meter stick was marked in centimeters and millimeters, you would report the measured line as 0.6532 meters, where again, the least significant digit (e.g., "2") is recognized as an estimated value. Note that if precision is properly reported, it is easy to discern that the first measurement was taken with a device that was marked in centimeters while the second measurement was taken with a device that was marked in millimeters.

Regarding the "uncertainty" in the measurements described above, BOTH measurements give an answer within the precision of the device that was used. EACH answer is valid "information" in the context of accuracy and precision. Whether one answer can be considered as "better" or more valid than the other often depends on how precise an answer you need for a given application or experiment. This, of course, means that there is no strictly defined relation between "uncertainty" and "information", as both of these terms are ambiguous within the context of established physics norms.

David White
  • 12,158
  • could you please respond to my edit of the question? Your answer isnt what I was after. – user21909 Aug 09 '18 at 00:51
  • @user21909, I can only answer your questions that are "classical". For a quantum mechanical answer, I will have to defer to the experts on this site. While my answer wasn't what you were after, I'm hoping that it prompted you to rephrase in a specific enough way for someone else to provide the kind of answer that you actually are looking for. – David White Aug 09 '18 at 01:13
0

“Data is not information. Information is not knowledge. Knowledge is not wisdom. Wisdom is not truth. Truth is not beauty. Beauty is not love. Love is not music. Music is THE BEST.” Frank Zappa

If you measure something, you get data. The amount of information in the data can be estimated by estimating your uncertainty. If before the data came in you had no idea what to expect, and then you found out, that's something. But if you were pretty certain what you'd get and then you got it, that's much less. If you were sure what the data would be and then you were wrong, then you got more information than if you just didn't know.

If the data was the outcome of a horse race that you had money on, then the actual message is what you care about. If REALLY your horse won but they didn't announce that, you lose your money anyway. But what if the thing you really care about is a general principle. You have an idea what's going on that results in the measurements, and each measurement helps to persuade you that you're right or wrong. If you understand the main things, and there are various little things that have small effects, you can accept some small errors. But once your big effects are accepted, somebody else might come along with theories about the small errors and to him, those are the important things. "One man's meat is another man's poison. One man's data is another man's experimental error."

So what decides whether it's information or noise, is what you care about.

How much information you get basicly comes from how surprised you are when you get it. Or maybe it's better to say from how much your uncertainty is reduced.

This stuff is hard to quantify in a meaningful way. But there are ways to quantify it which you might take meaning from in specific circumstances.

J Thomas
  • 2,970
  • 9
  • 30
0

We understand that quantity $q$ has a true value $q_{true}$. However, we can know only a set of values within which $q_{true}$ lies. The extent of that set of values is called the uncertainty $\delta q$. Since we cannot know where in the set that the true value lies, we often use the average as our best guess $q_{best}$. There are other best guesses such as the median or the geometric mean.

To report a quantity the formalism $q = q_{best}\pm\delta q$ is used. This report require 2 pieces of information.

  1. The best estimate, $q_{best}$
  2. The uncertainty, $\delta q$

Having a better estimate or improved uncertainty does not change the number of pieces of information required to completely describe the quantity. A smaller uncertainty exists within the extent of the larger uncertainty, so one implies the other. Giving the row and seat number without the stadium is meaningless. So a valid uncertainty is a single piece of information.

if uncertainty doubles, is the information gained cut in half?

No. There are still only two pieces of information about the quantity.

Also, does infinite uncertainty equate to 0 information?

No. The true values lies within the extent of the set of values. The best estimate can still be made. Some would call it, "Stating the obvious."

The Uncertainty Principle must be considered carefully. In this case "Uncertainty" has connotations of quantum mechanics that may modify the meaning compared to the way used above. The Principle connects the uncertainties (previously considered independent) of two quantities.$$x_{true} = x_{best}\pm \delta x$$$$p_{true} = p_{best}\pm \delta p$$ There are now 4 pieces of 1nformation, but the Uncertainty Principle applies a twist:$$\delta x \propto \frac {1}{\delta p}$$

The Uncertainty principle introduces a constraint on the uncertainties. If the constant of proportionality is known then the constraint reduces the number of pieces of information to 3. The constant may be exact bringing the number of pieces of information to 4. Or it may be uncertain bringing quantity of information to 5.

Uncertainty does not create or destroy information. Reducing it just polishes the existing information.

RussellH
  • 195