4

"Field" is a name for associating a value with each point in space. This value can be a scalar, vector or tensor etc. I read the wikipedia article and got that much, but then it goes it into more unfamiliar concepts.

My question is how to interpret a basic field. Lets say there is a field with momentum and energy. Does that mean, that any "object" which can interact with that field, can borrow that momentum and energy from that field and give back to it as well. Much like an electron in some (not its own) electromagnetic field, right? Now how do I extend this understanding to concepts like:

$U_\lambda$ is the radiation energy density per unit wavelength of a thermodynamic equilibrium radiation field trapped in some cavity."

Does that mean, that the "numbers" that make up the field in each point in space stay constant with temperature or time (or both, I'm not sure.) And if there are object is that field which can interact with it, for example electrons, then they can take energy from that field or lose energy to that field. Then the author calls this field a "photon gas" without explanation. So does that mean a bag of photons in cavity is mathematically equivalent to specifying some numbers in space? Or something else.

yayu
  • 4,822

3 Answers3

5

I'll start here -- "So does that mean a bag of photons in cavity is mathematically equivalent to specifying some numbers in space?"

It means much more than that. First, you define a field -- in the electromagnetic case, it's a set of vectors everywhere in space. Then you allow it to be dynamic, that is you write a Lagrangian for it that leads to classical equations of motion. Then you quantize it, which leads to (1) the idea of a vacuum, that is a state of the field that contains no excitations and (2) particles -- excitations of the field that can carry momentum and energy around.

So a photon gas is much more than just a simple set of numbers everywhere. The field is dynamic. It's not in the vacuum state and so it contains particles. And finally, it is in a very precise kind of state - the photons are distributed in energy according to the Bose-Einstein distribution at a particular temperature.

How did it get into that equilibrium state? To answer that, you need to extend your field from the plain EM one to include coupling to other matter fields, say the electron. So the electron will have it's own kind of field and your combined Lagrangian will contain a term that couples the two fields together so that they can interact. This is how "objects" like electrons can borrow and lend energy to the EM field, through the coupling between the electron field (which by the way is a spinor field, and not a vector field) and the EM field.

You can have fields for all the other kinds of particles and have interaction terms for all the other known forces. So now you can have a cavity made of matter that interacts with the EM field. And through complicated dynamics that depend crucially on the Second Law of Thermodynamics, the photon gas can come into equilibrium with the matter cavity, all this mediated by the couplings in the Lagrangian.

dbrane
  • 8,770
3

Think about this: a function that maps points on a 2D space to numbers can describe the shape of terrain, but I wouldn't say that it is the terrain. In the same way, a mapping of points to objects (scalars, vectors, tensors, etc.) is the mathematical description of a field, but if you think of the field as just the mapping, you're missing out.

Fields can have various physical properties. For example, just as a particle can have a certain amount of energy, so can an electromagnetic field. The difference is, since the field is spread throughout space, so is the energy; therefore, it makes more sense to talk about the density of energy rather than the amount. Same applies for momentum, or any other physical quantity carried by the field.

Just as the field could be described by a mapping of points to vectors, so the energy density can be described by a mapping of points to numbers. Given the vector value of the field at any point, you can calculate the numeric value of the energy density at that point. But remember that these numbers (i.e. the mapping) are just a mathematical description of the energy density.

Now, you may notice that the mapping that describes the energy density ($u(x)$) satisfies the naive definition of a mathematical description of a field: it associates a number with each point in space. But physicists wouldn't normally call that mapping a "field," because in a sense, it's not really independent. Mathematically, you can calculate $u(x)$ from $A(x)$; physically, the energy "field" is completely determined by the EM field. In physics, we tend to reserve the term "field" to talk about something that can't be obtained by a simple calculation from some other field.

Does that mean, that the "numbers" that make up the field in each point in space stay constant with temperature or time (or both, I'm not sure.)

I'm not sure how you got that from the quote you listed... no, the numbers that make up the mathematical description of the EM field do not stay constant with either time or temperature. In fact, one of the things that characterizes a physical field is that it has dynamics - mathematically, this means that the numbers (or whatever) making up the field change with respect to time and space, but in a predictable manner which can be described with differential equations.

But there are things you can calculate from a field that do stay constant. For instance, you can calculate the total energy stored in the field by calculating the energy density and then integrating it over the volume of the field. You could also calculate the temperature of the field, by some mathematical procedure. In many cases, these quantities are more closely based on the manner in which the field changes than the actual values that describe it. (In fact, in some sense, it turns out that you can describe a field by the way that the numbers change, just as well as you can with the numbers themselves. Read up on the Fourier transform and momentum space if you are interested.)

David Z
  • 76,371
1

dbrane and DZ have given Useful conventional Answers.

dbrane's characterization of the vacuum as "contains no excitations", however, is not very helpful because there is no clear Correspondence with the classical idea of no excitations, the everywhere zero classical field. Local measurements of the vacuum in general give non-zero results.

IMO (not a conventional Answer, so Useful only with care for the purposes of exams, etc.) it's more helpful to think of particles as modulations of the vacuum, which is abstractly constructed as a Poincaré invariant state over an algebra of observables. An alternative perspective is that a free field can be constructed from its Wightman functions, which are essentially correlations between the measured local values of the field when the localizations are at space-like separation [it's more usual to construct the Wightman functions from the quantum field, $W(x_1,x_2,...x_n)=\left<0\right|\hat\phi(x_1)\hat\phi(x_2)...\hat\phi(x_n)\left|0\right>$, but the other way round works too; note that, as generalized correlations, the Wightman functions are functions on the (symmetrized) space $M\oplus (M\times M)\oplus(M\times M\times M)\oplus...$, not on Minkowski space $M$ simpliciter]. The Wightman functions are not zero for the vacuum state.

Non-vacuum states have Wightman functions that are different from the vacuum state. In particular, they are not Poincaré invariant (which is why we can say that they "carry momentum and energy around"). Nonetheless, they are systematic deformations of the vacuum state, and hence of the Wightman functions, constructed by the action of the quantum field on the vacuum vector, which I choose to call modulations because I wish to emphasize the signal processing aspect of quantum field theory. [This doesn't make noncommutativity of measurements go away, but the relationship between quantum fields and classical signal processing is substantially different from the relationship between quantum mechanics and classical particle physics.] Although there is a continuum of possible modulations, there is also a discrete structure of states that can be constructed by the action of $1, 2, ...$ field operators on the vacuum vector, which we call the number of particles in the state. Unless there are superselection rules in place, we can construct (1) linear compositions of vectors and (2) linear compositions of states (superpositions and mixtures, respectively).

Note that by the vacuum state I mean a linear map from the space of operators $\omega_0:\mathcal{A}\rightarrow\mathbb{C};\omega_0(\hat A)=\left<0\right|\hat A\left|0\right>$, where I've used the vacuum vector $\left|0\right>$ to construct the vacuum state $\omega_0$. This is now a fairly universal distinction in mathematical physics, but not, I think, in Physics generally.

Now to your Question's more specific points. Interacting quantum fields defined in terms of deformed Lagrangians and Hamiltonians are only as well-defined as the degree of your acceptance of the mathematics of renormalization. From moment to moment, unitary evolution of a state preserves the Hilbert space norm (by definition), but may or may not conserve the number of particles in the state, nor even the energy and momentum if we construct an ad-hoc Hamiltonian (a thermal field in a box is an ad-hoc system, insofar as the Hamiltonian is not translation or boost invariant). Unitary evolutions are generalizations of sinusoidal motion to higher dimensional spaces, so insofar as we think of sinusoidal motion as borrowing potential energy to create kinetic energy, sure, it's "borrowing", but there are other, and I think better ways to think about such models.

From a field perspective that starts from the vacuum, a thermal state is a thermodynamic limit of mixtures of $0,1,2,...,n$ particle states (thermodynamic in the sense that $n\rightarrow\infty$), with weight for different states that is determined by the energy. The energy and the thermal state constructed using it are invariant under translations and under rotations, but not under boosts. Thermal states are special because the limit is not in the Hilbert space of bounded states. I regret that I don't have the time (I'm not sure if it's an hour, a week, or a Ph.D. thesis) to construct a field theoretic analysis of your thermal field in a box example, I'll have to leave it to you, expanding upon the principled approach I've laid out above.

Yayu, I've sent you to my papers before, so I won't do so again. I've riffed on your interestingly asked Question more than I've Answered it; I think of it as more up to you to make it Useful rather than Useful in itself. Best wishes.

Peter Morgan
  • 9,922