I'm currently a junior at San Diego State University and ever since I learned on my intro physics courses that antennas are best 1/2 wavelength. I've read wimpy explanations that it is so it can "feel" the whole wave. I don't find that convincing enough. Basically, how can you prove or show that 1/2 wavelength is the best? Keep in mind that semester I am embarking on my E&M classes. Thanks!
-
Er, the best version of the simplest possible design (one conducting stick) may well be a half-wavelength center-driven one, but that is hardly all there to antenna design. – dmckee --- ex-moderator kitten Jul 16 '14 at 02:58
2 Answers
The fraction of a wavelength used for antenna design is driven by the need to transfer the maximum power to and from the antenna. This is done by matching the impedance to the antenna. The standard $\lambda/2$ value is where the impedance resonates and there is no reactance, just real resistance, making the antenna easier to match to.
This is the reason why $\lambda/2$ is so popular. There are however many valid reasons to choose $3/5 \lambda$ or $\lambda/4$ or even longer lengths for $\lambda$ see this answer What happens when length of antenna >> lambda
I've read wimpy explanations that it is so it can "feel" the whole wave.
Yes. That sucks. I'll provide a rough explanation that might be more satisfying. Displacement current is primarily why antennas work. In volumes that are typically smaller than $\lambda/10$ the structure can be considered using normal lumped elements and approximations are valid. However when the structure starts to approach the size close to $\lambda/10$ or greater then the fields start to take on a more significant role because they will couple to the structure and induce current flows. Antennas pick up this field and allow those signals to be conducted to amplifiers (with proper impedance matching hopefully) making the radio detect a signal.
EDIT: $\lambda/10$ is just a rule of thumb, but it relates directly to the distance over which a charge is accelerated. If that distance is small in relation to its wavelength then the fields are small and can be ignored. Rather than go into a mathematical explanation using the infinitesimal dipole model, I'll try to explain quantitatively since you're just starting physics. Imagine you're a sprinter who runs at top speed when your shoes meet the tarmac perfectly at a stride of $\lambda/2$ At this stride your shoes hit perfectly and all your energy is transfered to the ground 100%. If you stretch and run with a larger stride say $0.75\lambda$ you need a little stiffer shoe to match the tarmac and transfer 100% of your energy. It works ok, as long as you have the right shoes to match the conditions and optimize your energy transfer. The same is true for $0.25\lambda$.
However as your room to run shrinks you can't even get close to making one stride. Say $\lambda/10$ for example. You can't even start to take a step before your foot has to stop again. In terms of charge, just as you start to move that packet of energy you have to stop again because there's no room. Thus trying to move in a space that is too small doesn't allow the field to build at that frequency making any radiation very weak.
-
How come at only $>\lambda/10$ will the radiation couple with the antenna better? Where does this effect come from? – aPhysicist Jul 16 '14 at 14:46
-
1@aPhysicist First, that's an engineering rule of thumb and not a hard physical limit. If the structure is "small enough" compared to the wavelength, you won't get a standing wave across the "antenna" and it won't act like an antenna (the potential drop across the antenna from the standing wave is what drives the antenna). This is why I tried to stay away from engineering terms (impedance, matching and so on) and explained it in terms of the underlying physics. – user3814483 Jul 16 '14 at 14:57
-
oh, okay that makes more sense, but does the antenna length depend on the intensity of the EM wave? Wouldn't a super intense wave move the electrons up the antenna more quickly so they run out of room before they can replicate the waveform and send that signal down the line? – aPhysicist Jul 16 '14 at 23:00
-
@aPhysicist No. The antenna length is independent of signal strength. There is only so much room for charge to move in, having a stronger charge doesn't change that. – user6972 Jul 17 '14 at 05:47
Given that this was a Physics class, it is likely that the complexities of antenna design were understated or ignored entirely. Indeed, it is part art (experience) and part sophisticated analytical and numerical modeling.
With the above in mind, the "best" simple antenna one can construct is either a half-wavelength ($\lambda/2$) dipole antenna (center tapped) or a quarter-wavelength ($\lambda/4$) monopole antenna. The physical intuition for this is that the difference in the amplitude of the standing waves induced on either end of the antenna is at a maximum.
For instance, think about standing waves along a quarter-wave antenna (for a fixed snapshot in time). This characteristic length corresponds to having a node and an antinode at opposing ends (For a monopole antenna, the circuit is completed by a separate ground plane).
Note that the $\lambda$ here is not the wavelength in free space, but rather the wavelength of the propagating wave in the antenna.

- 1,147
-
+1 for the "theory isn't practice". Keep in mind that nowadays there are so many antennas that do not follow the lambda/2 "rule", since they achieve a matching by using active (and passive) electronic components – Noldor130884 Jul 16 '15 at 06:21