I have seen it remarked in some problem sets that if you have an electromagnetic wave traveling in the $x$-direction with it's $y$-coordinate given as
$y(x,t)=y_0\sin (\omega t +kx)$
and you want a build an antenna to receive the wave, the antenna must be of length $2y_0$. I want to know how much truth there is to this statement.
The physical picture is quite nice; the electric field is "waving in space" so when it hits a conductor, the electrons accelerate along the conductor and you can measure the current to get the frequency. But "real" E/M waves don't have spatial position like a string; a solution to Maxwell's equations for traveling waves looks something like
$\vec{E}=E_0\sin (\omega t+kx)\hat{y}$
(taking the simplist possible conditions). The physical picture still works, since the electric field is "waving" along the conductor, but the length argument no longer makes any sense. And actually, don't dipole antennas work the $other$ way, by maximizing the voltage difference across the two ends by aligning themselves parallel to the direction of propagation (the $x$-axis here)?
So is the simple picture we paint for students completely incorrect, or is there any validity to it?
EDIT: As it turns out, the only examples of this misunderstanding I can easily find are those which I have some reservations about posting because of their relationship to graded problem sets. So, I will leave this up for a few days to see if I can attract anyone with a simple explanation of antenna design to answer it. If not, I will answer it myself.