Electromagnetic waves propagate much differently in conductors than they do in dielectrics or in vacuum. If the resistivity of the conductor is sufficiently low (that is, it is a sufficiently good conductor), the oscillating electric field of the wave gives rise to an oscillating conduction current that is much larger than the displacement current. In this case, the wave equation for an

electric field $\vec { E } ( x , t ) = E _ { y } ( x , t ) \hat { J }$ propagating in the $+ x$ -direction

within a conductor is $$\frac { \partial ^ { 2 } E _ { y } ( x , t ) } { \partial x ^ { 2 } } = \frac { \mu } { \rho } \frac { \partial E _ { y } ( x , t ) } { \partial t }$ where $\mu$$ is the permeability of the conductor and $\rho$ is its resistivity. (a) A solution to this wave equation is $$E _ { y } ( x , t ) = E _ { \max } e ^ { - k _ { c } x } \cos \left( k _ { C } x - \omega t \right)$$ where $k _ { \mathrm { C } } = \sqrt { \omega \mu / 2 \rho } .$ Verify this by substituting $E _ { y } ( x , t )$ into the above wave equation. (b) The exponential term shows that the electric field decreases in amplitude as it propagates. Explain why this happens. (Hint: The field does work to move charges within the conductor. The current of these moving charges causes $i ^ { 2 } R$ heating within the conductor, raising its temperature. Where does the energy to do this come from? $( \mathrm { c } )$ Show that the electric-field amplitude decreases by a factor of 1$/ e$ in a distance $1 / k _ { \mathrm { C } } = \sqrt { 2 p / \omega \mu } ,$ and calculate this distance for a radio wave with frequency $f = 1.0 \mathrm { MHz }$ in copper (resistivity $1.72 \times 10 ^ { - 8 } \Omega \cdot$ m; permeability $\mu = \mu _ { 0 } )$ . since this distance is so short, electromagnetic waves of this frequency can hardly propagate at all into copper. Instead, they are reflected at the surface of the metal. This is why radio waves can- not penetrate through copper or other metals, and why radio reception is poor inside a metal structure.