Page 1 of 1

### Is photocurrent dependent on frequency?

Posted: Wed May 21, 2014 9:25 pm
There's an interactive simulation of the photoelectric here ( http://phet.colorado.edu/en/simulation/photoelectric ), and it lets one shine a light onto a metal surface, allowing the user to vary the wavelength (and so frequency) of the light, as well as the intensity and the metal the light is shone on. It also lets the user apply a voltage, and provides a reading of the photocurrent.

In that simulation, when the wavelength of incident light is decreased (and all other factors held constant), it is seen that the current increases until it reaches a maximum, and then decreases.

Why is this the case? Why does photocurrent depend on the frequency of light in the first place, and why does the current reach a maximum and then decrease as the wavelength is increased?

### Re: Is photocurrent dependent on frequency?

Posted: Thu May 22, 2014 9:48 am
Gday A.S.H.,

first, it's an animation, so it shows an algorithm that someone has written, inaccessible to us. Let's hope it's the same as the real photoelectric effect, which we'll discuss.

Intensity I is measured in W/m^2. If you keep I constant and decrease wavelength, you have fewer photons carrying the same energy. Usually, one photon ejects one electron and the rest of its energy goes into the kinetic energy of the electron and some heat in the target. So, once the photons have energy well above the work function, we'd expect the photocurrent to fall with decreasing wavelength at constant I due to the lower rate of photon arrival.

It's worth noting also that, because short wavelength gives the electrons more kinetic energy, the latter travel faster on average, and so the same photocurrent has fewer electrons in view on the screen.

The animation doesn't plot i vs lambda (pity) but if you are keen you could make a plot and see if these effects explain everything. Or write to the author.

Joe