Is photocurrent dependent on frequency?
Posted: Wed May 21, 2014 9:25 pm
There's an interactive simulation of the photoelectric here ( http://phet.colorado.edu/en/simulation/photoelectric ), and it lets one shine a light onto a metal surface, allowing the user to vary the wavelength (and so frequency) of the light, as well as the intensity and the metal the light is shone on. It also lets the user apply a voltage, and provides a reading of the photocurrent.
In that simulation, when the wavelength of incident light is decreased (and all other factors held constant), it is seen that the current increases until it reaches a maximum, and then decreases.
Why is this the case? Why does photocurrent depend on the frequency of light in the first place, and why does the current reach a maximum and then decrease as the wavelength is increased?
In that simulation, when the wavelength of incident light is decreased (and all other factors held constant), it is seen that the current increases until it reaches a maximum, and then decreases.
Why is this the case? Why does photocurrent depend on the frequency of light in the first place, and why does the current reach a maximum and then decrease as the wavelength is increased?