I think we're going to go back and forth here, as long as no one gets angry, why not.
The beauty of electric motors/generators is the reversability. If only i could crank an engine backwards, feeding it some emissions and water and carbon dioxide, and get gas and air!
Anyway.....electric motors will supply a torque dependent on the current and reach a speed dependent on the voltage. An electric (dc for now) generator like they used to have in cars will generate a voltage based on speed and amperage based on torque.
The magnetic coils in the alternator, the windings, create a magnetic field. I believe it's faraday's law that says that -dB/dT = V, where B is the magnetic field measured in teslas, t is seconds and V is volts. Translation: the negative instantaneous change in magnetic field will equal an induced voltage. If you imagine a magnetic field coming up from the floor, and a loop held n your hand, parrallel to the floor, as you turn it, the amount of "floor" you can see through the loop will change as the angle changes. This is why the voltage graph for a generator will look liek a sine wave.
Anyway, voltage is defined as simply a potential difference...a circuit need not exist and a current need not be induced. However, hook the generator up to a circuit and the voltage will create a current. As the current rises, the amperage in the loop goes up. the flow of electrons will create a secondary magnetic field which i believe is in opposition to the first, creating the load we would see.
Because of this, if i do spin the generator faster, the value of -dB/dT will be of a higher magnitude, creating more voltage, and thereby creating more current. However, the alternator would then be creating 4x the voltage at 3,000 rpm than it does at idle, whihc isnt a pleasant situation for the car. The regulator will reduce or increase the voltage according to the situation. I was wrong earlier, i thnk, about it turning on and off. Rather, it changes the value of the magnetic field in the windings.
This is all DC stuff, but most should apply to AC (slightly different as i believe the current and voltage are out of phase).
If indeed the amperage increases with RPM, it is one of two things: 1)the alternator was unable to produce 14v at idle 2) there is a management system in effect that does not allow the alternator to run at 14v or does not place as large a load on it.
RPMs can only affect voltage, which has nothing* to do with the torque load on the engine.
The idea that more RPM is needed for more Amps goes against casual observation. Ill go back to the headlight example....turn on the headlights and the tach will show less rpm. If more were needed, the rpms would need to go up by the previous logic.
I've tried with with headlights and stereos.
And just to be a picky little one, the drag on the engine does not increase so the alternator can supply more power to it's load, its the decreases restance and accompanying rise in current that creates a greater drag.
tomas