I have read ( maybe not understood but) all the above.
Heres my question. Agiven motor is charaterised thus. Stall current Off load rpm. Off load current Max rpm ( mechanical..”centigugal”or bearing limits)
So far so good. So to drive motor at off load rpm ( which we assume is lower than max rpm) We feed a voltage sufficient to reach this rpm..dc for now. So why is motor sinking so much less than the stall current?. Well at stall we have say 24V and a few tenths of ohms thus we have hundreds of amps.. But at off load rpm we see 5A..? Doing the arithmetic we see that the motor acts as if run on 1-2 V…what gives..well that back emf the motor is generating…say 22V… PUSHING the other way leaving only 1-2V excess .. Now take the point where the motor is loaded and throttled to half rpm… Motor back emf falls and load current rises….. Now apply brakes… Well back emf is about ( pardon my guess ) 1/4 at this rpm so say 6V… Mmm how do we charge a 24V battery from that. . (Yes I know …inverter of some kind but lets keep this to the..fet/diode/switch level)?..
You probably need to get a book out on the subject or the internet equivalent.
You can make a fairly reasonable model of a motor by combining a DC voltage source and an inductance. The voltage of the source is proportional to the rotational speed (scale factor Kv) and the torque generated by the motor is proportional to the current in the stator (the static winding) – scale factor Kt. That pretty much does all you need.
When the motor is spinning with no load, the voltage source ("back emf" roughly equals the power supply voltage. At stall, the current is the voltage source divided by the stator resistance.
With wound field rotors (dynamos and alternators), the voltage factor Kv varies with the field current. Higher current gives higher torques but lower speeds. With permanent magnet motors Kv doesn't vary.