Let's start with the basics, the torque produced by an induction motor is primarily determined by the phase current. At base speed, usually associated with a frequency of 50Hz in the UK, the motor is designed to take rated current, and hence produce rated torque, at the rated voltage.
As the frequency decreases the phase currents will rise for a fixed voltage primarily due to lower back emfs. So, to keep at the rated current the applied voltage needs to be reduced as the frequency reduces. Simple, open loop, VFDs normally use a fixed V/f curve. This works quite well, but tends to go wrong at low frequencies, and voltages, where winding resistance is of greater importance.
Vector, or field oriented, control is different. Essentially the three phase rotating vector is transformed to a quasi-stationary two phase vector via the Clarke/Park transform. The resulting current vector has two components D (direct) and Q (quadrature). The D current represents field flux linkage and the Q current represents torque. So torque, and other motor parameters can be explicitly controlled. And features like torque boost at low speed and slowing the motor under overload conditions to prevent stalling are possible.
Vector control can be open loop (no sensors) or can use current sensors. A limitation of open loop vector control is that the motor must be rotating, ie, there is a minimum speed for control. This isn't normally a problem for machine drives. Vector control with sensors can provide full torque at zero speed. This is important for electric vehicle drives, where I cut my teeth on VFD design. It's the equivalent of slipping the clutch on a hill start.
I'm mystified as to how incorrect currents/voltages could cause an induction motor to overspeed. Surely rotation speed is driven solely by the frequency, plus an allowance of a few percent for slip?
Andrew