Current is basically proportional to torque – with a small additional magnetisation current – and the output voltage is roughly proportional to speed. Assuming you have set up the VFD with the correct nameplate info (particularly the number of poles or base speed and the phase-phase voltage), there isn't a whole lot to get wrong.
You can develop a significant motor current (torque) at low speed but the corresponding motor voltage will be low, so the net corresponding (real) power at either the input or the output will be low. The % efficiency of a modern VFD will be in the high 90s at rated output and obviously will be zero at stall.
The reason for specifying VA is because motors are fairly inductive (poor power factor) and the phase current isn't in phase with the drive voltage. The shaft power is developed by the real power (W), while the reactive power is basically recirculated. The power factor at the VFD input is fairly close to unity (not least due to legislation), so the big discrepancy between VA and W only really arises on the output.
If the VFD load were purely resistive, the VA and W would measure the same. Conversely, if it were purely inductive, the VA would be high and the W would be zero – and the VFD input power would only have to cover resistive losses.
When you are designing a VFD, the DC bus voltage is generally well defined and the main critical parameter is the output phase current. With IGBTs, it is the average AC phase current that matters. With a FET output stage it would be RMS current – but 200V and 400V class VFDs use IGBTs.
Murray