top | item 43070990

(no title)

viernullvier | 1 year ago

It's because the default "analog output" PWM mode of a microcontroller will only give a rough approximation of the signal that the servo actually requires. For a servo, the duty cycle is (almost) irrelevant, the 0-100% scale has no meaning here. What matters is the actual length of the control pulses in milliseconds - the gaps between them can be arbitrarily long within a certain range.

If you think about it, it actually makes a tiny bit of sense. First, it is failsafe: Breaking the control line or shorting it to ground will not move the servo to 0%, shorting it to signal level will not move it to 100% - it just doesn't move at all and stops applying force. Any sentient being within the movement range will definitely prefer it that way instead of random movements. Second, it can actually be pretty precise: The driver circuit can be completely analog, it doesn't have to be limited by arbitrary digital quantization steps. All it needs to do is check if the current encoder value is above or below the target and apply power to the motor accordingly.

discuss

order

No comments yet.