Hi everybody!
I'm trying to use QDEC driver in Xmega128A1. Everything is fine when using 16bit resolution - so the result is integer -32767..32768. When i'm trying to use 32bit resolution, the result is correct only if start value of counter is decrementing. When it is incrementing, the first value after 0 is 65535, and it seems to be wrong counting further too....I have no idea, what can be wrong. Test program is practicaly the same as in demos....
I'll be glad for any tip. Thank you, Tomas.
I'm trying to use QDEC driver in Xmega128A1. Everything is fine when using 16bit resolution - so the result is integer -32767..32768. When i'm trying to use 32bit resolution, the result is correct only if start value of counter is decrementing. When it is incrementing, the first value after 0 is 65535, and it seems to be wrong counting further too....I have no idea, what can be wrong. Test program is practicaly the same as in demos....
I'll be glad for any tip. Thank you, Tomas.