Hey all,
I was wondering if the resolution of the GetTickCount timer function is as bad in Windows CE as it is in Windows 9x/2000/XP (where it is only accurate to 50 ms).
I ask because I want to unchain my game physics from the frame rate so that my game can run at the maximal FPS on any PPC (so new PPCs will run it smoother), but with the speed of gameplay remaining the same. I noticed that most PPC developers don't do this - is it because the timer can't handle it? I read in the docs someplace that it had a resolution of 1 ms, but that sounds like a lie to me. Has anyone had any experience with this?
I don't suppose that the StrongARM processors have super high accuracy performance timers like pentiums do? I couldn't find any SDK docs on that.
Thanks for any input