Bypass the 15ms inaccuracy

As Tim Mangan recently blogged, the system timer in a multiprocessor Windows machine operates at 15ms. The effect of this 15ms is that a measurement / calculation in your program can vary with 15ms. This is fine for most operations in your program, but not when you want accuracy to the millisecond.

Although the best solution to the problem is changing the system timers, as Tim Mangan wrote, a workaround is available.

In the winmm.dll (Multimedia functions) an API is available to request a minimum resolution for periodic timers. When this API is called with an argument of 1, the accuracy changes from 15ms to 1ms.

When accuracy is needed the API “timeBeginPeriod” is called with the minimum resolution in ms as argument. The timers wil now be more accurate.

Next, when the system time needs to be read the API “timeGetTime” is called. This API wil return the system time (since Windows was started) in milliseconds.

When the accuracy is no longer needed the request for the accuracy can be ended with the API “timeEndPeriod”.

 

Setting the timer accuracy is a system-wide setting, so it will affect the whole machine. This is important because it will cost more CPU cycles, which require more power, which require more battery. So this will drain you laptop battery a lot faster!

 

Ingmar Verheij