Remember, what you measure is one thing, how often you show it is another. So, you can measure with millisecond accuracy, but only update the screen at decisecond intervals. It's not like anyone will be able to read the digits at 1000 Hz (especially as the screen only updates at maybe 50Hz...).
However, once you press stop (or whatever you do when you want the exact measurement, then you read the time at that precise moment.
For example, it's common to do the opposite when doing minute accuracy clocks. Even though you only show minute accuracy, you update the screen at second accuracy, to prevent some odd effects if a timer event is delayed (Which happens. Remember, a timer is not an exact tick, it's more like "wait this long and then get back to me when you have time for it".), which could mean that one minute was skipped because it didn't get an event because the event got pushed to the next minute.
Another example, which I often use, is in status displays where I display "x objects of y done". These can often get quite flickery, if the operations are quick, which I solve by only update the screen once per second. Sure, the numbers will "skip", but they will be readable and not cause epileptic seizures.
So, think about disconnecting the measurement from the display, and do each as good as you can depending on their different needs.