It is critical to understand the difference between Accuracy, Resolution and Precision in measurement instruments. As usual, an excellent article can be found in Wikipedia.
In short, Precision quantifies the instruments ability to produce exactly the same measurement within over and over again for a given input condition. Resolution defines the "fineness" of numerical representation of the result. Accuracy on the other hand quantifies a deviation of the measurement from "reality" or, for better definition, a given reference value. All three metrics are independent on an instrument and may vary over time.
Both accuracy and precision can be different in their characteristics, based on the type of instrument being looked at. For a voltage measurement, an accuracy error for example can exist dependent (like a percentage) and independent (like an offset) of the measured input condition.
For a frequency measurement, accuracy is potentially unreliable, depending on the quality of the time base and it's stabilization but precision can be expected to be fairly good once the time base has settled.
Understanding these instrument specifics can greatly help in getting a good feel for interpreting a measured value.