This is called "aliasing".
Cheers,
Please explain this phenomenon? I have never heard of "aliasing" before. I have heard of quantizing distortion, during the analog-to-digital conversion of much higher frequency analog signals.
I thought I did explain part of this. Be aware, I've been out of the lab for about ten years, now. Some equipment has no doubt gotten "smarter"
But, way back when our labs were starting to convert from analog scopes and other measuring equipment, to digital, were were taught by the equipment reps about what they termed "aliasing".
Digital equipment does not have a direct link to the display that you see. The input being measured is "sampled". There are several functional blocks in the equipment that take care of certain functions, and these pass data between them to get the overall job done.
The A to D converters, have a clock frequency (sometimes interactive with the time base selection on scopes). This freezes the data read on the converter, for a duration that allows it to be passed on to a buffer for computational analysis, or a display processor. So the actual voltage level being measured is sampled at some rate an interval. The display processor has it's own update interval, which may or may not be in sync with the voltage sampling.
The display has a response time. LCD displays were often slow, which doesn't matter as the human eye is slow, too. So sample rates can be higher that the display rates. If the voltage you are measuring has a periodic component. It may or may not be in sync with the display interval, or the sampling rate of the A to D converters. So, measuring a signal or changing input level accurately will depend on the frequency response and clock frequencies of the sampling equipment distributed among the functional blocks.
An example. Lets say that that your display is updated once per second. You provide a measuring signal to test, that changes from 12 v to 13 volts every 1 second. Your display could read either 12v or 13v depending on how the two independent time bases synchronizes or "beats" against each other. What you see is an "alias" of the actual signal or level on the input line. Equipment such as this may be perfectly adequate for the task, provided the line level is not critical in operation given 12 or 13 V.
Generally, you want all the internal house keeping and transfer of data between the functional blocks of the equipment to be as fast as possible, or at least faster than the subject being tested. But speed costs money. And cheaper equipment is usually cheaper for a reason. So, you have to be wary that your equipment doesn't present an alias of what is really present on the line input.
It gets far more complicated with oscilloscopes where you can control sampling time bases, buffer sizes allotments, etc.
Does this help?