|
Q. How should the input integration time
parameter (int.t) be set on the UDM meter?
The UDM samples the input signal with a sampling
frequency from 256 to 2560 Hz which corresponds a sampling period
from 100 to 1000 ms. The measuring error is minimised if the sampling
period is an integer multiple of the input waveform period. The meter
analyses the zero crossing of the waveform to determine its period
and, if int.t is set to Auto, adjusts its sampling period accordingly.
If the waveform is distorted, or has a DC component, it is not possible
to analyse correctly the zero crossing and therefore a measurement
error could be introduced. To avoid this, the sampling period must
be manually set (int.t parameter). In case of DC measurement, a manual
integration time can be used in case of an overlapped ripple to improve
the accuracy. int.t must always be an integer multiple of the waveform
or ripple.
|
|