Introduction
Two of the most important parameters to specify for a Digital Storage Oscilloscope are the length of its acquisition memory and the amount of RAM memory that can be applied to calculating answers from the raw data. The amount of acquisition memory in many cases determines the fidelity with which the scope can record a signal. But recording the signal is only the first step. The key to finding signal aberrations, characterizing circuit performance and making the wide variety of measurements that have made digital scopes popular is in the processing horsepower of the oscilloscope.
Capturing a Signal
The maximum time window that can be captured by a digital oscilloscope using a sampling period delta t is :
$$Time Window = delta t x Acquisition Memory Length$$
Where "Acquisition Memory Length" is the number of samples that can be captured in the data acquisition memory. Since the acquisition memory length is a fixed amount, the only way to capture longer time windows is to make the period between samples longer (see Figure 1). For example, a scope with 100k of memory and a sampling period of 2 nsec (500 MS/s) can capture a total time window of 0.2 msec at that sampling rate. If the user wanted to see a 4 msec signal using 100k samples, the points would have to be stretched farther apart to 20 nsec per sample (25 MS/s). This means the accuracy of timing measurements is degraded by a factor of 20 and many signal details are lost. Any frequency above 12.5 MHz (one half the sample rate) will be aliased. Many DSO users believe the ADC in a scope determines its sampling rate. They don't realize the acquisition memory length also plays a vital role. The current state of the art for acquisition memory is 2 million sample points per channel. In the example just given, a full 4 msec can be captured at 2 nsec per sample using 2 million points. The bottom line is that a scope putting 2 million points on a signal will give you 40 times better timing accuracy, a much better view of your signal and more usable bandwidth than one which uses 100k points on the same signal.
How To Use Long Memory To Spot Signal Irregularities
One of the prime purposes of an oscilloscope is to troubleshoot problems. The toughest problems are ones which occur infrequently. Scope vendors have been working hard to help the engineer with this task. One recently introduced scope has a chip set that can quickly acquire many triggers and display a view of them in color persistence mode. Less frequent events come out in a different color than common events but few analysis tools are available. There is a further limitation that only 500 points can be acquired. This means the signal must have a short simple shape or that the sampling rate must be reduced to record long events (with the danger that signal details and glitches will be missed between samples). Long memory can be used in a different way to attack this problem. Suppose the symptom is occasional misbehavior of a clock. The nature of the problem is unknown so there is no a prior knowledge that would allow the engineer to set up a special trigger (based amplitude, risetime, width, etc. ). The user can simply use auto trigger, acquire 2 million samples of continuous clock data (per trigger) and then histogram the pulse amplitudes, widths, rise times, areas or other parameters of interest. A single trigger with 2 million data points will have as much information as 4,000 trigger of 500 points each. In just a few triggers the user gets enough data to see the nature of the irregularities.
With this method there is measurable information as to the number of occurrences of each type of wrongly shaped clock pulse. Figure 3 shows what the results might look like if the clock synchronizer occasionally chopped a clock pulse. There are rare pulses which are very short followed by a second pulse with a glitch. The histogram of 993 sweeps quickly acquires 7046 pulses. The lowest width is 7.4 nsec, the average is 50 nsec and the high is 56.2. Note that the vertical scale is logarithmic. There are 12 bad clocks with 7.4 nsec width, 12 with 56.2 nsec width and 7012 with the normal 50 nsec width. The user can measure the ratio of good to bad pulses, make an adjustment and see if it has a measurable effect or use the data in the histogram to set up a special trigger based on width in order to troubleshoot the problem.
The Effects of Long Memory on Frequency Domain Measurements
One of the most common options in Digital Scopes is FFT (Fast Fourier Transform) capability. Since the Fourier transform in a DSO comes from a set of discrete points (with sampling period delta t) the information in the frequency domain is also a discrete set of points (whose spacing is Delta f). The resolution in the frequency domain is determined by two factors, the frequency span being measured and the number of points within that span. Nyquist's theorem determines the range of frequencies that can be measured. They range from DC to one half the sampling rate at which the data was captured. A Fourier transform of an array of N time domain data points produces N/2 frequency domain points within the range of frequencies between DC and the Nyquist frequency. so the frequency resolution of the FFT is
delta f = (1/2) Sampling Rate
(1/2) Number of Points
input to the FFT algorithm
The two factors of 1/2 cancel giving a resolution equal to the Sampling Rate divided by the number of points input to the FFT. Obviously it is important to capture the data at a high sampling rate. Long data acquisition memory plays an important role here since it allows the scope to have a fast sampling rate for a longer period of time. It is also clear that we can put more data points into the FFT algorithm if we capture more points in a long memory scope. But just capturing the points isn't enough. The DSO needs to have the processing horsepower to actually compute the FFT on a long data array.
As an example, one recently introduced digital scope can capture up to 500,000 points of data on a signal, but the FFT processing in the scope is limited to the first 10,000 points captured. This loses a factor of 50 in the resolution of the FFT compared to a scope which can process the complete 500,000 points. This is a tremendous loss in frequency information. Why would a vendor do this ? The answer lies in the next important facet of memory in a DSO. An FFT calculation is complex and may require ten times as much RAM in the processing memory as the number of points input to the FFT algorithm. To perform an FFT on a 500,000 point waveform may require 5 Mbytes of RAM. You also need a fast powerful processor and numerical coprocessor to handle long data arrays. Both the RAM and the processor/coprocessor add considerable cost to a scope.
Figure 4 shows the difference made by this trade off between price and performance. On the left an FFT is performed on the first 10,000 points of a waveform. On the right 1,000,000 samples are captured on the same waveform and an FFT is performed on the entire record. Both sets of data are captured at 500 MS/s sampling rate (so the highest frequency component measured is 250 MHz). The difference in frequency resolution is a factor of 100 (50 kHz vs 500 Hz). The frequency peaks on the bottom of the left screen image are very broad. In fact there is only a single point on each of them. On the right, the peaks are seen more accurately as being very narrow. In fact, the first peak is really two peaks at closely spaced frequencies. Those two peaks could not be resolved by the 10 K point FFT.
Capturing a Sequence of Signals with Minimum Dead Time
Many digital scopes have a mode which allows the user to segment the data acquisition memory into separate pieces. Each time the scope triggers, data is written into one of the segments of the memory then the scope quickly rearms itself for the next trigger. This process is useful in capturing signals where the dead time between events can use up the memory of the scope in recording uninteresting information. An example would be 5 nsec width laser pulses happening 200 times per second. If a scientist is examining a sample with a short lifetime, he would want to capture every laser pulse, but not the dead time between pulses. This mode is also useful for recording short irregularly spaced events such as intermittent failures in a circuit set to run an overnight test or capturing the effects of lightning bursts on communications lines. The user cannot predict when the event will occur and it is important not to miss any occurrences.
It is easy to see how the benefits of long memory be applied to using a fast capture-and-store mode as described above. The power of this triggering mode comes from dividing the acquisition memory into separate blocks which will each record one event. A longer memory allows the user to have more blocks and gives the flexibility to put a larger amount of memory into each block (thereby getting more points on each event of interest). A LeCroy 9300 series scope can record as many as 2,000 separate triggers in Sequence mode and use a total memory length up to 8,000,000 points. Also, each trigger has its own time stamp to mark when the event occurred. In contrast, competitors with 500,000 data acquisition points only have the capability to capture 910 events with a smaller amount of memory for each event and do not save the time of occurrence.
How Long Memory Affects Peak Detection
In Peak Detect mode a digital scope will keep its ADC running at the full sampling rate even on slow timebases. In the first part of this article it was shown that this type of operating mode would require more memory than was available. Let's suppose the timebase is 2 msec/div (20 msec for the full screen) and the sampling rate of the ADC is 500 Ms/s (2 nsec per sample). In order to store the entire 20 msec of data the scope would require 10,000,000 memory points. In Peak Detect mode, a scope with 100,000 memory points would have its ADC sample and measure all 10,000,000 points but only store 100,000 of them. The 100,000 that are stored are chosen by looking at smaller bins of data and choosing the maximum and minimum (peak) values to be saved. In this example, the ratio between the number of measurements by the ADC and the number of memory points that can be stored is 100/1. So the scope could examine each group of 200 points as they are acquired and save only 2 points (the maximum and minimum values). In Peak Detect mode the timing resolution of the scope is severely compromised even though the ADC is running at its full rate because the user does not know whether the peaks occurred at the beginning, middle or end of the group from which they were saved. Normally it is not possible to perform math operations on peak detect waveforms because of the uncertainty of the time at which the samples were taken.
A long memory scope brings three benefits to peak detect mode. The first is that the longer memory can record normal data at full sampling rate for a longer period of time so the user isn't forced to resort to peak detect mode. Secondly, when in peak detect mode, the longer memory scope can save a larger proportion of the data (perhaps it can save 1,000,000 out of 10,000,000 samples rather than only 100,000). A third benefit is found only in LeCroy scopes. In peak detect mode a 9300 scope will allocate half of its memory to storing the data peaks as described above and the other half to sampling data in normal fashion. This allows the customer to view all the peaks and still be able to perform math. An example might be an engineer who is recording voltage on channel 1 and current on channel 2. On a third trace he displays power (channel 1 X channel 2) which is the real waveform of interest. But it is also important for him to know if there are any spikes in the current or voltage waveforms. With a 9300 series scope, he can do the whole job. In buying $5,000 computers we have become very expert at getting to know the amount of RAM, how much cache memory there is for the processor and the amount of local RAM on the video board. Those memories are very important to the power of the computer. The same is true with digital scopes. There is data acquisition memory, processing RAM, display memory and storage memory for both waveforms and front panel setups. The power of the scope is in its ability to capture, view, measure, analyze and archive signals--all of which are tied to its memory.