File information: | |
File name: | 4452_Five_Ways.pdf [preview 4452 Five Ways] |
Size: | 252 kB |
Extension: | |
Mfg: | Keithley |
Model: | 4452 Five Ways 🔎 |
Original: | 4452 Five Ways 🔎 |
Descr: | Keithley Appnotes 4452_Five_Ways.pdf |
Group: | Electronics > Other |
Uploaded: | 03-02-2020 |
User: | Anonymous |
Multipart: | No multipart |
Information about the files in archive: | ||
Decompress result: | OK | |
Extracted files: | 1 | |
File name 4452_Five_Ways.pdf Five Ways to Shave Test Time By Doug Rathburn Keithley Instruments, Inc. If your rack and stack instruments seem slow, it may be the default settings you are using. In manufacturing, any bottleneck is objectionable, but it really stands out in a test station at the end of a production line. This puts a lot of heat on the engineer responsible for test station throughput. Those who find themselves in that position may need to look for ways of shaving test time. The author, an application engineer, has found that the problem often is a matter of the test instrument being used with default settings from the factory. There are five widely used settings that can be adjusted to speed up measurements, but they must be balanced with accuracy requirements. The five instrument settings involve: 1. signal integration period 2. auto-zero function 3. triggering functions 4. digital filtering 5. auto-ranging Since most production test systems are automated under PC control, data traffic on an external (usually, IEEE-488) bus should also be considered. The way the system is programmed to use the external data bus has significant effects on test cycle time. (While the tips in this article are aimed at rack-and-stack instruments connected on a GPIB bus, most of them also apply to stand alone bench-top instruments in a wide variety of applications.) Default Instrument Settings Knowing that usability with a front panel is important, most instrument manufacturers use default settings that are user-friendly. Generally, this means that any instrumentation or data acquisition hardware configured with a front panel will run relatively slow as shipped from the factory. While an instrument's speed and accuracy specs may be publicized and well known, the manufacturer has to reckon with what the user sees on the panel. For example, if you purchase an instrument and out of the box it is reading 2000 samples per second, your eyes would not be able to distinguish the data on the panel display. This would disturb many users; some might even think the instrument is defective. To a test engineer craving high throughput, user-friendly, default settings that allow front panel readings can be frustrating. Fortunately, the five settings listed earlier can be used to manipulate the sample rate. Signal Integration Period: A major component in total test time is how long it takes the analog-to-digital converter (ADC) to acquire the data. With respect to integrating ADCs, which are common in most rack-and-stack hardware, the acquisition time typically is expressed in terms of the number of power line cycles (NPLCs). The reason for this type of measurement is because line cycle noise is periodic, so integrating several samples allows it to be subtracted from the digitized data. Most instruments are shipped from the factory with NPLC set to 1.0, i.e., th |
Date | User | Rating | Comment |