AVS1996 Session VT-MoA: Calibration and Precision Gauging
Monday, October 14, 1996 1:30 PM in Room 104A/B
Monday Afternoon
Time Period MoA Sessions | Abstract Timeline | Topic VT Sessions | Time Periods | Topics | AVS1996 Schedule
Start | Invited? | Item |
---|---|---|
1:30 PM | Invited |
VT-MoA-1 Accurate Vacuum Measurements; How and Why
C. Tilford (National Institute of Standards & Technology) The increasing economic importance and complexity of vacuum processing is focusing attention on the importance of product quality and efficient control of manufacturing processes. Attainment of these goals, and the acquisition of reliable data in all aspects of vacuum science and technology, require reliable measurements. The definition of reliable depends on the application, but generally it comes to the question of traceability, i.e., how are vacuum measurements related to the accepted system of physical measurements (the System International). Although vacuum measurement research has been, from the beginning, fundamental to the development of vacuum science and technology, the infrastructure to support traceable measurements is still in an early stage of development, so this question has been little explored. In the broadest and most practical sense this exploration must extend far beyond a discussion of primary standards and calibration laboratories. This talk will discuss traceability, primary standards and calibrations, but it will also discuss the issues of "alternate" calibration techniques, instrument performance and proper application of instruments (i.e. what is the relationship between the instrument reading and the quantity of interest). |
2:10 PM |
VT-MoA-3 Fabrication and Test of an Automatic Static Expansion System for Vacuum Gauge Calibration
M. Hirata (Electrotechnical Laboratory, Japan) An automatic static expansion system was fabricated and tested. It is suitable for a calibration of a gauge, such as a spinning rotor gauge and a diaphragm gauge, in middle and high vacuum. It consists of four vacuum chambers, 6L, 0.15L, 11L and 160L connected with pneumatic valves in series. They are evacuated by three turbo-molecular pumps. The biggest one, of which ultimate pressure is 1.5x10\super -7\Pa, is a main chamber for the calibration. A standard low pressure from 10\super 2\ to 10\super -5\Pa is generated in the chamber by multi-expansion of test gas packed in the smallest one by valve operations. A primary diaphragm gauge(FS 133kPa), calibrated by using a dead weight meter in the pressure from 5x10\super 3\ to 10\super 5\Pa, is attached to the 6L chamber for the measurement of the initial pressure before expansion. The gauge is also used to measure volume ratios of chambers by expansions of gas packed in 6L chamber to other chambers. Expansion ratio from 0.15L to 11L chamber is 60, and to 11L and 160L chamber is 900. Valve operations and precious data acquisitions are done by using a personal computer automatically in order to aboid wrong operation, which decreases the reliability of the calibration and damages gauges and pumps due to over stress. The uncertainty of the system to calibrate gauges is estimated 0.5 to 2%. Detail of the system and calibration results of gauges are presented. |
|
2:30 PM |
VT-MoA-4 A Novel Primary Pressure Standard for Calibration in the mTorr Range
L. Hinkle, D. Surette (MKS Instruments, Inc.) Primary Standards for the measurement of gas pressure in the range from 1 to 100 mTorr (~ 0.1 to 10 Pa) have received a significant amount of attention in recent years. The interest in the calibration capability in this vacuum range is, to some degree, driven by the increasingly stringent requirements for the semiconductor processing industry. A novel, primary technique for generating known pressures throughout the mTorr range is presented here. The standard is based on calculating the gas pressure required to restore a diaphragm to a null position while the diaphragm is tilted to various, measured inclinations. With this as a basis, the calculations are relatively simple; there are no gas property dependent effects; and the system design is simple, robust, and easily automated. A formal evaluation of uncertainty for the system developed in this facility is reviewed and compared with other primary standards. For the 1 to 100 mTorr range, there are numerous practical advantages of this standard relative to liquid manometers, deadweight testers, volume expansion systems, and conductance-based systems. It is intended that this technique will enable more widespread capability for the calibration and verification of vacuum gauging in a pressure range of critical interest to many processes. |
|
2:50 PM |
VT-MoA-5 Measurement Performance of Capacitance Diaphragm Gages and Alternative Low-pressure Transducers
A. Miiller (National Institute of Standards & Technology) The measurement performance of low-pressure transducers is influenced by several factors. The most important of these for capacitance diaphragm gages (CDG's) are short-term instabilities in the zero-pressure readings, long-term shifts in the transducer calibration with time, and the effect of thermal transpiration at pressures below 100 Pa. A study of 29 CDG's of the type currently being used by calibration laboratories as transfer standards has shown that zero instabilities are strongly correlated with changes in room temperature. Repeat calibration data accumulated at NIST during the past 18 years have been analyzed. The analysis of nearly 300 calibration records indicates that the shifts in CDG response function with time are highly gage dependent and differ significantly for gages with different full-scale ranges, the largest shifts occurring for gages with the lowest full-scale range. Several hybrid CDG systems have been developed at NIST, using thermoelectric heating/cooling modules to regulate the CDG temperature. The data obtained demonstrates that this approach can improve the zero stability of CDG's and, when control is maintained near room temperature, minimize the effect of thermal transpiration. Low-pressure transducers based on other technologies (e.g., quartz-spiral tube, resonant structures, etc), which have become available in ever decreasing full-scale ranges (10 kPa or less), may provide a viable alternative to the CDG for selected measurement applications. Limited data obtained at NIST on their measurement performance will be presented as well. |
|
3:10 PM |
VT-MoA-6 Cold-Cathode Gauges for Ultra-High Vacuum Measurements
B. Kendall (ELVAC Laboratories); E. Drubetsky (Televac Division of The Fredericks Company) Eleven cold-cathode gauges have been evaluated on an ion-pumped UHV system operating at pressures down to the 10\super -11\ Torr range. The test gauges included magnetrons, inverted magnetrons and double inverted magnetrons from three different manufacturers as well as experimental variable-geometry gauges built especially for this project. Spinning rotor and extractor gauges were used for calibration. The investigation covered output linearity over the 10\super -10\ Torr to 10\super -4\ Torr range, stability over periods up to 25,000 hours of low-pressure operation, tests for discontinuities in the current-pressure characteristics, stray magnetic field measurements, susceptibility to external magnetic fields, outgassing effects, and starting behavior at very low pressures. Our conclusion is that modern cold-cathode gauges are capable of giving far more accurate results that were possible with earlier Penning-type designs. Because of their extremely low outgassing rates, and their relative freedom from X-ray and electron-stimulated desorption errors, they may in practice give results at low pressures which are more accurate than those obtained with typical hot-cathode gauges. |
|
3:30 PM |
VT-MoA-7 Linearization of the Spinning Rotor Gage Response for Pressures up to 100 Pa
J. Looney, J. Setina (National Institute of Standards & Technology) The Spinning Rotor Gage (SRG) determines the local gas pressure by measuring the rate of deceleration (or torque) on a freely suspended rotor. As the torque on a freely rotating sphere at low pressures (< 0.1 Pa) is directly proportional to the local gas density, the SRG exhibits a linear pressure response in this regime. However, in the high pressure limit, (~100 Pa) the torque on a freely rotating sphere depends only on the gas viscosity and geometrical factors, and hence the SRG losses all sensitivity to changes in the ambient pressure. In between these two limits, SRG controllers determine the pressure by the use of a linearization algorithm which is based on the work of Lindenau and Fremmery [1]. We have investigated the response of the SRG for pressures up to 100 Pa and have developed an improved algorithm for linearization of the SRG response which is straightforward to implement. The details of the observed gage behavior in the transitional regime, the accuracy of the linearization algorithm and the overall accuracy of pressure measurements using an SRG in this regime will be discussed. [1] B.E. Lindenau and J.K Fremerey, J. Vac. Sci. Technol. A9 (1991) 2737. |
|
3:50 PM |
VT-MoA-8 A Precision Gas Flowmeter for Vacuum Calibration
P. Levine, J. Sweda (Lockheed Martin Missiles and Space) The Primary Standards Laboratory at Lockheed Martin Missiles and Space has developed an orifice flow vacuum calibration station for in-house calibration of spinning rotor and ion gages. A flowmeter supplies gas to a vacuum chamber partitioned by an orifice of calculable conductance. The pressure generated within the chamber is defined by the flowrate and the orifice conductance for molecular flow conditions. Orifice conductance is calculated from dimensional measurements, the flowrate is determined by measuring pressure drop at constant volume within the flowmeter. The design, development and operation of the flowmeter is fully explained and illustrated. Chamber pressures as predicted from flowrate measurements are compared to those measured using a spinning rotor gage calibrated at the National Institute of Standards and Technology. Data for the extension of the calibration range beyond that of spinning rotor gages will be presented. An analysis of uncertainties inherent to the flowmeter is also presented. |
|
4:10 PM |
VT-MoA-9 International Comparison of Leak Standards using Calibrated Capillary Leaks
S. Tison (National Institute of Standards & Technology); M. Bergoglio, G. Rumiano (Istituto di Metrologia "G. Colonnetti", Italy); P. Mohan, A. Gupta (National Physical Laboratory, India) Primary leak standards are maintained in industrial countries for calibration of leak artifacts for support of national industries which desire leakage quantification. Leak or low-flow quantification is a requirement for many applications in nondestructive testing of pressure vessels, chemical or nuclear containment vessels, sealed electonic devices, and vacuum systems. To support the requirements for quantitative leak testing, national metrological institutes such as the National Institute of Standards and Technology (NIST), the Instituto di Metrologia "G. Colonnetti (IMGC), and the National Physical Laboratory, India (NPL) maintain primary leak standards for calibration of leak artifacts. A comparison of primary leak standards of these laboratories has been performed over a range of 7x10-12 to 3x10-9 mol/s with helium, and over more limited ranges with nitrogen and argon gases by repeated calibration of two metal capillary leaks. Results of the comparisons of the primary standards show that the three laboratories agree to within a few percent over the range of the study. |
|
4:30 PM |
VT-MoA-10 Calibration of an Axial-symmetric Transmission Gauge in Ultrahigh and Extreme High Vacua
H. Akimichi, T. Arai, K. Takeuchi, Y. Tuzi (ULVAC Corporation, Japan); I. Arakawa (Gakushuin University, Japan) An axial-symmetric transmission gauge (AT gauge) is a new extractor type ionization gauge with a Bessel box-type energy filter. The filter, placed between the ionizer and the ion detector, eliminates the effects of the electron stimulated desorption (ESD) ions and the soft X-rays emitted from the ionizer surface. The optimization of the structures of the ionizer and the filter was made by comparing the results of computer simulation and experiment. At present, it is estimated that the gauge is applicable for the pressure measurement at 10\super -12\ Pa or lower. We describe here the sensitivities of an AT gauge for hydrogen measured by the conductance modulation method (CMM) with variable conductance at the pressure range from 10\super -10\ to 10\super -6\ Pa. The CMM was developed originally for the measurement of the pumping speed and used later for the gauge calibration. The apparent sensitivity of the gauge, which was averaged over the whole pressure range, was about (2.4\+-\0.2)x10\super -3\Pa\super -1\. It was observed that the sensitivity increased with decrement of pressure. This pressure dependence was supposed to be caused by the outgassing from the ion detector (CERATRON) and the ionizer. The outgassing rate from the ion detector was much higher than that from the ionizer and the main gas component by ion bombardment was methane. As the outgassing rate depends on the electron current and the ion current, we can estimate the rates from the ionizer and from the ion detector separately through the measurement with various electron current. The intrinsic sensitivity of the gauge corrected the effects of outgassing rate is (2.3\+-\0.1)x10\super -3\Pa\super -1\ and is constant throughout the pressure from 10\super -10\ to 10\super -6\Pa. |