Time filter

Source Type

Webster E.,New Zealand Institute for Industrial Research and Development | White D.R.,New Zealand Institute for Industrial Research and Development
Metrologia | Year: 2015

The inhomogeneities within a thermocouple influence the measured temperature and contribute the largest component to uncertainty. Currently there is no accepted best practice for measuring the inhomogeneities or for forecasting their effects on real-world measurements. The aim of this paper is to provide guidance on the design and performance assessment of thermocouple inhomogeneity scanners by characterizing the qualitative performance of the various designs reported in the literature, and developing a quantitative measure of scanner resolution. Numerical simulations incorporating Fourier transforms and convolutions are used to gauge the levels of attenuation and distortion present in single- and double-gradient scanners. Single-gradient scanners are found to be far superior to double-gradient scanners, which are unsuitable for quantitative measurements due to their blindness to inhomogeneities at many spatial frequencies and severe attenuation of signals at other frequencies. It is recommended that the standard deviation of the temperature gradient within the scanner is used as a measure of the scanner resolution and spatial bandwidth. Recommendations for the design of scanners are presented, and include advice on the basic design of scanners, the media employed, operating temperature, scan rates, construction of survey probes, data processing, gradient symmetry, and the spatial resolution required for research and calibration applications. © 2015 BIPM & IOP Publishing Ltd.

White D.R.,New Zealand Institute for Industrial Research and Development | Tew W.L.,U.S. National Institute of Standards and Technology
International Journal of Thermophysics | Year: 2010

In 2006, the CIPM clarified the definition of the kelvin by specifying the isotopic composition of the water to be used in the realization of the triple point. At the same time, the Consultative Committee for Thermometry gave recommended values for the isotopic correction constants to be used for water departing from the specified composition. However, the uncertainties in the values for the correction constants were undesirably large due to unresolved differences between the data sets from which the values were determined. This paper derives improved values of the constants by considering additional data from isotopic fractionation measurements and the heats of fusion and freezing points of the relevant water isotopologues. Values of the corrections determined from the expanded data are A D = 671(10) μK, A 18O = 603(3) μK, and A 17O = 60(1) μK. A typical correction made with these values lies just within the expanded uncertainty (k = 2) of the corrections made with the older values, but has about half the uncertainty. © Springer Science+Business Media, LLC 2010.

White D.R.,New Zealand Institute for Industrial Research and Development
Metrologia | Year: 2012

This paper discusses the effects of non-linearity, some of the mechanisms responsible for non-linearity, and methods for measuring non-linearity in Johnson noise thermometry. Mechanisms considered include quantum tunnelling, bipolar junction transistor and junction field-effect transistor amplifiers, feedback, clipping, output-stage crossover, quantization and dither. It is found that even- and odd-order effects behave differently in correlator-based noise thermometers, with the dominant even-order effects contributing as intermodulation products whereas the dominant odd-order contributions are third-order and at the same frequencies as the parent signals. Possible test methods include the use of discrete tones, changes in spectral shape, and direct measurement using reference noise powers. For correlators operated at constant noise power, direct measurement of non-linearity using reference noise powers enables corrections to be made with negligible additional uncertainty and measurement time. © 2012 BIPM & IOP Publishing Ltd.

White D.R.,New Zealand Institute for Industrial Research and Development
Metrologia | Year: 2016

Measurement uncertainty is a measure of the quality of a measurement; it enables users of measurements to manage the risks and costs associated with decisions influenced by measurements, and it supports metrological traceability by quantifying the proximity of measurement results to true SI values. The Guide to the Expression of Uncertainty in Measurement (GUM) ensures uncertainty statements meet these purposes and encourages the world-wide harmony of measurement uncertainty practice. Although the GUM is an extraordinarily successful document, it has flaws, and a revision has been proposed. Like the already-published supplements to the GUM, the proposed revision employs objective Bayesian statistics instead of frequentist statistics. This paper argues that the move away from a frequentist treatment of measurement error to a Bayesian treatment of states of knowledge is misguided. The move entails changes in measurement philosophy, a change in the meaning of probability, and a change in the object of uncertainty analysis, all leading to different numerical results, increased costs, increased confusion, a loss of trust, and, most significantly, a loss of harmony with current practice. Recommendations are given for a revision in harmony with the current GUM and allowing all forms of statistical inference. © 2016 BIPM & IOP Publishing Ltd.

Saunders P.,New Zealand Institute for Industrial Research and Development
International Journal of Thermophysics | Year: 2014

A detailed analysis of the double-wavelength radiation thermometry technique for determining the thermodynamic temperature is presented. This technique provides an alternative method to absolute filter radiometry without the requirement of traceability to the watt. The analysis derives an algebraic expression for the uncertainties in the temperatures measured with the double-wavelength technique, which shows that the optimum strategy is to employ one narrowband and one broadband spectral responsivity, and that the center wavelengths do not need to be widely separated. With current best estimates for signal and spectral responsivity measurements, it is shown that the double-wavelength method can achieve total uncertainties only about four times larger than the current best absolute radiometric methods. Improvements in the signal measurement in the future could possibly reduce the total uncertainty to a level comparable to absolute radiometry. © 2014 Springer Science+Business Media New York.

White D.R.,New Zealand Institute for Industrial Research and Development
International Journal of Thermophysics | Year: 2015

It is well known that a single negative-temperature-coefficient thermistor can be linearized over a narrow temperature range by connecting a single resistor in parallel with the thermistor. With the linearizing resistor properly chosen for the operating temperature, the residual errors are proportional to the cube of the temperature range and have a peak value of about 0.2∘C for a 30∘C range. A greater range of temperatures can be covered or greater linearity be achieved by cascading thermistor–resistor combinations. This paper investigates the limits of the linearity performance of such networks by using interpolation to model their behavior. A simple formula is derived for estimating the residual non-linearity as a function of the number of thermistors, the temperature range covered by the network, and the constant characterizing the exponential temperature dependence of the thermistors. Numerical simulations are used to demonstrate the validity of the formula. Guidelines are also given for circuit topologies for realizing the networks, for optimizing the design of the networks, and for calculating the sensitivities to relative errors in the component values. © 2015, Springer Science+Business Media New York.

White D.R.,New Zealand Institute for Industrial Research and Development | Mason R.S.,New Zealand Institute for Industrial Research and Development
International Journal of Thermophysics | Year: 2011

Conventionally, the metal fixed points of indium, tin, zinc, aluminum, and silver are realized with two solid-liquid interfaces: one on the inner surface of the outside wall of the crucible and one around the thermometer well. Investigations into the practicality of inducing a single interface around the thermometer well suggest that the Gibbs-Thomson effect, due to the interface-surface curvature associated with multiple small nucleation sites on the thermometer well, is an influence effect capable of causing variations of several hundred microkelvins. Fixed-point initiation methods must therefore ensure that a complete solid-liquid interface is formed around the thermometer well. The experiments show that a single interface is satisfactory, and that there is no evidence for dendrite growth causing thermal coupling between the furnace and the thermometer. © 2010 Springer Science+Business Media, LLC.

White D.R.,New Zealand Institute for Industrial Research and Development | Fischer J.,Physikalisch - Technische Bundesanstalt
Metrologia | Year: 2015

The 26th General Conference on Weights and Measures (CGPM) will redefine the Kelvin by fixing the value of the Boltzmann constant in 2018 if the plans of the International Committee on Weights and Measures (CIPM) are fulfilled. This will replace the current definition of the Kelvin based on the triple point of water (TPW), which has been in use since 1954. The change in definition is not expected to make an immediate or significant difference to measurement practice, except perhaps at very low and very high temperatures. This change will enable much greater improvements in practical thermometry in the long-term.

Saunders P.,New Zealand Institute for Industrial Research and Development
AIP Conference Proceedings | Year: 2013

The majority of general-purpose low-temperature handheld radiation thermometers are severely affected by the size-of-source effect (SSE). Calibration of these instruments is pointless unless the SSE is accounted for in the calibration process. Traditional SSE measurement techniques, however, are costly and time consuming, and because the instruments are direct-reading in temperature, traditional SSE results are not easily interpretable, particularly by the general user. This paper describes a simplified method for measuring the SSE, suitable for second-tier calibration laboratories and requiring no additional equipment, and proposes a means of reporting SSE results on a calibration certificate that should be easily understood by the non-specialist user. © 2013 AIP Publishing LLC.

White R.,New Zealand Institute for Industrial Research and Development
Accreditation and Quality Assurance | Year: 2011

The latest version of the International Vocabulary of Metrology gives a meaning of measurement restricted to quantities that can be represented by numerical values and placed in an ordinal sequence. This restrictive definition fits poorly with both the colloquial and the wider scientific understanding of measurement. This paper suggests an extension to the metrological definition of measurement, based on the measurement classification scheme of Stevens, to incorporate non-numerical and nominal measurements. The more inclusive definition and the classification scheme offers insights into the utility, metrological traceability, and limitations of measurements and uncertainty treatments, and enables clarification of other measurement-related definitions. © 2010 Springer-Verlag.

Loading New Zealand Institute for Industrial Research and Development collaborators
Loading New Zealand Institute for Industrial Research and Development collaborators