Measurement Laboratory

Taiyuan, China

Measurement Laboratory

Taiyuan, China

Time filter

Source Type

News Article | May 17, 2017
Site: www.prnewswire.com

"Dr. Locascio brings just what our research enterprise needs to thrive in this increasingly competitive environment," said University of Maryland President Wallace D. Loh. "In addition to her great scholarship and innovation, she has a remarkable ability to create strategic partnerships. Laurie will lead us into a vibrant research future." Locascio is currently acting as the Principal Deputy Director and Associate Director for Laboratory Programs at NIST providing leadership and operational guidance for NIST's seven scientific and mission-focused laboratories. "It is an honor to be joining one of the nation's top research universities to spearhead transformative work," said Locascio. "I look forward to leading UMD's research enterprise of more than a half a billion dollars directed at groundbreaking work taking place across campus." Previously, Locascio directed the Material Measurement Laboratory (MML), one of NIST's largest scientific labs, overseeing 1,000 research staff in eight locations around the U.S. and a $170M annual budget. As MML Director, she recruited top talent, fostered excellence, and built a collegial and collaborative workplace. She implemented strategic partnerships with universities, industry, and other government labs, including a partnership with UMD's Institute for Bioscience and Biotechnology Research at Shady Grove. Prior to that, Locascio served as chief of the Biochemical Sciences Division in the MML. Locascio's most recent honors and awards include the 2017 American Chemical Society Earle B. Barnes Award for Leadership in Chemical Research Management, and the 2017 Washington Academy of Sciences Special Award in Scientific Leadership. A Fellow of the American Chemical Society, of the American Institute for Medical and Biological Engineering, and of the Washington Academy of Sciences, she has published 115 scientific articles and has been awarded 11 patents. Locascio received a B.S. in chemistry from James Madison University, a M.S. in bioengineering from the University of Utah, and a Ph.D. in toxicology from the University of Maryland, Baltimore. About University of Maryland The University of Maryland, College Park is the state's flagship university and one of the nation's preeminent public research universities. A global leader in research, entrepreneurship and innovation, the university is home to more than 37,000 students, 9,000 faculty and staff, and 250 academic programs. Its faculty includes three Nobel laureates, three Pulitzer Prize winners, 56 members of the national academies and scores of Fulbright scholars. The institution has a $1.9 billion operating budget, and secures $560 million annually in external research funding. For more information about the University of Maryland, College Park, visit www.umd.edu. To view the original version on PR Newswire, visit:http://www.prnewswire.com/news-releases/university-of-maryland-names-laurie-e-locascio-vice-president-for-research-300459259.html


That's why the properties of light reflected at different angles and intensities are enormously important, for example, to the way cars are painted, fabrics are made, plastics and coatings are colored, printed materials are produced, optical systems are engineered, and remote-sensing images are created and interpreted, to name only a few. Moreover, instruments deployed for many public service and national security uses—such as satellite monitoring of weather and natural disasters, or detecting activity by potential adversaries—contain components that must be precisely calibrated against a known standard to ensure accurate characterization of the effects of light reflectance and scattering. NIST maintains the national scale for reflectance, and now is about to launch a dramatically improved system for measuring the intensity and spectrum of light reflected and scattered off samples as large as 30 cm square in practically any direction. Called the Robotic Optical Scattering Instrument (ROSI), it will offer new capabilities that are in increasing demand by industry and science, but were not previously available at NIST. "We began work to develop this system years ago," says project leader Heather Patrick of NIST's Physical Measurement Laboratory, "arriving at progressively more sophisticated designs before the current version of ROSI. By the middle of 2017 we expect to make the first set of functions available to customers. About a year later, all of the system's features will be fully operational." Perhaps most importantly, ROSI will permit both in-plane and out-of-plane measurements. (See diagram below.) In the former, the light source, the sample, and the receiver are all in the same plane; in the latter, the receiver is in a different plane. Out-of-plane measurements were not possible with ROSI's predecessor, NIST's workhorse Spectral Tri-function Automated Reference Reflectometer (STARR). "But they are important to anyone who does reflectance measurements in the field, such as those who study ocean color or infrared monitoring of heat signatures," says NIST scientist Catherine Cooksey. "In addition, they are important to 'gonioapparent' coatings—that is, coatings that reflect different colors depending on the direction of viewing or illumination, like iridescent colors. For example, car paint that seems to change colors as you move around the car." In addition, ROSI extends the range of wavelengths from the ultraviolet to the near-infrared, and can provide 100 times more incident light than STARR, enabling detailed measurements of samples with low reflectance and opening new possibilities for research. The ROSI system combines three integrated, fully automated components (see diagram). One is a laser-based light source that can be tuned to a specific desired color, intensity, and polarization before it is focused on the sample to be studied. The beam makes a spot 1 cm in diameter when shining exactly perpendicular to the surface of the sample, but broadens to an ever-wider and fainter ellipse as the angle between the source and sample becomes increasingly oblique. The sample is mounted on the end of ROSI's second major component, a 6-axis robotic arm that can move the sample into almost any angle with respect to the beam. The third component is the receiver which detects the amount of light scattered from the sample at a specific viewing angle. The receiver can be moved around the axis of the robot arm, a design that facilitates out-of-plane measurements. That capacity is critically important for characterizing the materials that satellite systems such as the long-running Landsat series —which maps changes in the global landscape—use to calibrate their on-board sensors. Those devices receive reflected and scattered light arriving from a wide array of angles, and the quality of observations depends on understanding the way that those angles affect the signal. The impact is large: Since Landsat 8's launch in 2013, more than 30 million images have been downloaded from the program site. NASA provides many of its other satellite projects with reflectance measurements, and NASA's reflectance scale is traceable to the national reflectance scale through yearly NIST calibrations. ROSI is designed to make three kinds of measurements. The least complex involves mirror samples, in which nearly all the incident light is reflected at one angle. That is the first capability that will come online later this year. The second kind goes by the name of bidirectional reflectance distribution function (BRDF), which basically means that both the angle at which the incident light strikes the sample and the angle between the sample and the receiver can be separately adjusted to measure how the changes modify the properties of the reflected/scattered light. Finally, ROSI will be able to produce "hemispherical" measurements in which light reflected from the sample is recorded at numerous points constituting a complete hemisphere and producing a comprehensive data set. "This new facility offers NIST's customers an expanded capability, and NIST itself with extended research potential," Cooksey says. "Previously, we could only measure within a single plane. Now we can measure full hemispherical space above a sample point by point with significantly increased intensity of incident light. This increases the types of materials we can now measure, such as coatings with interesting BRDFs or very dark, black samples." Explore further: Understanding the impact of snow's reflectance


News Article | August 29, 2016
Site: phys.org

It won't be a minute too soon. The ampere (A) has long been a sort of metrological embarrassment. For one thing, its 70-year-old formal definition, phrased as a hypothetical, cannot be physically realized as written: "The ampere is that constant current which, if maintained in two straight parallel conductors of infinite length, of negligible circular cross-section, and placed 1 meter apart in vacuum, would produce between these conductors a force equal to 2 x 10–7 newton per meter of length." For another, the amp's status as a base unit is problematic. It is the only electrical unit among the seven SI base units. So you might logically expect that all other electrical units, including the volt and the ohm, will be derived from it. But that's not the case. In fact, the only practical way to realize the ampere to a suitable accuracy now is by measuring the nominally "derived" volt and ohm using quantum electrical standards and then calculating the ampere from those values. In 2018, however, the ampere is slated to be re-defined in terms of a fundamental invariant of nature: the elementary electrical charge (e). Direct ampere metrology will thus become a matter of counting the transit of individual electrons over time. One promising way to do so is with a nanoscale technique called single-electron transport (SET) pumping. Specially adapted at NIST for this application, it involves applying a gate voltage that prompts one electron from a source to tunnel across a high-resistance junction barrier and onto an "island" made from a microscopic quantum dot. The presence of this single extra electron on the dot electrically blocks any other electron from tunneling across until a gate voltage induces the first electron to move off the island, through another barrier, and into a drain. When the voltage returns to its initial value, another electron is allowed to tunnel onto the island; repeating this cycle generates a steady, measurable current of single electrons. There can be multiple islands in a very small space. The distance from source to drain is a few micrometers, and the electron channels are a few tens of nanometers wide and 200 nm to 300 nm long. And the energies involved are so tiny that that device has to be cooled to about 10 millikelvin in order to control and detect them reliably. Conventional, metallic SET devices, says NIST quantum-ampere project member Michael Stewart, can move and count single electrons with an uncertainty of a few parts in 108—in the uncertainty range of other electrical units—at a rate of tens of millions of cycles per second. "But the current in a single SET pump is on the order of picoamperes [10-12 A]," he says, "and that's many orders of magnitude too low to serve as a practical standard." So Stewart, colleague Neil Zimmerman, and co-workers are experimenting with ways to produce a current 10,000 times larger. By using all-silicon components instead of conventional metal/oxide materials, they believe that they will be able to increase the frequency at which the pump can be switched into the gigahertz range. And by running 100 pumps in parallel and combining their output, the researchers anticipate getting to a current of about 10 nanoamperes (10-9 A). Another innovation under development may allow them to reach a microampere (10-6 A), in the range that is needed to develop a working current standard. "At present, we are testing three device configurations of different complexity," Stewart says, "and we're trying to balance the fabrication difficulties with how accurate they can be." In addition to its use as an electrical current standard, a low-uncertainty, high-throughput SET pump would have two other significant benefits. The first is that it might be combined with ultra-miniature quantum standards for voltage or resistance into a single, quantum-based measurement suite that could be delivered to factory floors and laboratories. The overall effort to provide such standards for all the SI base units is known as "NIST-on-a-Chip," and is an ongoing priority of NIST's Physical Measurement Laboratory. The other advantage is that an SET pump could be used in conjunction with voltage and resistance standards to test Ohm's Law. Dating from the 1820s, it states that the amount of current (I) in a conductor is equal to the voltage (V) divided by the resistance (R): I=V/R. This relationship has been the basis for countless millions of electrical devices over the past two centuries. But metrologists are interested in testing Ohm's law with components which rely on fundamental constants. An SET pump could provide an all-quantum mechanical environment for doing so. In a separate effort, scientists at NIST's Boulder location are experimenting with an alternative technology that determines current by measuring the quantum "phase-slips" they engender while traveling through a very narrow superconducting wire. That work will be the subject of a later report.


News Article | February 27, 2017
Site: phys.org

Solitons can arise in the quantum world as well. At most temperatures, gas atoms bounce around like billiard balls, colliding with each other and rocketing off into random directions, following the rules of classical physics. Near absolute zero, however, certain kinds of atoms suddenly start behaving according to the very different rules of quantum mechanics, and begin a kind of coordinated dance. Under pristine conditions, solitons can emerge inside these ultracold quantum fluids, surviving for several seconds. Curious about how solitons behave in less than pristine conditions, scientists at NIST's Physical Measurement Laboratory, in collaboration with researchers at the Joint Quantum Institute (JQI), have added some stress to a soliton's life. They began by cooling down a cloud of rubidium atoms. Right before the gas could take on uniform properties and become a homogenous quantum fluid, a radio-frequency magnetic field coaxed a handful of these atoms into retaining their classical, billiard ball-like state. Those atoms are, in effect, "impurities" in the atomic mix. The scientists then used laser light to push apart atoms in one region of the fluid, creating a solitary wave of low density—a "dark" soliton. In the absence of impurities, this low-density region stably pulses through the ultracold fluid. But when atomic impurities are present, the dark soliton behaves as if it were a heavy particle, with lightweight impurity atoms bouncing off of it. These collisions make the dark soliton's movement more random. This effect is reminiscent of Einstein's 1905 predictions about randomized particle movement, dubbed Brownian motion. Guided by this framework, the scientists also expected the impurities to act like friction and slow down the soliton. But surprisingly, dark solitons do not completely follow Einstein's rules. Instead of dragging down the soliton, collisions accelerated it to a point of destabilization. The soliton's speed limit is set by the speed of sound in the quantum fluid, and upon exceeding that limit it exploded into a puff of sound waves. This behavior made sense only after researchers changed their mathematical perspective and remembered to treat the soliton as though it has a negative mass. This is a quirky phenomenon that arises for certain collective behaviors of many-particle systems. Here the negative mass is manifested by the soliton's darkness—it is a dip in the quantum fluid rather than a tall tsunami-like pulse. Particles with negative mass respond to friction forces opposite to their ordinary cousins, speeding up instead of slowing down. "All those assumptions about Brownian motion ended up going out the window. None of it applied," says Hilary Hurst, a graduate student at JQI and lead theorist on the paper. "But at the end we had a theory that described this behavior very well, which is really nice." Lauren Aycock, lead author on the paper, lauded what she saw as particularly strong feedback between theory and experiment, adding that "it's satisfying to have this kind of successful collaboration, where measurement informs theory, which then explains experimental results." Solitons in the land of ultracold atoms are intriguing, say Aycock and Hurst, because they are as close as you can get to observing the interface between quantum effects and the ordinary physics of everyday life. Experiments like this may help answer a deep physics riddle: where is the boundary between classical and quantum? In addition, this result may cast light on a similar problem with solitons in optical fibers, where random noise can disrupt the precise timing needed for communication over long distances. Explore further: Ultra-cold atoms may wade through quantum friction


News Article | February 27, 2017
Site: phys.org

Solitons can arise in the quantum world as well. At most temperatures, gas atoms bounce around like billiard balls, colliding with each other and rocketing off into random directions. Near absolute zero, however, certain kinds of atoms suddenly start behaving according to the very different rules of quantum mechanics, and begin a kind of coordinated dance. Under pristine conditions, solitons can emerge inside these ultracold quantum fluids, surviving for several seconds. Curious about how solitons behave in less than pristine conditions, scientists at NIST's Physical Measurement Laboratory, in collaboration with researchers at the Joint Quantum Institute (JQI), have added some stress to a soliton's life. They began by cooling down a cloud of rubidium atoms. Right before the gas became a homogenous quantum fluid, a radio-frequency magnetic field coaxed a handful of these atoms into retaining their classical, billiard ball-like state. Those atoms are, in effect, impurities in the atomic mix. The scientists then used laser light to push apart atoms in one region of the fluid, creating a solitary wave of low density—a "dark" soliton. In the absence of impurities, this low-density region stably pulses through the ultracold fluid. But when atomic impurities are present, the dark soliton behaves as if it were a heavy particle, with lightweight impurity atoms bouncing off of it. These collisions make the dark soliton's movement more random. This effect is reminiscent of Einstein's 1905 predictions about randomized particle movement, dubbed Brownian motion. Guided by this framework, the scientists also expected the impurities to act like friction and slow down the soliton. But surprisingly, dark solitons do not completely follow Einstein's rules. Instead of dragging down the soliton, collisions accelerated it to a point of destabilization. The soliton's speed limit is set by the speed of sound in the quantum fluid, and upon exceeding that limit it exploded into a puff of sound waves. This behavior made sense only after researchers changed their mathematical perspective and treated the soliton as though it has a negative mass. This is a quirky phenomenon that arises for certain collective behaviors of many-particle systems. Here the negative mass is manifested by the soliton's darkness—it is a dip in the quantum fluid rather than a tall tsunami-like pulse.  Particles with negative mass respond to friction forces opposite to their ordinary cousins, speeding up instead of slowing down. "All those assumptions about Brownian motion ended up going out the window—none of it applied," says Hilary Hurst, a graduate student at JQI and lead theorist on the paper. "But at the end we had a theory that described this behavior very well, which is really nice." Lauren Aycock, lead author on the paper, lauded what she saw as particularly strong feedback between theory and experiment, adding that "it's satisfying to have this kind of successful collaboration, where measurement informs theory, which then explains experimental results." Solitons in the land of ultracold atoms are intriguing, say Aycock and Hurst, because they are as close as you can get to observing the interface between quantum effects and the ordinary physics of everyday life. Experiments like this may help answer a deep physics riddle: where is the boundary between classical and quantum? In addition, this result may cast light on a similar problem with solitons in optical fibers, where random noise can disrupt the precise timing needed for communication over long distances. Explore further: Ultra-cold atoms may wade through quantum friction More information: Lauren M. Aycock et al. Brownian motion of solitons in a Bose–Einstein condensate, Proceedings of the National Academy of Sciences (2017). DOI: 10.1073/pnas.1615004114


News Article | February 23, 2017
Site: phys.org

At about 11 feet cubed (3.3 x 3.3 x 3.4 meters) and almost 20,000 pounds (about 9,000 kilograms), the device – a model called the Xenos, made by the German company Zeiss – takes up roughly half the volume of the laboratory space. With less than 100 mm (not quite 4 inches) overhead clearance, it nearly scrapes the ceiling. Getting the instrument into the underground lab on NIST's Gaithersburg, Md., campus took "some creativity and a lot of patience," says Vincent Lee of NIST's Physical Measurement Laboratory (PML). He and his colleagues knew it was too big for the route they normally use to install heavy equipment. "So we had to improvise and lower it down the ventilation shaft in the building," Lee says. They also had to knock a wall out of the room and bridge a half-meter gap between the outside and inside floors. The Xenos came to NIST to help scientists make a measurement of "big G," the universal constant of gravitation that has eluded precise measurement for centuries. When that experiment is complete, however, the researchers hope to incorporate the instrument into their growing fleet of CMMs, capable of making some of the most precise dimensional measurements in the world. Coordinate measuring machines like the Xenos use touch probes to measure the distances between points on an object in three dimensions, with billionths of a meter sensitivity for the most accurate machines. Customers who rely on NIST for this kind of measurement include manufacturers of ultra-precision parts, such as bearings for aircraft engines, test artifacts for other classes of measurement machines, and parts or structures for high-accuracy systems. Other customers come from the automotive and electronics industries, and from laboratories that perform calibrations for their own clientele. With the addition of the Xenos, NIST's dimensional metrology group now possesses four CMMs in the ultra high-accuracy class. This newest machine also has the potential to expand NIST's measurement capability since its work volume (the area accessible to the probe) is more than twice that of NIST's other CMM systems – 1.5 x 0.9 x 0.7 meters, about the size of a washer and dryer side by side. Also, the Xenos has a probe head that can move in all three dimensions, meaning that, unlike CMMs with a moving table, sensitive parts like those for the big G experiment are less likely to be disturbed during measurement. So far, tests of the system's performance are "promising," Lee says, "but there are a lot of other things we need to learn before we design and perform measurements for the big G experiment. One current challenge is controlling the CMM's environment. Pockets of hot or cool air in the room can warp the machine or even the part being measured. To ensure that the temperature is evenly distributed, the laboratory uses a system that pushes air from the ceiling down through vented floor tiles. But the Xenos CMM is so big that, like a pebble stuck in a garden hose, it restricts this flow, preventing the air from circulating optimally. Lee is currently exploring several solutions to ameliorate the problem. The big G experiment will start this spring and should be complete within two years. "After that, we plan to start pressing the Xenos machine into service for calibrations," Lee says. Meanwhile, he and PML staff will continue to gain a greater understanding of the Xenos in order to be able to realize its fullest potential. The team says that it will take years and years of learning the machine's quirks, making careful comparisons to its CMMs and other length measurement machines, and performing carefully executed experiments in order to be able to assess the capabilities of this new machine. "It was really a big effort to get it to where it is right now," Lee says. "And I'm not surprised it will take the same, or even more, to really understand the CMM's potential." Explore further: Solving the mystery of the big G controversy


News Article | February 23, 2017
Site: phys.org

A current case in point is the burgeoning growth of additive manufacturing (AM)—the industrial equivalent of 3-D printing in which complex structures are built up by the successive addition of layers, instead of either assembling them from separate components or starting with a solid block of material from which material is successively removed, sometimes using a number of machining tools, to produce the final part. AM is already in use for fabricating a wide range of devices from medical implants to multi-material electronic components, precision fluid conduits, lamp constituents, fiber-optic connectors, and more. But the method poses problems for defect detection and quality control: The exact dimensions and fit of a device's internal features cannot readily be evaluated without destroying the device. As a result, many manufacturers have turned to a technology called x-ray computed tomography (CT), long used in medical imaging but increasingly employed over the past 15 years to examine the dimensional characteristics of commercial products. At present, however, there are very few agreed-upon standards to evaluate a CT instrument's performance or verify the accuracy of its images. That's why NIST entered into a Cooperative Research and Development Agreement (CRADA) with North Star Imaging (NSI) of Minnesota, a manufacturer of industrial digital x-ray and CT systems, which has loaned a CT unit to NIST for the three-year duration of the CRADA. During that time, NIST researchers can use the CT system to test measurements of candidate reference artifacts that could eventually be employed in standardized testing and calibration; at the same time, the NSI system can be characterized by exacting procedures at the nation's standards laboratory. "Right now, we're mainly involved in developing very well described reference artifacts," says project scientist Meghan Shilling of NIST's Physical Measurement Laboratory. "We take an artifact designed to evaluate the performance of a CT system and measure it using our tactile-probe coordinate measuring machines, which have extremely well-established measurement accuracy. "Then we put the artifacts in the CT system, measure them, and see how the data compare. One person on our team, who is part of the Engineering Laboratory at NIST, is making metal test structures using additive manufacturing, into which he intentionally leaves some voids, which can also be imaged using the CT system. At the same time, we're also working on characterizing North Star's machine, giving them technical feedback that may help improve their system design." "The CRADA has been extremely valuable for NSI in characterizing the system for use in the refinement and enhancement of our CT system designs," says Tucker Behrns, Engineering Manager at NSI. "We have been able to gather a wealth of information through working alongside the NIST team while gaining unbiased feedback with a focus on metrological implications. The unique measurement knowledge and skills we have access to as a result of this agreement have allowed us to gain great depth in our understanding of the critical aspects of the machine function and performance." A concurrent goal is to assist in the development of performance evaluation standards that can be promulgated worldwide. "Both NIST and NSI are active in standards organizations, including the International Organization for Standardization (ISO) and the American Society of Mechanical Engineers," Shilling says. "Both are in the process of putting together standards for specifying CT systems. The only performance evaluation document that exists now for CT dimensional metrology is a German guideline, and the team that put together the guideline is also involved in drafting the ISO standard. Eventually, we also hope to be able to disseminate best practices and lessons learned about techniques and artifacts." CT works by projecting x-rays of appropriate energies through an object at successively varying angles. Different kinds of materials absorb or scatter more or fewer x-rays; so measuring the x-rays transmitted through a multi-featured object at different angles reveals its inner structure. In a typical medical CT scan, an x-ray source rotates continuously around the body, building up 2-D or 3-D images which reveal circulatory problems, tumors, bone irregularities, kidney and bladder stones, head injuries and many other conditions. X-ray CT for manufactured objects uses exactly the same principles. In the NSI instrument at NIST, a sample/test object is placed on a stage between the x-ray source and a detector plate. The sample revolves in a series of small angular increments around its vertical axis, and the x-ray beam passes through it, taking one frame of data at each position. Each measurement produces a single 2-D slice. Computer software integrates all of the slices and builds up a 3-D image. However, there are many complicating factors. For one thing, samples may contain both soft polymer parts and multiple hard metallic sections laid down in layers of melted or sintered powders. Each kind of material has an inherent attenuation coefficient (the ease with which x-rays pass through the material), that is dependent on the material composition and density as well as the energy spectrum of the x-ray source. NIST provides tables of x-ray mass attenuation coefficients for elements with atomic numbers from 1 to 92 for specific x-ray energies. But calculating the attenuation coefficient for multi-element compounds, such as plastics combined with metal, using a spectrum of x-ray energy, is a challenge. "We are able to vary the voltage and the current in the x-ray source," Shilling says, "and we can place various filters in front of the beam to adjust the x-ray spectrum that travels on to the target test object. So the system is very capable of measuring materials from plastics to steel." Depending on the customer's needs and the degree of detail that is wanted, a measurement run can range from half an hour to four hours or more. But how can the accuracy of those images be objectively evaluated? And what are the optimal ways to measure different materials and configurations? The answers are slowly emerging from scores of trials, and "developing the right settings is a bit of an art," Shilling says. Aside from adjusting the voltage and current in the x-ray beam and the filter material, both the distance between the x-ray source and the sample, and the sample and the detector, can be adjusted to achieve various effects. At the same time, Shilling and colleagues are also investigating aspects of the instrument that could potentially lead to measurement errors. "For example," she says, "as the vertical axis of the rotary table spins, we want to see how much the sample may move in other directions—up and down or side to side. That can affect the quality of the results. What we've been doing most recently is to characterize those motions on the most important axes of the machine." That effort requires sensitive capacitance gauges and laser interferometers that can detect extremely tiny changes in position. Those and other measurements will continue for about one more year under the terms of the CRADA. "At NSI," Behrns says, "we have seen a substantial increase in the use of additive manufacturing for production components across many of the major markets we serve. As our customers continue to expand the application of this technology, we believe that CT will play a crucial role in the identification and measurement of internal structures which is not possible with traditional methods. Working with NIST has allowed us to accelerate the advancement of CT measurement technology so that we can continue to improve our ability to serve this rapidly expanding market."


News Article | February 17, 2017
Site: www.scientificcomputing.com

What if there were a wearable fitness device that could monitor your blood pressure continuously, 24 hours a day? Unfortunately, blood pressure (BP) measurements currently require the use of a cuff that temporarily stops blood flow. So a wearable BP “watch” using today’s technology would squeeze your wrist every few minutes, making it impracticable to use – not to mention annoying. A better method might gauge subtle pressure changes at the surface of your skin above one of the main wrist arteries – the radial artery – without regularly cutting off your circulation. But before scientists can create this new technology, they need to understand what the pressure inside a blood vessel “looks” like on the surface of the skin. And to do that, they must make a physical model that can be used to test wearable devices in a laboratory. NIST’s Physical Measurement Laboratory (PML) is currently collaborating with Tufts University’s School of Medicine to develop just such a model, a blood pressure wrist “phantom” – essentially a fake arm that mimics the mechanical properties of blood pulsing through an artery surrounded by human tissue. “The phantom will give us very precise measurements – say, for example, what is actually the force on the blood vessel wall? And what is the force on the soft tissue and the skin?” says Tufts University School of Medicine assistant professor Mohan Thanikachalam, who is collaborating with NIST on this work. “I think it will help us tremendously in terms of optimizing our technology” for wearable BP devices. The NIST-Tufts blood pressure phantom consists of a slab of squishy silicone, which stands in for human tissue, sitting on top of a metal plate, the stand-in for bone. A pliable tube runs through the silicone to mimic an artery, through which fluid flows via a mechanical heart pump.* The materials were carefully selected to match the properties of skin, soft tissues, bone, and artery walls, the researchers say. But unlike actual live human tissue, the phantom can easily have sensors running through it, measuring the pressure changes that occur each time water is pumped through the tube. “One of the things we want to understand is, if a sensor is sitting up here on top of the silicone, what is it really seeing?” says Zeeshan Ahmed, lead researcher for the NIST PML team. “Is it just seeing the primary wave from a pulse of fluid going through? Is it seeing a lot of reflection waves, when the primary wave bounces off the metal plate? How does the pressure change over the time it takes for each pulse of water to pass through the artificial artery?” The sensors they are currently testing are thin optical fibers called Bragg gratings, designed to block a specific frequency, or color, of light from passing through them. When the pressure changes inside the Bragg grating, so does the color of light that is blocked. Researchers can use this change in color to identify the pressure that was applied to the fiber.** The final phantom will likely incorporate about a half dozen of these Bragg sensors, running through the silicone and over its top as well as inside and outside the artificial artery. Presently, the NIST team is conducting preliminary tests to gauge the performance of their sensors using a prototype without the mock artery. Instead of pumping water through a tube, they apply pressure to the silicone by crushing it with weights. For example, to mimic a BP of 140/60, they use masses of about 1 to 1.8 kilograms (kg, thousand grams, equivalent to approximately 2 to 4 pounds). So far, they have found that their sensors are able to detect pressures from 170 millimeters of mercury (mmHg, equivalent to about 22.5 kilopascals, kPa, or about 3¼ pounds per square inch, psi) down to 60 mmHg (about 8 kPa, or a little more than 1 psi) with a resolution of 2 mmHg (about 250 Pa, or less than 0.04 psi).*** In terms of weight, this means that they are measuring masses of about 1 kg with a resolution of just 20 grams. Also promising, Ahmed says, is that the results are reproducible: Each time the silicone is crushed, it springs back to its original form, so that the results are the same no matter how many times the experiment is run. The NIST team, which includes Kevin Douglass, is currently preparing to test the sensors’ ability to measure pressures that change over time, using a universal testing machine that they call “the crushinator.” [See video.] If all goes well, the collaboration could potentially have a working prototype sometime this year.


News Article | December 13, 2016
Site: phys.org

Additive manufacturing (AM) is a high-priority technology growth area for U.S. manufacturers. Innovative AM processes that fabricate parts layer-by-layer directly from a 3-D digital model have great potential for producing high-value, complex, individually customized parts. Companies are beginning to use AM as a tool for reducing time to market, improving product quality, and reducing the cost to manufacture products. Metal-based AM parts are already in use in a number of applications, including automotive engines, aircraft assemblies, power tools, and manufacturing tools. In support of the development of polymer-based additive manufacturing, the National Institute of Standards and Technology (NIST) has released the Measurement Science Roadmap for Polymer-Based Additive Manufacturing , a guide that identifies future desired capabilities, challenges, and priority R&D topics in polymer-based AM. The report is the result of the "Roadmap Workshop on Measurement Science for Polymer-Based Additive Manufacturing," held June 9-10, 2016 at the NIST campus in Gaithersburg, Maryland. The workshop brought together nearly 100 AM experts from industry, government, national laboratories, and academia to identify measurement science challenges and associated R&D needs for polymer-based AM systems. The workshop was hosted by NIST, and sponsored by the National Science Foundation, Division of Civil, Mechanical and Manufacturing Innovation and NIST's Material Measurement Laboratory. Additive manufacturing is an important research priority for NIST and a key component of MML's Five-Year Strategic Plan. By identifying high priority goals and challenges in polymer-based AM, the report can serve as a roadmap for R&D, standards development, and other future efforts. It includes detailed analyses of the complexities surrounding material characterization, process modeling, in situ measurement, performance, and other cross-cutting challenges for polymer-based AM. As such, the report can help guide public and private decision-makers interested in furthering the capabilities of polymer-based AM, and accelerating its more widespread use, and contribute to a robust national research agenda for polymer-based AM.


News Article | February 27, 2017
Site: phys.org

Typically, calibrating flowmeters entails flowing a gas stream through the meter under test and then into a collection tank during a measured time interval. The accuracy of the flowmeter calibration factor depends on a low uncertainty measurement of the mass collected in the tank. The quantity of collected gas is commonly determined using: (1) the tank's precisely known volume multiplied by (2) the change in density of the gas in the collection tank before and after the filling process. The density determination requires measuring the pressure and the average temperature of the collected gas. Unfortunately, the average temperature of the collected gas is tough to pin down. When pressurized gas flows into a large tank, the flow generates a non-uniform temperature distribution throughout the collection tank. Soon after the flow stops, the warmest gas ends up near the top of the tank and the coolest gas ends up near the bottom. This situation makes it difficult to measure the average temperature by conventional means. A prompt reading of a few thermometers is inherently inaccurate, and the temperature gradients in big tanks persist for hours or days. To circumvent the temperature-gradient problem, NIST calibrates many small flow meters, one at a time, and then uses them in parallel to calibrate larger meters. The small meters are calibrated using a small collection tank that is thermostatted to quickly eliminate temperature gradients. However, the multiple calibrations are time-consuming and labor-intensive, and therefore expensive. Two years ago, scientists at NIST's Physical Measurement Laboratory attacked this problem successfully by devising and demonstrating the use of "acoustic thermometry" to accurately and rapidly measure the average temperature. They proved the principles using pure argon gas in a small tank. Now, they are scaling up acoustic thermometry using a large high-pressure spherical vessel as the collection volume. Since the term "large high-pressure spherical vessel" is a mouthful, it was affectionately renamed the Big Blue Ball. "We are working toward a way to calibrate meters for large flows at high pressures, such as those used to measure natural gas flowing inside interstate pipelines," says Michael Moldover, leader of NIST's Fluid Metrology Group, "The Big Blue Ball allows us to scale up the proof-of-principle tests by a factor of 20 in pressure, from 0.35 MPa to 7 MPa (3.5 atmospheres to 70 atm), and by a factor of 6 in volume, from 300 liters to 1800 liters. Eventually, the volume will be scaled up by another factor of 3, or even 10." The blue ball is on loan to NIST's Gaithersburg, Md., campus, thanks to a Cooperative Research and Development Agreement (CRADA) with Colorado Engineering Experiment Station, Inc. (CEESI). CEESI is an independent laboratory that calibrates flow meters, including those used in natural gas pipelines. Ultimately, Moldover's group expects, CEESI and other calibration laboratories will use their technique on their sites for much larger tanks and meters. "I doubt there is another organization in the world that could do what NIST is doing," says Eric Harman, CEESI Natural Gas/Multiphase engineer. "The benefit to the natural gas industry will be immense. It's critical that large natural gas meters are calibrated accurately and every energy dollar is accounted for using the best-technology-available. Moldover and his group are redefining that standard to the best-technology-possible. This is a game changer." The NIST method is based on a fundamental physical principle: When a sound wave travels through a gas with regions at different temperatures, the sound wave's average speed is determined by the average temperature of the gas. Using this scheme, the very difficult task of measuring temperature is replaced by the much simpler one of measuring the speed of sound waves as they move from transmitter to receiver. Because the physics in the Big Blue Ball is identical with that used for proof-of-principle tests, scale-up should be straightforward. However, Moldover's group is moving carefully to identify potential measurement problems at increased volume and pressure. So far, the researchers have brought the pressure in the Big Blue Ball up to 2 MPa (20 atm) on the way to 7 MPa (70 atm). They anticipate obstacles. "For example, a sound generator and sound detector that work well at a pressure of a few atmospheres might not perform well at 70 atmospheres," Moldover says. "When scaling up, we're exposing our generator and detector to high-speed flow and to rapid pressure changes; these stresses will knock the transducers around a bit. We'll see what happens. At NIST, we go beyond proof-of-principle to solve engineering problems that a user might encounter—or at least we want to suggest plausible solutions." His group's proof-of-principle demonstration used pure argon gas. But when they filled the blue ball with compressed air and checked the volume of the big blue ball using microwave resonances, the results disagreed with predictions. The trouble, it appears, arose because the air had too much moisture, which increased the air's dielectric constant and decreased the microwave resonance frequencies from the expected values. When they dried the air, they got the volume that they expected. "Clearly, that's a very significant factor," Moldover says. "If you want to properly calibrate your volume using microwaves, you have to think seriously about the water content." "Thank goodness NIST is ironing out some of the potential scale-up pitfalls," Harman says. "Uncovering hidden landmines before you step is often the difference between success and failure. As U.S. calibration facilities integrate NIST's microwave and acoustic resonance techniques, knowing that we have to measure humidity ahead of time makes our job much easier." NIST does not have the infrastructure required to test really large flowmeters of the sort used in interstate pipelines, where flow rates reach 5 m3/s at pipeline pressures up to 7 MPa. However, NIST's CRADA partner, CEESI, has a calibration facility located next to a pipeline and they have collection vessels with volumes of 20 cubic meters. Thus, the lessons learned from the big blue ball will reach industry. "While the US energy sector will benefit greatly from NIST's new technology," Harman says, "the transportation, manufacturing, and aerospace industries stand to benefit as well. Temperature uncertainty problems aren't just limited to large scale primary calibrations; small and mid-size calibrations face the same temperature uncertainty issues. Air, oxygen, nitrogen, argon, carbon dioxide, hydrogen, and helium calibrations aren't immune to temperature stratification. CEESI is thrilled that NIST is taking the Big Blue Ball and running with it." Explore further: Weighing gas with sound and microwaves

Loading Measurement Laboratory collaborators
Loading Measurement Laboratory collaborators