Engineering Laboratory

Islamabad, Pakistan

Engineering Laboratory

Islamabad, Pakistan
SEARCH FILTERS
Time filter
Source Type

Developing a good, high-resolution 3-D map is a long, tedious and expensive process: a vehicle scans the surrounding environment from ground level up to the top of roofs or trees, while an aerial perspective is added using a drone. But a new approach, in which the terrestrial vehicle and drone are operated in tandem, has now been developed as part of a European project called mapKITE. EPFL researchers are involved in the consortium, which is funded by the H2020 program, and have designed some of the key components of this breakthrough technology. These include technical features – such as the target – that allow the drone to 'latch' virtually onto the vehicle. One look at the current approach to 3-D mapping shows why combining terrestrial and aerial techniques makes sense. For example, to map out a long corridor like a road, river or railway, the drone has to work segment by segment, following markers on the ground. For control reasons, it has to remain within eyeshot of the drone operator, and to ensure its sensors are precisely aimed it has to be able to 'see' a certain number of ground control points. Another drawback is that with aerial mapping the direction of the drone's sensor must be repeatedly corrected in poorly textured environments (e.g. snow, sand or water). And at ground level, it takes just a tree, bridge or vehicle to block the image. Then there's the problem of ensuring the data collected from the air is compatible and consistent with that collected on the ground. MapKITE harnesses the advantages of the two techniques – and does away with their drawbacks – by combining them. The researchers equipped the drone with remote detection instruments and a navigation, steering and control system. The terrestrial vehicle, which is manned, also has a real-time navigation system. A positioning system in the vehicle constantly calculates its route while at the same time generating a series of reference points for the drone by converting terrestrial navigation data (time, position, speed and attitude parameters) into aerial commands (altitude and route). This mechanism creates a 'virtual cord' that causes the drone to constantly follow the vehicle and operate at the same scale. The tandem concept goes beyond just having the drone tail the vehicle. The real value of the virtual cord derives from two features. The first is an optical target developed by EPFL's Geodetic Engineering Laboratory (TOPO). The target is a fractal design attached to the vehicle's roof that allows the drone to optically calculate its distance from the vehicle in real-time (and more accurately during post-processing). This means the drone knows its relative location at all times without using satellite navigation instruments and can conduct data fusion without relying on terrestrial targets. "Through this tandem approach, MapKite also complies with European regulations, since the drone can land autonomously on the vehicle if anything goes wrong or if its batteries need to be changed," said Jan Skaloud, a senior scientist at TOPO. The second key feature of the virtual cord is the use of signals from the European global navigation system Galileo – a first at this level of research. Galileo, which went live in December 2016, provides higher quality signals than the American GPS system and offers unique features that reduce errors in calculating terrestrial positions. In mid-March, the tandem was tested at the BCN Drone Center, near Barcelona. The results were spectacular: the system generated 3-D maps with a resolution of one centimeter, which is much more precise than systems like Google Street View. "With a target that's only 90 centimeters across, the images taken by the drone at a height of 100 meters provides the error in drone-to-target distance of less than 1%, while at a height of 50 meters the error is less than 0.25%," said Davide Cucci, a post-doc at TOPO. Potential applications for this technology are numerous – especially in map-making, as this instrument can be used to create 3-D models of long corridors. It could also be effective in inspecting and monitoring buildings and other structures in cities. Future developments are sure to emerge as well.


News Article | May 8, 2017
Site: www.prweb.com

World-renowned innovators from higher education, government and business sectors will gather in an international summit in Hong Kong this summer to explore how to enhance universities' impacts through innovative research and teaching for the benefit of the society and mankind. The inaugural Times Higher Education (THE) Innovation & Impact Summit, co-hosted by The Hong Kong Polytechnic University (PolyU) and THE, will run from 31 May to 2 June 2017 at Hotel ICON in Hong Kong. A key celebratory event of PolyU's 80th anniversary, the Summit will feature keynote speeches and a series of thought-provoking panel discussions under the theme of "Powering Universities' Economic and Social Impact through Innovative Research and Teaching". PolyU Showcase The Summit will commence with a PolyU Showcase on 31 May, highlighting the four areas that PolyU innovations have been creating impacts to the economy and society, while uplifting university-industry partnership. The feature contents in the four sessions are: 1. Space, aviation and railway There will be presentations of how PolyU ventured into space, and how we developed optical fibre sensing system for railway monitoring and technologies in enhancing aircraft maintenance, repair and overhaul. Visits: Aviation Services Research Centre and University Research Facility in 3D printing 2. Sustainable urban development PolyU will share how we develop and implement new technologies and solutions for sustainable urban development and smart cities, and will open up discussions of the role of higher education institutions in changing and sustaining the landscapes of urban development. Visits: Underground Utility Survey Laboratory, Construction Virtual Prototyping Laboratory and Fire Engineering Laboratory. 3. Human-centered innovation The Showcase will feature the application of human-centered innovation in genetics research in healthcare, advanced solutions to the development of the garment industry, and how research on Asian head and face shapes changes the world's perception of sizing, benefiting a host of industries that create products such as sunglasses, helmets and more. Tour: SizeChina, radiation-free scoliosis scan system, Public Design Lab. 4. Hospitality Pioneering teaching innovations in global hotel and tourism management will be highlighted and there will be Chinese wine appreciation at Vinoteca Lab. Visits: state-of-the-art facilities in School of Hotel and Tourism Management, Designer Suite by Vivienne Tam and other facilities in Hotel ICON. In the two-day Summit (1-2 June), a series of keynote addresses and panel discussions will cover a wide spectrum of topics, including: -Universities innovate to address the grand challenges -Translating research into business: turning ideas into impact -Creating and empowering entrepreneurs and future leaders -Calculating the economic and social impact of higher education These discussions will serve to inspire participating universities and organisations on how to address the global shifts in the economic, social and environmental landscape through creating and enhancing impact with higher education, research and innovations. World-class speakers include high-profile innovators, entrepreneurs and policy-makers will join, including Charles Chen Yidan, founder of Yidan Prize and core founder of Tencent Holdings Limited; Hermann Hauser, co-founder of Amadeus Capital Partners; Candace Johnson, Founder/Co-Founder SES, Loral-Teleport Europe, Europe Online, VATM, GTWN, OWNSAT, Success Europe; Greg Simon, Former executive director, White House Cancer Moonshot Task Force and Nicholas Yang, Secretary for Innovation and Technology, HKSAR government. Other influential higher education figures to address the Summit include leaders of renowned institutions from, to name just a few, Australia, England, Finland, France, Hong Kong, India, Israel, Japan, Singapore, Spain, the United States. It will be an incredible fusion of East and West. For registration, details of the PolyU Showcase and the Summit, as well as the full list of speakers, please visit the official website: https://goo.gl/kedVru


News Article | May 23, 2017
Site: www.eurekalert.org

New research in The FASEB Journal suggests the potential of tendon stem cells to improve healing and treatment for acute tendon injuries and chronic tendon disease New research published online in The FASEB Journal suggests that tendon stem (TSCs) may be able to significantly improve tendon healing by regulating inflammation, which contributes to scar-like tendon healing and chronic matrix degradation. This has implications for the treatment of acute tendon injuries and chronic tendon disease. "Inflammation plays a critical role in acute and chronic tendon injuries and healing," said Chang H. Lee, Ph.D., a researcher involved in the work and an assistant professor at the Regenerative Engineering Laboratory (Columbia University Irving Medical Center, New York). "Our findings represent an important foundation for the development of a new treatment that would regulate overwhelmed inflammation for tendon ruptures and tears, tendonitis, tendinopathy, and other tendon injuries and diseases." In their study, Lee and colleagues used both in vitro human models and in vivo rat models. In vitro, isolated TSCs were stimulated with proinflammatory cytokines (proteins that can influence interactions between cells), and the expression of genes involved in inflammatory regulation was measured. In vivo, the researchers evaluated inflammatory responses by TSCs, including infiltration of macrophages (white blood cells that consume damaged or dead cells) and expression of anti-/proinflammatory cytokines, at different time points. Connective tissue growth factor (CTGF) was used in both models to stimulate the anti-inflammatory roles of TSCs. The researchers found that CTGF stimulation induced TSCs' production of anti-inflammatory cytokines, consequently leading to improved tendon healing and matrix remodeling. "Many would have predicted that tendon healing is inflammation-linked," said Thoru Pederson, Ph.D., Editor-in-Chief of The FASEB Journal, "but that the anti-inflammatory roles of TSCs could be so potent, and so amplifiable, is a striking finding." Submit to The FASEB Journal by visiting http://fasebj. , and receive monthly highlights by signing up at http://www. . The FASEB Journal is published by the Federation of the American Societies for Experimental Biology (FASEB). It is the world's most cited biology journal according to the Institute for Scientific Information and has been recognized by the Special Libraries Association as one of the top 100 most influential biomedical journals of the past century. FASEB is composed of 30 societies with more than 125,000 members, making it the largest coalition of biomedical research associations in the United States. Our mission is to advance health and welfare by promoting progress and education in biological and biomedical sciences through service to our member societies and collaborative advocacy. Details: Solaiman Tarafder, Esther Chen, Yena Jun, Kristy Kao, Kun Hee Sim, Jungho Back, Francis Y. Lee, and Chang H. Lee. Tendon stem/progenitor cells regulate inflammation in tendon healing via JNK and STAT3 signaling. FASEB J. doi: 10.1096/fj.201700071R ; http://www.


News Article | May 25, 2017
Site: www.sciencedaily.com

New research published online in The FASEB Journal suggests that tendon stem (TSCs) may be able to significantly improve tendon healing by regulating inflammation, which contributes to scar-like tendon healing and chronic matrix degradation. This has implications for the treatment of acute tendon injuries and chronic tendon disease. "Inflammation plays a critical role in acute and chronic tendon injuries and healing," said Chang H. Lee, Ph.D., a researcher involved in the work and an assistant professor at the Regenerative Engineering Laboratory (Columbia University Irving Medical Center, New York). "Our findings represent an important foundation for the development of a new treatment that would regulate overwhelmed inflammation for tendon ruptures and tears, tendonitis, tendinopathy, and other tendon injuries and diseases." In their study, Lee and colleagues used both in vitro human models and in vivo rat models. In vitro, isolated TSCs were stimulated with proinflammatory cytokines (proteins that can influence interactions between cells), and the expression of genes involved in inflammatory regulation was measured. In vivo, the researchers evaluated inflammatory responses by TSCs, including infiltration of macrophages (white blood cells that consume damaged or dead cells) and expression of anti-/proinflammatory cytokines, at different time points. Connective tissue growth factor (CTGF) was used in both models to stimulate the anti-inflammatory roles of TSCs. The researchers found that CTGF stimulation induced TSCs' production of anti-inflammatory cytokines, consequently leading to improved tendon healing and matrix remodeling. "Many would have predicted that tendon healing is inflammation-linked," said Thoru Pederson, Ph.D., Editor-in-Chief of The FASEB Journal, "but that the anti-inflammatory roles of TSCs could be so potent, and so amplifiable, is a striking finding."


News Article | February 16, 2017
Site: www.eurekalert.org

Headaches and backaches also come with the job, which lacks ergonomic guidelines COLUMBUS, Ohio--Getting a tattoo may hurt, but giving one is no picnic, either. That's the finding of the first study ever to directly measure the physical stresses that lead to aches and pains in tattoo artists--workers who support a multibillion-dollar American industry, but who often don't have access to workers' compensation if they get injured. Researchers at The Ohio State University measured the muscle exertions of 10 central Ohio tattoo artists while they were working, and found that all of them exceeded maximums recommended to avoid injury, especially in the muscles of their upper back and neck. In the journal Applied Ergonomics, the researchers presented their findings and offered some suggestions on how tattoo artists can avoid injury. The study was unique, explained Carolyn Sommerich, director of the Engineering Laboratory for Human Factors/Ergonomics/Safety at Ohio State. She and former master's student Dana Keester spent a summer "hanging out in tattoo parlors with our EMG equipment, cameras and a tripod," observing artists who agreed to work while wearing electrodes that precisely measured their muscle activity. The electrodes gathered data for 15 seconds every 3 minutes for the entirety of each tattoo session. Though a single tattoo session can last as long as 8 hours depending on the size and complexity of the tattoo, the sessions used in the study lasted anywhere from 1 to 3 hours. In addition, the researchers used a standardized observational assessment tool to assess each artist's posture every five minutes and took a picture to document each observation. To the researchers' knowledge, this is the first time that anyone has gathered such data from tattoo artists at work. To Keester, some reasons for the artists' discomfort were immediately obvious. She noted that they sit for prolonged periods of time, often taking a posture just like the one immortalized in Norman Rockwell's painting "Tattoo Artist"--they perch on low stools, lean forward, and crane their neck to keep their eyes close to the tattoo they're creating. All 10 tattoo artists exceeded recommended exertion limits in at least one muscle group. Most notable was the strain on their trapezius muscles--upper back muscles that connect the shoulder blades to either side of the neck, a common site for neck/shoulder pain. Some exceeded limits by as much as 25 percent, putting them at high risk for injury. Those findings mesh well with a prior survey of tattoo artists that Keester carried out at the Hell City Tattoo Festival in Columbus, Ohio, in 2014. Among the 34 artists surveyed, the most common complaints were back pain (94 percent), headache (88 percent), neck pain (85 percent) and eye pain (74 percent). Tattoo artists suffer ailments similar to those experienced by dentists and dental hygienists, the researchers concluded. Like dental workers, tattoo artists perform detailed work with their hands while leaning over clients. But, unlike dental workers, tattoo artists in the United States lack a national organization that sets ergonomic guidelines for avoiding injury. One of the main problems is that the industry doesn't have specialized seating to support both the artist and the client, said Sommerich. "There's no such thing as an official 'tattoo chair,' so artists adapt dental chairs or massage tables to make a client comfortable, and then they hunch over the client to create the tattoo," Sommerich said. Adding to the problem is the fact that many tattoo artists are independent contractors who rent studio space from shop owners, so they're not covered by workers' compensation if they get hurt on the job, Keester said. Despite these challenges, the Ohio State researchers came up with some suggestions that may help artists avoid injury. Artists could experiment with different kinds of chairs for themselves, and try to support their back and arms. They could change positions while they work, take more frequent breaks and use a mounted magnifying glass to see their work instead of leaning in. They can also consider asking the client to move into a position that is comfortable for both the client and the tattoo artist, Sommerich added. "If the client can stand or maybe lean on something while the artist sits comfortably, that may be a good option," she said. "Switch it up once in a while." In the United States, tattooing is a $2.3 billion industry. A 2016 Harris Poll found that a third of Americans have at least one tattoo, and an IBISWorld report estimated that the industry is growing at around 13 percent per year. The National Institute for Occupational Safety and Health provided funding for Keester's graduate studies.


News Article | February 21, 2017
Site: www.eurekalert.org

Injecting large amounts of offshore wind power into the U.S. electrical grid is manageable, will cut electricity costs, and will reduce pollution compared to current fossil fuel sources, according to researchers from the University of Delaware and Princeton University who have completed a first-of-its-kind simulation with the electric power industry. The researchers consulted with PJM Interconnection -- a grid operator supplying electricity to more than 60 million people in 14 states -- to develop a computer model that simulates how the electric grid would respond to injections of wind power from offshore wind farms along the East Coast at five build-out levels, between 7 and 70 gigawatts of installed capacity. The two-part study is published in the journal Renewable Energy. One hurdle grid operators face is how to integrate increasing amounts of naturally fluctuating offshore wind into a network that has to deliver reliable power to customers, 24-7. The UD and Princeton team showed conservatively that, with some upgrades to transmission lines but without any need for added storage, the PJM grid can handle over 35 gigawatts of offshore wind--that's 35 billion watts--enough to power an estimated 10 million homes. They also found that the PJM grid could in the future handle twice that amount, up to 70 gigawatts, as wind forecasting improves, allowing the power operator to better predict and harness more wind. "Our goal was to replicate this very human-made energy system under all kinds of scenarios," said Cristina Archer, associate professor of physical ocean science and engineering at the University of Delaware. "What would you do as a grid operator if you thought it was going to be windy today and it isn't, or if the wind storm arrives earlier than expected? We simulated the entire PJM grid, with each power plant and each wind farm in it, old and new, every five minutes. As far as we know, this is the first model that does this." From her office in UD's Harker Interdisciplinary Science and Engineering Laboratory, Archer led the team's efforts to generate realistic offshore wind forecasts based on real wind farm data from land-based systems, which colleagues at Princeton then incorporated into their model of the PJM electric power system. The team used stochastic modeling, running hundreds of forecasts with various tweaks in conditions, to realistically represent the fluctuating and sometimes unpredictable behavior of wind. The model of PJM, called Smart-ISO, created at Princeton, is designed to handle both the variability and uncertainty of growing inputs of offshore wind energy, simulating what happens over an extensive power grid with more than 60,000 miles of transmission lines. "The uncertainty of wind will require that we develop strategies to minimize the need for spinning reserve," said Warren Powell, professor and lead researcher at Princeton in charge of the SMART-ISO model, referring to electric generators that need to keep "spinning" and be ready for any electricity shortage. "Although we found that reserves were needed -- 21 percent of the 70 gigawatt wind capacity -- there are a number of strategies that could be investigated to better handle the variability as wind grows in the future." The first U.S. offshore wind farm, consisting of five wind turbines at Block Island, Rhode Island, with a generating capacity of 30 megawatts, had not been built yet when the researchers began their study five years ago. The 70 gigawatts offshore modeled in this study would be almost equal to the total U.S. wind power capacity installed on land through the end of 2016. Archer says that adding more offshore wind farms would lower consumers' electricity costs and reduce pollution by replacing coal and natural gas power plants. "We saw up to a 50 percent reduction in carbon and sulfur dioxide and up to a 40 percent reduction in nitrogen oxides emissions at the highest build-out level, a 70-gigawatt set of wind farms. Plus, the costs of electricity would go down every month except in July when air conditioning is at a peak," Archer said. "Wind power is a very good idea -- for people's health and their wallets." The research was supported by the U.S. Department of Energy. The two-part study is published in Renewable Energy. Part I, "The Challenge of Integrating Offshore Wind Power in the U.S. Electric Grid: Wind Forecast Error," was written by Cristina Archer, H. P. Simao, Willett Kempton, Warren Powell and M. J. Dvorak. Part II, "The Challenge of Integrating Offshore Wind Power in the U.S. Electric Grid: Simulation of Electricity Market Operations," was written by H.P. Simao, Warren Powell, Cristina Archer and Willett Kempton.


News Article | February 23, 2017
Site: phys.org

A current case in point is the burgeoning growth of additive manufacturing (AM)—the industrial equivalent of 3-D printing in which complex structures are built up by the successive addition of layers, instead of either assembling them from separate components or starting with a solid block of material from which material is successively removed, sometimes using a number of machining tools, to produce the final part. AM is already in use for fabricating a wide range of devices from medical implants to multi-material electronic components, precision fluid conduits, lamp constituents, fiber-optic connectors, and more. But the method poses problems for defect detection and quality control: The exact dimensions and fit of a device's internal features cannot readily be evaluated without destroying the device. As a result, many manufacturers have turned to a technology called x-ray computed tomography (CT), long used in medical imaging but increasingly employed over the past 15 years to examine the dimensional characteristics of commercial products. At present, however, there are very few agreed-upon standards to evaluate a CT instrument's performance or verify the accuracy of its images. That's why NIST entered into a Cooperative Research and Development Agreement (CRADA) with North Star Imaging (NSI) of Minnesota, a manufacturer of industrial digital x-ray and CT systems, which has loaned a CT unit to NIST for the three-year duration of the CRADA. During that time, NIST researchers can use the CT system to test measurements of candidate reference artifacts that could eventually be employed in standardized testing and calibration; at the same time, the NSI system can be characterized by exacting procedures at the nation's standards laboratory. "Right now, we're mainly involved in developing very well described reference artifacts," says project scientist Meghan Shilling of NIST's Physical Measurement Laboratory. "We take an artifact designed to evaluate the performance of a CT system and measure it using our tactile-probe coordinate measuring machines, which have extremely well-established measurement accuracy. "Then we put the artifacts in the CT system, measure them, and see how the data compare. One person on our team, who is part of the Engineering Laboratory at NIST, is making metal test structures using additive manufacturing, into which he intentionally leaves some voids, which can also be imaged using the CT system. At the same time, we're also working on characterizing North Star's machine, giving them technical feedback that may help improve their system design." "The CRADA has been extremely valuable for NSI in characterizing the system for use in the refinement and enhancement of our CT system designs," says Tucker Behrns, Engineering Manager at NSI. "We have been able to gather a wealth of information through working alongside the NIST team while gaining unbiased feedback with a focus on metrological implications. The unique measurement knowledge and skills we have access to as a result of this agreement have allowed us to gain great depth in our understanding of the critical aspects of the machine function and performance." A concurrent goal is to assist in the development of performance evaluation standards that can be promulgated worldwide. "Both NIST and NSI are active in standards organizations, including the International Organization for Standardization (ISO) and the American Society of Mechanical Engineers," Shilling says. "Both are in the process of putting together standards for specifying CT systems. The only performance evaluation document that exists now for CT dimensional metrology is a German guideline, and the team that put together the guideline is also involved in drafting the ISO standard. Eventually, we also hope to be able to disseminate best practices and lessons learned about techniques and artifacts." CT works by projecting x-rays of appropriate energies through an object at successively varying angles. Different kinds of materials absorb or scatter more or fewer x-rays; so measuring the x-rays transmitted through a multi-featured object at different angles reveals its inner structure. In a typical medical CT scan, an x-ray source rotates continuously around the body, building up 2-D or 3-D images which reveal circulatory problems, tumors, bone irregularities, kidney and bladder stones, head injuries and many other conditions. X-ray CT for manufactured objects uses exactly the same principles. In the NSI instrument at NIST, a sample/test object is placed on a stage between the x-ray source and a detector plate. The sample revolves in a series of small angular increments around its vertical axis, and the x-ray beam passes through it, taking one frame of data at each position. Each measurement produces a single 2-D slice. Computer software integrates all of the slices and builds up a 3-D image. However, there are many complicating factors. For one thing, samples may contain both soft polymer parts and multiple hard metallic sections laid down in layers of melted or sintered powders. Each kind of material has an inherent attenuation coefficient (the ease with which x-rays pass through the material), that is dependent on the material composition and density as well as the energy spectrum of the x-ray source. NIST provides tables of x-ray mass attenuation coefficients for elements with atomic numbers from 1 to 92 for specific x-ray energies. But calculating the attenuation coefficient for multi-element compounds, such as plastics combined with metal, using a spectrum of x-ray energy, is a challenge. "We are able to vary the voltage and the current in the x-ray source," Shilling says, "and we can place various filters in front of the beam to adjust the x-ray spectrum that travels on to the target test object. So the system is very capable of measuring materials from plastics to steel." Depending on the customer's needs and the degree of detail that is wanted, a measurement run can range from half an hour to four hours or more. But how can the accuracy of those images be objectively evaluated? And what are the optimal ways to measure different materials and configurations? The answers are slowly emerging from scores of trials, and "developing the right settings is a bit of an art," Shilling says. Aside from adjusting the voltage and current in the x-ray beam and the filter material, both the distance between the x-ray source and the sample, and the sample and the detector, can be adjusted to achieve various effects. At the same time, Shilling and colleagues are also investigating aspects of the instrument that could potentially lead to measurement errors. "For example," she says, "as the vertical axis of the rotary table spins, we want to see how much the sample may move in other directions—up and down or side to side. That can affect the quality of the results. What we've been doing most recently is to characterize those motions on the most important axes of the machine." That effort requires sensitive capacitance gauges and laser interferometers that can detect extremely tiny changes in position. Those and other measurements will continue for about one more year under the terms of the CRADA. "At NSI," Behrns says, "we have seen a substantial increase in the use of additive manufacturing for production components across many of the major markets we serve. As our customers continue to expand the application of this technology, we believe that CT will play a crucial role in the identification and measurement of internal structures which is not possible with traditional methods. Working with NIST has allowed us to accelerate the advancement of CT measurement technology so that we can continue to improve our ability to serve this rapidly expanding market."


News Article | October 5, 2016
Site: www.nanotech-now.com

Home > Press > NIST-made 'sun and rain' used to study nanoparticle release from polymers Abstract: If the 1967 film "The Graduate" were remade today, Mr. McGuire's famous advice to young Benjamin Braddock would probably be updated to "Plastics ... with nanoparticles." These days, the mechanical, electrical and durability properties of polymers--the class of materials that includes plastics--are often enhanced by adding miniature particles (smaller than 100 nanometers or billionths of a meter) made of elements such as silicon or silver. But could those nanoparticles be released into the environment after the polymers are exposed to years of sun and water--and if so, what might be the health and ecological consequences? In a recently published paper, researchers from the National Institute of Standards and Technology (NIST) describe how they subjected a commercial nanoparticle-infused coating to NIST-developed methods for accelerating the effects of weathering from ultraviolet (UV) radiation and simulated washings of rainwater. Their results indicate that humidity and exposure time are contributing factors for nanoparticle release, findings that may be useful in designing future studies to determine potential impacts. In their recent experiment, the researchers exposed multiple samples of a commercially available polyurethane coating containing silicon dioxide nanoparticles to intense UV radiation for 100 days inside the NIST SPHERE (Simulated Photodegradation via High-Energy Radiant Exposure), a hollow, 2-meter (7-foot) diameter black aluminum chamber lined with highly UV reflective material that bears a casual resemblance to the Death Star in the film "Star Wars." For this study, one day in the SPHERE was equivalent to 10 to 15 days outdoors. All samples were weathered at a constant temperature of 50 degrees Celsius (122 degrees Fahrenheit) with one group done in extremely dry conditions (approximately 0 percent humidity) and the other in humid conditions (75 percent humidity). To determine if any nanoparticles were released from the polymer coating during UV exposure, the researchers used a technique they created and dubbed "NIST simulated rain." Filtered water was converted into tiny droplets, sprayed under pressure onto the individual samples, and then the runoff--with any loose nanoparticles--was collected in a bottle. This procedure was conducted at the beginning of the UV exposure, at every two weeks during the weathering run and at the end. All of the runoff fluids were then analyzed by NIST chemists for the presence of silicon and in what amounts. Additionally, the weathered coatings were examined with atomic force microscopy (AFM) and scanning electron microscopy (SEM) to reveal surface changes resulting from UV exposure. Both sets of coating samples--those weathered in very low humidity and the others in very humid conditions--degraded but released only small amounts of nanoparticles. The researchers found that more silicon was recovered from the samples weathered in humid conditions and that nanoparticle release increased as the UV exposure time increased. Microscopic examination showed that deformations in the coating surface became more numerous with longer exposure time, and that nanoparticles left behind after the coating degraded often bound together in clusters. "These data, and the data from future experiments of this type, are valuable for developing computer models to predict the long-term release of nanoparticles from commercial coatings used outdoors, and in turn, help manufacturers, regulatory officials and others assess any health and environmental impacts from them," said NIST research chemist Deborah Jacobs, lead author on the study published in the Journal of Coatings Technology and Research. This project resulted from a collaboration between NIST's Engineering Laboratory and Material Measurement Laboratory. It is part of NIST's work to help characterize the potential environmental, health and safety (EHS) risks of nanomaterials, and develop methods for identifying and measuring them. For more information, please click If you have a comment, please us. Issuers of news releases, not 7th Wave, Inc. or Nanotechnology Now, are solely responsible for the accuracy of the content.


News Article | October 3, 2016
Site: www.cemag.us

In their recent experiment, the researchers exposed multiple samples of a commercially available polyurethane coating containing silicon dioxide nanoparticles to intense UV radiation for 100 days inside the NIST SPHERE (Simulated Photodegradation via High-Energy Radiant Exposure), a hollow, 2-meter (7-ft.) diameter black aluminum chamber lined with highly UV reflective material that bears a casual resemblance to the Death Star in the film Star Wars. For this study, one day in the SPHERE was equivalent to 10 to 15 days outdoors. All samples were weathered at a constant temperature of 50 C (122 F) with one group done in extremely dry conditions (approximately 0 percent humidity) and the other in humid conditions (75 percent humidity). To determine if any nanoparticles were released from the polymer coating during UV exposure, the researchers used a technique they created and dubbed “NIST simulated rain.” Filtered water was converted into tiny droplets, sprayed under pressure onto the individual samples, and then the runoff — with any loose nanoparticles — was collected in a bottle. This procedure was conducted at the beginning of the UV exposure, at every two weeks during the weathering run and at the end. All of the runoff fluids were then analyzed by NIST chemists for the presence of silicon and in what amounts. Additionally, the weathered coatings were examined with atomic force microscopy (AFM) and scanning electron microscopy (SEM) to reveal surface changes resulting from UV exposure. Both sets of coating samples — those weathered in very low humidity and the others in very humid conditions — degraded but released only small amounts of nanoparticles. The researchers found that more silicon was recovered from the samples weathered in humid conditions and that nanoparticle release increased as the UV exposure time increased. Microscopic examination showed that deformations in the coating surface became more numerous with longer exposure time, and that nanoparticles left behind after the coating degraded often bound together in clusters. “These data, and the data from future experiments of this type, are valuable for developing computer models to predict the long-term release of nanoparticles from commercial coatings used outdoors, and in turn, help manufacturers, regulatory officials and others assess any health and environmental impacts from them,” says NIST research chemist Deborah Jacobs, lead author on the study published in the Journal of Coatings Technology and Research. This project resulted from a collaboration between NIST’s Engineering Laboratory and Material Measurement Laboratory. It is part of NIST's work to help characterize the potential environmental, health and safety (EHS) risks of nanomaterials, and develop methods for identifying and measuring them.


Yet as large as these databases are, they contain just a fraction of the information and knowledge needed to rapidly discover or design new materials that can have a transformative impact on advancing technologies that solve pressing social and economic problems. Part of this obstacle is that databases lack the ability to collect and interpret visual data such as graphs and images from countless scientific studies, handbooks and other publications. This limitation creates a bottleneck that often slows the materials discovery process to a crawl. That will soon change. The University at Buffalo has received a $2.9 million National Science Foundation (NSF) grant to transform the traditional role of a database as a repository for information into an automated computer laboratory that rapidly collects, interprets and learns from massive amounts of information. The lab, which also will conduct large-scale materials modeling and simulations based upon untapped troves of visual data, will be accessible to the scientific community and ultimately speed up and reduce the cost of discovering, manufacturing and commercializing new materials—goals of the White House's Materials Genome Initiative. "This pioneering and multidisciplinary approach to advanced materials research will provide the scientific community with tools it needs to accelerate the pace of discovery, leading to greater economic security and a wide range of societal benefits," said Venu Govindaraju, PhD, UB's vice president for research and economic development. Govindaraju, SUNY Distinguished Professor of Computer Science and Engineering, is the grant's principal investigator. Co-principal investigators, all from UB, are: Krishna Rajan, ScD, Erich Bloch Endowed Chair of the Department of Materials Design and Innovation (MDI); Thomas Furlani, PhD, director of the Center for Computational Research; Srirangaraj "Ranga" Setlur, principal research scientist; and Scott Broderick, PhD, research assistant professor in MDI. The award, from NSF's Data Infrastructure Building Blocks (DIBBS) program, draws upon UB's expertise in artificial intelligence, specifically its groundbreaking work that began in the 1980s to enable machines to read human handwriting. The work has saved postal organizations billions of dollars in the U.S. and worldwide. UB will use the DIBBS grant to create what it's calling the Materials Data Engineering Laboratory at UB (MaDE @UB). The lab will introduce the tools of machine intelligence—such as machine learning, pattern recognition, materials informatics and modeling, high-performance computing and other cutting-edge technologies—to transform data libraries into a laboratory that not only stores and searches for information but also predicts and processes information to discover materials that transform how society addresses climate change, national security and other pressing issues. "Essentially, we're creating a system—a smart robot—with cognitive skills for scientific interpretation of text, graphs and images, " said Rajan of MDI, a collaboration between UB's School of Engineering and Applied Sciences and the College of Arts and Sciences launched in 2014 to apply information science methods to advanced materials research. He added: "This machine intelligence driven approach will open a new trajectory of data-intensive materials science research impacting both computational and experimental studies." The lab builds upon significant investments UB has made in recent years to build a hub for advanced manufacturing in Western New York. Explore further: Using machine learning to understand materials

Loading Engineering Laboratory collaborators
Loading Engineering Laboratory collaborators