Islamabad, Pakistan
Islamabad, Pakistan

Time filter

Source Type

BAODING, China, Nov 23, 2016 /PRNewswire/ -- Yingli Green Energy Holding Company Ltd. today announced that The Clean Production Evaluation Index System for PV Cells (the "System") was recently officially released by the National Development and Reform Commission ("NDRC"), Ministry of Environmental Protection, and the Ministry of Industry and Information Technology ("MIIT"). The System was jointly developed by China National Institute of Standardization ("CNIS"), Yingli, World Wide Fund for Nature ("WWF"), and other leading China's PV companies. The establishment of the System was one of the commitments made by Yingli, as the first Chinese company and the first solar PV manufacturer, when joining the WWF Climate Savers Program in 2013. CNIS, Yingli, WWF and other leading PV companies, in order to promote clean production in China's PV industry, began developing the System in September 2013. Based on the extensive experience in clean production technology and practices at Yingli and other tier one solar PV companies, the participants researched and reviewed current practices, and finalized the System design in June 2016. As the first evaluation system of China's solar PV industry, the System establishes comprehensive evaluation indexes on the production of solar PV cells including the production process, equipment used, energy consumption, comprehensive utilization of resources, pollution emissions, product features, and clean production management. The System is a significant step forward for the promotion of clean production amongst China's solar PV cell manufacturers and will support improved energy conservation, pollution prevention, sustainable development, and lead to a more environmentally friendly Chinese solar PV industry. "Technical standards are one of the most important measures to promote supply-side structural reform, optimize industrial restructuring, and enhance the capability of independent innovation," said, Mr. Wang Bohua, Secretary General of China Photovoltaic Industry Association. "The establishment of the System could help China's solar PV cell manufactures to promote cleaner production, improve energy conservation and reductions in emissions. The System is crucial for the sustainable development of China's solar PV cell production and the construction of an environmentally friendly society." "The official release of this evaluation index system is a very good example to demonstrate how business can drive changes for the solar PV industry to be greener," said Ms. Chen Xin, Co-Acting Director of Climate & Energy Program of WWF China. "As one of the goals of the Climate Savers Program, WWF is committed to promoting more industrial leaders to participate in climate and energy issues to positively influence the markets, industries, and policies through their technical expertise and innovation." "The establishment of the System is essential for the Chinese solar PV industry to transition into a greener economy and be stronger," said Mr. Gao Dongfeng, Director of the Branch of Resource and Environment of CNIS. "We hope that China's solar PV industry could take this opportunity to accelerate technical innovation in order to strengthen itself and lead the advancement of clean production worldwide." "The System has set up a series of world leading indicators for the production of solar PV cells. Once achieved, it can help China's solar PV cell manufacturers reduce their energy consumption by approximately 20%, cut the emissions of nitrogen oxide and chemical oxygen demand both by 14%, which leads to a significant reduction of solar PV panel's energy recovery period from 1.17 year to 1 year or nearly 15%," said Mr. Song Dengyuan, Chief Technology Officer of Yingli. "It will make the solar PV industry greener and promote the concept of clean production throughout the industry." Yingli Green Energy Holding Company Limited (NYSE: YGE), known as "Yingli" or "Yingli Solar", is one of the world's leading solar panel manufacturers. Yingli's manufacturing covers the photovoltaic value chain from ingot casting and wafering through solar cell production and solar panel assembly. Headquartered in Baoding, China, Yingli has more than 30 regional subsidiaries and branch offices and has distributed more than 17GW solar panels to customers worldwide. For more information, please visit www.yinglisolar.com and join the conversation on Facebook, Twitter and Weibo. Climate Savers is a global program that positions companies as leaders of the low-carbon economy. Member companies take on two commitments: To become the best in reducing greenhouse gas emissions; And to influence market or policy developments by promoting their vision, solutions and achievements. Up till 2015, 30 globally well-known corporations have joined the program, such as Yingli, HP, Johnson & Johnson, Volvo Group, Coca-Cola, and Vanke. About Branch of Resource and Environment CNIS The Branch of Resource and Environment of China National Institute of Standardization is committed to the resource and environment standardization researches and the provision of policy research and technical services for energy conservation and emission reduction. Its major tasks involve the standardization concerning energy saving, low carbon, resources recycling, environmental protection industry, water conservation, and renewable energies and so on, leading the drafting of the general formulation principles of assessment indicator system of cleaner production and so on more than 300 national standards; the operation of the secretariats of the national technical committees, Administration Center of China Energy Label, Management office for National Subsidy Program for End-use Products and Management office for R&D Project of 100 Energy Efficiency Standards, the operation of the secretariats of China Alliance of Green Product Promotion and China Alliance of Green Design and Manufacturing Industry Innovation; and the construction and operation of the Engineering Laboratory of AQSIQ for Energy Efficiency and Water Efficiency, taking charge energy efficiency testing of energy-using products and water efficiency testing of water-using products. After 30 years of development, the Branch of Resource and Environment has established itself as a comprehensive resource and environment standardization research and development base incorporating standardization research, testing, management, consulting and services. It has been providing a full range of standardization support and services for the sustainable development of China, and will continue to build a resource-conserving and environment-friendly society, promote the development of circular economy to make a positive contribution in the future. This press release contains forward-looking statements. These statements constitute "forward-looking" statements within the meaning of Section 21E of the Securities Exchange Act of 1934, as amended, and as defined in the U.S. Private Securities Litigation Reform Act of 1995. These forward-looking statements can be identified by terminology such as "will," "expects," "anticipates," "future," "intends," "plans," "believes," "estimates," "target" and similar statements. Such statements are based upon management's current expectations and current market and operating conditions, and relate to events that involve known or unknown risks, uncertainties and other factors, all of which are difficult to predict and many of which are beyond Yingli Green Energy's control, which may cause Yingli Green Energy's actual results, performance or achievements to differ materially from those in the forward-looking statements. Further information regarding these and other risks, uncertainties or factors is included in Yingli Green Energy's filings with the U.S. Securities and Exchange Commission. Yingli Green Energy does not undertake any obligation to update any forward-looking statement as a result of new information, future events or otherwise, except as required under applicable law. For further information, please contact: To view the original version on PR Newswire, visit:http://www.prnewswire.com/news-releases/yingli-leads-the-development-of-the-first-clean-production-evaluation-system-for-chinas-pv-industry-300367833.html


News Article | August 23, 2016
Site: www.greencarcongress.com

With funding from Bioenergy Technologies Office (BETO), Pacific Northwest National Laboratory (PNNL) has been working with industry-partner LanzaTech to convert alcohols derived from captured carbon monoxide, a byproduct in the production of steel, into synthetic paraffinic kerosene, a non-fossil-based jet fuel. The technology not only provides a viable source of sustainable jet fuel but also reduces the amount of greenhouse gasses emitted into the atmosphere. The team recently reached a significant milestone on the project, producing over five gallons of synthetic paraffinic kerosene in a lab environment. Five gallons is the quantity needed for “fit-for-purpose” testing. The accomplishment is part of the technology transfer process moving from bench top at PNNL to piloting at Freedom Pines, Georgia. With tight ASTM fuel specification requirements, both the demonstration of the process and the fit-for-purpose testing need to take place before the technology can be adopted commercially. The process works in two stages. First, LanzaTech captures waste carbon from refineries and manufacturing plants and feeds the CO-rich gas to microbes that consume the gas and produce ethanol. By itself, ethanol is unsuitable for jet engines because it contains oxygen and just doesn’t have the energy needed. After removing the oxygen, the resulting molecule, ethylene, is a gas and simply doesn’t have the properties to be used in modern turbofan engines. Consequently, during the second stage of the process, the ethanol is run through a PNNL-developed catalyst that converts ethanol to jet fuel by removing the oxygen and combining hydrocarbons, a process known as dehydration-oligomerization. However, ethanol dehydration-oligomerization technologies are typically difficult to control. To overcome the challenge, PNNL borrowed technology it developed to convert methanol to gasoline and created a new, specialized catalyst. The catalyst first removes water from the ethanol (dehydration), leaving behind ethylene. The small ethylene hydrocarbons are then combined (oligomerization) to form hydrocarbon chains large enough for jet fuel without forming aromatics that lead to sooting when burned. The fuel meets all the specifications required for use in commercial aviation—not an easy thing to do. Producing 5 gallons of jet fuel at PNNL’s Bioproducts, Sciences, and Engineering Laboratory demonstrates the catalyst is ready for the intermediate stage in testing. Researchers will now seek to produce 2,000 gallons of jet fuel. As PNNL successfully meets milestones, LanzaTech will scale up further. In 2011, PNNL, Lanzatech and Imperium jointly responded to a DOE call for proposals for a new aviation biofuel research project to convert biomass-derived alcohols to drop-in renewable jet fuel. Also in 2011, LanzaTech received a US$3-million contract from the United States Federal Aviation Administration (FAA), through the Department of Transportation’s John A. Volpe Center, to accelerate commercial availability of alcohol-to-jet (ATJ) renewable drop-in aviation fuel. (Earlier post.) Lanzatech has also been working with the UK’s largest bank, HSBC, and Virgin Atlantic on the development of the process that captures waste gases from industrial steel production and ferments them to ethanol, which is then chemically converted for use as jet fuel. (Earlier post.)


News Article | February 21, 2017
Site: www.eurekalert.org

Injecting large amounts of offshore wind power into the U.S. electrical grid is manageable, will cut electricity costs, and will reduce pollution compared to current fossil fuel sources, according to researchers from the University of Delaware and Princeton University who have completed a first-of-its-kind simulation with the electric power industry. The researchers consulted with PJM Interconnection -- a grid operator supplying electricity to more than 60 million people in 14 states -- to develop a computer model that simulates how the electric grid would respond to injections of wind power from offshore wind farms along the East Coast at five build-out levels, between 7 and 70 gigawatts of installed capacity. The two-part study is published in the journal Renewable Energy. One hurdle grid operators face is how to integrate increasing amounts of naturally fluctuating offshore wind into a network that has to deliver reliable power to customers, 24-7. The UD and Princeton team showed conservatively that, with some upgrades to transmission lines but without any need for added storage, the PJM grid can handle over 35 gigawatts of offshore wind--that's 35 billion watts--enough to power an estimated 10 million homes. They also found that the PJM grid could in the future handle twice that amount, up to 70 gigawatts, as wind forecasting improves, allowing the power operator to better predict and harness more wind. "Our goal was to replicate this very human-made energy system under all kinds of scenarios," said Cristina Archer, associate professor of physical ocean science and engineering at the University of Delaware. "What would you do as a grid operator if you thought it was going to be windy today and it isn't, or if the wind storm arrives earlier than expected? We simulated the entire PJM grid, with each power plant and each wind farm in it, old and new, every five minutes. As far as we know, this is the first model that does this." From her office in UD's Harker Interdisciplinary Science and Engineering Laboratory, Archer led the team's efforts to generate realistic offshore wind forecasts based on real wind farm data from land-based systems, which colleagues at Princeton then incorporated into their model of the PJM electric power system. The team used stochastic modeling, running hundreds of forecasts with various tweaks in conditions, to realistically represent the fluctuating and sometimes unpredictable behavior of wind. The model of PJM, called Smart-ISO, created at Princeton, is designed to handle both the variability and uncertainty of growing inputs of offshore wind energy, simulating what happens over an extensive power grid with more than 60,000 miles of transmission lines. "The uncertainty of wind will require that we develop strategies to minimize the need for spinning reserve," said Warren Powell, professor and lead researcher at Princeton in charge of the SMART-ISO model, referring to electric generators that need to keep "spinning" and be ready for any electricity shortage. "Although we found that reserves were needed -- 21 percent of the 70 gigawatt wind capacity -- there are a number of strategies that could be investigated to better handle the variability as wind grows in the future." The first U.S. offshore wind farm, consisting of five wind turbines at Block Island, Rhode Island, with a generating capacity of 30 megawatts, had not been built yet when the researchers began their study five years ago. The 70 gigawatts offshore modeled in this study would be almost equal to the total U.S. wind power capacity installed on land through the end of 2016. Archer says that adding more offshore wind farms would lower consumers' electricity costs and reduce pollution by replacing coal and natural gas power plants. "We saw up to a 50 percent reduction in carbon and sulfur dioxide and up to a 40 percent reduction in nitrogen oxides emissions at the highest build-out level, a 70-gigawatt set of wind farms. Plus, the costs of electricity would go down every month except in July when air conditioning is at a peak," Archer said. "Wind power is a very good idea -- for people's health and their wallets." The research was supported by the U.S. Department of Energy. The two-part study is published in Renewable Energy. Part I, "The Challenge of Integrating Offshore Wind Power in the U.S. Electric Grid: Wind Forecast Error," was written by Cristina Archer, H. P. Simao, Willett Kempton, Warren Powell and M. J. Dvorak. Part II, "The Challenge of Integrating Offshore Wind Power in the U.S. Electric Grid: Simulation of Electricity Market Operations," was written by H.P. Simao, Warren Powell, Cristina Archer and Willett Kempton.


News Article | February 16, 2017
Site: www.eurekalert.org

Headaches and backaches also come with the job, which lacks ergonomic guidelines COLUMBUS, Ohio--Getting a tattoo may hurt, but giving one is no picnic, either. That's the finding of the first study ever to directly measure the physical stresses that lead to aches and pains in tattoo artists--workers who support a multibillion-dollar American industry, but who often don't have access to workers' compensation if they get injured. Researchers at The Ohio State University measured the muscle exertions of 10 central Ohio tattoo artists while they were working, and found that all of them exceeded maximums recommended to avoid injury, especially in the muscles of their upper back and neck. In the journal Applied Ergonomics, the researchers presented their findings and offered some suggestions on how tattoo artists can avoid injury. The study was unique, explained Carolyn Sommerich, director of the Engineering Laboratory for Human Factors/Ergonomics/Safety at Ohio State. She and former master's student Dana Keester spent a summer "hanging out in tattoo parlors with our EMG equipment, cameras and a tripod," observing artists who agreed to work while wearing electrodes that precisely measured their muscle activity. The electrodes gathered data for 15 seconds every 3 minutes for the entirety of each tattoo session. Though a single tattoo session can last as long as 8 hours depending on the size and complexity of the tattoo, the sessions used in the study lasted anywhere from 1 to 3 hours. In addition, the researchers used a standardized observational assessment tool to assess each artist's posture every five minutes and took a picture to document each observation. To the researchers' knowledge, this is the first time that anyone has gathered such data from tattoo artists at work. To Keester, some reasons for the artists' discomfort were immediately obvious. She noted that they sit for prolonged periods of time, often taking a posture just like the one immortalized in Norman Rockwell's painting "Tattoo Artist"--they perch on low stools, lean forward, and crane their neck to keep their eyes close to the tattoo they're creating. All 10 tattoo artists exceeded recommended exertion limits in at least one muscle group. Most notable was the strain on their trapezius muscles--upper back muscles that connect the shoulder blades to either side of the neck, a common site for neck/shoulder pain. Some exceeded limits by as much as 25 percent, putting them at high risk for injury. Those findings mesh well with a prior survey of tattoo artists that Keester carried out at the Hell City Tattoo Festival in Columbus, Ohio, in 2014. Among the 34 artists surveyed, the most common complaints were back pain (94 percent), headache (88 percent), neck pain (85 percent) and eye pain (74 percent). Tattoo artists suffer ailments similar to those experienced by dentists and dental hygienists, the researchers concluded. Like dental workers, tattoo artists perform detailed work with their hands while leaning over clients. But, unlike dental workers, tattoo artists in the United States lack a national organization that sets ergonomic guidelines for avoiding injury. One of the main problems is that the industry doesn't have specialized seating to support both the artist and the client, said Sommerich. "There's no such thing as an official 'tattoo chair,' so artists adapt dental chairs or massage tables to make a client comfortable, and then they hunch over the client to create the tattoo," Sommerich said. Adding to the problem is the fact that many tattoo artists are independent contractors who rent studio space from shop owners, so they're not covered by workers' compensation if they get hurt on the job, Keester said. Despite these challenges, the Ohio State researchers came up with some suggestions that may help artists avoid injury. Artists could experiment with different kinds of chairs for themselves, and try to support their back and arms. They could change positions while they work, take more frequent breaks and use a mounted magnifying glass to see their work instead of leaning in. They can also consider asking the client to move into a position that is comfortable for both the client and the tattoo artist, Sommerich added. "If the client can stand or maybe lean on something while the artist sits comfortably, that may be a good option," she said. "Switch it up once in a while." In the United States, tattooing is a $2.3 billion industry. A 2016 Harris Poll found that a third of Americans have at least one tattoo, and an IBISWorld report estimated that the industry is growing at around 13 percent per year. The National Institute for Occupational Safety and Health provided funding for Keester's graduate studies.


News Article | October 26, 2016
Site: www.eurekalert.org

Computers will not only collect and store data, they will 'interpret' and 'learn' from data, accelerating the discovery of new materials BUFFALO, N.Y. -- Scientists are using supercomputers and other technologies to create ever-growing libraries of data on the properties of metals, polymers, ceramics and other materials. Yet as large as these databases are, they contain just a fraction of the information and knowledge needed to rapidly discover or design new materials that can have a transformative impact on advancing technologies that solve pressing social and economic problems. Part of this obstacle is that databases lack the ability to collect and interpret visual data such as graphs and images from countless scientific studies, handbooks and other publications. This limitation creates a bottleneck that often slows the materials discovery process to a crawl. That will soon change. The University at Buffalo has received a $2.9 million National Science Foundation (NSF) grant to transform the traditional role of a database as a repository for information into an automated computer laboratory that rapidly collects, interprets and learns from massive amounts of information. The lab, which also will conduct large-scale materials modeling and simulations based upon untapped troves of visual data, will be accessible to the scientific community and ultimately speed up and reduce the cost of discovering, manufacturing and commercializing new materials -- goals of the White House's Materials Genome Initiative. "This pioneering and multidisciplinary approach to advanced materials research will provide the scientific community with tools it needs to accelerate the pace of discovery, leading to greater economic security and a wide range of societal benefits," said Venu Govindaraju, PhD, UB's vice president for research and economic development. Govindaraju, SUNY Distinguished Professor of Computer Science and Engineering, is the grant's principal investigator. Co-principal investigators, all from UB, are: Krishna Rajan, ScD, Erich Bloch Endowed Chair of the Department of Materials Design and Innovation (MDI); Thomas Furlani, PhD, director of the Center for Computational Research; Srirangaraj "Ranga" Setlur, principal research scientist; and Scott Broderick, PhD, research assistant professor in MDI. The award, from NSF's Data Infrastructure Building Blocks (DIBBS) program, draws upon UB's expertise in artificial intelligence, specifically its groundbreaking work that began in the 1980s to enable machines to read human handwriting. The work has saved postal organizations billions of dollars in the U.S. and worldwide. UB will use the DIBBS grant to create what it's calling the Materials Data Engineering Laboratory at UB (MaDE @UB). The lab will introduce the tools of machine intelligence -- such as machine learning, pattern recognition, materials informatics and modeling, high-performance computing and other cutting-edge technologies -- to transform data libraries into a laboratory that not only stores and searches for information but also predicts and processes information to discover materials that transform how society addresses climate change, national security and other pressing issues. "Essentially, we're creating a system -- a smart robot -- with cognitive skills for scientific interpretation of text, graphs and images, " said Rajan of MDI, a collaboration between UB's School of Engineering and Applied Sciences and the College of Arts and Sciences launched in 2014 to apply information science methods to advanced materials research. He added: "This machine intelligence driven approach will open a new trajectory of data-intensive materials science research impacting both computational and experimental studies." The lab builds upon significant investments UB has made in recent years to build a hub for advanced manufacturing in Western New York. These efforts include UB's New York State Center of Excellence in Materials Informatics (CMI), the UB Community of Excellence in Sustainable Manufacturing and Advanced Robotic Technologies (SMART), partnering with Buffalo Manufacturing Works (BMW) and the Digital Manufacturing and Design Innovation Institute (DMDII) and other endeavors.


News Article | February 23, 2017
Site: phys.org

A current case in point is the burgeoning growth of additive manufacturing (AM)—the industrial equivalent of 3-D printing in which complex structures are built up by the successive addition of layers, instead of either assembling them from separate components or starting with a solid block of material from which material is successively removed, sometimes using a number of machining tools, to produce the final part. AM is already in use for fabricating a wide range of devices from medical implants to multi-material electronic components, precision fluid conduits, lamp constituents, fiber-optic connectors, and more. But the method poses problems for defect detection and quality control: The exact dimensions and fit of a device's internal features cannot readily be evaluated without destroying the device. As a result, many manufacturers have turned to a technology called x-ray computed tomography (CT), long used in medical imaging but increasingly employed over the past 15 years to examine the dimensional characteristics of commercial products. At present, however, there are very few agreed-upon standards to evaluate a CT instrument's performance or verify the accuracy of its images. That's why NIST entered into a Cooperative Research and Development Agreement (CRADA) with North Star Imaging (NSI) of Minnesota, a manufacturer of industrial digital x-ray and CT systems, which has loaned a CT unit to NIST for the three-year duration of the CRADA. During that time, NIST researchers can use the CT system to test measurements of candidate reference artifacts that could eventually be employed in standardized testing and calibration; at the same time, the NSI system can be characterized by exacting procedures at the nation's standards laboratory. "Right now, we're mainly involved in developing very well described reference artifacts," says project scientist Meghan Shilling of NIST's Physical Measurement Laboratory. "We take an artifact designed to evaluate the performance of a CT system and measure it using our tactile-probe coordinate measuring machines, which have extremely well-established measurement accuracy. "Then we put the artifacts in the CT system, measure them, and see how the data compare. One person on our team, who is part of the Engineering Laboratory at NIST, is making metal test structures using additive manufacturing, into which he intentionally leaves some voids, which can also be imaged using the CT system. At the same time, we're also working on characterizing North Star's machine, giving them technical feedback that may help improve their system design." "The CRADA has been extremely valuable for NSI in characterizing the system for use in the refinement and enhancement of our CT system designs," says Tucker Behrns, Engineering Manager at NSI. "We have been able to gather a wealth of information through working alongside the NIST team while gaining unbiased feedback with a focus on metrological implications. The unique measurement knowledge and skills we have access to as a result of this agreement have allowed us to gain great depth in our understanding of the critical aspects of the machine function and performance." A concurrent goal is to assist in the development of performance evaluation standards that can be promulgated worldwide. "Both NIST and NSI are active in standards organizations, including the International Organization for Standardization (ISO) and the American Society of Mechanical Engineers," Shilling says. "Both are in the process of putting together standards for specifying CT systems. The only performance evaluation document that exists now for CT dimensional metrology is a German guideline, and the team that put together the guideline is also involved in drafting the ISO standard. Eventually, we also hope to be able to disseminate best practices and lessons learned about techniques and artifacts." CT works by projecting x-rays of appropriate energies through an object at successively varying angles. Different kinds of materials absorb or scatter more or fewer x-rays; so measuring the x-rays transmitted through a multi-featured object at different angles reveals its inner structure. In a typical medical CT scan, an x-ray source rotates continuously around the body, building up 2-D or 3-D images which reveal circulatory problems, tumors, bone irregularities, kidney and bladder stones, head injuries and many other conditions. X-ray CT for manufactured objects uses exactly the same principles. In the NSI instrument at NIST, a sample/test object is placed on a stage between the x-ray source and a detector plate. The sample revolves in a series of small angular increments around its vertical axis, and the x-ray beam passes through it, taking one frame of data at each position. Each measurement produces a single 2-D slice. Computer software integrates all of the slices and builds up a 3-D image. However, there are many complicating factors. For one thing, samples may contain both soft polymer parts and multiple hard metallic sections laid down in layers of melted or sintered powders. Each kind of material has an inherent attenuation coefficient (the ease with which x-rays pass through the material), that is dependent on the material composition and density as well as the energy spectrum of the x-ray source. NIST provides tables of x-ray mass attenuation coefficients for elements with atomic numbers from 1 to 92 for specific x-ray energies. But calculating the attenuation coefficient for multi-element compounds, such as plastics combined with metal, using a spectrum of x-ray energy, is a challenge. "We are able to vary the voltage and the current in the x-ray source," Shilling says, "and we can place various filters in front of the beam to adjust the x-ray spectrum that travels on to the target test object. So the system is very capable of measuring materials from plastics to steel." Depending on the customer's needs and the degree of detail that is wanted, a measurement run can range from half an hour to four hours or more. But how can the accuracy of those images be objectively evaluated? And what are the optimal ways to measure different materials and configurations? The answers are slowly emerging from scores of trials, and "developing the right settings is a bit of an art," Shilling says. Aside from adjusting the voltage and current in the x-ray beam and the filter material, both the distance between the x-ray source and the sample, and the sample and the detector, can be adjusted to achieve various effects. At the same time, Shilling and colleagues are also investigating aspects of the instrument that could potentially lead to measurement errors. "For example," she says, "as the vertical axis of the rotary table spins, we want to see how much the sample may move in other directions—up and down or side to side. That can affect the quality of the results. What we've been doing most recently is to characterize those motions on the most important axes of the machine." That effort requires sensitive capacitance gauges and laser interferometers that can detect extremely tiny changes in position. Those and other measurements will continue for about one more year under the terms of the CRADA. "At NSI," Behrns says, "we have seen a substantial increase in the use of additive manufacturing for production components across many of the major markets we serve. As our customers continue to expand the application of this technology, we believe that CT will play a crucial role in the identification and measurement of internal structures which is not possible with traditional methods. Working with NIST has allowed us to accelerate the advancement of CT measurement technology so that we can continue to improve our ability to serve this rapidly expanding market."


News Article | October 3, 2016
Site: www.cemag.us

In their recent experiment, the researchers exposed multiple samples of a commercially available polyurethane coating containing silicon dioxide nanoparticles to intense UV radiation for 100 days inside the NIST SPHERE (Simulated Photodegradation via High-Energy Radiant Exposure), a hollow, 2-meter (7-ft.) diameter black aluminum chamber lined with highly UV reflective material that bears a casual resemblance to the Death Star in the film Star Wars. For this study, one day in the SPHERE was equivalent to 10 to 15 days outdoors. All samples were weathered at a constant temperature of 50 C (122 F) with one group done in extremely dry conditions (approximately 0 percent humidity) and the other in humid conditions (75 percent humidity). To determine if any nanoparticles were released from the polymer coating during UV exposure, the researchers used a technique they created and dubbed “NIST simulated rain.” Filtered water was converted into tiny droplets, sprayed under pressure onto the individual samples, and then the runoff — with any loose nanoparticles — was collected in a bottle. This procedure was conducted at the beginning of the UV exposure, at every two weeks during the weathering run and at the end. All of the runoff fluids were then analyzed by NIST chemists for the presence of silicon and in what amounts. Additionally, the weathered coatings were examined with atomic force microscopy (AFM) and scanning electron microscopy (SEM) to reveal surface changes resulting from UV exposure. Both sets of coating samples — those weathered in very low humidity and the others in very humid conditions — degraded but released only small amounts of nanoparticles. The researchers found that more silicon was recovered from the samples weathered in humid conditions and that nanoparticle release increased as the UV exposure time increased. Microscopic examination showed that deformations in the coating surface became more numerous with longer exposure time, and that nanoparticles left behind after the coating degraded often bound together in clusters. “These data, and the data from future experiments of this type, are valuable for developing computer models to predict the long-term release of nanoparticles from commercial coatings used outdoors, and in turn, help manufacturers, regulatory officials and others assess any health and environmental impacts from them,” says NIST research chemist Deborah Jacobs, lead author on the study published in the Journal of Coatings Technology and Research. This project resulted from a collaboration between NIST’s Engineering Laboratory and Material Measurement Laboratory. It is part of NIST's work to help characterize the potential environmental, health and safety (EHS) risks of nanomaterials, and develop methods for identifying and measuring them.


News Article | October 5, 2016
Site: www.nanotech-now.com

Home > Press > NIST-made 'sun and rain' used to study nanoparticle release from polymers Abstract: If the 1967 film "The Graduate" were remade today, Mr. McGuire's famous advice to young Benjamin Braddock would probably be updated to "Plastics ... with nanoparticles." These days, the mechanical, electrical and durability properties of polymers--the class of materials that includes plastics--are often enhanced by adding miniature particles (smaller than 100 nanometers or billionths of a meter) made of elements such as silicon or silver. But could those nanoparticles be released into the environment after the polymers are exposed to years of sun and water--and if so, what might be the health and ecological consequences? In a recently published paper, researchers from the National Institute of Standards and Technology (NIST) describe how they subjected a commercial nanoparticle-infused coating to NIST-developed methods for accelerating the effects of weathering from ultraviolet (UV) radiation and simulated washings of rainwater. Their results indicate that humidity and exposure time are contributing factors for nanoparticle release, findings that may be useful in designing future studies to determine potential impacts. In their recent experiment, the researchers exposed multiple samples of a commercially available polyurethane coating containing silicon dioxide nanoparticles to intense UV radiation for 100 days inside the NIST SPHERE (Simulated Photodegradation via High-Energy Radiant Exposure), a hollow, 2-meter (7-foot) diameter black aluminum chamber lined with highly UV reflective material that bears a casual resemblance to the Death Star in the film "Star Wars." For this study, one day in the SPHERE was equivalent to 10 to 15 days outdoors. All samples were weathered at a constant temperature of 50 degrees Celsius (122 degrees Fahrenheit) with one group done in extremely dry conditions (approximately 0 percent humidity) and the other in humid conditions (75 percent humidity). To determine if any nanoparticles were released from the polymer coating during UV exposure, the researchers used a technique they created and dubbed "NIST simulated rain." Filtered water was converted into tiny droplets, sprayed under pressure onto the individual samples, and then the runoff--with any loose nanoparticles--was collected in a bottle. This procedure was conducted at the beginning of the UV exposure, at every two weeks during the weathering run and at the end. All of the runoff fluids were then analyzed by NIST chemists for the presence of silicon and in what amounts. Additionally, the weathered coatings were examined with atomic force microscopy (AFM) and scanning electron microscopy (SEM) to reveal surface changes resulting from UV exposure. Both sets of coating samples--those weathered in very low humidity and the others in very humid conditions--degraded but released only small amounts of nanoparticles. The researchers found that more silicon was recovered from the samples weathered in humid conditions and that nanoparticle release increased as the UV exposure time increased. Microscopic examination showed that deformations in the coating surface became more numerous with longer exposure time, and that nanoparticles left behind after the coating degraded often bound together in clusters. "These data, and the data from future experiments of this type, are valuable for developing computer models to predict the long-term release of nanoparticles from commercial coatings used outdoors, and in turn, help manufacturers, regulatory officials and others assess any health and environmental impacts from them," said NIST research chemist Deborah Jacobs, lead author on the study published in the Journal of Coatings Technology and Research. This project resulted from a collaboration between NIST's Engineering Laboratory and Material Measurement Laboratory. It is part of NIST's work to help characterize the potential environmental, health and safety (EHS) risks of nanomaterials, and develop methods for identifying and measuring them. For more information, please click If you have a comment, please us. Issuers of news releases, not 7th Wave, Inc. or Nanotechnology Now, are solely responsible for the accuracy of the content.


Yet as large as these databases are, they contain just a fraction of the information and knowledge needed to rapidly discover or design new materials that can have a transformative impact on advancing technologies that solve pressing social and economic problems. Part of this obstacle is that databases lack the ability to collect and interpret visual data such as graphs and images from countless scientific studies, handbooks and other publications. This limitation creates a bottleneck that often slows the materials discovery process to a crawl. That will soon change. The University at Buffalo has received a $2.9 million National Science Foundation (NSF) grant to transform the traditional role of a database as a repository for information into an automated computer laboratory that rapidly collects, interprets and learns from massive amounts of information. The lab, which also will conduct large-scale materials modeling and simulations based upon untapped troves of visual data, will be accessible to the scientific community and ultimately speed up and reduce the cost of discovering, manufacturing and commercializing new materials—goals of the White House's Materials Genome Initiative. "This pioneering and multidisciplinary approach to advanced materials research will provide the scientific community with tools it needs to accelerate the pace of discovery, leading to greater economic security and a wide range of societal benefits," said Venu Govindaraju, PhD, UB's vice president for research and economic development. Govindaraju, SUNY Distinguished Professor of Computer Science and Engineering, is the grant's principal investigator. Co-principal investigators, all from UB, are: Krishna Rajan, ScD, Erich Bloch Endowed Chair of the Department of Materials Design and Innovation (MDI); Thomas Furlani, PhD, director of the Center for Computational Research; Srirangaraj "Ranga" Setlur, principal research scientist; and Scott Broderick, PhD, research assistant professor in MDI. The award, from NSF's Data Infrastructure Building Blocks (DIBBS) program, draws upon UB's expertise in artificial intelligence, specifically its groundbreaking work that began in the 1980s to enable machines to read human handwriting. The work has saved postal organizations billions of dollars in the U.S. and worldwide. UB will use the DIBBS grant to create what it's calling the Materials Data Engineering Laboratory at UB (MaDE @UB). The lab will introduce the tools of machine intelligence—such as machine learning, pattern recognition, materials informatics and modeling, high-performance computing and other cutting-edge technologies—to transform data libraries into a laboratory that not only stores and searches for information but also predicts and processes information to discover materials that transform how society addresses climate change, national security and other pressing issues. "Essentially, we're creating a system—a smart robot—with cognitive skills for scientific interpretation of text, graphs and images, " said Rajan of MDI, a collaboration between UB's School of Engineering and Applied Sciences and the College of Arts and Sciences launched in 2014 to apply information science methods to advanced materials research. He added: "This machine intelligence driven approach will open a new trajectory of data-intensive materials science research impacting both computational and experimental studies." The lab builds upon significant investments UB has made in recent years to build a hub for advanced manufacturing in Western New York. Explore further: Using machine learning to understand materials


News Article | November 4, 2016
Site: www.materialstoday.com

Scientists are already using supercomputers and other technologies to create ever-growing libraries of data on the properties of metals, polymers, ceramics and other materials. Yet as large as these databases are, they contain just a fraction of the information and knowledge needed to rapidly discover or design new materials that could have a transformative impact on advancing technologies that solve pressing social and economic problems. One of the reasons for this is current databases lack the ability to collect and interpret visual data such as graphs and images from countless scientific studies, handbooks and other publications. This limitation creates a bottleneck that often slows the materials discovery process to a crawl. This could, however, soon change. The University at Buffalo (UB) has received a $2.9 million grant from the US National Science Foundation (NSF) to transform the traditional role of a database, from a repository for information to an automated computer laboratory that rapidly collects, interprets and learns from massive amounts of information. The lab, which will also conduct large-scale materials modeling and simulations based upon untapped troves of visual data, will be accessible to the scientific community, and ultimately speed up and reduce the cost of discovering, manufacturing and commercializing new materials. These are all goals of the US government's Materials Genome Initiative. "This pioneering and multidisciplinary approach to advanced materials research will provide the scientific community with tools it needs to accelerate the pace of discovery, leading to greater economic security and a wide range of societal benefits," said Venu Govindaraju, UB's vice president for research and economic development. Govindaraju, a professor of computer science and engineering, is the grant's principal investigator. Co-principal investigators, all from UB, are: Krishna Rajan, chair of the Department of Materials Design and Innovation (MDI); Thomas Furlani, director of the Center for Computational Research; Srirangaraj ‘Ranga’ Setlur, principal research scientist; and Scott Broderick, research assistant professor in MDI. The award, from NSF's Data Infrastructure Building Blocks (DIBBS) program, draws upon UB's expertise in artificial intelligence, specifically its ground-breaking work that began in the 1980s to enable machines to read human handwriting. This work has saved postal organizations billions of dollars in the US and worldwide. UB will use the DIBBS grant to create what it's calling the Materials Data Engineering Laboratory at UB (MaDE @UB). This lab will utilize the tools of machine intelligence, including machine learning, pattern recognition, materials informatics and modeling, high-performance computing and other cutting-edge technologies. Its aim is to transform data libraries into a facility that not only stores and searches for information, but also predicts and processes information to discover materials that transform how society addresses climate change, national security and other pressing issues. "Essentially, we're creating a system – a smart robot – with cognitive skills for scientific interpretation of text, graphs and images, " said Rajan of MDI, a collaboration between UB's School of Engineering and Applied Sciences and the College of Arts and Sciences. The MDI was launched in 2014 to apply information science methods to advanced materials research. "This machine intelligence driven approach will open a new trajectory of data-intensive materials science research impacting both computational and experimental studies," added Rajan. The lab builds upon significant investments UB has made in recent years to build a hub for advanced manufacturing in Western New York. This story is adapted from material from the University at Buffalo, with editorial changes made by Materials Today. The views expressed in this article do not necessarily represent those of Elsevier. Link to original source.

Loading Engineering Laboratory collaborators
Loading Engineering Laboratory collaborators