Joint Center for Artificial Photosynthesis

California, California, United States

Joint Center for Artificial Photosynthesis

California, California, United States
SEARCH FILTERS
Time filter
Source Type

News Article | May 11, 2017
Site: www.chromatographytechniques.com

The hunt for “needle in a haystack” materials that could help efficiently produce fuel from just water, sunlight and carbon dioxide has, over four decades, yielded only 16 prospects, none of which led to the creation of a commercially viable solar fuels generator. But the number of materials that could serve as catalysts for creating solar fuel is now significantly expanding thanks to the development of a new “discovery pipeline” by a team of researchers in California—a breakthrough that puts more options on the table for scientists trying to develop a renewable energy source. In two years, the research team identified nearly double the number of applicable materials using a new method developed through a partnership between the Joint Center for Artificial Photosynthesis at Caltech and Lawrence Berkeley National Laboratory’s Materials Project. The scientists say they have also discovered, but not yet reported, dozens of additional materials that may be capable of splitting water using energy from the sun. After splitting water, the extracted hydrogen atoms can be used to create hydrogen gas or combined with carbon dioxide to create hydrocarbon fuel. High-throughput theory By mimicking the natural process of photosynthesis, scientists aim to convert and store the energy of the sun for on-demand use in cost-effective and scalable systems. But since water doesn’t separate into hydrogen and oxygen in the presence of sunlight, a material known as a photoanode is needed to facilitate that reaction. John Gregoire, principal investigator and research thrust coordinator with the Joint Center for Artificial Photosynthesis, said researchers face a significant challenge when trying to identify potential photoanodes. While metal oxides are “very promising materials” for photoanodes due to their stability, they usually don’t absorb visible light. “The trick is finding the special kind of metal oxides that absorb visual light or, more technically, have a band gap energy in the visible range,” Gregoire told Laboratory Equipment. “These are really kind of needle in a haystack materials. There are many known metal oxides, but very few of them exhibit all the necessary properties to be solar fuels photoanodes.” For example, there are only 16 known oxide photoanodes, despite the thousands of metal oxides chemists work with on a daily basis. To accelerate the process of identifying potential photoanodes, Gregoire and a team of researchers co-led by Lawrence Berkeley National Laboratory’s Jeffrey Neaton and Qimin Yan developed a discovery pipeline that integrates theory and experiment. The researchers started by selectively mining a database of roughly 66,000 compounds with a well-defined hypothesis and identified 174 potentially promising vanadates, which contain vanadium, oxygen and one other element. They then screened those vanadates using a computational method aimed at predicting which materials would exhibit the properties of a photoanode. “High-throughput theoretical materials discovery has been on the rise recently, but a shortcoming is that such efforts do not involve experiments and therefore are limited by the accuracy of the computational methods,” Neaton, also a principal investigator with the Joint Center for Artificial Photosynthesis, told Laboratory Equipment. The center is a U.S. Department of Energy Innovation Hub dedicated to the development of solar fuels. But, in the integrated discovery pipeline developed by Gregoire and Neaton, promising materials identified by the computational methods were passed directly to experiment, where researchers measured the materials’ optical and photocatalytic properties. While high-throughput techniques are commonplace in pharmaceutical and biological labs, materials science tends to be more complex in that running high-throughput theory and high-throughput experiments separately has yielded ineffective results. “By combining them and coming up with a screening pipeline that uses both experiment and theory together is how we’ve been able to really accelerate the discovery process,” Gregoire said. While the researchers acknowledge the 12 materials they identified and reported in their study are far from appearing in any type of commercially available solar fuel generator, they say one of the keys to accelerating the development of solar technology is identifying more potential photoanodes. And the integrated pipeline does just that—quickly. “This is a discovery that they have any activity at all,” Gregoire said. Additional calculations and experiments are next to figure out how the newly discovered materials can be optimized. Several questions about the materials need to be answered in the process, including, what is their maximum activity? and what is the limit of their efficiency? As if identifying 12 new materials suitable as photoanodes isn’t accomplishment enough, Gregoire, Neaton and colleagues said they are not done yet. “The bigger picture is that we only used one of our design criteria to identify these 12. We have a number of other design criteria. So we are exploring very different metal oxides and are continuing to rapidly discover photoanodes in these other spaces, as well. Now that we have this high-throughput discovery pipeline working, we’re getting very good at finding all the needles in the haystack. We have dozens more discoveries.” Artificial leaf Other researchers at the Joint Center of Artificial Photosynthesis have developed an artificial leaf, a solar-driven system that splits water to create hydrogen fuels. The system is made up of two electrodes—one photoanode and one photocathode—and a membrane. The photoanode uses sunlight to oxidize water molecules, generating protons and electrons as well as oxygen gas. The photocathode recombines the protons and electrons to form hydrogen gas. A key part of the JCAP design is a plastic membrane, which keeps the oxygen and hydrogen gases separate. If the two gases are allowed to mix and are accidentally ignited, an explosion can occur; the membrane lets the hydrogen fuel be separately collected under pressure and safely pushed into a pipeline. In studies, the system was shown to convert 10 percent of the energy in sunlight into stored energy. During natural photosynthesis, plants convert about 1 percent of the sunlight’s energy into stored energy. Additionally, the artificial leaf system was proven capable of continuously operating for more than 40 hours. “Our work shows that it is indeed possible to produce fuels from sunlight safely and efficiently in an integrated system with inexpensive components,” said Caltech’s Nate Lewis in August 2015, when the system was announced. “Of course, we still have work to do to extend the lifetime of the system and to develop methods for cost-effectively manufacturing full systems, both of which are in progress.” Part of what Neaton, Gregoire and their team is trying to do is find replacements for the more expensive materials in prototype devices, like the artificial leaf, to drive down costs and create efficient, affordable and scalable solar fuel generation technology. “This photoanode material is just one material in the overall device that involves having to integrate many materials and having them all work together,” Gregoire said. “We don’t know yet whether any of these discoveries will be able to be integrated with the other known components and create an efficient technology that is deployable and cost competitive with existing methods for producing fuels, but it gives us a lot more options to explore.” The timeline for when solar fuels may be powering people’s cars and homes is not clear, but researchers say their new method is giving a boost to that effort. “It’s not going to be next week, tomorrow, even two years from now. It’s still down the road,” said Neaton. “We need to be patient, but I do think that the way to accelerate this process is going to be identifying new materials that could realistically be used in devices.”


News Article | October 26, 2016
Site: www.nature.com

Dalian National Laboratory will focus on reducing carbon emissions from coal. After five years of preparation, China has officially opened a clean-energy research centre that will spearhead the country's efforts to develop new ways to reduce its carbon emissions. "Our goal is to lead energy research in the country, and to rank among the world's top energy labs," says Can Li, head of the Dalian National Laboratory for Clean Energy (DNL), which was inaugurated in early October. Li says the facility will combine all major areas of energy research, including cleaner fossil fuels, solar power, and fuel cell technologies. The lab is based at the Dalian Institute of Chemical Physics (DICP), a subsidiary of the Chinese Academy of Sciences (CAS). The DNL's 600 scientists will be housed in a sprawling 40,000-square meter research complex on DICP's campus, where construction of the 204-million-renminbi (US$32 million) facility began in late 2006 after approval from the Chinese Ministry of Science and Technology. "Now is a major turning point for the DNL," says Tao Zhang, director of DICP. "We are transitioning from the planning and team-building stage to actual research." Mindful that China relies on coal for more than two-thirds of its electricity, Li expects the DNL to focus much of its resources on clean fossil-fuel technologies, at least initially. This plays to the strengths of the DICP, which has developed methanol-to-olefins conversion processes that help to reduce waste in the industrial processing of coal. In cooperation with the Shenhua Group, China's largest coal supplier, the DICP last year opened a factory using its technology. The DICP also has an ongoing energy research partnership with international oil giant BP. The DNL is expected to establish similar links with businesses and research institutions in China and abroad. "Much of the research scope is strategically defined by China's unique energy resources, and will be critical for the development of the country in the next few decades," says Peidong Yang, department head at the Joint Center for Artificial Photosynthesis at the Lawrence Berkeley National Laboratory in California. "Establishing the national lab is a great first step." The DNL's research into renewable energy sources will be more modest, however. "We are a latecomer in terms of solar-power research," says Li, who hopes that the lab will be able to leapfrog into more cutting-edge areas of renewable-energy research, such as artificial photosynthesis. The DNL sprang from the Chinese government's 2006 plan to set up ten national laboratories, each focusing on a broad topic, such as protein science or modern rail transportation. But the government has yet to set up a separate fund for those initiatives; the science ministry declined to comment on the situation. For now, the DICP is investing more than 289 million renminbi a year — over half of its annual research budget — in the DNL. More than half of that funding stream comes from DICP's business collaborations, with the remainder from government-funded research programs. "We are faced with some very fierce competition from labs all over the world, and money is one of the necessary ingredients to keep us going," says Li. "We hope for more funding from the government, but we are also prepared to generate revenues on our own."


News Article | October 12, 2016
Site: www.technologyreview.com

Despite healthy corporate earnings, an employment rate that has slowly rebounded since the financial crisis of 2008, and the outpouring of high-tech distractions from Silicon Valley, many people have an aching sense that there is something deeply wrong with the economy. Slow productivity growth is stunting their financial opportunities; high levels of income inequality in the United States and Europe are fueling public outrage and frustration in those left behind, leading to unprecedentedly angry politics; and yet despite the obvious symptoms, economists and other policy makers have been largely befuddled in explaining the causes and, even more important, the cures for these problems. That’s the starting point for Rethinking Capitalism. A series of essays by authors including Joseph Stiglitz, an economist at Columbia University who won a Nobel Prize in 2001, and Mariana Mazzucato, a professor of the economics of innovation at the University of Sussex and a rising voice in British politics, the book attempts to provide, as explained in its introduction, “a much better understanding of how modern capitalism works—and why in key ways it now doesn’t.” Together, the essays provide a compelling argument that we need more coherent and deliberate strategic planning in tackling our economic problems, especially in finding more effective ways to reduce greenhouse-gas emissions. In particular, Mazzucato, who also co-edited the book and co-wrote an introduction with Michael Jacobs, wants to counter the view that free markets inevitably lead to desirable outcomes and that freer markets are always better: the faith that “the ‘invisible hand’ of the market knows best.” In fact, she argues, we should admit that markets are created and shaped by government policies, including government support of innovation. There is nothing too contentious in that statement, but she extends the argument in a way that is controversial. Not only is it the responsibility of governments to facilitate innovation, which she calls “the driving force behind economic growth and development,” but the state should also set its direction; the trajectory of innovation needs to be guided by policies to solve specific problems, whether the aim is increasing productivity or creating a green-energy transition. Mazzucato writes that innovation needs both “well-funded public research and development institutions and strong industrial policies.” Industrial policies—or what ­Mazzucato sometimes calls mission-­oriented public policies—have a long and divisive history. Economists define industrial policy in a very specific way: it’s when governments set out to play a deliberate role in directing innovation and growth to achieve a desired objective. Her call for the revival of such policies counters the idea that has held sway for decades among many politicians, particularly in the United States and the U.K., that government is better off not trying to assert a role in steering innovation. She writes that governments should not only try to “level the playing field, as orthodox view would allow.” Rather, “they can help tilt the playing field towards the achievement of publicly chosen goals.” For them to do so, ­Mazzucato said in a recent interview, “the whole framework needs to change.” The belief that the government should only intervene to “fix” the market in extreme circumstances, rather than acting as a partner in creating and shaping markets, means we’re constantly putting “bandages” on problems and “nothing changes.” The intractability of today’s slow growth and widening inequality can be traced, she says, to the fact that governments in the U.S. and Europe have increasingly shied away from their responsibilities. “We have to admit that policy steers innovation and growth, and so the question is where do we want to steer them?” One of Mazzucato’s more controversial claims is that the private sector gets too much credit—and too many riches—for some of today’s most popular technologies. The iPhone, she contends, relied on advances, including the touch screen, Siri, GPS, and the Internet, that were all developed by state-funded research. Maybe. At times, she clearly takes this argument too far. Take, for example, her assertion that nanotechnology was initially funded by government initiatives and that the private sector jumped in later. In fact, key early inventions were made by IBM at its Zurich lab; these allowed researchers to image and manipulate single atoms for the first time. Regardless, Mazzucato’s argument has resonated among many of today’s policy makers. After Theresa May took over as the U.K.’s prime minister this summer, Mazzucato was summoned to Downing Street. Change was clearly in the air. A few weeks earlier, May had announced a newly formed Department for Business, Energy and Industrial Strategy. More than 30 years after Margaret Thatcher effectively killed off industrial policy in the country, another conservative prime minister was hinting at its revival. While it’s too early to know the outcome, ­Mazzucato says, “It seems superficially encouraging.” The debate over industrial policies played out in the United States and the U.K. in the early 1980s as President Reagan and Prime Minister Thatcher preached the power of free markets and the dangers of government meddling. And for at least the next few decades, the free-market rhetoric clearly won out, as popular wisdom held that such interventions are tantamount to governments picking winners and losers. Even advocates of industrial policies acknowledge that they have had a checkered history. In “Green Industrial Policy,” Dani Rodrik, an economist at Harvard’s John F. Kennedy School of Government, argues that such a strategy is needed to make the sweeping changes required to slow climate change. But he notes that executing industrial policies fairly has been a challenge. While such policies have “undoubtedly worked” in Japan, South Korea, China, and other countries, Rodrik writes, they have a reputation for being gamed in many countries by both businesses and political leaders. And industrial policies to support desirable sectors have given birth to such white elephants as the Concorde, a plane meant to bolster the aerospace industry in the U.K. and France. Because of this history, he writes, “economists traditionally exhibit scepticism—if not outright hostility—towards industrial policies.” But despite the challenge of making them work, he argues, industrial policies “have an indispensable role in putting the global economy on a green growth path,” because markets have failed to properly account for the social cost of carbon dioxide emissions and the true technological benefits of risky energy R&D. Rodrik said in an interview that while “unfortunately” we’re stuck with the label “industrial policy,” today’s versions are very different from ones conceived decades ago. Rather than singling out a specific sector—say, aerospace or steel manufacturing—for support with large investments and tax incentives, new thinking suggests working across sectors to achieve a desired goal such as addressing climate change, using tools such as carbon pricing. “It’s really just pushing markets in a direction they wouldn’t otherwise go,” he says. “The idea is to get government working closely with businesses to achieve more rapid and appropriate growth.” In that sense, says Rodrik, it is something that governments have been doing all along, even as industrial policy fell out of fashion in the 1980s. However, one consequence of attempting to “fly under the radar” is that governments are often not explicit about their objectives, he says. “If the goal is to spawn new technologies in clean energy, let’s say that.” And, he says, “being more self-­conscious and open provides a big advantage in designing better policies.” Included in such designs should be well-defined rules and procedures, insulating decision making from political whims and interests. Take, for example, the failure of the solar company Solyndra. It is often held up as the kind of thing that occurs when government picks winners. But, writes Rodrik, Solyndra failed largely because competing technologies got much cheaper. Such outcomes are not necessarily an indictment of industrial policies. The real problem, Rodrik argues: the U.S. Department of Energy loan guarantee program that supported the solar company had a mixed set of goals, from creating jobs to competing with China to helping fund new energy technologies. What’s more, it did not properly define procedures for evaluating the progress of potential loan recipients and, importantly, terminating support to those companies when appropriate. Instead, according to Rodrik, in the absence of such rules, money was lent to Solyndra for political reasons—President Obama and his administration used the company as a high-profile way to highlight its green-energy initiatives. Having singled out the solar company for praise, the administration was then reluctant to end its commitment. President Obama’s eight years in office will be judged in part on the $787 billion stimulus bill that passed in 2009 and included some $60 billion for energy projects and research. In some ways its results, both positive and negative, present a valuable lesson on just how difficult it is to put economic theory about industrial policy into practice. The stimulus bill was well-­intentioned, and the instinct to use government spending for a specific social goal, supporting the development of green energy, was laudable. The investment in energy was badly needed. But from the start, the energy spending was headed for trouble because it tried to serve multiple purposes: provide a monetary boost, create jobs, and seed the beginning of a green-energy infrastructure. As a leading economist warned in these pages: “It’s very much like pork-barrel politics.” (See “Can Technology Save the Economy?”) The problem was that those objectives often conflicted. Stimulating the economy meant spending money as quickly as possible, while investing wisely in energy projects required deliberate decisions and rigorous due diligence, both of which take time. What’s more, investments were made to help economically stressed regions even if they weren’t the wisest choices for building an energy sector. Government investments were made in a number of large battery production facilities in Michigan, each one coming with a promise to boost the local economy, even though there was not yet nearly enough demand for the batteries. Among the outcomes of the stimulus investments, not surprisingly, were the bankruptcies of Solyndra and other solar and battery startups. The stimulus energy investments were “a bit of a disaster,” says Josh Lerner, a professor at Harvard Business School. “A lot of the problem was in the ways they were implemented. They violated all the rules of how these things should be done.” Not only did the government make large bets on a few companies, in effect picking winners, but it did so without clear rules and criteria for the choices. And, says Lerner, “the selection of the battery and solar companies was extremely opaque. A lot of it seemingly came down to if you had a former assistant secretary of energy doing the lobbying for you.” Still, Lerner is not dismissive of government interventions to support green-energy innovation. “You can make the case that the need is greater than ever. A well-designed program would potentially make a lot of sense at this point.” But, he says, “experience tells us there are more misses than hits” with such government interventions. And he suggests that  such programs often fail because their creators are not familiar enough with any given technology and its business. “The decisions might seem plausible, but they turn out to be unproductive. The devil is in the details.” Even some of the stimulus’s greatest apparent successes now seem to be less effective than originally hoped. ­Steven Chu, a Nobel Prize–­winning physicist, was named secretary of the Department of Energy in early 2009 and implemented many of the bill’s most ambitious efforts to boost energy R&D. It funded large increases in energy research, and Chu created a series of well-conceived centers and initiatives, including the Joint Center for Artificial Photosynthesis and ARPA-E, a program to support early-stage energy technologies. But in subsequent years, budget cutbacks and political pressure took their toll on these projects, which needed patience and consistent funding. As a result, ambitious research and technology initiatives are now ghosts of their once high-profile selves. The outcome makes one wonder just how such policy initiatives, which include investments in research and engineering projects that require years to bear fruit, will ever survive the constantly changing political moods and government leadership. Creating a rigorous industrial policy to encourage green technologies is no doubt a worthwhile objective. Economists and the lessons from efforts like the stimulus bill can teach us how to design such policies to be robust and effective.


News Article | November 10, 2016
Site: www.eurekalert.org

Scientists have found a way to engineer the atomic-scale chemical properties of a water-splitting catalyst for integration with a solar cell, and the result is a big boost to the stability and efficiency of artificial photosynthesis. Led by researchers at the Department of Energy's Lawrence Berkeley National Laboratory (Berkeley Lab), the project is described in a paper published this week in the journal Nature Materials. The research comes out of the Joint Center for Artificial Photosynthesis (JCAP), a DOE Energy Innovation Hub established in 2010 to develop a cost-effective method of turning sunlight, water, and carbon dioxide into fuel. JCAP is led by the California Institute of Technology with Berkeley Lab as a major partner. The goal of this study was to strike a careful balance between the contradictory needs for efficient energy conversion and chemically sensitive electronic components to develop a viable system of artificial photosynthesis to generate clean fuel. "In order for an artificial photosystem to be viable, we need to be able to make it once, deploy it, and have it last for 20 or more years without repairing it," said study principal investigator Ian Sharp, head of materials integration and interface science research at JCAP. The problem is that the active chemical environments needed for artificial photosynthesis are damaging to the semiconductors used to capture solar energy and power the device. "Good protection layers are dense and chemically inactive. That is completely at odds with the characteristics of an efficient catalyst, which helps to split water to store the energy of light in chemical bonds," said Sharp, who is also a staff scientist at Berkeley Lab's Chemical Sciences Division. "The most efficient catalysts tend to be permeable and easily transform from one phase to another. These types of materials would usually be considered poor choices for protecting electronic components." By engineering an atomically precise film so that it can support chemical reactions without damaging sensitive semiconductors, the researchers managed to satisfy contradictory needs for artificial photosystems. "This gets into the key aspects of our work," said study lead author Jinhui Yang, who conducted the work as a postdoctoral researcher at JCAP. "We set out to turn the catalyst into a protective coating that balances these competing properties." The researchers knew they needed a catalyst that could not only support active and efficient chemical reactions, but one that could also provide a stable interface with the semiconductor, allow the charge generated by the absorption of light from the semiconductor to be efficiently transferred to the sites doing catalysis, and permit as much light as possible to pass through. They turned to a manufacturing technique called plasma-enhanced atomic layer deposition, performed at the Molecular Foundry at Berkeley Lab. This type of thin-film deposition is used in the semiconductor industry to manufacture integrated circuits. "This technique gave us the level of precision we needed to create the composite film," said Yang. "We were able to engineer a very thin layer to protect the sensitive semiconductor, then atomically join another active layer to carry out the catalytic reactions, all in a single process." The first layer of the film consisted of a nanocrystalline form of cobalt oxide that provided a stable, physically robust interface with the light-absorbing semiconductor. The other layer was a chemically reactive material made of cobalt dihydroxide. "The design of this composite coating was inspired by recent advances in the field that have revealed how water-splitting reactions occur, at the atomic scale, on materials. In this way, mechanistic insights guide how to make systems that have the functional properties we need," said Sharp. Using this configuration, the researchers could run photosystems continuously for three days--potentially longer--when such systems would normally fail in mere seconds. "A major impact of this work is to demonstrate the value of designing catalysts for integration with semiconductors," said Yang. "Using a combination of spectroscopic and electrochemical methods, we showed that these films can be made compact and continuous at the nanometer scale, thus minimizing parasitic light absorption when integrated on top of photoactive semiconductors." The study authors noted that while this is an important milestone, there are many more steps needed before a commercially viable artificial photosystem is ready for deployment. "In general, we need to know more about how these systems fail so we can identify areas to target for future improvement," said Sharp. "Understanding degradation is an important avenue to making something that is stable for decades." This work was supported by DOE's Office of Science. The researchers used the Advanced Light Source at Berkeley Lab to characterize the materials they created. The Molecular Foundry and the Advanced Light Source are both DOE Office of Science User Facilities. Lawrence Berkeley National Laboratory addresses the world's most urgent scientific challenges by advancing sustainable energy, protecting human health, creating new materials, and revealing the origin and fate of the universe. Founded in 1931, Berkeley Lab's scientific expertise has been recognized with 13 Nobel Prizes. The University of California manages Berkeley Lab for the U.S. Department of Energy's Office of Science. For more, visit http://www. . DOE's Office of Science is the single largest supporter of basic research in the physical sciences in the United States, and is working to address some of the most pressing challenges of our time. For more information, please visit science.energy.gov.


Led by researchers at the Department of Energy's Lawrence Berkeley National Laboratory (Berkeley Lab), the project is described in a paper published this week in the journal Nature Materials. The research comes out of the Joint Center for Artificial Photosynthesis (JCAP), a DOE Energy Innovation Hub established in 2010 to develop a cost-effective method of turning sunlight, water, and carbon dioxide into fuel. JCAP is led by the California Institute of Technology with Berkeley Lab as a major partner. The goal of this study was to strike a careful balance between the contradictory needs for efficient energy conversion and chemically sensitive electronic components to develop a viable system of artificial photosynthesis to generate clean fuel. "In order for an artificial photosystem to be viable, we need to be able to make it once, deploy it, and have it last for 20 or more years without repairing it," said study principal investigator Ian Sharp, head of materials integration and interface science research at JCAP. The problem is that the active chemical environments needed for artificial photosynthesis are damaging to the semiconductors used to capture solar energy and power the device. "Good protection layers are dense and chemically inactive. That is completely at odds with the characteristics of an efficient catalyst, which helps to split water to store the energy of light in chemical bonds," said Sharp, who is also a staff scientist at Berkeley Lab's Chemical Sciences Division. "The most efficient catalysts tend to be permeable and easily transform from one phase to another. These types of materials would usually be considered poor choices for protecting electronic components." By engineering an atomically precise film so that it can support chemical reactions without damaging sensitive semiconductors, the researchers managed to satisfy contradictory needs for artificial photosystems. "This gets into the key aspects of our work," said study lead author Jinhui Yang, who conducted the work as a postdoctoral researcher at JCAP. "We set out to turn the catalyst into a protective coating that balances these competing properties." The researchers knew they needed a catalyst that could not only support active and efficient chemical reactions, but one that could also provide a stable interface with the semiconductor, allow the charge generated by the absorption of light from the semiconductor to be efficiently transferred to the sites doing catalysis, and permit as much light as possible to pass through. They turned to a manufacturing technique called plasma-enhanced atomic layer deposition, performed at the Molecular Foundry at Berkeley Lab. This type of thin-film deposition is used in the semiconductor industry to manufacture integrated circuits. "This technique gave us the level of precision we needed to create the composite film," said Yang. "We were able to engineer a very thin layer to protect the sensitive semiconductor, then atomically join another active layer to carry out the catalytic reactions, all in a single process." The first layer of the film consisted of a nanocrystalline form of cobalt oxide that provided a stable, physically robust interface with the light-absorbing semiconductor. The other layer was a chemically reactive material made of cobalt dihydroxide. "The design of this composite coating was inspired by recent advances in the field that have revealed how water-splitting reactions occur, at the atomic scale, on materials. In this way, mechanistic insights guide how to make systems that have the functional properties we need," said Sharp. Using this configuration, the researchers could run photosystems continuously for three days—potentially longer—when such systems would normally fail in mere seconds. "A major impact of this work is to demonstrate the value of designing catalysts for integration with semiconductors," said Yang. "Using a combination of spectroscopic and electrochemical methods, we showed that these films can be made compact and continuous at the nanometer scale, thus minimizing parasitic light absorption when integrated on top of photoactive semiconductors." The study authors noted that while this is an important milestone, there are many more steps needed before a commercially viable artificial photosystem is ready for deployment. "In general, we need to know more about how these systems fail so we can identify areas to target for future improvement," said Sharp. "Understanding degradation is an important avenue to making something that is stable for decades." Explore further: New discovery could better predict how semiconductors weather abuse More information: Jinhui Yang et al. A multifunctional biphasic water splitting catalyst tailored for integration with high-performance semiconductor photoanodes, Nature Materials (2016). DOI: 10.1038/nmat4794


News Article | January 29, 2016
Site: www.cemag.us

An international research team has simplified the steps to create highly efficient silicon solar cells by applying a new mix of materials to a standard design. Arrays of solar cells are used in solar panels to convert sunlight to electricity. The special blend of materials — which could also prove useful in semiconductor components — eliminates the need for a process known as doping that steers the device’s properties by introducing foreign atoms to its electrical contacts. This doping process adds complexity to the device and can degrade its performance. “The solar cell industry is driven by the need to reduce costs and increase performance,” says James Bullock, the lead author of the study, published this week in Nature Energy. Bullock participated in the study as a visiting researcher at the U.S. Department of Energy’s Lawrence Berkeley National Laboratory (Berkeley Lab) and UC Berkeley. “If you look at the architecture of the solar cell we made, it is very simple,” says Bullock, of Australian National University (ANU). “That simplicity can translate to reduced cost.” Other scientists from Berkeley Lab, UC Berkeley, ANU, and The Swiss Federal Institute of Technology of Lausanne (EPFL) also participated in the study. Bullock adds, “Conventional silicon solar cells use a process called impurity doping, which does bring about a number of limitations that are making further progress increasingly difficult.” Most of today’s solar cells use crystalline silicon wafers. The wafer itself, and sometimes the layers deposited on the wafer, are doped with atoms that either have electrons to spare when they bond with silicon atoms, or alternatively generate electron deficiencies, or “holes.” In both cases, this doping enhances electrical conductivity. In these devices, two types of dopant atoms are required at the solar cell’s electrical contacts to regulate how the electrons and holes travel in a solar cell so that sunlight is efficiently converted to electrical current that flows out of the cell. Crystalline silicon-based solar cells with doped contacts can exceed 20 percent efficiency — meaning more than 20 percent of the sun’s energy is converted to electricity. A dopant-free silicon cell had not previously exceeded 14 percent efficiency. The new study, though, demonstrated a dopant-free silicon cell, referred to as a DASH cell (dopant free asymmetric heterocontact), with an average efficiency above 19 percent. This increased efficiency is a product of the new materials and a simple coating process for layers on the top and bottom of the device. Researchers showed it’s possible to create their solar cell in just seven steps. In this study, the research team used a crystalline silicon core (or wafer) and applied layers of dopant-free type of silicon called amorphous silicon. Then, they applied ultrathin coatings of a material called molybdenum oxide, also known as moly oxide, at the sun-facing side of the solar cell, and lithium fluoride at the bottom surface. The two layers, having thicknesses of tens of nanometers, act as dopant-free contacts for holes and electrons, respectively. “Moly oxide and lithium fluoride have properties that make them ideal for dopant-free electrical contacts,” says Ali Javey, program leader of Electronic Materials at Berkeley Lab and a professor of Electrical Engineering and Computer Sciences at UC Berkeley. Both materials are transparent, and they have complementary electronic structures that are well-suited for solar cells. “They were previously explored for other types of devices, but they were not carefully explored by the crystalline silicon solar cell community,” says Javey, the lead senior author of the study. Javey notes that his group had discovered the utility of moly oxide as an efficient hole contact for crystalline silicon solar cells a couple of years ago. “It has a lot of defects, and these defects are critical and important for the arising properties. These are good defects,” he says. Stefaan de Wolf, another author who is team leader for crystalline silicon research at EPFL in Neuchâtel, Switzerland, says, “We have adapted the technology in our solar cell manufacturing platform at EPFL and found out that these moly oxide layers work extremely well when optimized and used in combination with thin amorphous layer of silicon on crystalline wafers. They allow amazing variations of our standard approach.” In the study, the team identified lithium fluoride as a good candidate for electron contacts to crystalline silicon coated with a thin amorphous layer. That layer complements the moly oxide layer for hole contacts. The team used a room-temperature technique called thermal evaporation to deposit the layers of lithium fluoride and moly oxide for the new solar cell. There are many other materials that the research teams hopes to test to see if they can improve the cell’s efficiency. Javey says there is also promise for adapting the material mix used in the solar cell study to improve the performance of semiconductor transistors. “There’s a critical need to reduce the contact resistance in transistors so we’re trying to see if this can help.” Some off the work in this study was performed at The Molecular Foundry, a DOE Office of Science User Facility at Berkeley Lab. This work was supported by the DOE Office of Science, Bay Area Photovoltaics Consortium (BAPVC); the Joint Center for Artificial Photosynthesis, a DOE Energy Innovation Hub; Office fédéral de l’énergie (OFEN); the Australian Renewable Energy Agency (ARENA), and the CSEM PV-center.


News Article | February 15, 2017
Site: www.technologyreview.com

Clean energy made critical strides in 2016. The Paris Climate accords went into effect, the price of solar installations continued to drop, investments in renewable energy soared, offshore wind finally got under way in the United States, and scientists made a series of technical advances that promise to make sustainable energy increasingly efficient and affordable. That last one is key, since invention is still the surest way to avoid the greatest impacts of climate change.  Today's commercially available renewable technologies can't meet all of the world's energy demands, even if they're scaled up aggressively. The United States comes up about 20 percent short by 2050, according to a thorough analysis by the National Renewable Energy Laboratory. Meanwhile, the U.N.'s Intergovernmental Panel on Climate Change concluded the world must cut greenhouse gas emissions by as much as 70 percent by midcentury, and to nearly zero by 2100, to have any chance of avoiding warming levels that could ensure sinking cities, mass extinctions, and widespread droughts. So we need more highly efficient renewable energy sources, cheaper storage, smarter grids, and effective systems for capturing greenhouse gases. Here are some of the most promising scientific advances of 2016. One of the crucial missing pieces in the portfolio of renewable energy sources is a clean liquid fuel that can replace gasoline and other transportation fuels.  One of the most promising possibilities is artificial photosynthesis, mimicking nature's own method for converting sunlight, carbon dioxide, and water into fuels. There have been slow if steady improvements in the field in recent years. But this summer, Harvard scientists Daniel Nocera and Pamela Silvers, in partnership with their co-authors, developed a "bionic leaf" that could capture and convert 10 percent of the energy in sunlight, a big step forward for the field. It's also about 10 times better than the photosynthesis of your average plant. The researchers use catalysts made from a cobalt-phosphorous alloy to split the water into hydrogen and oxygen, and then set specially engineered bacteria to work gobbling up the carbon dioxide and hydrogen and converting them into liquid fuel. Others labs have also made notable strides in the efficiency and durability of solar fuel devices in recent months, including Lawrence Berkeley National Laboratory and the Joint Center for Artificial Photosynthesis. This year the latter lab created a solar-driven device that converted carbon dioxide to formate at 10 percent efficiency levels. Formate can be used as an energy source for specialized fuel cells. But the field still faces considerable technical challenges, as an earlier MIT Technology Review story explained, and any commercial products are still likely years away. This spring, a team of MIT researchers reported the development of a solar thermophotovoltaic device that could potentially push past the theoretical efficiency limits of the conventional photovoltaics used in solar panels.  Those standard solar cells can only absorb energy from a fraction of sunlight's color spectrum, mainly the visual light from violet to red. But the MIT scientists added an intermediate component made up of carbon nanotubes and nanophotonic crystals that together function sort of like a funnel, collecting energy from the sun and concentrating it into a narrow band of light. The nanotubes capture energy across the entire color spectrum, including in the invisible ultraviolet and infrared wavelengths, converting it all into heat energy. As the adjacent crystals heat up to high temperatures, around 1,000 °C, they reëmit the energy as light, but only in the band that photovoltaic cells can capture and convert. The researchers suggest that an optimized version of the technology could one day break through the theoretical cap of around 30 percent efficiency on conventional solar cells. In principle at least, solar thermophotovoltaics could achieve levels above 80 percent, though that's a long way off, according to the scientists. But there's another critical advantage to this approach. Because the process is ultimately driven by heat, it could continue to operate even when the sun ducks behind clouds, reducing the intermittency that remains one of the critical drawbacks of solar power. If the device were coupled with a thermal storage mechanism that could operate at these high temperatures, it could offer continuous solar power through the day and night. Perovskite solar cells are cheap, easy to produce, and very efficient at absorbing light. A thin film of the material, a class of hybrid organic and inorganic compounds with a particular type of crystal structure, can capture as much light as a relatively thick layer of the silicon used in standard photovoltaics. One of the critical challenges, however, has been durability. The compounds that actually absorb solar energy tend to quickly degrade, particularly in wet and hot conditions. But research groups at Stanford, Los Alamos National Laboratory, and the Swiss Federal Institute of Technology, among other institutions, made considerable strides in improving the stability of perovskite solar cells this year, publishing notable papers in Nature,  Nature Energy, and Science. "At the start of the year, they just weren't stable for long periods of time," says Ian Sharp, a staff scientist at Lawrence Berkeley National Lab. "But there have been some really impressive advances in that respect. This year things have really gotten serious." Meanwhile, other researchers have succeeded at boosting the efficiency of perovskite solar cells and identifying promising new paths for further advances. Electricity generation is responsible for producing 30 percent of the nation's carbon dioxide, so capturing those emissions at the source is crucial to any reduction plan. This year saw advances for several emerging approaches to capturing carbon in power plants, including carbonate fuel cells,  as well as at least some promising implementations of existing technology in the real world. (Though, to be sure, there have been some starkly negative examples as well.) But most of these approaches leave open the question of what to do with the stuff after it's successfully captured. And it's not a small problem. The world produces nearly 40 billion tons of carbon dioxide annually. One method, however, appears more promising than initially believed: burying carbon dioxide and turning it into stone. Since 2012, Reykjavik Energy’s CarbFix Project in Iceland has been injecting carbon dioxide and water deep underground, where they react with the volcanic basalt rocks that are abundant in the region. An analysis published in Science in June found that 95 percent of the carbon dioxide had mineralized in less than two years, much faster than the hundreds of thousands of years many had expected. So far, it also doesn't appear to be leaking out greenhouse gases, which suggests it could be both cheaper and more secure than existing burial approaches. But further research will be required to see how well it works in other areas, notably including under the ocean floors, outside observers say. Another promising option for captured carbon dioxide is, essentially, recycling it back into usable fuels. Earlier this year, researchers at the U.S. Department of Energy's Oak Ridge National Laboratory stumbled onto a method for converting it into ethanol, the liquid fuel already used as an additive in gasoline. The team developed a catalyst made from carbon, copper, and nitrogen with a textured surface, which concentrated the electrochemical reactions at the tips of nano spikes, according to a study published in Chemistry Select in October. When voltage was applied, the device converted a solution of carbon dioxide into ethanol at a high level of efficiency. The materials were also relatively cheap and the process worked at room temperature, both critical advantages for any future commercialization. “We’re taking carbon dioxide, a waste product of combustion, and we’re pushing that combustion reaction backwards,” said lead author Adam Rondinone in a news release. In addition to converting captured carbon dioxide, the process could be used to store excess energy from wind and solar electricity generation. Some outside researchers, however, are skeptical about the initial results and are anxiously awaiting to see if other labs can verify the findings.


News Article | September 9, 2016
Site: www.cemag.us

While metal oxide semiconductors have been widely considered to exhibit outstanding durability, performance degradation in these solar energy harvesting components is frequently observed. Understanding the degradation is essential for developing stable, efficient photosystems. To address the failure, a team at the Joint Center for Artificial Photosynthesis uncovered the mysteries of photochemical instability in a widely used semiconductor. Their results reveal previously unpredicted pathways to degradation and provide insights. Production of fuels from sunlight, carbon dioxide, and water relies on semiconductors that can resist corrosion in harsh operating conditions. Predicting and understanding the origin and pathways associated with the degradation of semiconductors is crucial to designing a next generation of robust and efficient materials. Artificial photosynthesis, which is the process of conversion of sunlight, carbon dioxide, and water into fuels, relies on chemically stable materials that can efficiently harvest solar energy under harsh operating conditions. Artificial systems must be constructed from robust components that can sustain years of operation without the need for energy-intensive and costly repairs. Currently, the lack of durable and efficient semiconductors and the complexity of fabricating stable assemblies are major roadblocks to the realization of viable artificial photosystems. In recent years, significant effort has been directed at developing novel protection schemes that can prolong the lifetimes of otherwise unstable materials. While these approaches have met with success, understanding – and then predicting – corrosion processes of semiconductors will greatly aid the discovery and development of materials that are inherently stable. To promote such understanding, scientists from the Joint Center for Artificial Photosynthesis used experimental and theoretical tools to assess the mechanisms underlying the degradation of bismuth vanadate in the working conditions present in a solar fuels device. Bismuth vanadate is currently one of the best materials available for fabricating semiconductor photoanodes to split water into hydrogen fuel and oxygen.  The study reveals that kinetic factors play a critical role in defining corrosion pathways. Indeed, accumulation of light-generated charge at the surface of the bismuth vanadate destabilizes the material. These and other insights will guide approaches to stabilization and aid the search for durable, visible-light-absorbing materials for the next generation of solar-to-fuel conversion systems.


News Article | August 22, 2016
Site: www.treehugger.com

Crystalline silicon solar cells make up the majority of solar panels out in the world today, but scientists believe that other types have the potential to be more efficient and carry more benefits. One of those types is perovskite solar cells, called that because they are made from compounds that have the crystal structure of the mineral perovskite. These solar cells are inexpensive and easy to fabricate, making them a great alternative to traditional solar cells. Scientists have been working with this technology for about seven years and in just that amount of time the efficiency of those cells has increased from just three percent in 2009 to 22 percent today -- similar to silicon solar cells. That's the fastest efficiency increase of any solar cell material so far. Scientists at the Berkeley Lab's Molecular Foundry and the Joint Center for Artificial Photosynthesis have made a discovery that could push that efficiency up even higher -- up to 31 percent. Using photoconductive atomic force microscopy to study the structures of the cells at the nanoscale level, the researchers were able to map photocurrent generation and open circuit voltage in the active layer of the solar cell -- two properties that affect the conversion efficiency. The maps revealed a surface composed of bumpy, gemstone-like grains measuring about 200 nanometers each. Each grain had multiple facets that it turns out had varying conversion efficiencies. Some facets of the grains were highly efficient, reaching the 31 percent mark, while others were much lower. The researchers believe that if they can study the high efficiency facets and understand what makes them better at converting sunlight to electricity, they can produce a much higher efficiency solar cell overall. “If the material can be synthesized so that only very efficient facets develop, then we could see a big jump in the efficiency of perovskite solar cells, possibly approaching 31 percent,” said Sibel Leblebici, a postdoctoral researcher at the Molecular Foundry. The researchers found that each of the facets behaved like tiny solar cells all connected in parallel -- some performing really well and others not so much. The current flows towards the poorly-performing facets, which lowers the performance of the entire solar material. They believe that if the material could be constructed so that only the high efficiency facets connect with the electrode, then the efficiency of the solar cell would jump to as high as 31 percent, leading to a higher-performing and less expensive solar material than we use today.


News Article | September 8, 2016
Site: phys.org

A solar simulator illuminates a photoelectrochemical cell that contains a bismuth vanadate thin-film electrode to harvest light. Credit: Joint Center for Artificial Photosynthesis and Paul Mueller (Lawrence Berkeley National Laboratory) While metal oxide semiconductors have been widely considered to exhibit outstanding durability, performance degradation in these solar energy harvesting components is frequently observed. Understanding the degradation is essential for developing stable, efficient photosystems. To address the failure, a team at the Joint Center for Artificial Photosynthesis uncovered the mysteries of photochemical instability in a widely used semiconductor. Their results reveal previously unpredicted pathways to degradation and provide insights. Production of fuels from sunlight, carbon dioxide, and water relies on semiconductors that can resist corrosion in harsh operating conditions. Predicting and understanding the origin and pathways associated with the degradation of semiconductors is crucial to designing a next generation of robust and efficient materials. Artificial photosynthesis, which is the process of conversion of sunlight, carbon dioxide, and water into fuels, relies on chemically stable materials that can efficiently harvest solar energy under harsh operating conditions. Artificial systems must be constructed from robust components that can sustain years of operation without the need for energy-intensive and costly repairs. Currently, the lack of durable and efficient semiconductors and the complexity of fabricating stable assemblies are major roadblocks to the realization of viable artificial photosystems. In recent years, significant effort has been directed at developing novel protection schemes that can prolong the lifetimes of otherwise unstable materials. While these approaches have met with success, understanding – and then predicting – corrosion processes of semiconductors will greatly aid the discovery and development of materials that are inherently stable. To promote such understanding, scientists from the Joint Center for Artificial Photosynthesis used experimental and theoretical tools to assess the mechanisms underlying the degradation of bismuth vanadate in the working conditions present in a solar fuels device. Bismuth vanadate is currently one of the best materials available for fabricating semiconductor photoanodes to split water into hydrogen fuel and oxygen. The study reveals that kinetic factors play a critical role in defining corrosion pathways. Indeed, accumulation of light-generated charge at the surface of the bismuth vanadate destabilizes the material. These and other insights will guide approaches to stabilization and aid the search for durable, visible-light-absorbing materials for the next generation of solar-to-fuel conversion systems. Explore further: New, inexpensive production materials boost promise of hydrogen fuel More information: Francesca M. Toma et al. Mechanistic insights into chemical and photochemical transformations of bismuth vanadate photoanodes, Nature Communications (2016). DOI: 10.1038/ncomms12012

Loading Joint Center for Artificial Photosynthesis collaborators
Loading Joint Center for Artificial Photosynthesis collaborators