A recent study combined experimental and theoretical approaches to compare the hydrogen yield achieved by several different metal catalysts used for steam reforming ethylene glycol. Hydrogen production through steam reforming biomass-derived compounds is an economically feasible and environmentally benign way to efficiently use renewable energy resources. A study by scientists at Pacific Northwest National Laboratory compared the hydrogen yield achieved by several different metal catalysts used for steam reforming ethylene glycol. The findings show a cobalt catalyst had a much higher hydrogen yield than rhodium or nickel catalysts, making it a promising catalyst for steam reforming ethylene glycol for hydrogen production. Ethylene glycol is found in aqueous phases that are produced from the direct liquefaction of plant-derived cellulose or cellulosic oxygenates. The study could lead to the development of more efficient strategies to produce hydrogen from bio-derived aqueous phases as an environmentally friendly strategy to power diverse energy needs. Steam reforming biomass-derived compounds is a promising strategy for hydrogen production. To realize the full potential of this approach, scientists must identify which catalyst is optimal for producing the highest yield of hydrogen. To address this question, a team of researchers from Pacific Northwest National Laboratory combined experimental and theoretical methods to study steam reforming ethylene glycol over MgAl2O4-supported rhodium, nickel, and cobalt catalysts. Computational work and advanced catalyst characterization were performed at EMSL, the Environmental Molecular Sciences Laboratory, a DOE national scientific user facility. Compared to the highly active rhodium and nickel catalysts which achieve 100 percent conversion of ethylene glycol, the steam reforming activity of the cobalt catalyst was comparatively lower, with only 42 percent conversion under the same reaction conditions. However, the use of the cobalt catalyst rather than the rhodium and nickel catalysts resulted in a three-fold drop in methane selectivity-a measure of the percentage of ethylene glycol converted to methane. Calculations revealed the lower methane selectivity for the cobalt catalyst, as compared to the two other catalysts, is primarily due to the higher barrier for methane formation. The findings demonstrate that the cobalt catalyst leads to a higher yield of hydrogen, at the expense of methane, compared to the other two. Additionally, the cobalt catalyst was also found to offer enhanced catalyst stability compared with the more conventional nickel and rhodium catalysts. This information could be used to develop efficient methods for converting biomass-derived compounds into hydrogen for petroleum refining, the production of industrial commodities such as fertilizers, and electricity production via fuel cells. Explore further: Converting bioethanol into hydrogen for fuel cells becomes significantly simpler with innovative metal catalysts More information: Donghai Mei et al. Steam Reforming of Ethylene Glycol over MgAl O Supported Rh, Ni, and Co Catalysts , ACS Catalysis (2016). DOI: 10.1021/acscatal.5b01666
News Article | April 17, 2016
Few things go together as poorly as science and politicians. Whether it’s Senator Ted Stevens describing the internet as a “series of tubes” during a net neutrality debate or Republican representatives reveling in their own ignorance about climate change, it’s clear that scientific illiteracy is a rampant problem in our nation’s hallowed halls of government. Yet this was precisely why it was so refreshing to see Canada’s recently elected Prime Minister Justin Trudeau explain the difference between a “normal” computer and a quantum computer completely off the cuff during a press briefing at the Perimeter Institute in Waterloo, Ontario, thereby proving that politics and science need not be mutually exclusive. Although Trudeau was at the Institute to announce $50 million in funding which will allow those working at Perimeter to continue their work on fundamental physics, he took the time to breakdown the essence of quantum computing for a clueless journalist: “Normal computers work either with power going through a wire or not, a one or a zero,” Trudeau said. “They’re binary systems. What quantum states allow for is much more complex information to be encoded into a single bit. A regular computer bit is either a one or a zero, on or off. A quantum state can be much more complex than that because as we know things can be both a particle and a wave at the same time, and the uncertainty around quantum states allows us to encode more information into a much smaller computer. That’s what’s exciting about quantum computing.” While most applauded Trudeau’s remarkably “clear and concise” explanation of quantum computing, others deemed his description as totally off the mark. I decided to ask some experts on quantum computing what they thought of the Prime Minister’s explanation to settle the debate once and for all: Romain Alléaume—Associate Professor at Telecom ParisTech and Paris Center for Quantum Computing “The beginning of Justin Trudeau’s explanation, about the difference between a classical bit and a quantum bit is absolutely correct. To be frank, the argumentation of Justin gets gradually more ‘uncertain’ when he says that the uncertainty principle implies that we can encode more information into ‘smaller computers’. Maybe he wanted to say that quantum computers can process information ‘in superposition,’ which allows to speed up some computations (i.e., solve some problems on smaller computers), but I am not certain about that. It is great to see a high level politician show enthusiasm for one of the biggest challenges in modern science.” Amr Helmy—Director, University of Toronto’s Center of Quantum Information and Quantum Control “His account of the distinction between a classical and quantum state is accurate. This is impressive that Canada’s PM has given this some thought. His comment on how superposition aides in storing information is an argument that can be equally made to explain the power which quantum computing possesses to process information in a fashion that is distinctly different from the classical paradigms. These are insights that are rarely considered by a Prime Minister. The rest of the world should be jealous!” SCORE: Too complex an issue to rank Michele Mosca—University Research Chair and Co-founder, Institute for Quantum Computing, University of Waterloo. Founding Member, Perimeter Institute for Theoretical Physics "The task is to explain quantum computing to a lay audience in a 100 words or so. It’s extremely hard, for even the best scientists and communicators, to get something like this both correct and interesting, especially in 100 words. He doesn’t say anything wrong. He conveys the essence of what quantum computing is, and why it might be more powerful. It’s understandable, and succinct. Also, keep in mind that this was something he said live, on the fly, in response to a joke from a reporter. Room for improvement? Hard to find. Can he next explain how encoding that more complex information in quantum bits leads to a more powerful computer? I’d love to hear his explanation." Aephraim Steinberg—Professor of Physics at the University of Toronto and member of Center of Quantum Information and Quantum Control “He zeroed in on the importance of how information is stored in a physical system, what a bit is, and the difference between classical bits and ‘quantum bits’ or ‘qubits’. This hinted he may have appreciated something very deep: the field of quantum computing is not just about trying to figure out how to speed up one task or another, but about understanding the fundamental role information has in the laws that govern the universe, how much information it takes to describe a physical system, and, on the flip side, what it means to store information in a physical system. “He faltered when trying to explain why a qubit is so much richer than a classical bit and threw in a few tangentially related buzzwords like ‘uncertainty’ and ‘particle and wave,’ in a way that made it clear that although he had the (accurate) sense that these concepts had something to do with quantum information he had to admit that he didn’t know what the connection was, but would throw caution to the wind and stir up some buzzword soup. “To put it bluntly, if you think about the level at which any scientist given a few minutes to try to explain quantum computing to him would have tried to pitch it, he probably got the gist and explained it back as well as you could imagine anyone doing. In any case, my joy is not because I believe our Prime Minister has become an expert at quantum physics. It is because he showed that he is ready to listen to scientists and try to understand what they are saying, what they believe is important, and why they demand support for basic research.”
A 3D hierarchical porous catalyst architecture based on earth abundant metals Ni, Fe, and Co has been fabricated through a facile hydrothermal and electrodeposition method for efficient oxygen evolution reaction (OER) and hydrogen evolution reaction (HER). The electrode is comprised of three levels of porous structures including the bottom supermacroporous Ni foam (≈500 μm) substrate, the intermediate layer of vertically aligned macroporous NiCo O nanoflakes (≈500 nm), and the topmost NiFe(oxy)hydroxide mesoporous nanosheets (≈5 nm). This hierarchical architecture is binder-free and beneficial for exposing catalytic active sites, enhancing mass transport and accelerating dissipation of gases generated during water electrolysis. Serving as an anode catalyst, the designed hierarchical electrode displays excellent OER catalytic activity with an overpotential of 340 mV to achieve a high current density of 1200 mA cm−2. Serving as a cathode catalyst, the catalyst exhibits excellent performance toward HER with a moderate overpotential of 105 mV to deliver a current density of 10 mA cm−2. Serving as both anode and cathode catalysts in a two-electrode water electrolysis system, the designed electrode only requires a potential of 1.67 V to deliver a current density of 10 mA cm−2 and exhibits excellent durability in prolonged bulk alkaline water electrolysis.
News Article | February 6, 2016
I live at the base of Wind Mountain in Washington state, which is also a quarter mile from its companion Wind River, and would you even believe that it is windy as shit here nearly all of the time? It's true! Shredded trees are just a fact of life in my little valley. The mornings after a good blow, which are most of them in the wintertime, will find local roads and trails coated in a fresh carpet of pine scraps and even some full-sized trees. It's spooky just being out and about when the wind is really ripping because it seems like just about any tree might be the next to snap with a shotgun report and tumble. You can hear them up above, creaking and moaning. Maybe the next one will be destined for my skull. It turns out that there are some interesting physics at work in the process of wind-caused tree felling. As researchers from France's École Polytechnique write in the current Physical Review E, all trees, irrespective of size and species, will fall in the presence of wind blowing at the same critical speed. To reach this conclusion, the physicists had help from an even better laboratory than my own: Cyclone Klaus. Klaus, which killed 26 and left many millions without power, hit Western Europe in 2009 backed by hurricane force winds. In a few days time, the storm leveled millions of trees of all types. The group behind the current study was led by David Quéré and Christophe Clanet, the principles behind École Polytechnique's Interfaces and Co. lab. Their study revolves around the physics of liquids interacting with solids and gases—related topics are wide-ranging and include superhydrophobicity, the Leidenfrost effect, and sports physics. In their paper, Quéré and Clanet note that the study of wind interfacing with wood is as old as physics itself. Leonardo studied it, and came up with the relationship D2/L, where D is the diameter of a wooden beam or cylinder and L is its length. This was the critical mass at which wood would break, he concluded. A couple of centuries later, Galileo determined that it was more like D3/L, and, finally, in 1740 the French mathematician and naturalist Comte de Buffon refined it to D2.6/L1.1. Which brings us to now, 2016, where we're still trying to fully understand the falling of trees, or their "loss of verticality." The phenomenon is known more properly as lodging. "The storm Klaus in France (January 24th, 2009) gives precious data on the vulnerability of trees in a large territory hosting many types of forest," Quéré and Clanet write. "The map of maximal wind speed and the map of trees broken after the storm suggest that strong winds fit with high percentages of damage. This result seems independent of the tree characteristics, as shown in areas A and B, where trees are respectively pines (softwood) and oaks (hardwood)." The group arrived at their conclusions via three broad physical concepts: Hooke's Law (how forces interact with elastic materials), Griffith's criterion (how cracks propagate through a material), and tree allometry (quantitative measurements of trees, basically). "A closer look at the shape of tree trunks, foliages, and wind unsteadiness leads to a more precise estimation of the absolute value of critical wind speed, found to be on the order of the maximal wind speeds expected on the Earth ( ≃50 m/s)," the paper concludes. "Hence our results might contribute to understanding why trees are such old living systems." As the authors note, the conclusion is hardly trivial in a changing climate. More storms means more wind, and this has obvious implications for, well, not getting crushed by trees, but also for how we design and build structures. Your world may soon enough look a lot more like mine.
« FEV North America, Inc. opening office in Silicon Valley | Main | Volvo Cars and Uber join forces to develop next-gen autonomous driving cars; $300M joint project » Uber has acquired Otto, a 90-plus person technology startup the stated mission of which is to rethink transportation, starting with self-driving trucks. Anthony Levandowski, Otto’s co-founder, will now lead Uber’s self-driving efforts and report directly to Travis Kalanick, Uber CEO and Co-Founder—across personal transportation, delivery and trucking—in San Francisco, Palo Alto and Pittsburgh. Otto hardware and software is tuned for the consistent patterns and easy-to-predict road conditions of highway driving. Sensors are installed high atop existing trucks, offering vehicles an unobstructed view of the road ahead. Highways represent only 5% of US roads, allowing Otto to focus its testing on this specific set of critical trucking routes. Together, we now have one of the strongest autonomous engineering groups in the world; self-driving trucks and cars that are already on the road thanks to Otto and Uber’s Advanced Technologies Center in Pittsburgh; the practical experience that comes from running ridesharing and delivery services in hundreds of cities; with the data and intelligence that comes from doing 1.2 billion miles on the road every month. … Of course, this is just the start, especially when it comes to safety. Over one million people die on the world’s roads every year and 90 percent of these accidents are due to human error. In the US, traffic accidents are a leading cause of death for people under 25. This is a tragedy that self-driving technology can help solve. That’s why our partnership with Swedish car maker Volvo, which we’re also announcing today, is so important. Volvo has consistently been a leader when it comes to safety. And partnership is crucial to our self-driving strategy because Uber has no experience making cars. To do it well is incredibly hard, as I realized on my first visit to a car manufacturing plant several years ago. By combining Uber’s self-driving technology with Volvo’s state-of-the art vehicles and safety technology, we’ll get to the future faster than going it alone.