Potter C.,NASA |
Klooster S.,California State University, Monterey Bay |
Hiatt C.,California State University, Monterey Bay |
Genovese V.,California State University, Monterey Bay |
Castilla-Rubio J.C.,Silicon Valley 411
Environmental Research Letters | Year: 2011
Satellite remote sensing was combined with the NASA-CASA (Carnegie Ames Stanford Approach) carbon cycle simulation model to evaluate the impact of the 2010 drought (July through September) throughout tropical South America. Results indicated that net primary production in Amazon forest areas declined by an average of 7% in 2010 compared to 2008. This represented a loss of vegetation CO2 uptake and potential Amazon rainforest growth of nearly 0.5Pg C in 2010. The largest overall decline in ecosystem carbon gains by land cover type was predicted for closed broadleaf forest areas of the Amazon river basin, including a large fraction of regularly flooded forest areas. Model results support the hypothesis that soil and dead wood carbon decomposition fluxes of CO2 to the atmosphere were elevated during the drought period of 2010 in periodically flooded forest areas, compared to those for forests outside the main river floodplains. © 2011 IOP Publishing Ltd.
Aguiar A.P.D.,National Institute for Space Research |
Ometto J.P.,National Institute for Space Research |
Nobre C.,National Institute for Space Research |
Lapola D.M.,Claro |
And 7 more authors.
Global Change Biology | Year: 2012
We present a generic spatially explicit modeling framework to estimate carbon emissions from deforestation (INPE-EM). The framework incorporates the temporal dynamics related to the deforestation process and accounts for the biophysical and socioeconomic heterogeneity of the region under study. We build an emission model for the Brazilian Amazon combining annual maps of new clearings, four maps of biomass, and a set of alternative parameters based on the recent literature. The most important results are as follows: (a) Using different biomass maps leads to large differences in estimates of emission; for the entire region of the Brazilian Amazon in the last decade, emission estimates of primary forest deforestation range from 0.21 to 0.26 Pg C yr -1. (b) Secondary vegetation growth presents a small impact on emission balance because of the short duration of secondary vegetation. In average, the balance is only 5% smaller than the primary forest deforestation emissions. (c) Deforestation rates decreased significantly in the Brazilian Amazon in recent years, from 27 Mkm 2 in 2004 to 7 Mkm 2 in 2010. INPE-EM process-based estimates reflect this decrease even though the agricultural frontier is moving to areas of higher biomass. The decrease is slower than a non-process instantaneous model would estimate as it considers residual emissions (slash, wood products, and secondary vegetation). The average balance, considering all biomass, decreases from 0.28 in 2004 to 0.15 Pg C yr -1 in 2009; the non-process model estimates a decrease from 0.33 to 0.10 Pg C yr -1. We conclude that the INPE-EM is a powerful tool for representing deforestation-driven carbon emissions. Biomass estimates are still the largest source of uncertainty in the effective use of this type of model for informing mechanisms such as REDD+. The results also indicate that efforts to reduce emissions should focus not only on controlling primary forest deforestation but also on creating incentives for the restoration of secondary forests. © 2012 Blackwell Publishing Ltd.
Chang K.,Silicon Valley 411 |
Low R.M.,San Jose State University |
Stamp M.,National Security Agency
Cryptologia | Year: 2014
Rotor cipher machines played a large role in World War II: Germany used Enigma; America created Sigaba; Britain developed Typex. The breaking of Enigma by Polish and (later) British cryptanalysts had an enormous impact on the war. However, despite being based on the commercial version of the Enigma, there is no documented successful attack on Typex during its time in service. This article covers the Typex machine. Researchers consider the development of Typex, discuss how Typex works, and present and analyze two cryptanalytic attacks on the cipher. The first attack assumes the rotor wirings are known and uses Turing's crib attack-originally developed for Enigma-to recover the settings of the stepping rotors. The second attack assumes that the rotor wirings are unknown. This ciphertext-only attack uses a hill-climb to determine the wirings of the stepping rotors. Finally, researchers briefly consider an attack developed by Polish cryptanalysts to recover the Enigma rotor wirings, and they argue that Typex was significantly more resistant to this particular attack. © 2014 Copyright Taylor & Francis Group, LLC.
Faggin F.,Silicon Valley 411
Mondo Digitale | Year: 2015
After elucidating the fundamental concepts of consciousness, computer, and living cell, this article considers the crucial difference between a cell and a computer. The conclusion is that a cell is a dynamic and holistic nanosystem based on the laws of quantum physics, whereas a computer is a "static" system using the reductive laws of classical physics. The essence of consciousness is its capacity to perceive and know through sensations and feelings. However, there is no known physical phenomenon allowing the conversion of electrical activity, either in a computer or in a brain, into feelings: The two phenomena are incommensurable. To explain the nature of consciousness, the author introduces a model of reality based on cognitive principles rather than materialistic ones. According to this model, consciousness is a holistic and irreducible property of the primordial energy out of which everything is made (space, time, and matter). As such, consciousness can only grow if the components of a system combine holistically, like it happens in a cell. But since the computer is a reductionistic system, its consciousness cannot grow with the number of its elementary components (the transistors), thus remaining the same of that of a transistor.
Sanders S.R.,University of California at Berkeley |
Alon E.,University of California at Berkeley |
Le H.-P.,University of California at Berkeley |
Seeman M.D.,Silicon Valley 411 |
And 2 more authors.
IEEE Transactions on Power Electronics | Year: 2013
This paper provides a perspective on progress toward realization of efficient, fully integrated dc-dc conversion and regulation functionality in CMOS platforms. In providing a comparative assessment between the inductor-based and switched-capacitor approaches, the presentation reviews the salient features in effectiveness in utilization of switch technology and in use and implementation of passives. The analytical conclusions point toward the strong advantages of the switched-capacitor (SC) approach with respect to both switch utilization and much higher energy densities of capacitors versus inductors. The analysis is substantiated with a review of recently developed and published integrated dc-dc converters of both the inductor-based and SC types. © 2012 IEEE.
Keay A.,University of Sydney |
Keay A.,Silicon Valley 411
Proceedings of IEEE Workshop on Advanced Robotics and its Social Impacts, ARSO | Year: 2011
The naming of robots in competitions is an emergent phenomenon. Naming is invoked by the affordances of competition registration forms. Naming both reflects and creates robot identity, and is illustrative of the social relations between robot designer and robot. Naming is a fundamental classifier and naming practices have been shown to have far reaching implications for behavior in many domains. © 2011 IEEE.
Akbar I.,Silicon Valley 411
IEEE Consumer Electronics Magazine | Year: 2014
Stephen (steve) perlman is the founder of OnLive and WebTV. He is known for the invention of QuickTime, which is built into all Apple computers and phones . In February, he announced a wireless broadband technology called pCell created by his latest start-up, Artemis Networks. The company has been working on this technology for ten years under the code name of DIDO. This technology will enable full-speed wireless broadband to every mobile device, regardless of how many users are using the same wireless spectrum at once . The technology is compatible with existing fourth-generation (4G) standards, such as the LTE that is used by the most recent mobile phones. Before we take a careful look at the technology, let us step back and put this into context. © 2014 IEEE.
Rosenthal D.S.H.,Silicon Valley 411
Communications of the ACM | Year: 2010
Protection of data bits can be made possible by making more and independent soft copies and auditing copies more frequently. It is assumed that disk and sector failures are the only failures contributing to the system failures and that all other threats to stored data as possible causes of data loss. The NetApp study identified the incidence of silent storage corruption in individual disks in RAID arrays and found more than 4×105 silent corruption incidents. Identical systems are subject to common mode failures, such as those caused by a software bug in all the systems damaging the same data in each. Fast array of wimpy nodes (FAWN) couples low-power embedded CPUs to small amounts of local flash storage, and balances computation and I/O capabilities to enable efficient, massively parallel access to data.
Walker A.J.,Philips |
Walker A.J.,Silicon Valley 411
IEEE Transactions on Semiconductor Manufacturing | Year: 2013
Scaling challenges with NAND flash have forced manufacturers to consider monolithic 3-D process and device architectures as potential successor technologies. Those that involve a vertical cylindrical channel are regarded as favorites. These include bit-cost scalable (BiCS) NAND, pipe-shaped bit-cost scalable (p-BiCS) NAND, and terabit cell array transistor (TCAT) NAND. It has been assumed that their manufacturing costs decrease monotonically with the number of additional device layers. This paper presents a rigorous analysis of this assumption based on recently reported challenges associated with the construction of these architectures. It is shown that there is a minimum in die cost after which costs increase with increasing device layers. Also, achievable die sizes using these approaches may not even reach existing production NAND Flash. An important consequence is that monolithic 3-D approaches that involve more lithography-intensive steps may actually result in lower total cost provided that these scale appropriately © 1988-2012 IEEE.