Entity

Time filter

Source Type

San Jose, CA, United States

Aguiar A.P.D.,National Institute for Space Research | Ometto J.P.,National Institute for Space Research | Nobre C.,National Institute for Space Research | Lapola D.M.,Claro | And 7 more authors.
Global Change Biology | Year: 2012

We present a generic spatially explicit modeling framework to estimate carbon emissions from deforestation (INPE-EM). The framework incorporates the temporal dynamics related to the deforestation process and accounts for the biophysical and socioeconomic heterogeneity of the region under study. We build an emission model for the Brazilian Amazon combining annual maps of new clearings, four maps of biomass, and a set of alternative parameters based on the recent literature. The most important results are as follows: (a) Using different biomass maps leads to large differences in estimates of emission; for the entire region of the Brazilian Amazon in the last decade, emission estimates of primary forest deforestation range from 0.21 to 0.26 Pg C yr -1. (b) Secondary vegetation growth presents a small impact on emission balance because of the short duration of secondary vegetation. In average, the balance is only 5% smaller than the primary forest deforestation emissions. (c) Deforestation rates decreased significantly in the Brazilian Amazon in recent years, from 27 Mkm 2 in 2004 to 7 Mkm 2 in 2010. INPE-EM process-based estimates reflect this decrease even though the agricultural frontier is moving to areas of higher biomass. The decrease is slower than a non-process instantaneous model would estimate as it considers residual emissions (slash, wood products, and secondary vegetation). The average balance, considering all biomass, decreases from 0.28 in 2004 to 0.15 Pg C yr -1 in 2009; the non-process model estimates a decrease from 0.33 to 0.10 Pg C yr -1. We conclude that the INPE-EM is a powerful tool for representing deforestation-driven carbon emissions. Biomass estimates are still the largest source of uncertainty in the effective use of this type of model for informing mechanisms such as REDD+. The results also indicate that efforts to reduce emissions should focus not only on controlling primary forest deforestation but also on creating incentives for the restoration of secondary forests. © 2012 Blackwell Publishing Ltd. Source


Kaur S.,Silicon Valley 411
IETE Technical Review (Institution of Electronics and Telecommunication Engineers, India) | Year: 2013

More and more applications are hungry for more and more computation power and performance. This need plus the technology advancements in the field of electronics are resulting in more and more cores and processors being put on the smaller and smaller System-on-Chip (SoC). In spite of such growth in SoC, due to limitations of current On-chip interconnections solution, the power of SoC remains un-utilized. In this column "Pushing Frontiers with the First Lady of Emerging Technologies", Dr. Satwant Kaur explains how her vision of Networks-on-Chip (NoC) also known as On-chip Networks will fulfill the potential and promise of On-chip systems. She discusses some key emerging technologies used for NoC in areas of topologies, switching, congestion, deadlock recovery, 3D SoC, flow control, and interference control to bring about improvements in data quality, communication, performance, connectivity, power consumption, and scalability of today′s SoC. Source


Stephen (steve) perlman is the founder of OnLive and WebTV. He is known for the invention of QuickTime, which is built into all Apple computers and phones [1]. In February, he announced a wireless broadband technology called pCell created by his latest start-up, Artemis Networks. The company has been working on this technology for ten years under the code name of DIDO. This technology will enable full-speed wireless broadband to every mobile device, regardless of how many users are using the same wireless spectrum at once [2]. The technology is compatible with existing fourth-generation (4G) standards, such as the LTE that is used by the most recent mobile phones. Before we take a careful look at the technology, let us step back and put this into context. © 2014 IEEE. Source


Faggin F.,Silicon Valley 411
Mondo Digitale | Year: 2015

After elucidating the fundamental concepts of consciousness, computer, and living cell, this article considers the crucial difference between a cell and a computer. The conclusion is that a cell is a dynamic and holistic nanosystem based on the laws of quantum physics, whereas a computer is a "static" system using the reductive laws of classical physics. The essence of consciousness is its capacity to perceive and know through sensations and feelings. However, there is no known physical phenomenon allowing the conversion of electrical activity, either in a computer or in a brain, into feelings: The two phenomena are incommensurable. To explain the nature of consciousness, the author introduces a model of reality based on cognitive principles rather than materialistic ones. According to this model, consciousness is a holistic and irreducible property of the primordial energy out of which everything is made (space, time, and matter). As such, consciousness can only grow if the components of a system combine holistically, like it happens in a cell. But since the computer is a reductionistic system, its consciousness cannot grow with the number of its elementary components (the transistors), thus remaining the same of that of a transistor. Source


Chang K.,Silicon Valley 411 | Low R.M.,San Jose State University | Stamp M.,National Security Agency
Cryptologia | Year: 2014

Rotor cipher machines played a large role in World War II: Germany used Enigma; America created Sigaba; Britain developed Typex. The breaking of Enigma by Polish and (later) British cryptanalysts had an enormous impact on the war. However, despite being based on the commercial version of the Enigma, there is no documented successful attack on Typex during its time in service. This article covers the Typex machine. Researchers consider the development of Typex, discuss how Typex works, and present and analyze two cryptanalytic attacks on the cipher. The first attack assumes the rotor wirings are known and uses Turing's crib attack-originally developed for Enigma-to recover the settings of the stepping rotors. The second attack assumes that the rotor wirings are unknown. This ciphertext-only attack uses a hill-climb to determine the wirings of the stepping rotors. Finally, researchers briefly consider an attack developed by Polish cryptanalysts to recover the Enigma rotor wirings, and they argue that Typex was significantly more resistant to this particular attack. © 2014 Copyright Taylor & Francis Group, LLC. Source

Discover hidden collaborations