Gainesville, FL, United States
Gainesville, FL, United States

System dynamics is an approach to understanding the behaviour of complex systems over time. It deals with internal feedback loops and time delays that affect the behaviour of the entire system. What makes using system dynamics different from other approaches to studying complex systems is the use of feedback loops and stocks and flows. These elements help describe how even seemingly simple systems display baffling nonlinearity. Wikipedia.


Time filter

Source Type

Grant
Agency: GTR | Branch: EPSRC | Program: | Phase: Research Grant | Award Amount: 3.44M | Year: 2013

Compared to many parts of the world, the UK has under-invested in its infrastructure in recent decades. It now faces many challenges in upgrading its infrastructure so that it is appropriate for the social, economic and environmental challenges it will face in the remainder of the 21st century. A key challenge involves taking into account the ways in which infrastructure systems in one sector increasingly rely on other infrastructure systems in other sectors in order to operate. These interdependencies mean failures in one system can cause follow-on failures in other systems. For example, failures in the water system might knock out electricity supplies, which disrupt communications, and therefore transportation, which prevent engineers getting to the original problem in the water infrastructure. These problems now generate major economic and social costs. Unfortunately they are difficult to manage because the UK infrastructure system has historically been built, and is currently operated and managed, around individual infrastructure sectors. Because many privatised utilities have focused on operating infrastructure assets, they have limited experience in producing new ones or of understanding these interdependencies. Many of the old national R&D laboratories have been shut down and there is a lack of capability in the UK to procure and deliver the modern infrastructure the UK requires. On the one hand, this makes innovation risky. On the other hand, it creates significant commercial opportunities for firms that can improve their understanding of infrastructure interdependencies and speed up how they develop and test their new business models. This learning is difficult because infrastructure innovation is undertaken in complex networks of firms, rather than in an individual firm, and typically has to address a wide range of stakeholders, regulators, customers, users and suppliers. Currently, the UK lacks a shared learning environment where these different actors can come together and explore the strengths and weaknesses of different options. This makes innovation more difficult and costly, as firms are forced to learn by doing and find it difficult to anticipate technical, economic, legal and societal constraints on their activity before they embark on costly development projects. The Centre will create a shared, facilitated learning environment in which social scientists, engineers, industrialists, policy makers and other stakeholders can research and learn together to understand how better to exploit the technical and market opportunities that emerge from the increased interdependence of infrastructure systems. The Centre will focus on the development and implementation of innovative business models and aims to support UK firms wishing to exploit them in international markets. The Centre will undertake a wide range of research activities on infrastructure interdependencies with users, which will allow problems to be discovered and addressed earlier and at lower cost. Because infrastructure innovations alter the social distribution of risks and rewards, the public needs to be involved in decision making to ensure business models and forms of regulation are socially robust. As a consequence, the Centre has a major focus on using its research to catalyse a broader national debate about the future of the UKs infrastructure, and how it might contribute towards a more sustainable, economically vibrant, and fair society. Beneficiaries from the Centres activities include existing utility businesses, entrepreneurs wishing to enter the infrastructure sector, regulators, government and, perhaps most importantly, our communities who will benefit from more efficient and less vulnerable infrastructure based services.


News Article | April 22, 2016
Site: www.washingtonpost.com

As over 150 nations assemble to sign the Paris climate agreement in New York on Friday, reams of new analysis are pouring out from the planet’s vital number-crunchers, who look at the fundamental relationship between how much carbon we put in the air and how much the planet’s temperature increases as a result. And it’s adding up to a somber verdict: We seem closer to must-avoid climate thresholds than we thought — and crossing them may have bigger consequences than we recognize. The Paris climate agreement pledges countries to keep the planet’s warming “well below” 2 degrees Celsius (3.6 degrees F) above pre-industrial levels, and to strive to keep warming as low as 1.5 degrees C (2.7 degrees F) above those levels. But here are four things you need to know about these targets, based upon four separate new and insightful analyses: 1.5 degrees C isn’t looking so far off lately. An analysis by Climate Central shows that the planet has been right around 1.5 degrees C all year this year, if you take temperatures from 1881-1910 to be the pre-industrial baseline. “The average global temperature change for the first three months of 2016 was 1.48°C, essentially equaling the 1.5°C warming threshold agreed to by COP 21 negotiators in Paris last December,” the group wrote. February of 2016, Climate Central calculates, was actually slightly warmer than 1.5 degrees C over pre-industrial levels. The news isn’t as bad as it sounds: These have been some super-hot months, and El Nino is at least partly to blame. We’re likely to cool down some as El Nino ends — and we won’t truly have crossed the 1.5C threshold until the globally averaged temperature does over multiple years, so that it becomes the average. That will require far more than a few short months to happen. Still, 1.5C hardly sounds theoretical lately. We already know what it feels like on a temporary basis, and it has coincided with mass coral bleaching, early Greenland melting and much more. 2 degrees C is considerably worse than 1.5. Meanwhile, a new study just out in Earth System Dynamics, by researchers with Climate Analytics, the Potsdam Institute for Climate Impact Research, and several other institutions has found that although 1.5C and 2C may not sound all that different, they actually are, in terms of their impacts. “Before many have argued that there can’t be much difference because temperatures are so close and there’s so much uncertainty,” says Climate Analytics’ William Hare, one of the study’s authors. “But we’ve done an end-to-end uncertainty analysis, using 5 climate models and a state of the art impact assessment … to pull out some of the statistically significant signals.” For instance, the study finds that “virtually all” tropical reefs the globe over are at risk of “severe degradation” at 2 degrees C starting in the year 2050, but that for a 1.5C scenario, that’s only 90 percent, and it actually lessens over the course of the century to 70 percent by its end. In other words, 1.5C just might save some coral reefs. That’s not all the study found. In some regions of the globe, like the Mediterranean, water-availability risks are much worse at 2C than at 1.5C. In others, like parts of Africa, agricultural risks could be considerably higher, to list just a few of the findings. Extreme heat events also show a “substantial increase” in likelihood of occurrence at 2C, according to the study. “There’s a really substantial reduction of risk for areas that are already hot and dry and suffering food and water shortages,” says Hare, if we hold warming to 1.5 rather than 2 degrees above pre-industrial levels. Fast policy moves are needed to achieve either target. Meanwhile, an  by Climate Interactive and the MIT Sloan School of Management finds that the current Paris agreement pledges — made by individual countries as part of the agreement, and supposed to be improved upon over time — would still let the world warm by as much as 3.5 degrees Celsius by 2100. They obviously need to be ratcheted up, then. How fast? The analysis finds that “with each year that countries wait to strengthen their current pledges, the rate at which emissions must decline gets steeper and steeper.” So if we wait for global emissions to peak in 2030, rather than in 2020, then every year after that they will have to decline by 4.6 percent per year, the analysis finds, a number that is “prohibitively fast.” If we peak in 2020, though, then reductions only have to happen at 3.2 percent per year, to stay under 2 degrees C, “a rate that has been achieved by some nations in the past.” Thus, if possible, emissions should peak by 2020. The United States, in this scenario, would have to go from lowering its emissions 26 percent below 2005 levels by 2025 (its current goal), to cutting them by 45 percent by 2030. Other nations would have to make similarly large improvements on what they are currently promising to do. And even then, due to scientific uncertainty, the planet could still conceivably overshoot 2 degrees, and there is only a 66 percent or greater chance of getting there. Of course, the actual embraced goal of the world is to stay “well below” 2 degrees, a target that suggests prudent avoidance, not walking right up to it and potentially going over. Accordingly, the study also examined what it would take to suppress emissions fast enough to hit 1.5 degrees C. Here we’d have to have global emissions peak in 2020 and then decline by 5.9 percent annually thereafter. The United States, here, would have to get its emissions 60 percent below their 2005 levels by 2030. This is extreme, but then, that’s what it would take. If we want to buy time, we have to save forests. There’s some good news here. According to an analysis by the Woods Hole Research Center, if we stop deforesting the tropics and instead move rapidly to restore these forests, we can buy 10 to 15 years longer to try to stay within 2 degrees C. The reason is that if deforestation abruptly stopped — and stopped contributing to greenhouse gases each year — then forests would start growing back and sequestering carbon: pulling it back out of the air again. A current addition to our emissions would become a subtraction from them. Now that’s smart math. “Proper forest management is the only climate change mitigation technology that is: 1) available immediately; 2) capable of providing negative emissions at the necessary scale; and 3) proven to have additional benefits for the local and global climate,” write the researchers. Yes, that’s right — the world should simply stop chopping down trees immediately. Granted, while it may be theoretically possible to put the brakes on deforestation faster than it is to halt fossil fuel use, it seems unlikely that the underlying (economic) drivers of deforestation will suddenly end, either. So what’s the upshot of it all? This Earth Day, it’s hard to say the planet is in great shape. It is also hard to say that it is beyond saving, or at least, beyond beginning to repair. Rather, what happens next all depends on us.


News Article | April 21, 2016
Site: news.yahoo.com

Paris (AFP) - A jump in global temperature of two degrees Celsius would double the severity of crop failures, water shortages and heatwaves in many regions compared to a rise of 1.5 C, according to a study released Thursday. An extra 0.5 C (0.9 degrees Fahrenheit) would also add 10 centimetres (4 inches) to the average ocean waterline, further imperilling dozens of small island nations and densely-populated, low-lying deltas, a team of researchers reported. In a 2 C scenario, impacts are amplified in certain climate "hot spots," said the study in Earth System Dynamics, a journal of the European Geosciences Union. In the Mediterranean basin, for example, a 2 C world would see its supply of fresh water diminish by 20 percent compared to the late 20th century -- double the loss forecast for a 1.5 C increase. "We found significant differences" between 1.5 C and 2 C projections for 11 different impact areas, said the study's lead author Carl Schleussner, a scientist at Climate Analytics in Germany. The world's first global climate pact, hammered out by 195 nations in Paris last December, aims to hold average global warming to "well below 2 C" compared to pre-Industrial Era levels. The Paris Agreement also pledges to "pursue efforts" to cap warming at 1.5 C, a hard-fought concession to a coalition of more than 100 poor and climate-vulnerable nations. More than 160 countries are set to attend a formal signing ceremony Friday at the United Nations in New York, the penultimate step before ratification of the accord. The study also looks at coral reefs, and found that warming of 1.5 C would give these threatened ecosystems a fighting chance of adapting to warmer and more acidic seas. An extra half-a-degree by century's end, however, would expose all the world's reefs -- which harbour 25 percent of the ocean's wildlife -- to possible extinction. Last week, scientists in Australia reported that 93 percent of the Great Barrier Reef is already affected by bleaching. In tropical zones, another hot spot, the loss of maize and wheat yields would be twice as severe in a 2 C world. Extreme weather events would also be amplified. "The additional 0.5 C increase marks the difference between events at the upper limit of present-day natural variability" -- an intense heat-wave, for example -- "and a new climate regime," Schleussner said in a statement. Many climate scientists have cast serious doubt on the feasibility of holding temperatures below the 2 C threshold -- never mind 1.5 C. At current rates of fossil fuel consumption, Earth is on track for an increase of 4 C or higher. In almost any future climate scenario, humanity will be confronted with the challenge of cooling the planet by removing carbon dioxide from the atmosphere, something current technology does not allow on a global scale.


Understanding the key factors influencing the global oceanic redox system is crucial to fully explaining the variations in oceanic chemical dynamics that have occurred throughout the Earth's history. In order to elucidate the mechanisms behind these variations on geological timescales, numerical sensitivity experiments were conducted with respect to the partial pressure of atmospheric molecular oxygen (pO2), the continental shelf area (Acs), and the riverine input rate of reactive phosphorus to the oceans (RP). The sensitivity experiment for atmospheric pO2 indicates that pervasive oceanic anoxia and euxinia appear when pO2<0.145atm and <0.125atm, respectively. These critical values of pO2 are higher than a previous estimate of ~50% PAL (present atmospheric level) due to redox-dependent phosphorus cycling. The sensitivity experiment regarding the shelf area demonstrates that changes in the shelf area during the Phanerozoic significantly affected oceanic oxygenation states by changing marine biogeochemical cycling; a large continental shelf acts as an efficient buffer against oceanic eutrophication and prevents the appearance of ocean anoxia/euxinia. We also found that an enhanced RP is an important mechanism for generation of widespread anoxia/euxinia via expansion of both the oxygen minimum zone and coastal deoxygenation, although the critical RP value depends significantly on pO2, Acs, and the redox-dependent burial efficiency of phosphorus at the sediment--water interface. Our systematic examination of the oceanic redox state under Cretaceous greenhouse climatic conditions also supports the above results. © 2013 Elsevier B.V.


Patent
System Dynamics | Date: 2012-10-01

A method of controlling a combination vehicle for road transport of heavy goods, said vehicle comprising a motor vehicle at the front and a trailer attached so as to be towed behind the motor vehicle, said trailer comprising:


Grant
Agency: Department of Defense | Branch: Air Force | Program: SBIR | Phase: Phase II | Award Amount: 749.93K | Year: 2010

The U. S. Air Force has identified a need to develop innovative technologies that will enable miniaturization of micro air vehicles (MAV), and future micro air weapons (MAW), to allow these platforms to be sufficiently compact to accommodate diverse deployment scenarios. Smaller airframes necessarily require that airframe components be miniaturized to provide adequate volume for mission payloads. In particular, the Air Force has earmarked flight control actuation devices as critical components requiring miniaturization. The conventional approach for control surface deflection in small air vehicle has involved the use of analog or digital servos. The Phase I program demonstrated that piezo-electric actuators could replace conventional servos and provide comparable control and manuverability during radio-controlled flight. The objective of the Phase II effort is to transition from an RC prototype aircraft to a fully autonomous, bird-size, piezo-equipped tactical MAV that exhibits payload and weight benefits relative to conventional servos, while providing the optimum control authority to precisely maneuver the air vehicle. BENEFIT: There are numerous non-military applications of the peizo-actuated control surface technology that will be logical by-products of the military applications. Camera-equipped MAVs have great potential for surveillance and monitoring tasks in areas either too remote or too dangerous to send human scouts. MAVs will enable a number of important missions, including chemical/radiation spill monitoring, forest-fire reconnaissance, visual monitoring of volcanic activity, surveys of natural disaster areas, and even inexpensive traffic and accident monitoring. Additional on-board sensors can further augment MAV mission profiles to include, for example, airborne chemical analysis. Also, RC models may be the most intriguing commercial application for piezo-actuated control surfaces. RC enthusiasts are always looking for the next “gadget” for their gas-powered and electric planes. Today’s electric flight packs generally include a pair a servos, an ESC, and an RC receiver and crystal. It is quite conceivable that future flight packs could include a pair of piezo-electric actuators and a small DC boost circuit, along with the ESC and receiver.


Grant
Agency: Department of Defense | Branch: Army | Program: STTR | Phase: Phase I | Award Amount: 99.96K | Year: 2010

This Phase I STTR effort develops and tests spectropolarimetric surface characterization algorithms for LADAR based remote sensing. A systematic approach is used which first defines the operational scenarios for which the algorithms are to work. These scenarios are then used to define requirements for LADAR hardware and algorithms which serves to focus the algorithm development effort. A library of surface material Mueller matrix measurements is used as the basis for a fundamental surface characterization investigation that will establish the ultimate potential to discriminate between different materials/classes of materials. The library used consists of existing measurement data from government and industry sources plus measurements made during Phase I to fill high priority gaps in the data library. A preliminary design of improved instrumentation (which would be built in Phase II) for measuring full Mueller matrix BRDFs will be made in order to address weaknesses in previous measurement data. Finally, a suite of algorithms will be developed to address the high priority scenarios identified at the beginning of the effort, and these algorithms will be tested on synthetic LADAR images created to represent these scenarios. Testing will be performed on SDI’s existing LEAP ATR application.


News Article | October 7, 2016
Site: www.techtimes.com

The human race has been responsible for global warming since the Industrial Revolution and recent studies suggest that the situation is more dire than ever. The levels of carbon emissions and greenhouse gas are endangering the future of our species and could be responsible for a severe climate change that would bring the world as we know it to catastrophe. A new study published in the open journal Earth System Dynamics places this issue under the responsibility of young people, whose primary objective should be turning the Earth into an environmentally friendly place. "Global temperature has just reached a level similar to the mean level in the prior interglacial (Eemian) period, when sea level was several meters higher than today, and, if it long remains at this level, slow amplifying feedback will lead to greater climate change and consequences," explains the study. The human-caused greenhouse gases have increased more than 20 percent over the past decade, mostly because of CH  (methane), which makes it almost impossible to limit the consequences. According to the study, we have come to a point where not only do we have to reduce the emissions we're currently releasing into the air, but we also have to focus on "negative emissions." The concept involves extracting the CO  (carbon dioxide) from the atmosphere. Improving industrial and agricultural practices would also be a necessary step toward the goal of reducing global warming, along with reforestation. The study brought together 12 authors who were led by James Hansen, a former chief of NASA's Goddard Institute for Space Studies. The paper underlines the seriousness of the status quo, as the current annual temperatures exceed those of the 1880s-1920s by more than 1.25 degrees Celsius. "On the other hand, if large fossil fuel emissions are allowed to continue, the scale and cost of industrial CO2 extraction, occurring in conjunction with a deteriorating climate with growing economic effects, may become unmanageable," the authors also explained. There have been various propositions from worldwide organizations such as the United Nations aimed at reducing global warming. One of these is replacing the current carbon taxes with the cap and trade system, in which a regulatory body sets a limit on pollutant emissions from polluters such as power plants. The total emissions deemed acceptable under the cap are split into individual permits, which then carry financial value and become suitable for trading. However, as the technology necessary to limit the levels of pollution is not affordable for all the UN member states, the motion did not pass the General Assembly. © 2017 Tech Times, All rights reserved. Do not reproduce without permission.


News Article | February 15, 2017
Site: www.24-7pressrelease.com

CARY, NC, February 15, 2017-- Mindy Schrager has been included in the Platinum Anniversary Edition of Who's Who in America. While inclusion in Who's Who in America is an honor, only a select few in each professional field are chosen for this distinction. As in all Marquis Who's Who biographical volumes, individuals profiled are selected on the basis of current reference value. Factors such as position, noteworthy accomplishments, visibility, and prominence in a field are all taken into account during the selection process.Recognized for three and a half decades of invaluable contributions in the fields of Quality, Process, Programs, Change, Operations Management, Teamwork and more, Ms. Schrager parlays knowledge of change and transformation into Systems of Change, LLC, a business she founded in 2015. After earning a Bachelor of Arts degree from Dickinson College and an MBA from Babson College, including studies in France and Switzerland, she began her career at Nolan Norton and Company first as an Analyst, then as a Consultant in Information Technology Management. Subsequently, she moved to Logos Corp, a machine translation start-up where she managed a variety of roles ranging from sales support and customer satisfaction measurement to human resources. During her next eight-year adventure at Motorola ISG, she created the division's customer satisfaction and non-product quality programs as well as a call center for customers' non-technical issues. Next, Ms. Schrager moved into Financial Services at Fidelity Investments first as Director of Boston Phone Site Quality, then Director of Bill Payment as it migrated to the web and a third party provider. In early 2000, she took on senior roles in Operations, Process, and Program Management at Ardent/Ascential Software acquired by IBM in 2005, working there until 2015.Complimenting her degrees and corporate experience, Ms. Schrager studied transformational approaches and earned an Associate Certified Coach credential from The International Coach Federation, and an Integrative Coach certification from JFK University and The Ford Institute of Transformational Studies. She also is a certified Cultural Transformation Tools consultant with Barrett Values Center, Systemic Family and Organizational Dynamic Practitioner through The Institute of Integrative System Dynamics, and Voice Dialogue Practitioner through Voice Dialogue Connection. She is a member of the National Association of Female Executives, International Coach Federation, Triangle OD Network, American Society for Quality, and the Association for Talent Development.During her years in the quality field, Mindy co-authoredandfor the ASQ (American Society for Quality) National Conference, in addition to starting ASQ's Boston Chapter's Non-Product Quality Committee and the Boston Chapter of the Association for Quality and Participation. She recently authored the bookand created a blog series. Additional Who's Who honors: Finance and Industry, American Women, Emerging Leaders, in the World since the 1980's, in America, in the East since the 1990's, and more recently, in the South and Southeast since 2010.About Marquis Who's Who :Since 1899, when A. N. Marquis printed the First Edition of Who's Who in America , Marquis Who's Who has chronicled the lives of the most accomplished individuals and innovators from every significant field of endeavor, including politics, business, medicine, law, education, art, religion and entertainment. Today, Who's Who in America remains an essential biographical source for thousands of researchers, journalists, librarians and executive search firms around the world. Marquis now publishes many Who's Who titles, including Who's Who in America , Who's Who in the World , Who's Who in American Law , Who's Who in Medicine and Healthcare , Who's Who in Science and Engineering , and Who's Who in Asia . Marquis publications may be visited at the official Marquis Who's Who website at www.marquiswhoswho.com


News Article | February 17, 2017
Site: www.24-7pressrelease.com

LEOPOLDSHOEHE, GERMANY, February 17, 2017 /24-7PressRelease/ -- This is the final version of Predicted Desire, a simple and easy product calculation client. From v2.0, Predicted Desire will be called Startup Product Manager. We keep this version and continue to host PD through 2017. How it works Predicted Desire will calculate your product's various cost, revenue and net profit over 48 months. Being the first ever sample of the Perfect Desire platform, it's capable of calculating basically any Formula System over time, including a Solver Algorithm capable of solving simple Differential Equations. PD will calculate all Targets, predict and graphically display a company's proceedings over the next 48 months in the reference model. Based on the set of Input Parameters and Formula, the user interface is built automatically by parsing and interpretation of the simulation models's equation system. SocialMedia Followers leading the path of Development Accompanying the Release, which went through a beta phase with a few hundred freelance testers, two SocialMedia Competitions have been started. At twitter.com/dynamic_idea, a public voting competition allows everyone to suggest, favour, and retweet new simulation ideas. Every Month, Dynamic Applications is working on the top voted idea, and it will be available one (1) month for free, at least, for the SocialMedia crowd after completion. Roadmap In parallel to @dynamic_idea, there is a Roadmap, Bug and Feature competition at @dynamic_qs, so users can as well prioritize our product Roadmap. We call it Customer Driven Development. Since our roadmap is community defined, we try to give every person in the world a fair chance to participate, supposed they can get access to a free Twitter account. This way, we are Sharing Economy. It's an experiment in Swarm Intelligence, as we define Online Democracy. Is it possible? - we say yes. About Dynamic Applications Dynamic Applications was founded on January 01, 2016. The Founder and first Software Developer of Dynamic Applications is Martin Bernhardt. He has researched not only any public information about the System Dynamics approach, but has also been working with two companies in the Strategic Asset Management market. Dynamic Applications has no flyers, sales consultants, traditional marketing, or budget. Our strategy is called Growth Hacking: we spend as much time as possible on Research and Development of the Software. Following an Agile Development approach, we're working our way all up from the very bottom. We try to make the crap we started from better, every day. We just hope there are people out there who'll find our products any useful. Dynamic Applications is a community approach. We pay with a Tweet, and define the next big thing to publish. We are Dynamic Applications. We empower people. We are Sharing Economy. Follow us to gain.

Loading System Dynamics collaborators
Loading System Dynamics collaborators