News Article | December 6, 2016
Veterinary biologics are products derived from living organisms and biological processes. These veterinary biologics are used for prevention, diagnosis or treatment of animal diseases including domestic livestock, poultry, pets, wildlife, and fish and function through an immunological process. Veterinary biologics are basically animal health products such as vaccines (bacterins and antisera), diagnostic kits, antibody products, and in vitro diagnostic test kits. Also to ensure that the veterinary biologics are effective, pure, safe and potent. Veterinary vaccines plays an important role in veterinary biologics market by supporting animal health and public health, decreasing animal suffering, by growing efficient production of food animals, by decreasing need for antibiotics which are used for treating companion animals. For instance, rabies vaccines used for treating domestic animals and wildlife has effectively reduced human rabies in developed countries. Above mentioned factors are expected to drive the veterinary biologics market over the forecast period. Moreover, companies are into a co-marketing tie-up with the pharmaceutical companies to increase its geographical footprint globally which is expected to drive the global veterinary biologics market over the forecast period. The demand for veterinary biologics market is fueled by rising needs for primary and secondary food items from livestock animals helps in creating sustainable demand for livestock vaccines which is thereby expected to drive the demand for veterinary biologics over the forecast period. In addition the increasing demand for food security across the globe coupled with rising impact of host-pathogen interactions and limited vaccine stockpile across major nations is expected to witness robust market growth during the forecast period. According to the Food and Agricultural Organization (FAO) of the United Nations, the expected increase in food product is around 70% during the period 2007 and 2050 so as to serve the global population. In 2014, the global population accounted for 7.3 Bn and is expected to reach 9.1 Bn in 2050 at a yearly average rate of 3%. Also the consumption of meat and associated products is estimated to increase from 218 Mn tonnes in 1997–1999 to 376 Mn tonnes by 2030. These factors are expected to drive growth of the global veterinary biologics market over the forecast period. Moreover government initiatives for animal vaccination programs to manage the disease transfer from animals to humans is expected to create demand for veterinary biologics. However, lack of knowledge of certain virulent veterinary diseases is expected to hamper discovery and production limit for vaccine to prevent spread of the disease which will thereby impede revenue growth of the veterinary market over the forecast period. The global veterinary biologics market, is segmented on the basis of product type, by species, by disease type, end user and region. The veterinary biologics market, is largely penetrated by some global players and is projected to witness robust CAGR over the forecast period owing to major players in the market are focusing on introducing effective vaccination for parasitic invasion which is a prevalence type of disease affecting livestock. Also companies are actively seeking supportive government regulations for development of effective veterinary vaccines is expected to create fierce competition over the forecast period in the veterinary market. Moreover, increasing prevalence of viral diseases in animals is expected to fuel demand for attenuated live vaccines as they are the first drug of choice in viral diseases and provide more competitive advantage over substitute’s products available in the market. Geographically, the global veterinary biologics market, is classified into regions namely, North America, Latin America, Western Europe, Eastern Europe, Asia-Pacific excluding Japan (APEJ), Japan, Middle East and Africa (MEA). North America dominates the global veterinary biologics market followed by Europe. North America and Western Europe regions are expected to witness robust growth due to increased adoption of veterinary biologics and dispersed livestock presence coupled with rapid demand for vaccines is expected to fuel the revenue growth in these regions. APEJ is expected to witness significant growth as compared to other regions and countries owing to increased disease awareness and focus on animal welfare are factors expected to fuel market growth in this region. Latin America and MEA regions are expected to witness sluggish growth due to lack of understanding on epidemiological disease patterns for livestock animals are expected to hamper proper vaccine distribution in these region. However, increasing foodborne and zoonotic diseases may boost the growth of veterinary biologics market in these regions during the forecast period. Some of the players operating in the market for veterinary biologics are Zoetis, Elanco, Merial, Merck & Co., Inc., Bayer Pharma AG, Boehringer Ingelheim GmbH, Ceva, Virbac and others. Above companies are actively entering into strategic agreements and collaborations with other players in the industry, universities and farming institutes to improve their product line and increase visibility owing to strategic product placement.
News Article | October 6, 2016
Yet in this corner of the country is where the 75-year-old Huber hopes the South American grain quinoa takes root. Last month, Huber harvested quinoa commercially for the first time on about 30 acres, making him the latest addition to a small number of U.S. farmers trying to capitalize on American eaters' growing demand for the Andean grain. "It's a beautiful crop," Huber said as he surveyed his combine grinding the plants and spitting out the seeds. He chose a variety called Redhead, which turned his field lipstick red for a couple of weeks before harvest. "We're still learning. I kind of stepped off the end of the dock here with a bit of a bite this year." Americans consume more than half the global production of quinoa, which totaled 37,000 tons in 2012. Twenty years earlier, production was merely 600 tons, according to the United Nation's Food and Agricultural Organization. Yet quinoa fields are so rare in American farming that the total acreage doesn't show on an agricultural census, said Julianne Kellogg, a Washington State University graduate student monitoring quinoa test plots around the Olympic Mountains, including one next to Huber's field. A rough estimate puts the country's quinoa fields at 3,000 to 5,000 acres. Quinoa's nutritional punch has pushed the grain beyond health food stores and into general consumption, propped up by celebrities like Oprah Winfrey. It has all the amino acids humans need, making it a complete protein, Kellogg said. That's hard to find in grain crops, she said. It's also gluten-free. The grain's future is marked with possibilities, including milk, beer, cereals, hair products, snacks—products well beyond the salad bar. "I think we're witnessing the start of a staple," said Sergio Nuñez de Arco, a Bolivia native whose company, Andean Naturals, has been instrumental in bringing quinoa north, distributing to Costco, Trader Joe's and others. The spike in demand from the U.S. and Europe led big farm operations in Peru to enter quinoa farming a few years ago. That resulted in an oversupply, and prices have been falling. According to a July report from the U.S. Department of Agriculture's Foreign Agricultural Service, quinoa prices plummeted about 40 percent between September 2014 and August 2015. "Farmers are rotating out of quinoa," Nuñez de Arco said. "They went back to the city to look for work. It was good while it lasted, so it's back to rural migration." Nuñez de Arco has opened a California processing plant for the bitter coating that covers the quinoa grains. It wasn't welcome news for his Bolivian farmers. "There needs to be some improvement to practices and they're gonna get that through some healthy competition," said Nuñez de Arco, now based in San Francisco. "My push has been to protect the smaller farmer in a top-shelf niche, where they will have the demand." In Washington state, Huber's quinoa will head to Lundberg Family Farms, a California-based company that has been a leader in domestic quinoa production. This year, Lundberg and its network of contracted farmers along the West Coast hope to harvest 2 million pounds of quinoa. "It's great to have product available where folks are consuming it," said Tim Schultz, vice president of research and development at Lundberg. "You have less food miles on it." For more quinoa to grow in the United States, farmers and researchers must find the right mix of varieties and environments. The Washington State University plots are testing varieties for heat resistance and late-summer sprouting, among other benchmarks. Next year, they'll test plots in Maryland and Minnesota. "From a farmer's perspective, it's more options for rotations," said Kevin Murphy, an assistant professor at the university. That's an option that attracted Huber. Quinoa represents his first commodity crop. On a harvest day, he eyeballed a lower yield than he wanted, in part because the elk that roam the nearby woods frolicked in the quinoa fields. "I hope I break even," he said with a laugh. "If we break even or make a little bit of money, that'll be good because I learned quite a few things here."
News Article | September 7, 2016
The effectiveness of antibiotics has been waning since they were introduced into modern medicine more than 70 years ago. Today, our inability to treat infections ranks alongside climate change as a global threat1, 2. New classes of antimicrobial drugs are unlikely to become widely available any time soon1; if and when they do, bacteria, viruses and other microbes will again evolve resistance3. In any case, waging war on microbes is not tenable3 — our bodies and planet depend on them4 (see Supplementary Information). Addressing resistance requires global collective action. Like the ozone layer, a stable climate or biodiversity, the global population of susceptible microbes is a common pool resource — one shared by all. But no individual or country has a strong enough incentive to conserve this 'commons'. It has been depleted by the massive use of antimicrobial compounds and the growing competitive advantage of resistant microbes. It is a classic 'tragedy of the commons'. This intimate relationship with microorganisms predates modern humans. It is the result of many millions of years of co-evolution. Our bodies need particular kinds of microbes for digestion, immune function and general health. Equally, microbes support planetary health, for example, through nutrient cycles, including those that maintain soil and water quality4. In other words, microbes sustain human civilization. Yet our understanding of the complex interactions and uncertainties that govern the relationships between humans and microbes is limited. The 2015 Global Action Plan on Antimicrobial Resistance, drafted by the World Health Organization (WHO) with support from the United Nations Food and Agricultural Organization (FAO) and the World Organisation for Animal Health (OIE), recognizes the need for multisectoral cooperation to address resistance (see go.nature.com/2bbijap). But, in our view, it does not go far enough in recognizing the life support we receive from the global microbiome. Tackling resistance urgently requires the scaling back of the massive overuse of antibiotics to secure the liveability of Earth in the long term. On 21 September, heads of state will meet to take further action at the United Nations high-level meeting on antimicrobial resistance in New York City. A UN declaration currently under discussion must set global targets, accelerate implementation of the global action plan, plug its gaps and ensure stronger accountability and interagency coordination. It must emphasize the many benefits of microbes. Parties should aim to build the resilience of society and the microbiome. In our opinion, this is the way to maintain low levels of resistance amid the many surprises of a rapidly changing planet. Advances from studying resilience in other common pool resources such as fisheries and forests5 suggest key steps for antimicrobial resistance, which we set out below. Achieving these will require changes to institutions, regulations, education, community norms and expectations, notably in medicine and agriculture. Until now, political and financial investments have focused largely on creating incentives to fuel drug innovation and new or faster diagnostics. Currently, such technological fixes appeal to and benefit mainly rich nations in the 'global north'. Incentives must be targeted to benefit not only large pharmaceutical companies in the north, but also to enlist research and development efforts globally. One of the most important outcomes of the UN meeting should be national commitments to the broadest and most creative participatory education campaigns about resistance2 and the importance of the microbial world. Why? Because the level of ignorance about the calamity that is antimicrobial resistance is staggering. A 2015 WHO survey across 12 countries found that 64% of the public think that antibiotics also work for, for instance, viral infections such as influenza and colds (see go.nature.com/2c7zvfu). Such basic knowledge gaps lead patients and physicians to reach for antibiotics without appreciating the costs. Instead, institutions and citizens must understand the central facts, context and risks in a way that allows them to learn more independently. This goal requires awareness campaigns to be revised and scaled up by orders of magnitude2, as well as investment in new communication tools. Initiated in 2007, Thailand's Antibiotics Smart Use project sets a direction for upscaling. It enables patients in pharmacies to self-diagnose on the basis of the appearance of their sore throat to verify whether they need antibiotic treatment6. For further learning, citizen-science programmes in which participants monitor their own microbiomes should be extended to cover, for example, self-testing for resistance in various parts of the body7. Such campaigns could engage communities and change norms about how and when to use antibiotics. Campaigns will need to be coordinated internationally for quality and impact, and adapted to suit regional perspectives. Engagement can be spread through schools, mass media and social media. Resistance affects animal and environmental health as well as human health, and so requires coordinated action across economic sectors. No single concern exemplifies this better than the high rate of antibiotic use in agriculture (largely as growth promoters or disease prevention). In the United States, 70–80% of all antimicrobials consumed are given to livestock; agricultural use in the BRICS emerging economies (Brazil, Russia, India, China and South Africa) is expected to double by 2030, as compared to 2010 levels8 (see 'Farm forecast'). As a result, antibiotics and resistance genes enter the food chain, soil and the water table, threatening human health. The European Union has phased out the use of medically important antibiotics for growth promotion in agriculture. Other countries, including Mexico and Taiwan9, have sought to reduce it. In the United States, a directive discourages the use of antibiotics for growth promotion through voluntary measures and stronger veterinary oversight of therapeutic use. However, the powerful industrial farming lobby and a lack of perceived urgency have so far stalled stronger mandates. Stronger political action to change how we use antibiotics, whether by humans or animals, requires citizens to be better informed. For instance, the public should have online access to surveillance that tracks how human resistance increases in settlements near farms. In the meantime, consumer groups play a crucial part by calling on retail chains to switch where their meat is sourced. For example, US food chains Chipotle, McDonald's and Chick-fil-A have responded (to varying degrees) to public demands with stricter limits on antibiotic use in the meat they sell. A particularly worrying issue that is not confined to the use of antimicrobials in food production is the international spread of resistance genes, especially those conferring resistance to many drugs of 'last resort'. Most recently, a mobile plasmid gene carrying resistance to the last-resort antibiotic colistin has been found in Asia, Europe and North America. Clearly, countries cannot act alone to deal with the problem without jeopardizing the benefits of globalization. Much better surveillance and containment is needed of the most dangerous multiresistant strains in people and food2. A global routine-surveillance initiative could help to prevent the spread of resistance. It could screen medical tourists or patients returning from hospitals abroad to identify carriers of multiple resistant strains. Hospitals that are centres of international travel for medical treatment must lead the way; funding and learning mechanisms must be increased for other hospitals to follow suit. The International Health Regulations, revised by WHO member states in 2005, are a legally binding instrument that aims to provide global surveillance and response. Properly financed, they could be effective10. Yet the resources needed to respond to emerging diseases do not flow commensurately to low- and middle-income countries as they do in the global north — a key lesson of the recent Ebola outbreak. All governments have a collective responsibility to improve capacities for rapid response to resistance. Greater support by donor countries to new and existing funding mechanisms such as the Global Fund to Fight AIDS, Tuberculosis and Malaria is needed in low- and middle-income countries. International and national coalitions must be broadened. The global action plan strengthens the established collaboration between the WHO, FAO and OIE. This should be extended to cover other relevant sectors, including trade, development and environment. The model set up by UNAIDS (the Joint United Nations Programme on HIV/AIDS) in 1996 serves as an example of how to intensify collaboration, leverage resources, involve more parties and reduce barriers. The UN meeting must commit to driving learning between institutions. Global platforms are needed for sharing best practices and the latest data about resistance levels and antibiotic consumption, for instance, among national agencies. Such exchange happens in Europe for resistant human bloodstream infections, and human and veterinary antimicrobial consumption. This must be scaled up to monitor resistance in communities, food industry and the environment. A relevant model for exchange at the global level is the WHO's Pandemic Influenza Preparedness Framework. To engage the public effectively, more-frequent updating, vivid visualizations and engaging communications are needed. As in the Paris climate agreement, countries should submit to the UN voluntary but monitored targets on limiting resistance. Parties may go further by making shortfalls subject to potential sanctions. A key priority is to establish measurable indicators at the country level, such as the median yearly consumption of antibiotics per person. As for the climate issue, non-state actors from business to civil society can be central to societal transformations. Such stakeholders were consulted during the development of the WHO global action plan. But their participation in the long run must become more integral to the global coalition responsible for tackling resistance. Available governance instruments range from binding treaties to guidelines, with each approach having pros and cons. A first step to holding companies accountable would be an international code on the promotion of antibiotics (promotional spending in the United States in 1998 amounted to US$1.6 billion), akin to that adopted by the WHO in 1981 on the marketing of breast-milk substitutes. The complexity and gravity of resistance call for the immediate mass mobilization of society. Maintaining the susceptibility of microbes to drugs for global health is a matter of sustainable development. Improving understanding about humankind's dependence on the global microbiome should lead to action on many other important issues involving microorganisms. These issues include infectious diseases, food security, natural resources and environmental conservation. Action here could, in turn, lead to more-equitable forms of national progress across the sustainable-development goals3. Building global resilience to resistance is a long game. But changes can be surprisingly fast when the time is ripe and a plan is ready. This month's UN high-level meeting is a rare opportunity for global collective action on human interactions with microbes. It must protect both the lifesaving power of antibiotics and the ability to use them when necessary.
News Article | October 27, 2015
"Why is potassium bromate, a possible human carcinogen, still available in U.S. baked goods, despite being banned around the world?" "If you buy bread in Toronto, Paris, or Rio de Janeiro, it cannot by law contain the chemical potassium bromate. Yet if you eat baked goods in the U.S., you may be eating this substance unknowingly. Potassium bromate is used to whiten and strengthen dough, to reduce mixing time and enhance rising—but it was also classified by the World Health Organization (WHO) as a possible human carcinogen in 1999 after it was found to cause kidney and thyroid tumors in lab rats. Declared unsuitable for use in flour by both the WHO and United Nation’s Food and Agricultural Organization in the early 1990s, potassium bromate—which can also cause non-carcinogenic adverse effects on the human kidney—is now banned in the European Union, China, and other countries around the world. It has also been listed as a carcinogen under California’s Proposition 65 since 1990. But the U.S. Food & Drug Administration (FDA) continues to allow its use in flour. And according to a recent report from the Environmental Working Group (EWG), baked goods containing potassium bromate in the U.S. are widely available despite many companies’ shift away from the additive." Elizabeth Grossman reports for Civil Eats October 26, 2015.
News Article | February 17, 2017
Bird Flu has reared its ugly head in China, leading to fears of a worldwide epidemic, with the discovery that the H7N9 strain of avian flu virus is responsible for the outbreak. It is believed that infection rates due to Bird Flu on poultry farms in the country are possibly higher than thought earlier, as the H7N9 strain has been instrumental in killing over 100 people this winter. Animal experts share that this strain of the virus is deadly as it is difficult to detect the same in geese and chicken. According to the experts, poultry which has been affected by the H7N9 strain show no symptoms or negligible signs of being affected. This implies that the infection can only be detected if health officials or the farmers themselves conduct sporadic tests on their poultry. "There are very few, if any, clinical signs when this (H7N9) virus infects birds, and that's the main reason we're not seeing reporting coming from poultry farms in China," shared Matthew Stone, the Deputy Director General For International Standards and Science at the World Organization for Animal Health,. However, while the poultry may not be impacted by the virus' effects, it is a different story when it comes to humans. The H7N9 strain of avian flu can be detrimental in the case of humans per experts. In December 2016, South Korean farms were affected by the H5N6 strain of the avian flu. This prompted the culling of nearly 26 million birds in the country. However, this strain was not responsible for any deaths. On the other hand, the H7N9 has impacted humans, killing more than 100 people in China, which bears testimony to its detrimental effects. China saw a massive outbreak of the H7N9 virus in January, which according to reports are considered to be four times higher than the 2015 outbreak and around 79 people died due to the same this year. The main reason behind this massive death is due to lack of early detection of the virus strain in the poultry farms. According to the reports by the United Nation's Food and Agricultural Organization (FAO), China is considered to have the largest group of ducks, chicken, and geese, out of which around 11 billion birds were slaughtered in 2014. Per the Centre For Disease Control and Prevention in China, people who were affected with the H7N9 virus were found to have direct connection with poultry, as well as the live markets there. In light of the latest outbreak, on Thursday, Feb. 16, the Chinese Government promised to exercise more control on poultry farms and the transport of chicken and geese to help battle the epidemic, which poses a grave threat to human life. © 2017 Tech Times, All rights reserved. Do not reproduce without permission.
News Article | December 2, 2015
It is clear that soil biodiversity represents an underutilized resource for sustaining or improving human health through better soil management. As indicated above, some agroecological management options are known to maintain and increase soil biodiversity for human, animal and plant health. However, further development of viable practices and especially the promotion of their use as broadly as possible is urgently needed. How to best manage the world’s lands for improved human health? Some basic guidelines for management of soil biodiversity are offered here. We suggest that a new approach for land use and management is required that acknowledges that soil biota act in concert to provide multiple benefits, even if these benefits are not easily observed. Moreover, increased soil foodweb complexity promotes resistance and resilience to perturbation and may buffer the impacts of extreme events. Agroecological practices that enhance soil organic matter content and soil biodiversity can promote nutrient supply, water infiltration and well-structured soil. Effective management options for cropping systems include reduced tillage with residue retention and rotation, cover crop inclusion, integrated pest management, and integrated soil fertility management (such as the combination of chemical and organic fertilizer). Expanding plant species diversity in crop and/or land rotations and adding organic amendments to pastures can increase soil biodiversity and mimic better the natural soil foodweb65, 66, 86. Additionally, maintenance of soil biodiversity at the landscape level can be enhanced through buffer strips and riparian zones and land rotations. Drainage water management can reduce the movement of pollutants, agrochemicals and other contaminants to nearby landscapes13. Likewise, several forestry practices exist that promote soil biodiversity: re-established mixed deciduous forest stands in Europe were shown to have higher soil biodiversity than pure coniferous stands87. Management for conservation of land should include soil biodiversity as an important criterion in determining protected and wilderness areas, particularly in rapidly changing ecosystems, such as tropical forests, permafrost soils and alpine grasslands. Conservation of soil biodiversity should, in general terms, be based on existing knowledge of soil properties, the abundance, sizes and types of soil organisms, and vegetation. Nevertheless, conserving soil biodiversity could also be done through laboratory isolation of individual organisms or whole communities to maintain a reservoir of genetic and functional diversity appropriate for future disease prevention, biological technologies, and pharmaceuticals88. Soil archives that conserve live collections of interacting species of soil microbes and invertebrates in soil samples from different biomes are irreplaceable and essential; yet at present there are few such archives88. Given the growing global demands placed on limited productive land and the projected increases in infectious diseases, there is an urgent need to implement these and other conservation measures as a stockpile for the future. Ideally, the practices and conservation strategies outlined above that enhance soil biodiversity for the maintenance of human health should be incorporated directly into land-, air- and water-use policies at global and regional levels and integrated with public health organizations such as the United Nations (UN) World Health Organization. Global conventions such as the UN Framework Convention on Climate Change, the UN Convention on Biological Diversity (CBD) and the UN Convention to Combat Desertification are all central to soils and global land use but often neglect soil biodiversity and our dependence on soil for human health, with the exception of the CBD14 through the Food and Agricultural Organization (FAO). Through the Global Soil Partnership, the UN FAO brings together global institutions and other interested parties to coordinate agreements and international challenges related to soil sustainability. The Global Soil Partnership is advised on global soil issues by a scientific Intergovernmental Technical Panel on Soils. Likewise, progress towards the UN Sustainable Development Goals can be achieved by incorporating knowledge of soil biodiversity into a broader spectrum of benefits that improve human health (see Box 2; ref. 89). Importantly, the Global Soil Biodiversity Initiative was established as an independent scientific effort to provide information on soil biodiversity to policymakers and is preparing to publish the first Global Soil Biodiversity Atlas in collaboration with the European Union Joint Research Centre. The Global Soil Biodiversity Initiative (https://globalsoilbiodiversity.org) is also working to have soil biodiversity considered in current international initiatives such as the Intergovernmental Platform on Biodiversity and Ecosystem Services and Future Earth. Fortunately, there is increased recognition that developing effective management tools for soil biodiversity requires active information transfer between scientists and policymakers with new policies formed on current evidence-based knowledge and local cultural knowledge3, 4. However, we need to identify implementation mechanisms to encourage easier updates on best management practices and related policies to ensure long-term sustainable use of global lands under a changing global environment. This is particularly crucial given the rapid accumulation of new insights on how soil biodiversity can be managed to promote human health. We are losing soils and soil biodiversity at a rapid pace, with substantial negative ramifications on human health worldwide. It is time to recognize and manage soil biodiversity as an underutilized resource for achieving long-term sustainability goals related to global human health, not only for improving soils, food security, disease control, water and air quality, but because biodiversity in soils is connected to all life and provides a broader, fundamental ecological foundation for working with other disciplines to improve human health.
Kathiresan R.,Annamalai University |
Gualbert G.,Food and Agricultural Organization
Weed Biology and Management | Year: 2016
Invasive weeds degrade ecosystems and are a threat to plant and animal biodiversity. The literature on biological invasions suggests that only 10% of introduced species become invasive in a new host range. Most introduced plants do not become invasive in a new environment. The invasive behavior of a weed depends on the weed's genetic variability, biotic factors, and climatic factors with which it interacts. The climatic factors that affect the invasive traits of weeds include the atmospheric temperature, soil temperature, precipitation, evaporation, and CO2 concentration. The biological traits that are influenced by a change in any one or more of these climatic factors include the pattern of assimilate partitioning, induction of dormancy or seed germination, herbivore tolerance, propagule production and distribution, variability of plant architecture, photosynthetic rate, and seedbank longevity. The impact of climate change on the invasive traits of certain weed species is reviewed. © 2016 Weed Science Society of Japan
News Article | February 28, 2017
South Sudan is suffering the world's first famine in six years, after Somalia in 2011 where an estimated 260,000 people died (AFP Photo/TONY KARUMBA) Nairobi (AFP) - From ancient Rome to modern times, mankind has suffered devastating periods of hunger caused by drought, war or misguided politics. Last week South Sudan was declared the site of the world's first famine in six years, affecting about 100,000 people. Here is an exploration of a term that evokes the very worst of human suffering. "Famine is not a word that we use lightly," said Erminio Sacco, a food security expert with the Food and Agricultural Organization (FAO). Since 2007 the term has been employed according to a scientific system agreed upon by global agencies, as the Integrated Food Security Phase Classification (IPC) scale. According to the IPC scale, famine exists when at least 20 percent of the population in a specific area has extremely limited access to basic food; acute malnutrition exceeds 30 percent; and the death rate exceeds two per 10,000 people per day for the entire population. "This scientific methodology helps to avoid famine becoming a term misused for political reasons," Sacco said. Over the last century, famines hit China, the Soviet Union, Iran and Cambodia, often the result of human actions. Europe suffered several famines in the Middle Ages, but its most recent were during World War I and II, where parts of Germany, Poland and The Netherlands were left starving under military blockades. In Africa there have been several famines in recent decades, from Biafra in Nigeria in the 1970s to the 1983-1985 Ethiopian famine, which ushered in a new form of celebrity fundraising and unprecedented media attention on the suffering. The last famine in the world was in Somalia in 2011, which killed an estimated 260,000 people. - Why are there still famines today? - While South Sudan is officially experiencing famine, the UN has warned that Nigeria, Somalia and Yemen are all on the verge of the classification, which could affect more than 20 million people. "The common denominator is protracted armed conflict and its negative impact on access to food, farming and livestock production... livelihoods, trade and, not least, humanitarian delivery," Sacco said. Of the four famine alerts, only one -- Somalia -- is caused by drought, while the other three stem from conflicts. - What is life like under famine? - In South Sudan, people have gone through cycles of displacement over the past three years which have driven many of them to hide in swamps, having lost their homes, crops and livestock. With nothing else available, they spend days foraging for wild foods such as water lily roots, fruit or fish, Sacco said. They also spend days walking in search of food aid through areas controlled by armed groups. "They are extremely weak, hungry, and drink unsafe water from ponds and rivers," he said. Cholera is a constant threat. - What does it mean to die from hunger? - When lack of food has led to an 18 percent loss of weight, the body starts undergoing physiological disturbances, according to a 1997 study of hunger strikes published in the British Medical Journal. "The body metabolism gets increasingly dysfunctional, impacting the brain and other vital organs. At that point, therapeutic feeding treatment is necessary to save their lives, as the body has lost the ability to process normal foods," Sacco said. When people have insufficient food over several weeks, it leads to organ failure and eventually death. - What are the long-term impacts? - Even without reaching famine, parts of the Sahel, Somalia and Ethiopia go through regular cycles of hunger that have long-term social consequences. "The biological damage erodes the physical well-being of entire generations of children and their development potential, possibly resulting in a weak workforce and retarded students," Sacco said. Hunger leads to stunted growth and impacts cognitive development, and can lead to poor health throughout a person's life.
News Article | January 28, 2016
Despite calling to mind biblical curses, locust plagues are actually a very modern problem. Right now, Argentina is preparing for the worst locust plague it’s seen in 60 years, and there’s evidence that warmer temperatures and more intense weather events influenced by climate change are causing locust plagues to shift globally. They’re descending on new locations, and, in some cases, becoming more frequent and more severe. When you take a look at what a locust plague looks like, it’s a pretty disturbing thought. So what are we doing to stop it? For years, our main strategy has been to simply watch the weather. “We know locust plagues are affected by weather patterns, so understanding those patterns has helped us better predict where plagues might occur,” said Arianne Cease, a sustainability researcher at Arizona State University who investigates the spread—and mitigation—of locust plagues. “We’re looking to find any outbreak pockets of locusts when they’re young, before they start flying, and then targeting them with pesticides. That’s where we’re at right now, but we think that we can take a step back even before that.” That’s what they’re doing in Argentina: spraying pesticides in a frantic attempt to decrease the locust population before it matures into adults, grows wings, and becomes a cloud of destruction decimating the country’s cotton and sunflower crops. The jury is still out on what led to the locust population surge there, but after a series of warm, wet winters, some officials are pointing to climate change as a contributing factor. Climate change has also contributed to more severe locust outbreaks in China, and the United Nations Food and Agricultural Organization recently warned that climate change could lead to more locust outbreaks in parts of Africa. There are about 20 species of locusts in the world and each of them is impacted differently by the weather, said Cease. That means environmental factors are not one-size-fits-all, although it’s clear changes in weather mean changes in locust populations. But generally speaking, hotter, longer summers hasten the locusts’ development cycle, which can lead to more generations of locusts in a season, creating a high population density. Wetter, warmer winters can have a similar effect by promoting growth of grasslands where locusts like to hang out (heavy rains can also wipe them out early in their development cycle). Locusts are solitary insects most of the time, but when the population density reaches a certain point, their instinct to travel together and form a swarm kicks in, Cease said. That means the only time to strike is to find locusts when they're young. But it gets more complicated: because locusts, in manageable populations, are an important part of their ecosystems, scientists like Cease don’t want to just wipe them out. They’re hunting for solutions to keep locusts at bay without eradicating them, which is a tricky balancing act. To solve this problem, Cease thinks scientists need to go beyond just weather monitoring if we hope to prevent and mitigate widespread locust outbreaks. Her lab takes a multi-pronged approach to locust studies, which is precisely why I wanted to talk to her. Monitoring the weather and killing off dense populations of locusts works some of the time, but it’s not a great solution. Cease’s lab is one of the few that’s trying to find other ways to fight back. Rather than looking at it as a purely ecological problem, Cease and her colleagues consider the economical and cultural influences at play, too. Her work, and the work of others, has shown land use has as much of an impact on locust population numbers as climate and weather patterns, for example. Cutting down forests, which act as a barrier between locust populations, can increase population density. Overgrazing, too, creates a hotbed for locust growth. By understanding these influences, policies and strategies can be put in place that limit or prevent locust outbreaks altogether, Cease said. There are political influences at play, too. In the past, civil conflicts have prevented effective locust management—like spray pesticides when the population grows too dense—in places like Mali and Niger, leading to outbreaks. There are a lot of factors at play, and Cease says the only way to predict and prevent plagues is to consider all of those influences. The solutions are pretty dry on paper—ideas like creating incentive programs for farmers to help them make better crop choices and prevent the spread of locusts—but if it results in fewer terrifying swarms of giant, crop-dissolving grasshoppers, maybe some dry policy research is exactly what we need.
News Article | September 21, 2016
You may have missed it amid all the reports of Brad and Angelina's divorce, but the United Nations opened its General Assembly session this week to discuss major global issues such as climate change and antibiotic resistance. But they really should be talking about bacon. The link between our food systems and many of the high-level issues being discussed at the UN this week is significant. From the greenhouse gas emissions to the overuse of medically-important antibiotics, the way we raise our animals for meat is a major contributor. Most experts agree that some small changes to what we eat, such as cutting down on meat, could have a huge impact on solving these problems. In fact, a study in 2009 calculated that a global shift to a low meat diet could cut the costs needed to achieve our greenhouse gas goals by 50 percent. But how do we do that on a large scale, in a way that people don't hate? China—where meat consumption has risen dramatically over the last decade—released a plan earlier this year to reduce the country's meat consumption by 50 percent. It's mostly based on changing the dietary guidelines issued by the government and launching an ad campaign (featuring, for some reason, Arnold Schwarzenegger and James Cameron) to encourage people to cut back on meat. It's way too early to tell if this strategy will be effective, but at least China is staying woke to the fact that we need to start rearranging our dinner plates. "We don't need to wait for technological breakthroughs." But at the UN, even as delegates acknowledged the role of agriculture on antibiotic resistance Wednesday, there was no mention of the simplest, most obvious solution: getting everyone to cool it on the hamburgers a couple of times a week. "The whole green energy movement and developing technologies and getting governments on board, sure, doing those sorts of things is fine," said Sonia Faruqi, a former investment banker and the author of Project Animal Farm, which explores sustainable solutions for farming. "But it's also relatively simple. We don't need to wait for technological breakthroughs or innovation. We just need to think about what's on our plate that doesn't need to be there." When it comes to antibiotic resistance, the overuse of antimicrobials in agriculture—used both to fatten up animals and to prevent disease—is a major contributor to the development of resistant bacteria. In the US, between 70 and 80 percent of all antibiotics sold each year are sold to farmers. With climate change, the effects are even more apparent. Some researchers have declared animal agriculture the leading cause of climate change, but even those who question that claim recognize that raising livestock contributes significantly to greenhouse gas emissions. The UN's own Food and Agricultural Organization estimates livestock produce 14.5 percent of all human-generated greenhouse gas emissions. It also uses a lot more water than crop agriculture and contributes to deforestation. As global diets shift to a more westernized cuisine, expecting everyone on the planet to give up meat, eggs and dairy entirely isn't realistic (or, from a food security perspective, necessarily the best option). But reducing our meat consumption feels a little more doable. Read More: New US Dietary Guidelines: Hey Dudes, Eat Less Meat! If there was widespread understanding that curbing meat consumption could help us fight climate change and prevent antibiotic resistant superbugs, the public might just get on board. Studies have shown that, when it comes to climate change at least, people are more likely to shift behaviors if they have a good understanding of the adverse effects of not changing their current practices. And we've already seen that, in a small way, with the gradual reduction of red meat consumption in the US as we've gained better understanding about nutrition. But a study published earlier this year showed that only six percent of Americans even knew there is a link between eating meat and climate change. The truth is cutting back on bacon alone won't fix everything, but it's a simple, viable strategy to add to our toolbox that the UN is largely ignoring. Maybe Arnold Schwarzenegger should drop by the general assembly. Get six of our favorite Motherboard stories every day by signing up for our newsletter.