Assessing the impact of integrated pest management farmer field schools (IPM-FFSs) on acquisition of farmers knowledge regarding use of pesticide, nutrient management and confidence in decision making process
Siddiqui A.A.,Agriculture Extension Wing |
Siddiqui M.,University of Sindh |
Knox O.,Scottish Agricultural College SAC
Pakistan Journal of Life and Social Sciences | Year: 2012
To encourage the environmentally friendly farming practices the National Integrated Pest Management Programme (Nat-IPM) for cotton was launched in Sindh province, Pakistan during 2001 to 2004, which sought to empower the farmer's community to take wise decisions at the field and new training methodology called Farmer Field School (FFS) was introduced. Integrated Pest Management Farmer Field School (IPM-FFS) training emphasized that the crops should be healthier with least use of pesticides which have bad impact on the nature and encouraged to the natural pest mechanism. The basic principle behind this new extension IPM-FFS training method was to enable farmers to be self sufficient, using efficient cultivation techniques and that are eco-friendly. To assess the impact of this new FFS training model in connection to agro-ecological sound IPM practices with special reference to cotton, study was conducted in four districts of Sindh province. The sample size comprised of 432 farmers in total, selecting 144 farmers from each category (Trained, Exposed and Controlled) and 108 farmers from each district (Hyderabad, Tando Allahyar, Matiari and Mirpurkhas). The results indicated that IPM-FFSs increased farmers' knowledge regarding use of pesticides, nutrient management and their confidence on decision making process regarding agro-eco-friendly farming.
Mechler R.,International Institute For Applied Systems Analysis |
Mechler R.,Vienna University of Economics and Business |
Hochrainer S.,International Institute For Applied Systems Analysis |
Aaheim A.,CICERO Center for International Climate and Environmental Research |
And 3 more authors.
Mitigation and Adaptation Strategies for Global Change | Year: 2010
Adaptation to climate change in Europe has only recently become a true policy concern with the management of extreme events one priority item. Irrespective of future climatic changes increasing the need for systematic evaluation and management of extremes, weather-related disasters already today pose substantial burdens for households, businesses and governments. Research in the ADAM project identified substantial direct risks in terms of potential crop and asset losses due to combined drought and heatwave, as well as flood hazards in Southern and Eastern Europe, respectively. This paper focuses on the indirect, medium to longer term economic risks triggered by the direct risks and mediated by policy responses. We present a selection of three economic impact and adaptation assessments and modelling studies undertaken on extreme event adaptation in Europe. Responding to a need for more economically based adaptation assessments, we address some relatively unresearched issues such as the understanding of past adaptation, the role of market response to impacts as well as government's ability to plan for and share out extreme event risks. The first analysis undertakes an empirical exploration of observed impacts and adaptation in the agricultural sector in the UK comparing the impact of consecutive extreme events over time in order to determine whether adaptation has occurred in the past and whether this can be used to inform future estimates of adaptation rates. We find that farmers and the agricultural sector clearly have adapted to extreme events over time, but whether this rate can be maintained into the future is unclear, as some autonomous adaptation enacted seemed rather easy to be taken. Markets may mediate or amplify impacts and in the second analysis, we use an economic general equilibrium model to assess the economic effects of a reduction in agricultural production due to drought and heatwave risk in exposed regions in Spain. The analysis suggests that modelled losses to the local economy are more serious in a large-scale scenario when neighbouring provinces are also affected by drought and heatwave events. This is due to the supply-side induced price increase leading to some passing on of disaster costs to consumers. The simulation highlights the importance of paying particular attention to the spatial and distributional effects weather extremes and possibly changes therein induced by climate change may incur. Finally, we discuss how national governments may better plan their disaster liabilities resulting from a need to manage relief and reconstruction activities post event. We do so using a risk based economic planning model assessing the fiscal consequences associated with the coping with natural extremes. We identify large weather-related disaster contingent liabilities, particularly in the key flood hot spot countries Austria, Romania, and Hungary. Such substantial disaster liabilities ("hidden disaster deficits") when interacting with weak fiscal conditions may lead to substantial additional stress on government budgets and reduced fiscal space for funding other relevant public investment projects. Overall, our paper suggests the importance of respecting the specific spatial and temporal characteristics of extreme event risk when generating information on adaptation decisions. As our adaptation decisions considered, such as using sovereign risk financing instruments are associated with a rather short time horizon, the analysis largely focuses on the management of today's extreme events and does not discuss in detail projections of risks into a future with climate change. Such projections raise important issues of uncertainty, which in some instances may actually render future projections non-robust, a constraint to be kept in mind when addressing longer term decisions, which at the same time should account for both climate and also socioeconomic change. © 2010 Springer Science+Business Media B.V.
Guy J.H.,Northumbria University |
Cain P.J.,Northumbria University |
Seddon Y.M.,Northumbria University |
Baxter E.M.,Scottish Agricultural College SAC |
Edwards S.A.,Northumbria University
Animal Welfare | Year: 2012
New livestock housing systems designed to improve animal welfare will only see large-scale commercial adoption if they improve profitability, or are at least cost neutral to the farm business. Economic evaluation of new system developments is therefore essential to determine their effect on cost of production and hence the extent of any market premium necessary to stimulate adoption. This paper describes such an evaluation in relation to high welfare farrowing systems for sows where any potential system needs to reconcile the behavioural needs of the sow with piglet survivability, acceptable capital and running costs, farm practicality and ease of management. In the Defra-sponsored PigSAFE project, a new farrowing system has been developed which comprises a loose, straw-bedded pen with embedded design features which promote piglet survival. Data on this and four other farrowing systems (new systems: 360°Farrower and a Danish pen; existing systems: crate and outdoor paddock) were used to populate a model of production cost taking account of both capital and running costs (feed, labour, bedding etc). Assuming equitable pig performance across all indoor farrowing systems, the model estimated a higher production cost for non-crate systems by 1.6, 1.7 and 3.5%, respectively, for 360° Farrower, Danish and PigSAFE systems on a per-sow basis. The outdoor production system had the lowest production cost. An online survey of pig producers confirmed that, whilst some producers would consider installing a non-crate system, the majority of producers remain cautious about considering alternatives to the farrowing crate. If pig performance in alternative indoor systems could be improved from the crate baseline (eg through reduced piglet mortality, improved weaning weight or sow re-breeding), then the differential cost of production could be reduced. Indeed, with further innovation by pig producers, management of alternative farrowing systems may evolve to a point where there can be improvements in both welfare and pig production. However, larger data sets of alternative systems on commercial farms will be needed to explore fully the welfare/production interface before such a relationship can be confirmed for those pig producers who will be replacing their units in the next ten years. © 2012 Universities Federation for Animal Welfare.
Mrode R.,Scottish Agricultural College SAC |
Pritchard T.,Scottish Agricultural College SAC |
Coffey M.,Scottish Agricultural College SAC |
Wall E.,Scottish Agricultural College SAC
Journal of Dairy Science | Year: 2012
Genetic parameters were estimated in a joint analysis of loge-transformed somatic cell count (TSCC) with either mastitis as a binary trait (MAS) or the number of mastitis cases (NMAS) in Holstein-Friesian cows for the first 3 lactations using a random regression model. In addition, a multi-trait analysis of MAS and NMAS was also implemented. There were 67,175, 30,617, and 16,366 cows with records for TSCC, MAS, and NMAS in lactations 1, 2, and 3, respectively. The frequency of MAS was 14, 20, and 25% in lactations 1, 2, and 3 respectively. The model for TSCC included herd-test-day, age at calving and month of calving, fixed lactation curves nested with calving year groups, and random regressions with Legendre polynomials of order 2 for animal and permanent environmental effects. The model for MAS and NMAS included fixed herd-year-season, age at calving and month of calving, and random animal and permanent environmental effects. All analyses were carried out using Gibbs sampling. Estimates of mean daily heritability averaged over a 305-d lactation were 0.11, 0.14, and 0.15 for TSCC for lactations 1, 2, and 3, respectively. Corresponding heritability estimates for MAS were 0.05, 0.07, and 0.09. The heritabilities for NMAS were similar at 0.06, 0.07, and 0.12, respectively, for lactations 1, 2, and 3. The genetic correlations between lactations 1 and 2, 1 and 3, and 2 and 3 were 0.75, 0.64, and 0.92 for computed 305-d lactation TSCC; 0.55, 0.48, and 0.89 for MAS; and 0.62, 0.42, and 0.85 for NMAS, respectively. The genetic correlations between MAS and TSCC were positive and generally moderate to high. The genetic correlations between computed 305-d lactation TSCC and MAS were 0.53, 0.61, and 0.68 in lactations 1, 2, and 3, respectively. Similar corresponding genetic correlations were obtained between computed 305-d lactation TSCC and NMAS in the respective parities. Mastitis as a binary trait and NMAS in the same lactation were very highly correlated and were genetically the same trait. It is intended that the new parameters will be used in setting up a national evaluation system for the joint analysis of TSCC and MAS. © 2012 American Dairy Science Association.
Tiley G.E.D.,Scottish Agricultural College SAC
Journal of Ecology | Year: 2010
This account presents information on all aspects of the biology of Cirsium arvense that are relevant to understanding its ecological characteristics and behaviour. The main topics are presented within the standard framework of the Biological Flora of the British Isles: distribution, habitat, communities, responses to biotic factors, responses to environment, structure and physiology, phenology, floral and seed characters, herbivores and disease, history, conservation and management.2.Cirsium arvense, creeping thistle (Californian thistle, Canada thistle), one of the world's most troublesome and persistent weeds, is native to Europe and the east northern hemisphere but introduced to North America and the southern hemisphere. Latitudinal distribution north or south is limited by low winter and summer maximum temperatures and by a long day requirement for flowering.3.Cirsium arvense is believed to have originated in the temperate Middle East and its spread has closely followed human migration and agricultural activity. Colonization of new sites is by seed which establishes best in bare or disturbed ground, mirroring its prehistoric ecology as an opportunist pioneer of bare ground and organic residues. It is now a widespread and scheduled agricultural weed in both arable crops and pastures and also a constituent in over 70 British (National Vegetation Classification) plant communities, occurring mainly on waste neglected land, roadsides, hedgerows and disturbed areas.4. Its presence in crops leads to yield losses and in pastures seriously interferes with utilization due to the deterrent effect of the leaf spines on grazing animals. This has led to a long history of investigation into control measures: mechanical, chemical, biological and integrated, which are summarized. Combination treatments and integrated control have achieved some success but effective control requires follow-up procedures over a number of seasons. Climate change studies suggest C. arvense could grow better and be more difficult to control in future.5. Success and persistence derives from an extensive, far-creeping and deep rooting system which ensures survival and rapid vegetative spread under a wide range of soil and management conditions, and a means of escape from sub-aerial control treatments. New adventitious buds capable of shoot development can arise at any point along the horizontal roots, even when these are cut into pieces or damaged. Root buds remain dormant until released from dormancy through damage or decay of the aerial shoots. Carbohydrate root reserves, stored in swollen cortical tissue, fall to a minimum just before flowering and are then replenished for perennation during the subsequent winter. Strategies for control aim to treat the plant when root carbohydrate reserves are at a minimum, to exhaust these reserves and to prevent replenishment for further perennation.6. Balanced against its difficulty as a weed, C. arvense has significant conservation value as a host to numerous insects, many attracted by copious and accessible nectar and strong flower fragrance. It is however a strong competitor to low-growing plants in natural communities.7.Cirsium arvense is dioecious and for flowering has a 14-16 h day length requirement. Seed set is successful if male and female plants are no more than 50-90 m apart to allow insect pollination. In spite of the conspicuous wind-borne pappus, this rarely carries a seed which normally falls near the parent plant. The flower heads and other plant parts are regularly attacked by numerous insects and less frequently by diseases.8. Germination of seed is mainly during the high temperatures of early summer in the year following dispersal and establishment is most successful in open areas. Development of the branching root system and vegetative spread follow rapidly.9. A combination of dioecy and vegetative reproduction has resulted in the maintenance of genotypic and genetic diversities within populations allowing efficient colonization and persistence, contributing greatly to success in the species. © 2010 The Author. Journal compilation © 2010 British Ecological Society.
Prieto N.,Sustainable Livestock Systems Group |
Prieto N.,Scottish Agricultural College SAC |
Ross D.W.,Sustainable Livestock Systems Group |
Navajas E.A.,Sustainable Livestock Systems Group |
And 4 more authors.
Animal | Year: 2011
The objective of this study was to examine the online use of near infrared reflectance (NIR) spectroscopy to estimate the concentration of individual and groups of fatty acids (FA) as well as intramuscular fat (IMF) in crossbred Aberdeen Angus (AA×) and Limousin (LIM×) cattle. This was achieved by direct application of a fibre-optic probe to the muscle immediately after exposing the meat surface in the abattoir at 48 h post mortem. Samples of M. longissimus thoracis from 88 AA× and 106 LIM× were scanned over the NIR spectral range from 350 to 1800 nm and samples of the M. longissimus lumborum were analysed for IMF content and FA composition. Statistically significant differences (P < 0.001) were observed in most FA between the two breeds studied, with FA concentration being higher in AA× meat mainly. NIR calibrations, tested by cross-validation, showed moderate to high predictability in LIM× meat samples for C16:0, C16:1, C18:0, trans11 C18:1, C18:1, C18:2 n-6, C20:1, cis9, trans11 C18:2, SFA (saturated FA), MUFA (monounsaturated FA), PUFA (polyunsaturated FA) and IMF content with R 2 (SECV, mg/100 g muscle) of 0.69 (146), 0.69 (28), 0.71 (62), 0.70 (8.1), 0.76 (192), 0.65 (13), 0.71 (0.9), 0.71 (2.9), 0.68 (235), 0.75 (240), 0.64 (17) and 0.75 (477), respectively. FA such as C14:0, C18:3 n-3, C20:4 n-6, C20:5 n-3, C22:6 n-3, n-6 and n-3 were more difficult to predict by NIR in these LIM× samples (R2 = 0.12 to 0.62; SECV = 0.5 to 26 mg/100 g muscle). In contrast, NIR showed low predictability for FA in AA× beef samples. In particular for LIM×, the correlations of NIR measurements and several FA in the range from 0.81 to 0.87 indicated that the NIR spectroscopy is a useful online technique for the early, fast and relatively inexpensive estimation of FA composition in the abattoir. Copyright © 2010 The Animal Consortium.
Aakre S.,CICERO Center for International Climate and Environmental Research |
Banaszak I.,Polish Academy of Sciences |
Banaszak I.,Slovak Academy of Sciences |
Mechler R.,International Institute For Applied Systems Analysis |
And 6 more authors.
Mitigation and Adaptation Strategies for Global Change | Year: 2010
Increasing losses from weather related extreme events coupled with limited coping capacity suggest a need for strong adaptation commitments, of which public sector responses to adjustments to actual and expected climate stimuli are key. The European Commission has started to address this need in the emerging European Union (EU) climate adaptation strategy; yet, a specific rationale for adaptation interventions has not clearly been identified, and the economic case for adaptation to extremes remains vague. Basing the diagnosis on economic welfare theory and an empirical analysis of the current EU and member states' roles in managing disaster risk, we discuss how and where the public sector may intervene for managing climate variability and change. We restrict our analysis to financial disaster management, a domain of adaptation intervention, which is of key concern for the EU adaptation strategy. We analyse three areas of public sector interventions, supporting national insurance systems, providing compensation to the affected post event as well as intergovernmental loss sharing through the EU solidarity fund, according to the three government functions of allocation, distribution, and stabilization suggested by welfare theory, and suggest room for improvement. © 2010 The Author(s).
Bell M.J.,Scottish Agricultural College SAC |
Wall E.,Scottish Agricultural College SAC |
Russell G.,University of Edinburgh |
Simm G.,Scottish Agricultural College SAC |
Stott A.W.,Scottish Agricultural College SAC
Journal of Dairy Science | Year: 2011
This study compared the environmental impact of a range of dairy production systems in terms of their global warming potential (GWP, expressed as carbon dioxide equivalents, CO 2-eq.) and associated land use, and explored the efficacy of reducing said impact. Models were developed using the unique data generated from a long-term genetic line × feeding system experiment. Holstein-Friesian cows were selected to represent the UK average for milk fat plus protein production (control line) or were selected for increased milk fat plus protein production (select line). In addition, cows received a low forage diet (50% forage) with no grazing or were on a high forage (75% forage) diet with summer grazing. A Markov chain approach was used to describe the herd structure and help estimate the GWP per year and land required per cow for the 4 alternative systems and the herd average using a partial life cycle assessment. The CO 2-eq. emissions were expressed per kilogram of energy-corrected milk (ECM) and per hectare of land use, as well as land required per kilogram of ECM. The effects of a phenotypic and genetic standard deviation unit improvement on herd feed utilization efficiency, ECM yield, calving interval length, and incidence of involuntary culling were assessed. The low forage (nongrazing) feeding system with select cows produced the lowest CO 2-eq. emissions of 1.1 kg/kg of ECM and land use of 0.65m 2/kg of ECM but the highest CO 2-eq. emissions of 16.1t/ha of the production systems studied. Within the herd, an improvement of 1 standard deviation in feed utilization efficiency was the only trait of those studied that would significantly reduce the reliance of the farming system on bought-in synthetic fertilizer and concentrate feed, as well as reduce the average CO 2-eq. emissions and land use of the herd (both by about 6.5%, of which about 4% would be achievable through selective breeding). Within production systems, reductions in CO 2-eq. emissions per kilogram of ECM and CO 2-eq. emissions per hectare were also achievable by an improvement in feed utilization. This study allowed development of models that harness the biological trait variation in the animal to improve the environmental impact of the farming system. Genetic selection for efficient feed use for milk production according to feeding system can bring about reductions in system nutrient requirements, CO 2-eq. emissions, and land use per unit product. © 2011 American Dairy Science Association.
PubMed | Scottish Agricultural College SAC
Type: Journal Article | Journal: International journal for parasitology | Year: 2012
The degree of periparturient relaxation of immunity to gastrointestinal parasites has a nutritional basis, as overcoming protein scarcity through increased protein supply improves lactational performance, enhances local immune responses and reduces worm burdens. Herein lactating rats, re-infected with Nippostrongylus brasiliensis, are used to test the hypothesis that a similar and rapid improvement of immunity can be achieved through reducing nutrient demand at times of dietary protein scarcity. Reducing litter size from 12 to three pups during lactation resulted, as expected, in cessation of maternal body weight loss and increased pup body weight gain compared with dams which continued to nurse 12 pups. This increase in performance concurred with a rapid decrease in parasitism; within 3 days post nutrient reduction, a 87% reduction in the number of worm eggs found in the colon and 83% reduction in worm burdens was observed, which concurred with increased local immune responses, i.e. 70% more mast cells and 44% more eosinophils in the small intestinal mucosa, to levels similar to those in dams nursing three pups throughout. However, there were no concurrent changes in goblet cell hyperplasia, serum anti-N. brasiliensis-specific antibody levels or mRNA expression of IL-4, IL-10 or IL-13 in the mesenteric lymph nodes. To our knowledge the current study is the first to employ a litter reduction strategy to assess the rate of immune improvement upon overcoming nutrient scarcity in a non-ruminant host. These data support the hypothesis that periparturient relaxation of immunity to gastrointestinal nematodes can be reduced by restoring nutrient adequacy and, importantly, that this improvement can occur very rapidly.
Jones L.A.,Scottish Agricultural College SAC |
Sakkas P.,Scottish Agricultural College SAC |
Sakkas P.,Wageningen University |
Houdijk J.G.M.,Scottish Agricultural College SAC |
And 3 more authors.
International Journal for Parasitology | Year: 2012
The degree of periparturient relaxation of immunity to gastrointestinal parasites has a nutritional basis, as overcoming protein scarcity through increased protein supply improves lactational performance, enhances local immune responses and reduces worm burdens. Herein lactating rats, re-infected with Nippostrongylus brasiliensis, are used to test the hypothesis that a similar and rapid improvement of immunity can be achieved through reducing nutrient demand at times of dietary protein scarcity. Reducing litter size from 12 to three pups during lactation resulted, as expected, in cessation of maternal body weight loss and increased pup body weight gain compared with dams which continued to nurse 12 pups. This increase in performance concurred with a rapid decrease in parasitism; within 3. days post nutrient reduction, a 87% reduction in the number of worm eggs found in the colon and 83% reduction in worm burdens was observed, which concurred with increased local immune responses, i.e. 70% more mast cells and 44% more eosinophils in the small intestinal mucosa, to levels similar to those in dams nursing three pups throughout. However, there were no concurrent changes in goblet cell hyperplasia, serum anti-N. brasiliensis-specific antibody levels or mRNA expression of IL-4, IL-10 or IL-13 in the mesenteric lymph nodes. To our knowledge the current study is the first to employ a litter reduction strategy to assess the rate of immune improvement upon overcoming nutrient scarcity in a non-ruminant host. These data support the hypothesis that periparturient relaxation of immunity to gastrointestinal nematodes can be reduced by restoring nutrient adequacy and, importantly, that this improvement can occur very rapidly. © 2012.