Munkholm L.J.,University of Aarhus |
Hansen E.M.,University of Aarhus |
Thomsen I.K.,University of Aarhus |
Wahlstrom E.M.,University of Aarhus |
Soil Use and Management | Year: 2017
Early seeding of winter wheat (Triticum aestivum L.) has been proposed as a means to reduce N leaching as an alternative to growing cover crops like fodder radish (Raphanus sativus L.). The objective of this study was to quantify the effect of winter wheat, seeded early and normally, and of fodder radish on N dynamics and root growth. Field experiments were carried out on a humid temperate sandy loam soil. Aboveground biomass and soil inorganic N were determined in late autumn; N uptake and grain yield of winter wheat were measured at harvest. Nitrate leaching was estimated from soil water samples taken at 1 m depth. Root growth was measured late autumn using the core break and root washing methods. Winter wheat root growth dynamics were followed during the growing season using the minirhizotron method. The 2013-2014 results showed that early seeding of wheat improved autumn growth and N uptake and reduced N leaching during the winter compared with the normal seeding time. Early-seeded wheat (WWearly) was, however, not as efficient as fodder radish at reducing N leaching. Proper establishment of WWearly was a prerequisite for benefiting from early seeding, as indicated by the 2012-2013 results. Early seeding improved root growth throughout the 2013-2014 growing season compared with normal seeding time, but had no significant effect on crop grain yield. Our results indicate the potential of using early seeding as a tool to limit drought susceptibility and increase nutrient uptake from the subsoil. © 2017 British Society of Soil Science.
Shah A.,University of Aarhus |
Askegaard M.,SEGES |
Rasmussen I.A.,International Center for Research in Organic Food Systems |
Jimenez E.M.C.,Institute Investigacion y Formacion Agraria y Pesquera IFAPA |
Olesen J.E.,University of Aarhus
European Journal of Agronomy | Year: 2017
A field experiment comparing different arable crop rotations was conducted in Denmark during 1997–2008 on three sites varying in climatic conditions and soil types, i.e. coarse sand (Jyndevand), loamy sand (Foulum), and sandy loam (Flakkebjerg). The crop rotations followed organic farm management, and from 2005 also conventional management was included for comparison. Three experimental factors were included in the experiment in a factorial design: 1) crop rotation (organic crop rotations varying in use of whole-year green manure (O1 and O2 with a whole-year green manure, and O4 without), and a conventional system without green manure (C4)), 2) catch crop (with and without), and 3) manure (with and without). The experiment consisted of three consecutive cycles using four-course rotations with all crops present every year, i.e. 1997–2000 (1st cycle), 2001–2004 (2nd cycle), and 2005–2008 (3rd cycle). In the 3rd cycle at all locations C4 was compared with two organic rotations, i.e. O2 and O4. The O2 rotation in the third cycle included spring barley, grass-clover, potato, and winter wheat, whereas C4 and O4 included spring barley, faba bean, potato, and winter wheat. For the O2 rotation with green manure there was a tendency for increased DM yield over time at all sites, whereas little response was seen in N yield. In the O4 rotation DM and N yields tended to increase at Foulum over time, but there was little change at Flakkebjerg. The DM yield gap between organic and conventional systems in the 3rd cycle varied between sites with 34–66% at Jyndevad, 21–44% at Foulum, and 32–52% at Flakkebjerg. The inclusion of grass-clover resulted in lower cumulated yield over the rotation than the treatment without grass-clover. The use of manure reduced the DM yield gap between conventional and organic systems on an average by 15 and 21%-points in systems with and without grass-clover, respectively, and the use of catch crops reduced the yield gap by 3 and 5%-points in the respective systems. Across all crops the agronomic efficiency of N in manure (yield benefit for each kg of mineral N applied) was greater in O4 compared with O2 for all crops. © 2017 Elsevier B.V.
Pandey A.,University of Aarhus |
Li F.,Northwest University, China |
Askegaard M.,SEGES |
Olesen J.E.,University of Aarhus
European Journal of Agronomy | Year: 2017
Biological nitrogen (N) fixation (BNF) by legumes in organic cropping systems has been perceived as a strategy to substitute N import from conventional sources. However, the N contribution by legumes varies considerably depending on legumes species, as well as local soil and climatic conditions. There is a lack of knowledge on whether the N contribution of legumes estimated using short-term experiments reflects the long-term effects in organic systems varying in fertility building measures. There is also limited information on how fertilizer management practices in organic crop rotations affect BNF of legumes. Therefore, this study aimed to estimate BNF in long-term experiments with a range of organic and conventional arable crop rotations at three sites in Denmark varying in climate and soils (coarse sand, loamy sand and sandy loam) and to identify possible causes of differences in the amount of BNF. The experiment included 4-year crop rotations with three treatment factors in a factorial design: (i) rotations, i.e. organic with a year of grass-clover (OGC), organic with a year of grain legumes (OGL), and conventional with a year of grain legumes (CGL), (ii) with (+CC) and without (−CC) cover crops, and (iii) with (+M) and without (−M) animal manure in OGC and OGL, and with (+F) mineral fertilizer in CGL. Cover crops consisted of a mixture of perennial ryegrass and clover (at the sites with coarse sand and sandy loam soils) or winter rye, fodder radish and vetch (at the site with loamy sand soil) in OGC and OGL, and only perennial ryegrass in CGL at all sites. The BNF was measured using the N difference method. The proportion of N derived from the atmosphere (%Ndfa) in aboveground biomass of clover grown for an entire year in a mixture with perennial ryegrass and harvested three times during the growing season in OGC was close to 100% at all three sites. The Ndfa of grain legumes in both OGL and CGL rotations ranged between 61% and 95% depending on location with mostly no significant difference in Ndfa between treatments. Cover crops had more than 92% Ndfa at all sites. The total BNF per rotation cycle was higher in OGC than in OGL and CGL, mostly irrespective of manure/fertilizer or cover crop treatments. There was no significant difference in total BNF between OGL and CGL rotations, but large differences were observed between sites. The lowest cumulated BNF by all the legume species over the 4-year rotation cycle was obtained at the location with sandy loam soil, i.e. 224–244, 96–128, and 144–156 kg N ha−1 in OGC, OGL and CGL, respectively, whereas it was higher at the locations with coarse sand and loamy sand soil, i.e. 320–376, 168–264, and 200–220 kg N ha−1 in OGC, OGL and CGL, respectively. The study shows that legumes in organic crop rotations can maintain N2 fixation without being significantly affected by long-term fertilizer regimes or fertility building measures. © 2017 Elsevier B.V.
Schjonning P.,University of Aarhus |
Lamande M.,University of Aarhus |
Munkholm L.J.,University of Aarhus |
Lyngvig H.S.,SEGES |
Soil and Tillage Research | Year: 2016
Compaction of the subsoil due to heavy traffic in moist and wet soil is widespread in modern agriculture. The objective of this study was to quantify the effects from realistic field traffic on soil penetration resistance and barley crop yield for three Luvisols developed from glacial till. Undisturbed soil cores were used for quantifying the precompression stress (σpc) of non-compacted soil. Tractor-trailer combinations for slurry application with wheel loads of ∼3, ∼6 and ∼8 Mg (treatments M3, M6, M8) were used for the experimental traffic in the spring at field-capacity. For one additional treatment (labelled M8-1), the soil was loaded only in the first year. A tricycle-like machine with a single pass of wide tyres each carrying ∼12 Mg (treatment S12) was included at one site. Traffic treatments were applied in a randomized block design with four replicates and with treatments repeated in four consecutive years (2010–2013). After two years of repeated experimental traffic, penetration resistance (PR) was measured to a depth of 1 m. The yield of a spring barley crop (Hordeum vulgare L.) was recorded in all four years of the experiment. The results did not support our hypothesis of σpc as a soil strength measure predicting resistance to subsoil compaction. The tyre inflation pressure and/or the mean ground pressure were the main predictors of PR in the upper soil layers. For deeper soil layers, PR correlated better to the wheel load. The number of wheel passes (M-treatments vs the S12 treatment) modified this general pattern, indicating a very strong impact of repeated wheel passes. Our data indicate that a single traffic event may mechanically weaken the soil without inducing major compaction but with influence on the effect of subsequent traffic even after as long an interval as a year (treatments M8 vs M8-1). Crop yields were much influenced by compaction of the plough layer. Due to the repeated wheel passes for the M-treatments, significant yield penalties were observed, while the single-pass treatment with 12 Mg wheel load in S12 did not have significant effects on crop yield. Our hypothesis of 3 Mg wheel load as an upper threshold for not inducing subsoil compaction was confirmed for the tractor-trailer treatments with repeated wheel passes but not supported for the single-pass machinery. The results call for further studies of the potential for carrying high loads using wide, low-pressure tyres by crab steering/dog-walk machinery. © 2016 Elsevier B.V.
Kristensen T.,University of Aarhus |
Aaes O.,SEGES |
Weisbjerg M.R.,University of Aarhus
Livestock Science | Year: 2015
Cattle production during the last century has changed dramatically in Western Europe, including Denmark, with a steady increase in production per animal and in herd and farm size. The effect of these changes on total production, herd efficiency, surplus of nitrogen (N) at herd and farm level and emission of greenhouse gases (GHG) per kg product has been evaluated for the Danish dairy cattle sector based on historic information. Typical farms representing the average situation for Danish dairy cattle farms and land required for feed supply was modeled for the situation in: (A) 1920 - representing a local-based production, (B) 1950 - representing a period with emerging mechanization and introduction of new technologies and a more global market, (C) 1980 - representing a period with heavy use of external resources like fertilizer and feed protein and (D) 2010 - today with focus on balancing production and risk of environmental damage. In A, B and C, other livestock such as pigs and hens also played a role, while the dairy farm in 2010 only had cattle. In 1920 and 1950 the farm was based on 7-8 dairy cows producing typically 1800-3400kg energy-corrected milk (ECM) per cow annually and fed primarily on pasture and hay, only to a limited extent supplemented with imported protein. In 1980 the herd size had increased to 20 dairy cows producing 5000kg ECM each, and feeding was with silage instead of hay, but still included grazing and there was a larger proportion of imported feed. In 2010 the herd had increased to 134 dairy cows producing 9000kg ECM per cow and fed indoors all year. During this period net energy used for milk and meat in % of total intake and land use per 1000kg of milk has steadily decreased as a consequence of higher milk yield per cow and higher yields of forage per ha. In opposition, the utilization of N in the herd, while increasing from 1920 to 1950 and to 2010 showed a drop in the 1980 system, where also the environmental N surplus per ha farmland was highest (40; 65; 226; 148kg N per ha farmland in the respective periods). The lower N efficiency in 1980 also resulted in an increased GHG emission per kg milk than in the preceding and following periods (2.23; 1.38; 1.94; 1.20kg CO2-eq. per kg ECM in the respective periods). It is concluded that the biological and technical development has made it possible to reduce the environmental load of dairy production significantly, but that this requires a strong focus on nitrogen management at the farm level and production efficiency in the herd. © 2015 Elsevier B.V..
PubMed | NAV Nordic Cattle Genetic Evaluation, University of Aarhus and Seges
Type: Journal Article | Journal: Journal of dairy science | Year: 2015
A bias in the trend of genomic estimated breeding values (GEBV) was observed in the Danish Jersey population where the trend of GEBV was smaller than the deregressed proofs for individuals in the validation population. This study attempted to improve the prediction reliability and reduce the bias of predicted genetic trend in Danish Jersey. The data consisted of 1,238 Danish Jersey bulls and 611,695 cows. All bulls were genotyped with the 54K chip, and 1,744 cows were genotyped with either 7K chips (1,157 individuals) or 54K chips (587 individuals). The trait used in the analysis was protein yield. All cows with EBV were used in a single-step approach. Deregressed proofs were used as the response variable. Four alternative approaches were compared with genomic best linear unbiased prediction (GBLUP) model with bulls in the reference data (GBLUPBull): (1) GBLUP with both bulls and genotyped cows in the reference data; (2) GBLUP including a year of birth effect; (3) GEBV from a GBLUP model that accounted for the difference of EBV between dams and maternal grandsires; and (4) using a single-step approach. The results indicated all 4 alternatives could reduce the bias of predicted genetic trend and that the single-step approach performed best. However, not all these approaches improved reliability or reduced inflation of GEBV. The reliability was 0.30 and regression coefficients of deregressed proofs on GEBV were 0.69 in the scenario GBLUPBull. When genotyped cows were included in the reference population, the regression coefficients decreased to 0.59 but the reliability increased to 0.35. If a year effect was included in the model, the prediction reliability decreased to 0.29 and the regression coefficient improved to 0.75. The method in which GEBV were adjusted for the difference between dam EBV and maternal grandsire EBV led to much lower regression coefficients though the reliability increased to 0.4. The single-step approach improved both the reliability, to 0.38 and regression coefficient to 0.78. Therefore, the bias in genetic trend was reduced. The results suggest that implementing the single-step approach is an effective way to improve genomic prediction in Danish Jersey cattle.
PubMed | University of Aarhus, Seges and Nordic Cattle Genetic Evaluation
Type: Journal Article | Journal: Journal of dairy science | Year: 2015
Including genotyped females in a reference population (RP) is an obvious way to increase the RP in genomic selection, especially for dairy breeds of limited population size. However, the incorporation of these females must be conducted cautiously because of the potential preferential treatment of the genotyped cows and lower reliabilities of phenotypes compared with the proven pseudo-phenotypes of bulls. Breeding organizations in Denmark, Finland, and Sweden have implemented a female-genotyping project with the possibility of genotyping entire herds using the low-density (LD) chip. In the present study, 5 scenarios for building an RP were investigated in the Nordic Jersey population: (1) bulls only, (2) bulls with females from the LD project, (3) bulls with females from the LD project plus non-LD project females genotyped before their first calving, (4) bulls with females from the LD project plus non-LD project females genotyped after their first calving, and (5) bulls with all genotyped females. The genomically enhanced breeding value (GEBV) was predicted for 8 traits in the Nordic total merit index through a genomic BLUP model using deregressed proof (DRP) as the response variable in all scenarios. In addition, (daughter) yield deviation and raw phenotypic data were studied as response variables for comparison with the DRP, using stature as a model trait. The validation population was formed using a cut-off birth year of 2005 based on the genotyped Nordic Jersey bulls with DRP. The average increment in reliability of the GEBV across the 8 traits investigated was 1.9 to 4.5 percentage points compared with using only bulls in the RP (scenario 1). The addition of all the genotyped females to the RP resulted in the highest gain in reliability (scenario 5), followed by scenario 3, scenario 2, and scenario 4. All scenarios led to inflated GEBV because the regression coefficients are less than 1. However, scenario 2 and scenario 3 led to less bias of genomic predictions than scenario 5, with regression coefficients showing less deviation from scenario 1. For the study on stature, the daughter yield deviation/daughter yield deviation performed slightly better than the DRP as the response variable in the genomic BLUP (GBLUP) model. Therefore, adding unselected females in the RP could significantly improve the reliabilities and tended to reduce the prediction bias compared with adding selectively genotyped females. Although the DRP has performed robustly so far, the use of raw data is recommended with a single-step model as an optimal solution for future genomic evaluations.
News Article | September 22, 2016
An international consortium of researchers from INRA (France), University of Copenhagen and SEGES (Denmark), BGI-Shenzhen (China) and NIFES (Norway) has now established the first catalogue of bacterial genes in the gut of pigs. This achievement is published in the latest issue of Nature Microbiology.
Mathiasen H.,Copenhagen University |
Bligaard J.,SEGES |
Esbjerg P.,Copenhagen University
Entomologia Experimentalis et Applicata | Year: 2015
The cabbage stem flea beetle, Psylliodes chrysocephala (L.) (Coleoptera: Chrysomelidae), is a major pest of winter oilseed rape. The larvae live throughout winter in leaf petioles and stems. Winter temperatures might play an important role in survival during winter and hence population dynamics, yet to what degree is unknown. This study investigates the effect of exposure time, cold acclimation, and larval stage on survival at -5 and -10 °C. Exposure time at -5 °C was 1, 2, 4, 8, 12, 16, and 20 days and 6, 12, 24, 36, 48, 72, 96, 120, and 144 h at -10 °C. Mortality increased with increasing exposure time and was significantly lower for cold-acclimated larvae. Estimated time until an expected mortality of 50% (LT50) and 90% (LT90) of larvae exposed to -5 °C was 7.4 and 9.6 days (non-acclimated) and 11.0 and 15.1 days (acclimated), respectively. Estimated LT50 for non-acclimated and acclimated larvae exposed to -10 °C was 32.6 and 70.5 h, respectively, and estimated LT90 66.8 and 132.2 h. Significant differences in mortality between larval stages were observed only at -5 °C. When exposed to -5 °C for 8 days, mortality of first and second instars was 81.2 and 51.3%, respectively. When exposed to -10 °C for 2 days, mortality of first and second instars was 70.5 and 76.1%. Data on winter temperatures in Denmark from 1990 to 2013 showed that larvae were rarely exposed to a number of continuous days at -5 or -10 °C causing a potential larval mortality of 50-90%. © 2015 The Netherlands Entomological Society.
PubMed | SEGES, Copenhagen University and Technical University of Denmark
Type: | Journal: Frontiers in veterinary science | Year: 2016
We describe a new mechanistic bioeconomic model for simulating the spread of