Time filter

Source Type

Hauge S.J.,Animalia Norwegian Meat and Poultry Research Center | Nafstad O.,Animalia Norwegian Meat and Poultry Research Center | Rotterud O.-J.,Animalia Norwegian Meat and Poultry Research Center
International Journal of Food Microbiology | Year: 2011

The meat industry in Norway has developed national guidelines for Good Hygiene Practices for slaughtering and skinning, based on categorisation of animals. These include shearing sheep and lambs in the abattoirs immediately before slaughter. The aim of this study was to investigate microbiological carcass contamination associated with: (i) different shearing regimes; (ii) fleece cleanliness; and (iii) the slaughter process. In addition, the efficacy of the national guidelines in reducing microbial contamination was evaluated. A total of 280 swab samples were collected from the brisket areas (100cm 2) of 140 naturally contaminated lamb carcasses in a commercial abattoir. Half the samples were collected at skinning of brisket areas at the start of the slaughter-line and half of them were collected at the end of slaughter-line, just before chilling. The lambs were divided into four groups (n=35) according to the duration of the period between shearing and slaughter: (i) 0days (shorn at the abattoir immediately before slaughter); (ii) three days; (iii) seven days; and (iv) not shorn. Mean log colony forming units (CFU) per 100cm 2 at skinning were 5.78 and 6.95 for aerobic plate count (APC) (P<0.05), 1.65 and 2.78 for Escherichia coli (P<0.05) for shorn and unshorn lambs, respectively. For shorn lambs, divided according to the period between shearing and slaughter, the mean log CFU per 100cm 2 were 5.45, 5.75, 6.12 (APC) and 1.77, 1.46, 1.71 (E. coli) for the 0-days, 3-days and 7-days groups, respectively (P<0.05 for the difference between 0- and 7-days groups in APC results). A four-category scale (0-3) was used for assessing fleece cleanliness before skinning. Visually clean lambs (score '0') had lower levels of APC on the carcass surfaces than those categorised as dirty (score '2-3') (P<0.05). The carcasses at the end of the slaughter-line had lower levels of APC than they had at skinning. However, the statistical significant reduction of E. coli on carcass surfaces at skinning point for shorn lambs, were impaired and no longer significantly different from the unshorn group at the end of the slaughter-line. The increased E. coli level at the end of the slaughter-line might be explained by weaknesses related to slaughter hygiene in particular suboptimal evisceration in the abattoir which was used as a basis for our trial, and thus the national guidelines concerning shearing had not the fully intended effect on reducing microbial carcass contamination. © 2011 Elsevier B.V.

Hauge S.J.,Animalia Norwegian Meat and Poultry Research Center | Wahlgren M.,Nortura SA | Rotterud O.-J.,Animalia Norwegian Meat and Poultry Research Center
International Journal of Food Microbiology | Year: 2011

Although hot water pasteurisation of carcasses is accepted as a general intervention in USA, this is not the case in Europe. The aims of this study were (i) to evaluate the microbiological effects of hot water pasteurisation of lamb carcasses, both after slaughtering and dressing and following subsequent chilling and storage; (ii) to discuss hot water pasteurisation from a public health and cost-benefit perspective; (iii) to discuss the benefits of hot water pasteurisation compared with use of separate meat processing streams for high-risk carcasses; (iv) to evaluate the use of recycled hot water in a hygienic context and in relation to EU regulations; and (v) to consider the technological and sensory aspects of hot water pasteurisation of lamb carcasses. Samples were collected from 420 naturally contaminated lamb carcasses, with 50% of the carcasses (n=210) subject to hot water pasteurisation at 82°C for 8s immediately after slaughter. Surface swab samples from 4500cm2 areas on carcasses were collected at slaughter, after chilling for 24h, and after chilling for five days. The microbial analyses included Escherichia coli, Enterobacteriaceae, Bacillus cereus, Clostridium perfringens and aerobic plate count (APC). A resuscitation step using Tryptone Soya Agar was included in the microbiological analyses. Hot water pasteurisation significantly reduced the levels of E. coli, Enterobacteriaceae, B. cereus and APC (all P<0.001). E. coli colony forming unit (CFU) reduction was 99.5%, corresponding to a reduction of 1.85log CFU per carcass. E. coli was isolated from 66% of control carcasses and from 26% of pasteurised carcasses. After 24h storage, the reduction in E. coli was increased to 2.02 log, and after five days E. coli could not be isolated from the pasteurised carcasses. These results suggest that surface pasteurisation could be an important and efficient procedure (critical control point) for reducing generic E. coli and thereby shiga toxin-producing E. coli on carcasses, and thus the risk for disease among consumers. The recycled water had acceptable physical and chemical parameters and no spore-forming bacteria were detected. Although some carcass discolouration was observed, after 24h the colour was acceptable. Our data provide relevant input for some of the data gaps regarding hot water pasteurisation and indicate that replacing the expensive system of separate processing of high-risk carcasses with hot water surface pasteurisation should be considered as a serious option. © 2011 Elsevier B.V.

Hektoen L.,Animalia Norwegian Meat and Poultry Research Center
Preventive Veterinary Medicine | Year: 2012

A study was conducted in order to obtain information about sheep farms in Norway and to identify housing and management characteristics that were risk factors for neonatal mortality of lambs 0-5 days of age. A questionnaire was submitted to sheep farmers, who provided demographic data and information on sheep housing conditions and feeding and management practices. Our description of farms is based on the questionnaire responses received from 2260 farmers. Data on lamb mortality during the preceding lambing season were available for those flocks that were enrolled in the Norwegian Sheep Recording System. Some flocks where the number of lambing ewes was less than 20 or greater than 400 were excluded. The total number of flocks included in the analysis of neonatal mortality was 1125. An increase in the mean number of live-born lambs per ewe per flock was associated with increasing neonatal mortality. Factors independently associated with increased neonatal survival were continuous monitoring of the ewes during the lambing season, active support to ensure sufficient colostrum intake of the lambs, feeding a combination of grass silage and hay compared with grass silage alone, and supplying roughage at least twice per day versus only once. Increased survival was also observed in flocks where the farmer had at least 15 years of experience in sheep farming. Flocks in which the Spæl breed predominated had lower odds for neonatal deaths compared to flocks in which the Norwegian White breed predominated. In conclusion, measures in sheep flocks targeted at feeding practices during the indoor feeding period and management practice during lambing season would be expected to reduce neonatal lamb mortality. © 2012 Elsevier B.V.

Hauge S.J.,Animalia Norwegian Meat and Poultry Research Center | Nafstad O.,Animalia Norwegian Meat and Poultry Research Center | Rotterud O.-J.,Animalia Norwegian Meat and Poultry Research Center
Food Control | Year: 2012

In order to reduce hide-to-carcass contamination during slaughtering and dressing of cattle, the meat industry in Norway has developed national guidelines for Good Hygiene Practices based on hide cleanliness. Three categories of hide cleanliness have been described in Norway: Category 0 is clean, Category 1 is moderately dirty, and Category 2 is very dirty. For hides classified as either Category 1 or Category 2 payments to farmers are reduced. The aim of our study was to evaluate microbiological carcass contamination associated with hide cleanliness and the slaughtering and dressing process. A total of 324 swab samples were taken from abdomen and brisket areas (100 cm 2 per sample) of 81 naturally contaminated beef carcasses at two commercial abattoirs. Samples were collected immediately after dehiding, at the start of the slaughter-line, and also at the end of the slaughter-line, just before chilling. Carcasses derived from dirty animals (Category 1 and Category 2 combined (n = 34)), were more contaminated just after dehiding than clean animals (Category 0, n = 47), as determined by aerobic plate counts (APC) (P < 0.001) and by Escherichia coli levels (P < 0.05). Also at the end of the slaughter-line, carcasses derived from dirty animals had higher APC levels than clean animals (P < 0.005). The carcasses had lower levels of APC and E. coli at the end of the slaughter-line than at the start for all categories (P < 0.05), and Category 1 carcasses had higher APC values than Category 0 carcasses at the start (P < 0.05) and at end of the slaughter-line (P = 0.05). The carcasses adjacent to carcasses classified as belonging to Categories 1 or 2, hanging just behind them on the conveyor-belt, were evaluated separately in order to investigate cross-contamination. At the end of the slaughter-line, carcasses that were adjacent to Category 2 carcasses had higher E. coli values than carcasses in all other categories.The national guidelines consider Category 2 carcasses as high-risk carcasses and they are processed separately, with heat treatment of the meat products. However, this study suggests that Category 2 carcasses seem to be of the same hygienic standard as those in other categories, perhaps because Category 2 carcasses are dehided and trimmed more carefully. However, since trimming of visual spot contamination seems sufficient for reducing microbial contamination to adequate levels, similar to those of cleaner animals, directing these carcasses into a separate meat process line, as demanded by the national guidelines, may be unnecessary. © 2012 Elsevier Ltd.

Holck A.L.,Nofima Materials AS | Axelsson L.,Nofima Materials AS | Rode T.M.,Nofima Materials AS | Hoy M.,Nofima Materials AS | And 4 more authors.
Meat Science | Year: 2011

After a number of foodborne outbreaks of verotoxigenic Escherichia coli involving fermented sausages, some countries have imposed regulations on sausage production. For example, the US Food Safety and Inspection Service requires a 5 log10 reduction of E. coli in fermented products. Such regulations have led to a number of studies on the inactivation of E. coli in fermented sausages by changing processing and post-processing conditions. Several factors influence the survival of E. coli such as pre-treatment of the meat, amount of NaCl, nitrite and lactic acid, water activity, pH, choice of starter cultures and addition of antimicrobial compounds. Also process variables like fermentation temperature and storage time play important roles. Though a large variety of different production processes of sausages exist, generally the reduction of E. coli caused by production is in the range 1-2 log10. In many cases this may not be enough to ensure microbial food safety. By optimising ingredients and process parameters it is possible to increase E. coli reduction to some extent, but in some cases still other post process treatments may be required. Such treatments may be storage at ambient temperatures, specific heat treatments, high pressure processing or irradiation. HACCP analyses have identified the quality of the raw materials, low temperature in the batter when preparing the sausages and a rapid pH drop during fermentation as critical control points in sausage production. This review summarises the literature on the reduction verotoxigenic E. coli in production of fermented sausages. © 2011 Elsevier Ltd.

Hauge S.J.,Animalia Norwegian Meat and Poultry Research Center | Dommarsnes K.,Eurofins
International Journal of Food Microbiology | Year: 2010

Abattoirs have to enumerate Escherichia coli on carcass surfaces as part of compulsory HACCP monitoring and they therefore need rapid and reliable methods. The objective of this study was to compare a conventional plating method with a faster, simpler method for detection and enumeration of E. coli in samples from naturally contaminated carcasses. The two methods were the conventional pour plate method of the Nordic Committee on Food Analysis; NMKL Method No. 125, and the enzymatic method of SimPlate Coliforms & E. coli. Materials were 588 cotton-cloth samples used for swabbing 100cm2 areas on four sites on cattle and lamb carcasses in three commercial abattoirs in Norway. E. coli was detected by at least one of the methods in 270 (46%) of the samples. Forty-five samples (8%) were positive only by SimPlate while 28 samples (5%) were positive only by NMKL125. Cohen's kappa value was 0.74 for detection/non-detection results, which showed a high level of agreement between the two methods. E. coli counts determined by the conventional NMKL125 method showed a high concordance correlation (ccc 0.80, slope 0.99) with most probable number (MPN) values obtained with SimPlate. SimPlate is a rapid and reliable method for detection and enumeration of E. coli and a suitable alternative method for use with swab samples from cattle and lamb carcasses. © 2010 Elsevier B.V.

Groneng G.M.,Norwegian Veterinary Institute | Green L.E.,University of Warwick | Kaler J.,University of Nottingham | Vatn S.,Animalia Norwegian Meat and Poultry Research Center | Hopp P.,Norwegian Veterinary Institute
Preventive Veterinary Medicine | Year: 2014

In 2008, ovine footrot was detected in Norway for the first time since 1948. By December 2012 it had spread to 99 flocks, all in the county of Rogaland in the south west of Norway, and 42% of which were located in the municipality of Rennesøy in Rogaland. The aim of this study was to investigate risk factors for contracting severe footrot in flocks of sheep. A flock was considered positive for severe footrot based on positive virulence test or by clinical signs in addition to a positive PCR test.A retrospective longitudinal study was performed with a questionnaire as the main data source. All sheep farmers (107) in the municipality of Rennesøy were selected for inclusion in the study. The questions focused on direct and indirect contacts between sheep in different sheep flocks and general information about the farm. The questions covered the years 2007-2011. Data were analysed using discrete time survival modelling.A total of 81 (76%) farmers responded to the questionnaire including 29 of 41 (71%) farmers with flocks positive for severe footrot. Factors that increased the risk of a flock becoming positive for severe footrot in the final multivariable survival model were sheep that trespassed boundary fences and came into contact with a flock positive for severe footrot (odds ratio 11.5, 95% confidence interval 4.1-32.2) and at least one flock with severe footrot within 0-1. km radius of a farm (odds ratio 8.6, 95% confidence interval 2.3-32.6). This study highlights the importance of upgrading and maintaining boundary fences and encouraging farmers to avoid direct and indirect contact between nearby flocks. © 2013 Elsevier B.V.

Vatn S.,Animalia Norwegian Meat and Poultry Research Center | Hektoen L.,Animalia Norwegian Meat and Poultry Research Center | Hoyland B.,Animalia Norwegian Meat and Poultry Research Center | Reiersen A.,Norwegian Food Safety Authority | And 2 more authors.
Small Ruminant Research | Year: 2012

Norway was regarded to be free from footrot until the detection of Dichelobacter nodosus in a flock suffering from severe lameness in 2008. D. nodosus was subsequently shown to be prevalent throughout the country. However, virulent strains were only isolated from sheep in one out of 19 counties. Severe footrot has been diagnoses in a total of 97 sheep flocks. An elimination program was established, based on clinical examination, slaughter of selected animals, foot bathing with zinc sulphate, judicious use of clean pastures and ongoing clinical monitoring, with the aim of eliminating severe footrot. The elimination program has so far been carried out in 35 flocks with severe footrot and preliminary results indicate a success rate of 65-70%. The continued success of the program is important to ensure economic productivity and high standards of animal welfare. © 2012 Elsevier B.V.

Rendueles E.,University of León | Omer M.K.,Animalia Norwegian Meat and Poultry Research Center | Alvseike O.,Animalia Norwegian Meat and Poultry Research Center | Alonso-Calleja C.,University of León | And 2 more authors.
LWT - Food Science and Technology | Year: 2011

High hydrostatic pressure (HHP) processing as a novel non-thermal method has shown great potential in producing microbiologically safer products while maintaining the natural characteristics of the food items. Scientific research of the process and its industrial applications has been widespread in the past two decades with many scientific publications describing its uses, advantages and limitations. The review describes the effect of HHP on foodborne pathogenic microorganisms, their structures and adaptive mechanisms, the intrinsic and extrinsic factors that affect its application with a focus on microbiological safety, and research needs. In a risk assessment context, tools and mechanisms in place to monitorize, optimize and validate the process, and procedures for assessing and modelling the lethal effect of the treatment are reviewed. © 2010 Elsevier Ltd.

Heir E.,Nofima Materials AS | Holck A.L.,Nofima Materials AS | Omer M.K.,Animalia Norwegian Meat and Poultry Research Center | Alvseike O.,Animalia Norwegian Meat and Poultry Research Center | And 3 more authors.
International Journal of Food Microbiology | Year: 2010

Outbreaks of verotoxigenic Escherichia coli (VTEC) linked to dry-fermented sausages (DFSs) have emphasized the need for DFS manufacturers to introduce measures to obtain enhanced safety and still maintain the sensory qualities of their products. To our knowledge no data have yet been reported on non-O157:H7 VTEC survival in DFS. Here, the importance of recipe and process variables on VTEC (O157:H7 and O103:H25) reductions in two types of DFS, morr and salami, was determined through three statistically designed experiments. Linear regression and ANOVA analyses showed that no single variable had a dominant effect on VTEC reductions. High levels of NaCl, NaNO2, glucose (low pH) and fermentation temperature gave enhanced VTEC reduction, while high fat and large casing diameter (aw) gave the opposite effect. Interaction effects were small. The process and recipe variables showed similar effects in morr and salami. In general, recipes combining high batter levels of salt (NaCl and NaNO2) and glucose along with high fermentation temperature that gave DFS with low final pH and aw, provided approximately 3 log10 reductions compared to approximately 1.5 log10 reductions obtained for standard recipe DFS. Storage at 4°C for 2months provided log10 0.33-0.95 additional VTEC reductions and were only marginally affected by recipe type. Sensory tests revealed only small differences between the various recipes of morr and salami. By optimisation of recipe and process parameters, it is possible to obtain increased microbial safety of DFS while maintaining the sensory qualities of the sausages. © 2010 Elsevier B.V.

Loading Animalia Norwegian Meat and Poultry Research Center collaborators
Loading Animalia Norwegian Meat and Poultry Research Center collaborators