Time filter
Source Type

Yu L.-X.,WA | Rodringuez J.,Forage Genetics International Inc.
Molecular Plant Pathology | Year: 2016

Verticillium wilt (VW) is a fungal disease that causes severe yield losses in alfalfa. The most effective method to control the disease is through the development and use of resistant varieties. The identification of marker loci linked to VW resistance can facilitate breeding for disease-resistant alfalfa. In the present investigation, we applied an integrated framework of genome-wide association with genotyping-by-sequencing (GBS) to identify VW resistance loci in a panel of elite alfalfa breeding lines. Phenotyping was performed by manual inoculation of the pathogen to healthy seedlings, and scoring for disease resistance was carried out according to the standard test of the North America Alfalfa Improvement Conference (NAAIC). Marker-trait association by linkage disequilibrium identified 10 single nucleotide polymorphism (SNP) markers significantly associated with VW resistance. Alignment of the SNP marker sequences to the M. truncatula genome revealed multiple quantitative trait loci (QTLs). Three, two, one and five markers were located on chromosomes 5, 6, 7 and 8, respectively. Resistance loci found on chromosomes 7 and 8 in the present study co-localized with the QTLs reported previously. A pairwise alignment (blastn) using the flanking sequences of the resistance loci against the M. truncatula genome identified potential candidate genes with putative disease resistance function. With further investigation, these markers may be implemented into breeding programmes using marker-assisted selection, ultimately leading to improved VW resistance in alfalfa. © 2016 BSPP AND JOHN WILEY & SONS LTD.

Peace M.,University of Adelaide | McCaw L.,WA | Mills G.,University of Adelaide
Australian Meteorological and Oceanographic Journal | Year: 2012

From time to time, bushfires exhibit fire behaviour that was never anticipated in the prevailing environmental conditions. The Layman fuel-reduction burn, in scenic southwest Western Australia, was one such fire. The burn was ignited in mid-October 2010 in benign weather conditions. Late morning on the day following ignition, fire activity escalated rapidly; a convection column developed with a deep vertical circulation that extended from the surface to a height of 4 km. The ensuing intense fire with tall flames caused extensive crown scorch and defoliation, and resulted in concerns about the safety of rural communities adjoining the planned burn. The observations and meteorological model data indicate that the intense fire activity was driven by a combination of meteorological processes not routinely assessed in fire environments. Low-level sea breeze convergence in the wind field, combined with potential instability in the presence of FireCAPE, entrainment of dry air from aloft desiccating already climatologically dry fuels and vertical circulation on a frontal change were all present. The dramatic development of the Layman burn shows how meteorological processes not currently embedded in fire science may produce an environment conducive to intense fire activity. The ways in which fire managers might incorporate innovative meteorological products identified in this paper in order to mitigate against such events in the future are discussed.

Allen B.L.,University of Queensland | Burrows N.D.,WA | Engeman R.M.,U.S. Department of Agriculture | Fleming P.J.,Australian Department of Primary Industries and Fisheries | Leung L.K.-P.,University of Queensland
Ecological Management and Restoration | Year: 2014

Top-predators can sometimes be important for structuring fauna assemblages in terrestrial ecosystems. Through a complex trophic cascade, the lethal control of top-predators has been predicted to elicit positive population responses from mesopredators that may in turn increase predation pressure on prey species of concern. In support of this hypothesis, many relevant research papers, opinion pieces and literature reviews identify three particular case studies as supporting evidence for top-predator control-induced release of mesopredators in Australia. However, many fundamental details essential for supporting this hypothesis are missing from these case studies, which were each designed to investigate alternative aims. Here, we re-evaluate the strength of evidence for top-predator control-induced mesopredator release from these three studies after comprehensive analyses of associated unpublished correlative and experimental data. Circumstantial evidence alluded to mesopredator releases of either the European Red Fox (Vulpes vulpes) or feral Cat (Felis catus) coinciding with Dingo (Canis lupus dingo) control in each case. Importantly, however, substantial limitations in predator population sampling techniques and/or experimental designs preclude strong assertions about the effect of lethal control on mesopredator populations from these studies. In all cases, multiple confounding factors and plausible alternative explanations for observed changes in predator populations exist. In accord with several critical reviews and a growing body of demonstrated experimental evidence on the subject, we conclude that there is an absence of reliable evidence for top-predator control-induced mesopredator release from these three case studies. Well-designed and executed studies are critical for investigating potential top-predator control-induced mesopredator release. © 2014 Ecological Society of Australia and Wiley Publishing Asia Pty Ltd.

PubMed | University of Western Ontario, Carleton University, Wa and University of Waterloo
Type: | Journal: Social science & medicine (1982) | Year: 2016

This study examines perceptions and experiences of mothers, traditional birth attendants (TBA), and skilled birth attendants (SBA) regarding Ghanas recent policy that forbids TBAs from undertaking deliveries and restricts their role to referrals. In the larger context of Ghanas highly underdeveloped and geographically uneven health care system, this study draws on the political ecology of health framework to explore the ways global safe motherhood policy discourses intersect with local socio-cultural and political environments of Ghanas Upper West Region (UWR). This study reveals that futile improvements in maternal health and the continued reliance on TBAs illustrate the governments inability to understand local realities marked by poor access to SBAs or modern health care services. Using focus group discussions (FGDs) (n = 10) and in-depth interviews (IDIs) (n = 48) conducted in Ghanas UWR, the findings suggest that mothers generally perceive TBAs as better placed to conduct deliveries in rural isolated communities, where in most cases no SBAs are present or easily accessible. The results indicate that by adhering to the World Health Organizations guidelines, the local government may be imposing detrimental, unintended consequences on maternal and child health in remote rural locations. In addition, the findings suggest that the new policy has resulted in considerable confusion among TBAs, many of whom remain oblivious or have not been officially notified about the new policy. Furthermore, participant accounts suggest that the new policy is seen as contributing to worsening relations and tensions between TBAs and SBAs, a situation that undermines the delivery of maternal health services in the region. The study concludes by suggesting relevant policy recommendations.

Moore S.J.,Murdoch University | O'Dea M.A.,WA | Perkins N.,AusVet Animal Health Services | Barnes A.,Murdoch University | O'Hara A.J.,Murdoch University
Journal of Veterinary Diagnostic Investigation | Year: 2014

The cause of death in 215 cattle on 20 long-haul live export voyages from Australia to the Middle East, Russia, and China was investigated between 2010 and 2012 using gross, histologic, and/or molecular pathology techniques. A quantitative reverse transcription polymerase chain reaction (qRT-PCR) assay was used to detect nucleic acids from viruses and bacteria known to be associated with respiratory disease in cattle: Bovine coronavirus (Betacoronavirus 1), Bovine herpesvirus 1, Bovine viral diarrhea virus 1 and 2, Bovine respiratory syncytial virus, Bovine parainfluenza virus 3, Histophilus somni, Mycoplasma bovis, Mannheimia haemolytica, and Pasteurella multocida. The most commonly diagnosed cause of death was respiratory disease (107/180, 59.4%), followed by lameness (n = 22, 12.2%), ketosis (n = 12, 6.7%), septicemia (n = 11, 6.1%), and enteric disease (n = 10, 5.6%). Two thirds (130/195) of animals from which lung samples were collected had histologic changes and/or positive qRT-PCR results indicative of infectious lung disease: 93 out of 130 (72%) had evidence of bacterial infection, 4 (3%) had viral infection, and 29 (22%) had mixed bacterial and viral infections, and for 4 (3%) the causative organism could not be identified. Bovine coronavirus was detected in up to 13% of cattle tested, and this finding is likely to have important implications for the management and treatment of respiratory disease in live export cattle. Results from the current study indicate that although overall mortality during live export voyages is low, further research into risk factors for developing respiratory disease is required. © 2014 The Author(s).

Benson-Davies S.,Bariatric Nutrition Consultant | Davies M.L.,WA | Kattelmann K.,South Dakota State University
Bariatric Surgical Patient Care | Year: 2013

Background: The popularity of weight loss surgery (WLS) has surged in the United States during the past decade. Although obesity-related comorbidities show improvement with WLS, there is limited research about long-term weight maintenance strategies. Methods: The purpose of this pilot study was to explore daily caloric intake and walking behaviors associated with weight maintenance in post-Roux-en-Y gastric bypass patients (n=24) two or more years postsurgery. Demographic, anthropometric, food record, and step count data were collected. Results: Weight maintenance was stable at a mean body mass index of 33.7±8 kg/m2. Weight regained from the lowest reported weight averaged 16.2±12.7 kg. Of this sample, 18 (75%) have sustained a weight loss ≥50% of their excess body weight. A mean total caloric intake of 1,429±411 calories was reported with a caloric breakdown of 43% from carbohydrate, 17% from protein, and 39% from fat. Of those individuals who identified themselves as currently gaining weight (n=9), a caloric intake of approximately of 1,630±532 kcals and 3,217±1,155 steps per day were recorded. Compared to those individuals who were sustaining a significant weight loss (n=15), approximately 1,343±275 kcals were consumed per day and they averaged 6,915±3,715 steps per day. A statistically significant difference was found between the two groups for step count (t(17)=3.81, p<0.001). The estimated caloric difference to sustain a lower weight was approximately 500 calories when calculating energy intake plus walking behaviors. Conclusion: A lower caloric intake and higher energy expenditure in walking behavior appears to have a positive association to weight stabilization following WLS. © 2013, Mary Ann Liebert, Inc.

A key conservation issue in north-western Australia is recent declines in biodiversity, especially among the nationally threatened critical weight range (35g-5kg) mammals. Changed fire regimes are implicated as a cause of these declines, but it is unclear whether declines are related to fire, or to other key threatening processes. In this review, historical and scientific evidence for fire driven declines are examined and critically evaluated. Data suggest we cannot confidently attribute biodiversity declines to fire based on available evidence. This is because historical evidence is circumstantial only, and because scientific evidence showing changes in abundance relating to fire regime may not relate to regional scale declines and range contractions. A way forward in understanding factors driving declines is investigation of key mechanisms underlying fire effects. The importance of correct diagnosis of mechanisms is emphasised as incorrect assumptions can lead to inappropriate management of declining species. Three hypotheses about key mechanisms are raised based on general conservation biology approaches for threatened species, and also on evidence gained from northern Australia ecological studies. These are 1) that declines are driven by increased predation mortality through repeated removal and simplification of vegetation cover by severe fire regimes; 2) that declines are driven by resource limitations caused by too frequent fires; and 3) that declines are driven by failure to retain sufficient source breeding populations in optimal habitats (e.g. unburnt patches) within savanna landscapes for the continued persistence of firesensitive species. I suggest prescribed burning operations should aim to explicitly retain long unburnt vegetation patches (>3 years, >1 ha) frequently within the landscape. Our lack of knowledge of key mechanisms driving declines, and evidence that threatened species are fire-sensitive, suggests that indiscriminate application of fire mosaics may be harmful to some threatened species. © The Government of Western Australia, 2010.

Stanton J.H.,WA | Speijers J.,WA | Naylor G.R.S.,CSIRO | Pieruzzini S.,WA | And 3 more authors.
Textile Research Journal | Year: 2014

An important role of garments is to provide adequate comfort. A study was undertaken of the sensory scores for perceived comfort of wool base layer long sleeve knitted T shirts. This paper, the first in a series, describes and evaluates the wearer trial protocol in which untrained female wearers scored tactile, thermal, and moisture-based sensations during a controlled series of activities in a range of controlled climatic environments. Wearer scores were sufficiently consistent, that significant differences in aggregate scores between garments were detected that reflected changes in the fiber type (wool, cashmere, and cotton) and fiber specifications. Prickle and discomfort scores responded to different factors. The importance of choosing appropriate test conditions when assessing garments for particular end uses was highlighted as both the environment and activity affected wearer's perception of garment performance. A novel test feature was the use of a 'link' garment common to separate trials. This, combined with the observed absence of an effect due to garment washing, enabled the testing to be expanded so that 38 garments were successfully compared over 30 months in nine trials. Finally while the first trial used 43 wearers to obtain good estimates of absolute comfort levels, it was demonstrated that a reduction to 25 wearers was adequate for later trials with minimal loss in sensitivity. © The Author(s) 2014 Reprints and permissions: sagepub.co.uk/journalsPermissions.nav.

Smith-Lock K.,Macquarie University | Leitao S.,Curtin University Australia | Lambert L.,WA | Prior P.,WA | And 4 more authors.
International Journal of Speech-Language Pathology | Year: 2013

This study compared the effectiveness of a school-based treatment for expressive grammar in 5-year-olds with specific language impairment delivered in two different dose frequencies: eight sessions delivered daily over 8 consecutive school days or eight sessions delivered weekly over 8 consecutive weeks. Eighteen children received treatment daily and 13 children received treatment weekly. In both groups, treatment consisted of eight 1-hour sessions of small group activities in a classroom setting. Techniques included explicit instruction, focused stimulation, recasting, and imitation. Results were analysed at the group level and as a case series with each child as their own control in a single-subject design. The 8-weeks group showed significantly greater gain in test scores over the treatment period than in an equal time period prior to treatment, whereas the 8-days group did not (Cohen's d = 1.64 for 8-weeks group). Single-subject analyses indicated that 46% of children in the 8-week group and 17% of children in the 8-day group showed a significant treatment effect. It is concluded that expressive grammar treatment was most effective when dose frequency was weekly over 8 weeks rather than daily over 8 days for 5-year-old children with specific language impairment. © 2013 The Speech Pathology Association of Australia Limited.

Misich I.,WA
Rock Engineering in Difficult Ground Conditions - Soft Rocks and Karst - Proceedings of the Regional Symposium of the International Society for Rock Mechanics, EUROCK 2009 | Year: 2010

An investigation of a subsidence event in the Collie Basin of Western Australia, nearly 50 years after mine closure, has determined that the likely mechanism of subsidence development was the collapse of weak sandstone in the mine roof. The implication of this recent subsidence event is significant to the local region; there is far more potential for mining subsidence above other abandoned underground mines with similar mining/geological environments than was previously expected. Minesites in other geographical locations, that are susceptible to roof collapse over time, are also potentially susceptible to this mode of subsidence development-depending on the height of the collapse and the rock mass strength of the materials within the roof horizon. © 2010 Taylor & Francis Group, London.

Loading WA collaborators
Loading WA collaborators