Sacre Coeur Hospital of Montreal

Montréal, Canada

Sacre Coeur Hospital of Montreal

Montréal, Canada

Time filter

Source Type

Troyanov S.,Sacre Coeur Hospital of Montreal | Bellur S.,University of Oxford | Verhave J.C.,Sacre Coeur Hospital of Montreal | Cook H.T.,Imperial College London | And 3 more authors.
Journal of the American Society of Nephrology | Year: 2015

Current guidelines suggest treatment with corticosteroids (CS) in IgA nephropathy (IgAN) when proteinuria is persistently ≥1 g/d despite 3-6 months of supportive care and when eGFR is >50 ml/min per 1.73 m2.Whether the benefits of this treatment extend to patients with an eGFR ≤50 ml/min per 1.73 m2, other levels of proteinuria, or different renal pathologic lesions remains unknown. We retrospectively studied 1147 patients with IgAN from the European Validation Study of the Oxford Classification of IgAN (VALIGA) cohort classified according to the Oxford-MEST classification and medication used, with details of duration but not dosing. Overall, 46% of patients received immunosuppression, of which 98% received CS. Treated individuals presented with greater clinical and pathologic risk factors of progression. They also received more antihypertensive medication, and a greater proportion received renin angiotensin system blockade (RASB) compared with individuals without immunosuppressive therapy. Immunosuppression was associated with a significant reduction in proteinuria, a slower rate of renal function decline, and greater renal survival. Using a propensity score, we matched 184 subjects who received CS and RASB to 184 patients with a similar risk profile of progression who received only RASB. Within this group, CS reduced proteinuria and the rate of renal function decline and increased renal survival. These benefits extended to those with an eGFR#50 ml/min per 1.73 m2, and the benefits increased proportionally with the level of proteinuria. Thus, CS reduced the risk of progression regardless of initial eGFR and in direct proportion to the extent of proteinuria in this cohort. Copyright © 2015 by the American Society of Nephrology.


PubMed | Imperial College London, University of Leicester, Sacre Coeur Hospital of Montreal, Regina Margherita Childrens Hospital and University of Oxford
Type: Journal Article | Journal: Journal of the American Society of Nephrology : JASN | Year: 2015

Current guidelines suggest treatment with corticosteroids (CS) in IgA nephropathy (IgAN) when proteinuria is persistently 1 g/d despite 3-6 months of supportive care and when eGFR is >50 ml/min per 1.73 m(2). Whether the benefits of this treatment extend to patients with an eGFR50 ml/min per 1.73 m(2), other levels of proteinuria, or different renal pathologic lesions remains unknown. We retrospectively studied 1147 patients with IgAN from the European Validation Study of the Oxford Classification of IgAN (VALIGA) cohort classified according to the Oxford-MEST classification and medication used, with details of duration but not dosing. Overall, 46% of patients received immunosuppression, of which 98% received CS. Treated individuals presented with greater clinical and pathologic risk factors of progression. They also received more antihypertensive medication, and a greater proportion received renin angiotensin system blockade (RASB) compared with individuals without immunosuppressive therapy. Immunosuppression was associated with a significant reduction in proteinuria, a slower rate of renal function decline, and greater renal survival. Using a propensity score, we matched 184 subjects who received CS and RASB to 184 patients with a similar risk profile of progression who received only RASB. Within this group, CS reduced proteinuria and the rate of renal function decline and increased renal survival. These benefits extended to those with an eGFR50 ml/min per 1.73 m(2), and the benefits increased proportionally with the level of proteinuria. Thus, CS reduced the risk of progression regardless of initial eGFR and in direct proportion to the extent of proteinuria in this cohort.


Camilla R.,Sacre Coeur Hospital of Montreal | Brachemi S.,CHUM | Pichette V.,Maisonneuve Rosemont Hospital | Cartier P.,University of Montréal | And 4 more authors.
Journal of Nephrology | Year: 2011

Background: Reliable biomarkers are needed to identify patients with glomerular disease at risk of progression. Transforming growth factor beta 1 (TGF-β1) and monocyte chemotactic protein 1 (MCP-1) play key roles in promoting renal tissue injury. Whether their urinary measurement adds value to current predictors of progression is uncertain. Methods: We enrolled patients with diabetic (n=53) and nondiabetic (n=47) proteinuric renal disease and retrospectively studied their rate of renal function decline over a defined period of 2 years. We simultaneously measured urinary protein, MCP-1 and TGF-β1, standardized to urinary creatinine. Results: The initial estimated glomerular filtration rate, proteinuria and rate of renal function decline (slope) were 36 ml/min per 1.73 m2, 1.1 g/day and -4.0 ± 7.2 ml/min per 1.73 m2 year. Median urinary TGF-β1 and MCP-1 levels were 0.3 (range 0.0-28.1) and 18 (range 3-370) ng/mmol of creatinine, respectively. Urinary protein and MCP-1 to creatinine ratios were associated with slope, and this applied to both diabetic and nondiabetic patients separately. Urinary TGF-β1 showed no relation to slope. However, the majority of its measurements were below the suggested reproducibility threshold. Using linear regression, both normalized urinary protein and MCP-1 were independently associated with the slope. Adding urinary MCP-1 to the model statistically raised the adjusted R2 from 0.35 to 0.40, refining patient risk stratification. Using cutoffs for urinary protein and MCP-1 obtained by receiver operating characteristic curves, the risk of progression was confidently determined in 80% of patients. Conclusion: Urinary MCP-1 is a marker of renal function decline in diabetic and nondiabetic proteinuric renal disease, independent of and additive to proteinuria. © 2011 Società Italiana di Nefrologia.


Chauny J.-M.,Sacre Coeur Hospital of Montreal | Chauny J.-M.,University of Montréal | Paquet J.,Sacre Coeur Hospital of Montreal | Lavigne G.,Sacre Coeur Hospital of Montreal | And 4 more authors.
Pain | Year: 2016

Percentage of pain intensity difference (PercentPID) is a recognized way of evaluating pain relief with an 11-point numerical rating scale (NRS) but is not without flaws. A new metric, the slope of relative pain intensity difference (SlopePID), which consists in dividing PercentPID by the time between 2 pain measurements, is proposed. This study aims to validate SlopePID with 3 measures of subjective pain relief: a 5-category relief scale (not, a little, moderate, very, complete), a 2-category relief question ("I'm relieved," "I'm not relieved"), and a single-item question, "Wanting other medication to treat pain?" (Yes/No). This prospective cohort study included 361 patients in the emergency department who had an initial acute pain NRS > 3 and a pain intensity assessment within 90 minutes after analgesic administration. Mean age was 50.2 years (SD 19.3) and 59% were women. Area under the curves of receiver operating characteristic curves analyses revealed similar discriminative power for PercentPID (0.83; 95% confidence interval [CI], 0.79-0.88) and SlopePID (0.82; 95% CI, 0.77-0.86). Considering the "very" category from the 5-category relief scale as a substantial relief, the average cutoff for substantial relief was a decrease of 64% (95% CI, 59-69) for PercentPID and of 49% per hour (95% CI, 44-54) for SlopePID. However, when a cutoff criterion of 50% was used as a measure of pain relief for an individual patient, PercentPID underestimated pain-relieved patients by 12.1% (P < 0.05) compared with the SlopePID measurement, when pain intensity at baseline was an odd number compared with an even number (32.9% vs 45.0%, respectively). SlopePID should be used instead of PercentPID as a metric to evaluate acute pain relief on a 0 to 10 NRS. © 2015 International Association for the Study of Pain.


Chapdelaine S.,Sacre Coeur Hospital of Montreal | Chapdelaine S.,University of Montréal | Paquet J.,Sacre Coeur Hospital of Montreal | Dumont M.,Sacre Coeur Hospital of Montreal | Dumont M.,University of Montréal
Journal of Sleep Research | Year: 2012

In most situations, complete circadian adjustment is not recommended for night workers. With complete adjustment, workers experience circadian misalignment when returning on a day-active schedule, causing repeated circadian phase shifts and internal desynchrony. For this reason, partial circadian realignment was proposed as a good compromise to stabilize internal circadian rhythms in night shift workers. However, the extent of partial circadian adjustment necessary to improve sleep and vigilance quality is still a matter of debate. In this study, the effects of small but statistically significant partial circadian adjustments on sleep and vigilance quality were assessed in a laboratory simulation of night work to determine whether they were also of clinical significance. Partial adjustments obtained by phase delay or by phase advance were quantified not only by the phase shift of dim light salivary melatonin onset, but also by the overlap of the episode of melatonin production with the sleep-wake cycle adopted during simulated night work. The effects on daytime sleep and night-time vigilance quality were modest. However, they suggest that even small adjustments by phase delay may decrease the accumulation of sleep debt, whereas the advance strategy improves subjective alertness and mood during night work. Furthermore, absolute phase shifts, by advance or by delay, were associated with improved subjective alertness and mood during the night shift. These strategies need to be tested in the field, to determine whether they can be adapted to real-life situations and provide effective support to night workers. © 2012 European Sleep Research Society.


Dumont M.,Sacre Coeur Hospital of Montreal | Dumont M.,University of Montréal | Lanctt V.,Sacre Coeur Hospital of Montreal | Cadieux-Viau R.,Sacre Coeur Hospital of Montreal | Paquet J.,Sacre Coeur Hospital of Montreal
Chronobiology International | Year: 2012

Decreased melatonin production, due to acute suppression of pineal melatonin secretion by light exposure during night work, has been suggested to underlie higher cancer risks associated with prolonged experience of night work. However, the association between light exposure and melatonin production has never been measured in the field. In this study, 24-h melatonin production and ambulatory light exposure were assessed during both night-shift and day/evening-shift periods in 13 full-time rotating shiftworkers. Melatonin production was estimated with the excretion of urinary 6-sulfatoxymelatonin (aMT6s), and light exposure was measured with an ambulatory photometer. There was no difference in total 24-h aMT6s excretion between the two work periods. The night-shift period was characterized by a desynchrony between melatonin and sleep-wake rhythms, as shown by higher melatonin production during work and lower melatonin production during sleep when working night shifts than when working day/evening shifts. Light exposure during night work showed no correlation with aMT6s excreted during the night of work (p>.5), or with the difference in 24-h aMT6s excretion between the two work periods (p >.1). However, light exposure during night work was negatively correlated with total 24-h aMT6s excretion over the entire night-shift period (p<.01). In conclusion, there was no evidence of direct melatonin suppression during night work in this population. However, higher levels of light exposure during night work may have decreased total melatonin production, possibly by initiating re-entrainment and causing internal desynchrony. This interpretation is consistent with the proposition that circadian disruption, of which decreased melatonin production is only one of the adverse consequences, could be the mediator between night shiftwork and cancer risks. © 2012 Informa Healthcare USA, Inc.


Daoust R.,Sacre Coeur Hospital of Montreal | Daoust R.,University of Montréal | Paquet J.,Sacre Coeur Hospital of Montreal | Lavigne G.,University of Montréal | And 4 more authors.
American Journal of Emergency Medicine | Year: 2014

Study objective Delayed pain treatment is a common problem in emergency departments (EDs). The objective of this study was to examine the effect of age on time to ED patients receiving the first analgesic dose for moderate to severe pain. Methods Real-time, archived data from a tertiary urban hospital and a secondary regional hospital were analyzed post hoc. We included all consecutive adult ED patients (> 18 years) on stretchers whose pain intensity was at least 4 (0-10, verbal numerical scale) at triage between March 2008 and December 2012. The primary outcome was time from the beginning of triage to analgesic medication in seniors (> 65 years) compared with younger patients. Results A total of 34,213 patients (56% women) were triaged to an ED bed with mean pain intensity of 7.6 (SD ± 1.8). Analgesics were administered to 20,486 patients (59.9%) in a median time of 2.3 hours (interquartile range [IQR] = 3.6). Median time for seniors to receive analgesics was 3.2 hours (IQR = 5.1) compared with 2.1 hours (IQR = 3.1, effect size = 0.19) for younger patients. This represents a 55.2% increase in time to analgesic for seniors. Seniors waited 12 minutes longer to be evaluated by a physician, 20 minutes longer for analgesic prescription, and 35 minutes longer for medication administration. After controlling for confounding factors, they still waited longer to receive pain medication (hazards ratio = 1.37; 95% confidence interval, 1.32-1.42) than younger patients. Conclusion Seniors with moderate to severe pain wait 1.1 hours (55.2%) longer than younger patients to receive analgesics. Physicians and nurses (32 and 35 minutes, respectively) contributed to this disparity. © 2014 Elsevier Inc.


Dumont M.,Sacre Coeur Hospital of Montreal | Dumont M.,University of Montréal | Paquet J.,Sacre Coeur Hospital of Montreal
Chronobiology International | Year: 2014

Decreased melatonin production, due to nighttime exposure to light, has been proposed as one of the physiological mechanisms increasing cancer risk in night workers. However, few studies measured melatonin production in night workers, and most of these studies did not measure melatonin over 24 h. One study compared total melatonin production between day and night shifts in rotating night workers and did not find significant differences. However, without baseline measures, it was not possible to exclude that melatonin production was reduced during both day and night work. Here, we used data collected in a simulation study of night work to determine the effect of night work on both nighttime and 24-h melatonin production, during three consecutive days of simulated night work. Thirty-eight healthy subjects (15 men, 23 women; 26.6 ± 4.2 years) participated in a 6-d laboratory study. Circadian phase assessments were made with salivary dim light melatonin onset (DLMO) on the first and last days. Simulated day work (09:00-17:00 h) occurred on the second day, followed by three consecutive days of simulated night work (00:00-08:00 h). Light intensity at eye level was set at 50 lux during both simulated day and night work. The subjects were divided into three matched groups exposed to specific daytime light profiles that produced various degrees of circadian phase delays and phase advances. Melatonin production was estimated with the excretion of urinary 6-sulfatoxymelatonin (aMT6s). For the entire protocol, urine was collected every 2 h, except for the sleep episodes when the interval was 8 h. The aMT6s concentration in each sample was multiplied by the urine volume and then added to obtain total aMT6s excretion during nighttime (00:00-08:00 h) and during each 24-h day (00:00-00:00 h). The results showed that melatonin production progressively decreased over consecutive days of simulated night work, both during nighttime and over the 24 h. This decrease was larger in women using oral contraceptives. There was no difference between the three groups, and the magnitude of the decrease in melatonin production for nighttime and for the 24 h was not associated with the magnitude of the absolute circadian phase shift. As light intensity was relatively low and because the decrease in melatonin production was progressive, direct suppression by nighttime light exposure was probably not a significant factor. However, according to previous experimental observations, the decrease in melatonin production most likely reflects the circadian disruption associated with the process of re-entrainment. It remains to be determined whether reduced melatonin production can be harmful by itself, but long-term and repeated circadian disruption most probably is. © 2014 Informa Healthcare USA, Inc.


PubMed | Sacre Coeur Hospital of Montreal
Type: Journal Article | Journal: Nephron | Year: 2015

Recent acute kidney injury (AKI) guidelines, based on studies performed a decade ago, recommend avoiding aminoglycosides (AGs) in patients at risk of AKI. Whether present patient characteristics and management have changed this risk is uncertain. We determined the current incidence, risk factors and outcomes of AG-AKI.We retrospectively identified adult patients who received gentamicin or tobramycin for 5 days in 2 large university-affiliated centers, excluding critically ill and dialysis patients. We assessed the incidence of Risk, Injury, Failure, Loss and End-stage kidney disease criteria of AKI risk and then matched each AKI to 2 controls of same age and gender to determine factors associated with AG-AKI and its recovery, defined by a creatinine within 150% of baseline by 21 days.Since 2001, the frequency of AG administration and dosing declined, but the incidence of AG-AKI remained constant. Of the 562 patients who received AG for 5 days, 65 (12%) developed AG-AKI after 11 (IQR 8-15) days, with 56, 29 and 15% having stages 1, 2 and 3 AKI, respectively. We matched these to 130 controls. In this nested case-control study, independent AKI risk factors were vancomycin coadministration, high AG trough levels and heart failure. AG-AKI compared to AG exposure without AKI was associated with greater mortality. Renal recovery occurred in 51% of the AKI patients and was less likely with heart failure and higher AKI severity.AG administration has recently decreased but the risk of AKI remained unchanged and half of the patients did not recover. Vancomycin coadministration, high AG trough levels and heart failure independently predicted AKI.


PubMed | Sacre Coeur Hospital of Montreal and Montreal Heart Institute
Type: Journal Article | Journal: Trauma monthly | Year: 2015

Multiple classifications can be used to define the magnitude of aortic injury. The Vancouver Classification (VC) is a new and simplified computed tomography-based Blunt Aortic Injury (BAI) grading system correlating with clinical outcomes.The objectives of this study are: 1) to describe the severity of aortic injury in a center with a predominantly surgical approach to BAI; 2) to correlate the severity of aortic trauma to hospital survival rate and rate of adverse events according to the type of interventions performed during the hospital stay; and 3) to evaluate VC.All patients referring to the Sacre-Coeur Hospital of Montreal between August 1998 and April 2011 for management of BAI were studied. Two radiologists reviewed all CT scan images individually and classified the aortic injuries using VC.Among the 112 patients presenting with BAI, 39 cases had local CT scans available for reconstruction. Seven patients were identified as suffering from grade I injuries (flap or thrombus of less than 1 cm), 6 from grade II injuries (flap or thrombus of more than 1 cm), and 26 from grade III injuries (pseudoaneurysm). Among the patients with grade I injuries, 57% were treated surgically and 43% medically with a survival rate of 100%. Among the patients with grade II injuries (67% treated surgically and 33% treated medically) survival was also 100%. Among patients with grade III injuries (85% treated surgically, 7% had Thoracic Endovascular Aortic Repair (TEVAR) and 8% treated medically) survival was 95%, 95% and 50%, respectively. There were no significant differences between groups as to clinical outcome. Inter-rater reliability was 0.81.VC is easy to use and has low inter-observer variability. Low grades of injury were associated with low mortality related to medical treatment.

Loading Sacre Coeur Hospital of Montreal collaborators
Loading Sacre Coeur Hospital of Montreal collaborators