Time filter

Source Type

Regoli F.,Fondazione Cardiocentro Ticino | Scopigni F.,Fondazione Cardiocentro Ticino | Leyva F.,The Good | Landolina M.,Fondazione IRCCS Policlinico San Matteo | And 5 more authors.
European Journal of Heart Failure | Year: 2013

AimsSurvival prediction by the Seattle Heart Failure Model (SHFM) of patients treated with cardiac resynchronization therapy (CRT) remains ill defined. The performance of the SHFM in this clinical setting was therefore evaluated.Methods and resultsData from 1309 consecutive CRT patients (five centres) were collected retrospectively; 1139 of these patients were considered for analysis. Three-hundred and seven deaths occurred over 40.1 months (interquartile range 25.2-60.0 months; mean event rate 9.7%/year; survival of 89, 81, and 64% at 1, 2, and 5 years). Kaplan-Meier event-free survival analysis stratified according to tertile of SHFM score was significant (log rank test P < 0.001). High-risk tertile (T1) survival was 82, 67, and 46% at 1, 2, and 5 years, respectively. Observed compared with SHFM-predicted survival was 0.11 vs. 0.08, 0.19 vs. 0.16, and 0.36 vs. 0.36, at 1, 2, and 5 years. Model discrimination by c-statistic was 0.64; the logistic models' area under the receiver operating characteristic curve (AUC-ROC) of risk tertiles was 0.66, 0.68, and 0.67, at 1, 2, and 5 years. Compared with the other two groups, T1 was globally more compromised. Within the T1 group, independent predictors of death were male gender, ischaemic heart failure aetiology, lower body weight, and CRT pacemaker.ConclusionsSHFM performance was found to be modest, tending to overestimate survival. However, SHFM identified a high-risk, globally more compromised patient subgroup, hence supporting a comprehensive approach, which should include nutritional, metabolic, and immunological aspects, as well as defibrillator back-up. © 2012 The Author.

Rondanelli M.,University of Sfax | Rondanelli M.,University of Pavia | Opizzi A.,University of Sfax | Opizzi A.,University of Pavia | And 5 more authors.
Journal of the American Geriatrics Society | Year: 2011

OBJECTIVES: To determine whether nightly administration of melatonin, magnesium, and zinc improves primary insomnia in long-term care facility residents. DESIGN: Double-blind, placebo-controlled clinical trial. SETTING: One long-term care facility in Pavia, Italy. PARTICIPANTS: Forty-three participants with primary insomnia (22 in the supplemented group, 21 in the placebo group) aged 78.3±3.9. INTERVENTION: Participants took a food supplement (5 mg melatonin, 225 mg magnesium, and 11.25 mg zinc, mixed with 100 g of pear pulp) or placebo (100 g pear pulp) every day for 8 weeks, 1 hour before bedtime. MEASUREMENTS: The primary goal was to evaluate sleep quality using the Pittsburgh Sleep Quality Index. The Epworth Sleepiness Scale, the Leeds Sleep Evaluation Questionnaire (LSEQ), the Short Insomnia Questionnaire (SDQ), and a validated quality-of-life instrument (Medical Outcomes Study 36-item Short Form Survey (SF-36)) were administered as secondary end points. Total sleep time was evaluated using a wearable armband-shaped sensor. All measures were performed at baseline and after 60 days. RESULTS: The food supplement resulted in considerably better overall PSQI scores than placebo (difference between groups in change from baseline PSQI score=6.8; 95% confidence interval=5.4-8.3, P<.001). Moreover, the significant improvements in all four domains of the LSEQ (ease of getting to sleep, P<.001; quality of sleep, P<.001; hangover on awakening from sleep, P=.005; alertness and behavioral integrity the following morning, P=.001), in SDQ score (P<.001), in total sleep time (P<.001), and in SF-36 physical score (P=.006) suggest that treatment had a beneficial effect on the restorative value of sleep. CONCLUSION: The administration of nightly melatonin, magnesium, and zinc appears to improve the quality of sleep and the quality of life in long-term care facility residents with primary insomnia. © 2011, The American Geriatrics Society.

Rindi G.,Catholic University of the Sacred Heart | Falconi M.,University of Verona | Klersy C.,Service of Biometry and Clinical Epidemiology | Albarello L.,San Raffaele Scientific Institute | And 21 more authors.
Journal of the National Cancer Institute | Year: 2012

Background Both the European Neuroendocrine Tumor Society (ENETS) and the International Union for Cancer Control/American Joint Cancer Committee/World Health Organization (UICC/AJCC/WHO) have proposed TNM staging systems for pancreatic neuroendocrine neoplasms. This study aims to identify the most accurate and useful TNM system for pancreatic neuroendocrine neoplasms. Methods The study included 1072 patients who had undergone previous surgery for their cancer and for which at least 2 years of follow-up from 1990 to 2007 was available. Data on 28 variables were collected, and the performance of the two TNM staging systems was compared by Cox regression analysis and multivariable analyses. All statistical tests were two-sided. Results Differences in distribution of sex and age were observed for the ENETS TNM staging system. At Cox regression analysis, only the ENETS TNM staging system perfectly allocated patients into four statistically significantly different and equally populated risk groups (with stage I as the reference; stage II hazard ratio [HR] of death = 16.23, 95% confidence interval [CI] = 2.14 to 123, P =. 007; stage III HR of death = 51.81, 95% CI = 7.11 to 377, P <. 001; and stage IV HR of death = 160, 95% CI = 22.30 to 1143, P <. 001). However, the UICC/AJCC/WHO 2010 TNM staging system compressed the disease into three differently populated classes, with most patients in stage I, and with the patients being equally distributed into stages II-III (statistically similar) and IV (with stage I as the reference; stage II HR of death = 9.57, 95% CI = 4.62 to 19.88, P <. 001; stage III HR of death = 9.32, 95% CI = 3.69 to 23.53, P =. 94; and stage IV HR of death = 30.84, 95% CI = 15.62 to 60.87, P <. 001). Multivariable modeling indicated curative surgery, TNM staging, and grading were effective predictors of death, and grading was the second most effective independent predictor of survival in the absence of staging information. Though both TNM staging systems were independent predictors of survival, the UICC/AJCC/WHO 2010 TNM stages showed very large 95% confidence intervals for each stage, indicating an inaccurate predictive ability. Conclusion Our data suggest the ENETS TNM staging system is superior to the UICC/AJCC/WHO 2010 TNM staging system and supports its use in clinical practice. © The Author(s) 2012.

Rindi G.,Catholic University of the Sacred Heart | Klersy C.,Service of Biometry and Clinical Epidemiology | Inzani F.,Catholic University of the Sacred Heart | Fellegara G.,Centro Diagnostico Italiano | And 17 more authors.
Endocrine-Related Cancer | Year: 2014

Lung neuroendocrine tumors are catalogued in four categories by the World Health Organization (WHO 2004) classification. Its reproducibility and prognostic efficacy was disputed. The WHO 2010 classification of digestive neuroendocrine neoplasms is based on Ki67 proliferation assessment and proved prognostically effective. This study aims at comparing these two classifications and at defining a prognostic grading system for lung neuroendocrine tumors. The study included 399 patients who underwent surgery and with at least 1 year follow-up between 1989 and 2011. Data on 21 variables were collected, and performance of grading systems and their components was compared by Cox regression and multivariable analyses. All statistical tests were two-sided. At Cox analysis, WHO 2004 stratified patients into three major groups with statistically significant survival difference (typical carcinoid vs atypical carcinoid (AC), P=0.021; AC vs large-cell/small-cell lung neuroendocrine carcinomas, P<0.001). Optimal discrimination in three groups was observed by Ki67% (Ki67% cutoffs: G1 <4, G2 4-<25, G3 ≥25; G1 vs G2, P=0.021; and G2 vs G3, P≤0.001), mitotic count (G1 ≤2, G2 >2-47, G3 >47; G1 vs G2, P≤0.001; and G2 vs G3, P≤0.001), and presence of necrosis (G1 absent, G2 <10% of sample, G3 >10% of sample; G1 vs G2, P≤0.001; and G2 vs G3, P≤0.001) at uni and multivariable analyses. The combination of these three variables resulted in a simple and effective grading system. A three-tiers grading system based on Ki67 index, mitotic count, and necrosis with cutoffs specifically generated for lung neuroendocrine tumors is prognostically effective and accurate. © 2014 Society for Endocrinology.

Rondanelli M.,University of Pavia | Opizzi A.,University of Pavia | Monteferrario F.,University of Pavia | Klersy C.,Service of Biometry and Clinical Epidemiology | And 2 more authors.
European Journal of Clinical Nutrition | Year: 2011

Background/ Objectives:There has been growing interest in using dietary intervention to improve the lipid profile. This work aims at analyzing the effects and the comparison of the enrichment of a diet with beta-glucans or rice bran in mildly hypercholesterolemic men.Subjects/Methods:The subjects initially consumed a 3-week Step 1 American Heart Association diet with rice bran-enriched foods. After this adaptation period, volunteers were randomly assigned to follow a crossover, controlled trial that consisted of two treatment with beta-glucan- or rice bran-enriched foods, each of 4 weeks, with a 3-week wash-out, like the adaptation period, between periods. Fasted blood samples were collected on days 0, 21, 49, 70 and 98 in both study arms for measuring low-density lipoprotein (LDL)-cholesterol (primary outcome), total cholesterol, high-density lipoprotein (HDL)-cholesterol, triglycerides, apolipoprotein (apo) A-I, apo B and glucose levels. Results:Twenty-four men (mean age: 50.3±5.3, mean body mass index: 24.9±1.9) completed the 14-week trial. Subjects in the 3-week adaptation period experienced significant reductions in the mean change of LDL cholesterol, total cholesterol, total cholesterol/HDL cholesterol, LDL cholesterol/HDL cholesterol, apo A-I, apo A-I/apo B and glucose. During the intervention diet periods, a difference was found between treatment groups for the mean change in LDL (0.21 (95% confidence interval (CI): 0.02-0.40), P=0.033) and total cholesterol (0.34 (95% CI: 0.20-0.47), P<0.001). Other parameters evaluated were not significantly affected by the diet consumed. Conclusions:The results of the present crossover clinical trial showed that beta-glucan-enriched foods are more effective in lowering serum LDL levels, compared with rice bran-enriched foods.European Journal of Clinical Nutrition advance online publication, 20 April 2011; doi:10.1038/ejcn.2011.48.

Boriani G.,University of Bologna | Auricchio A.,Fondazione Cardiocentro Ticino | Klersy C.,Service of Biometry and Clinical Epidemiology | Kirchhof P.,University of Munster | And 3 more authors.
Europace | Year: 2011

Aims: A pilot European survey was conducted to assess the cumulative time spent by healthcare personnel for in-office follow-up of cardiac implantable electrical devices (CIEDs), including cardiac pacemakers, implantable cardioverter-defibrillators, and cardiac resynchronization therapy (CRT) devices. Methods and results: Resource use data were collected during a session of in-clinic follow-up. Among 407 visits, 93 were scheduled and 7 unscheduled. Visit duration (total cumulative time) lasted a mean of 27 min for scheduled visits, and was ∼30 longer for unscheduled visits. Independent determinants of visit duration were: unscheduled visit (7.6 min, P 0.01), the need for device reprogramming (7.5 min, P < 0.001), and the type of device checked, with CRT devices needing 9.1 and 6.6 more minutes than single- (P < 0.001) and dual-chamber devices (P 0.002), respectively. Most visits involved two different types of healthcare personnel (239 of 407, 59), simultaneously. The most frequent combination was the involvement of both a cardiologist and a nurse (216 of 407 visits with both of them only, and 65 additional visits with also an internal technician, an external technician, or both). Overall, an external technician was involved in 18 of visits. Conclusions: In 'real-world' practice, the follow-up of CIEDs nowadays requires important resources in terms of time dedicated by specialized personnel, corresponding to cardiologists, nurses, internal technicians, and external, industry-employed technicians. These observations should be the basis for addressing clinical, organizational, financial, and policy initiatives targeted to optimize follow-up procedures in order to face the increase in the number of patients treated with CIEDs expected for the next years. © 2011 The Author.

Klersy C.,Service of Biometry and Clinical Epidemiology | De Silvestri A.,Service of Biometry and Clinical Epidemiology | Gabutti G.,Scientific Documentation Center | Raisaro A.,Fondazione IRCCS Policlinico San Matteo | And 3 more authors.
European Journal of Heart Failure | Year: 2011

AimsTo assess the cost-effectiveness and the cost utility of remote patient monitoring (RPM) when compared with the usual care approach based upon differences in the number of hospitalizations, estimated from a meta-analysis of randomized clinical trials (RCTs).Methods and resultsWe reviewed the literature published between January 2000 and September 2009 on multidisciplinary heart failure (HF) management, either by usual care or RPM to retrieve the number of hospitalizations and length of stay (LOS) for HF and for any cause. We performed a meta-analysis of 21 RCTs (5715 patients). Remote patient monitoring was associated with a significantly lower number of hospitalizations for HF [incidence rate ratio (IRR): 0.77, 95 CI 0.650.91, P < 0.001] and for any cause (IRR: 0.87, 95 CI: 0.790.96, P 0.003), while LOS was not different. Direct costs for hospitalization for HF were approximated by diagnosis-related group (DRG) tariffs in Europe and North America and were used to populate an economic model. The difference in costs between RPM and usual care ranged from €300 to €1000, favouring RPM. These cost savings combined with a quality-adjusted life years (QALYs) gain of 0.06 suggest that RPM is a 'dominant' technology over existing standard care. In a budget impact analysis, the adoption of an RPM strategy entailed a progressive and linear increase in costs saved.ConclusionsThe novel cost-effectiveness data coupled with the demonstrated clinical efficacy of RPM should encourage its acceptance amongst clinicians and its consideration by third-party payers. At the same time, the scientific community should acknowledge the lack of prospectively and uniformly collected economic data and should request that future studies incorporate economic analyses. © 2010 The Author.

Previtali M.,University of Pavia | Chieffo E.,University of Pavia | Ferrario M.,University of Pavia | Klersy C.,Service of Biometry and Clinical Epidemiology
European Heart Journal Cardiovascular Imaging | Year: 2012

Aim: Conflicting evidence exists as to whether the mitral E/E′ ratio can be a reliable predictor of the left ventricular enddiastolic pressure (LVEDP). Our aim was to assess the value of the mitral E/E′ ratio for the estimation of left ventricular diastolic pressures (LVDP) in patients without heart failure (HF). Methods and results: Echo-Doppler examination and left heart catheterization were carried out in 100 consecutive patients to assess the correlation between echo-Doppler parameters and the LVDP. The E/A ratio showed the best correlation with the pre-a LVDP and the LVEDP, whereas septal and mean E/E′ ratios were significantly correlated with pre-a LVDP but not with the LVEDP. No difference in the echo-Doppler parameters was found between patients with normal and elevated LVEDP. Mitral E/E′ ratio was significantly higher in patients with an ejection fraction (EF) <50% compared with those with the EF ≥50% and in patients with a dilated left ventricular (LV) compared with those with a normal LV. No significant difference in mean LVEDP was found among the three groups with E/E′ ratios of <8, 8-15, and >15. The best cut-off values identified by receiver operating characteristic curve analysis for septal, lateral, and mean E/E′ had sensitivities of 53, 68, and 54% and specificities of 66, 51, and 69% for identifying a >15 mmHg LVEDP. Conclusion: In patients without HF mitral E/E′ ratio is influenced by EF and LV volumes and is better correlated with the pre-a LVDP than with the LVEDP. The suboptimal sensitivity and specificity of E/E′ for predicting increased LVDP suggest that the mitral E/E′ ratio is of limited clinical value in patients without HF. Published on behalf of the European Society of Cardiology. All rights reserved. © The Author 2011.

Regoli F.,Fondazione Cardiocentro Ticino | Faletra F.F.,Fondazione Cardiocentro Ticino | Nucifora G.,Fondazione Cardiocentro Ticino | Pasotti E.,Fondazione Cardiocentro Ticino | And 3 more authors.
JACC: Cardiovascular Imaging | Year: 2011

Objectives: The aim of this study was to evaluate the feasibility and acute efficacy of real-time 3-dimensional transesophageal echocardiography (RT3DTEE)guided ablation of the cavotricuspid isthmus (CVTI). Background: The use of RT3DTEE to guide a transcatheter radiofrequency ablation procedure has never been systematically investigated. Methods: Seventy consecutive patients with CVTI-dependent atrial flutter underwent CVTI ablation. Procedural monitoring using RT3DTEE was assigned to patients who requested general anesthesia for the procedure (n = 21 [30%]). In the other 49 patients (the control group), the procedures were monitored using the standard fluoroscopic approach. Procedural time was considered as skin-to-skin electrophysiological procedure duration, not including anesthesia preparation; adequate radiofrequency ablation applications (with fixed temperature and power settings) were considered as lesions lasting < 60 s. Results: RT3DTEE allowed visualization of the CVTI and identified related structures in most patients (20 of 21); anatomic features such as long CVTI (n = 11), prominent Eustachian ridge (n = 9), prominent Eustachian valve (n = 6), septal recess (n = 8), and pectinate muscles (n = 10) were frequent. Also, RT3DTEE allowed continuous visualization of ablation catheter movement and contact. Compared with the control group, RT3DTEE was equally effective in achieving CVTI bidirectional block (100% in both groups), and no complications occurred. RT3DTEE shortened procedural time (median 73.0 min, interquartile range [IQR] 60.0 to 90.0 min, vs. median 115.0 min, IQR 85.0 to 133.0 min, p < 0.001), reduced radiation exposure (median fluoroscopy time 4.2 min, IQR 3.1 to 8.4 min, vs. median 19.3 min, IQR 12.9 to 36.4 min, p < 0.001; median fluoroscopy dose 575.4 cGy · cm 2, IQR 428.5 to 1,299.4 cGy · cm 2, vs. median 3,520.7 cGy · cm 2, IQR 1,700.0 to 6,709.0 cGy · cm 2, p < 0.001), and reduced the number of radiofrequency applications to achieve bidirectional block (median 7, IQR 6 to 10, vs. median 12, IQR 10 to 22, p = 0.007). A strong learning curve was detected by comparing procedural data between the first and last patients treated using RT3DTEE. Conclusions: RT3DTEE-guided ablation of CVTI was feasible, allowing real-time detailed morphological CVTI characterization as well as continuous visualization of the ablation catheter during radiofrequency ablation. This approach entailed marked reductions in procedural time, radiation exposure, and the number of radiofrequency applications. © 2011 American College of Cardiology Foundation.

PubMed | Cardiocentro Ticino and Service of Biometry and Clinical Epidemiology
Type: | Journal: International journal of cardiology | Year: 2016

The aim of this study was to determine overall and aetiology-related incidence of secondary prevention ICD implantation over the last 15years in Canton Ticino and to assess clinical outcome according to time period of implantation.Consecutive patients treated by implantation of an ICD for secondary prevention from 2000 to 2015 were included in the current study and compared between 5-year cohorts (2000/2004; 2005/2009; 2010/2015). Yearly implantation rate, changing in clinical presentation over years and events during follow-up were evaluated. One-hundred fifty six patients were included. ICD implantation rate increased from 2.1 in 2000-2005 to 5.1 in 2010-2015, respectively (p 0.001). There was an increase in the proportion of non-ischaemic patients and of ventricular tachycardia (VT) as presenting rhythm. No differences in appropriate ICD interventions were observed according to aetiology, presenting arrhythmia or type of device. Reverse remodelling was observed more often in non-ischaemic patients, without any influence on the occurrence of appropriate interventions. Previous myocardial infarction (MI), atrial fibrillation (AF), NYHA class 2-3 and left ventricular ejection fraction (LVEF)<35% were predictors of appropriate therapies during follow-up.Rate of implants for secondary prevention indication has almost doubled during the last 15years. Importantly, there has been a progressive increase of non-ischaemic patients receiving an ICD, and of VT as presenting rhythm. Patients had an overall good survival and a relatively low incidence of appropriate therapies. Improvement of ejection fraction did not correlate with risk reduction of ventricular arrhythmias.

Loading Service of Biometry and Clinical Epidemiology collaborators
Loading Service of Biometry and Clinical Epidemiology collaborators