Clinical Hospital Center Zagreb
Clinical Hospital Center Zagreb
Sagud M.,University of Zagreb |
Mihaljevic-Peles A.,University of Zagreb |
Uzun S.,Clinic for Psychiatry Vrapce |
Cusa B.V.,Clinical Hospital Center Zagreb |
And 6 more authors.
Psychopharmacology | Year: 2013
Rationale: Although a number of studies investigated the link between major depressive disorder (MDD) and metabolic syndrome (MetS), the association between MetS and treatment-resistant depression (TRD) is still not clear. Objectives: The aim of the study was to investigate the relationship between TRD and MetS and/or components of MetS and cardiovascular risk factors. Given the high prevalence of both conditions, the hypothesis was that TRD would be significantly associated with MetS. Methods: This cross-sectional study included 203 inpatients with MDD, assessed for the treatment resistance, MetS and its components, and severity of MDD. Diagnoses and evaluations were made with SCID based on DSM-IV, National Cholesterol Education Program Adult Treatment Panel III criteria, and the Hamilton Depression Rating Scale. Results: TRD prior to study entry was found in 26.1 % of patients, while MetS was observed in 33.5 % of patients. The prevalence of MetS did not differ significantly between TRD and non-TRD patients. In addition, the frequency of the altered values of particular components of the MetS or cardiovascular risk factors was not associated with treatment resistance in depressed patients. Patients with TRD were older, had a higher number of lifetime episodes of depression and suicide attempts, and longer duration of MDD compared to non-TRD patients. Conclusions: The occurrence of either MetS or the particular components of the MetS and other cardiovascular risk factors was similar between TRD and non-TRD patients. Although there is a bidirectional relationship between depression and MetS, neither MetS nor its components appear to influence treatment resistance to antidepressants. © 2013 Springer-Verlag Berlin Heidelberg.
Katic J.,Institute for Medical Research and Occupational Health |
Fucic A.,Institute for Medical Research and Occupational Health |
Gamulin M.,Clinical Hospital Center Zagreb
Arhiv za Higijenu Rada i Toksikologiju | Year: 2010
Health disorders and diseases related to environmental exposure in children such as cancer and immunologic disturbances (asthma, allergies) are on the rise. However, complex transplacental and prepubertal genotoxicology is given very limited consideration, even though intrauterine development and early childhood may be critical for elucidating the cancer aetiology. The foetus is transplacentally exposed to contaminants in food and environment such as various chemicals, drugs, radiochemically contaminated water and air. Target organs of xenobiotic action may differ between the mother and the foetus due to specific stage of developmental physiology and enzyme distribution. This in turn may lead to different levels of clastogenic and aneugenic metabolites of the same xenobiotic in the mother and the foetus. Adult's protective behaviour is not sufficient to isolate children from radioisotopes, pesticides, toxic metals and metalloids, environmental tobacco smoke, endocrine disrupting chemicals, and various food contaminants, which are just a part of the stressors present in a polluted environment. In order to improve legislation related to foetus and child exposure to genotoxic and possibly carcinogenic agents, oncologists, paediatricians, environmental health specialists, and genotoxicologists should work together much more closely to make a more effective use of accumulated scientific data, with the final aim to lower cancer incidence and mortality.
Sabat A.J.,University of Groningen |
Budimir A.,Clinical Hospital Center Zagreb |
Sa-Leao R.,Institute Tecnologia Quimica e Biologica |
van Dijl J.M.,University of Groningen |
And 3 more authors.
Eurosurveillance | Year: 2013
Typing methods for discriminating different bacterial isolates of the same species are essential epidemiological tools in infection prevention and control. Traditional typing systems based on phenotypes, such as serotype, biotype, phage-type, or antibiogram, have been used for many years. However, more recent methods that examine the relatedness of isolates at a molecular level have revolutionised our ability to differentiate among bacterial types and subtypes. Importantly, the development of molecular methods has provided new tools for enhanced surveillance and outbreak detection. This has resulted in better implementation of rational infection control programmes and efficient allocation of resources across Europe. The emergence of benchtop sequencers using next generation sequencing technology makes bacterial whole genome sequencing (WGS) feasible even in small research and clinical laboratories. WGS has already been used for the characterisation of bacterial isolates in several large outbreaks in Europe and, in the near future, is likely to replace currently used typing methodologies due to its ultimate resolution. However, WGS is still too laborious and time-consuming to obtain useful data in routine surveillance. Also, a largely unresolved question is how genome sequences must be examined for epidemiological characterisation. In the coming years, the lessons learnt from currently used molecular methods will allow us to condense the WGS data into epidemiologically useful information. On this basis, we have reviewed current and new molecular typing methods for outbreak detection and epidemiological surveillance of bacterial pathogens in clinical practice, aiming to give an overview of their specific advantages and disadvantages.
Delimar D.,University of Zagreb |
Aljinovic A.,Clinical Hospital Center Zagreb |
Bicanic G.,University of Zagreb
Archives of Orthopaedic and Trauma Surgery | Year: 2014
Introduction: Bulk bone grafts are used in total hip arthroplasty (THA) when adequate acetabular cup coverage cannot be achieved. Data from literature show mainly good short-term and mid-term results with contradictory long-term results. The aim of this study was to investigate acetabular cup stability and graft integrity after dysplastic adult hip reconstruction with total hip endoprosthesis and bulk bone graft for acetabular deficiency.Methods: Seventy-two hips in 64 patients that underwent THA with bone autograft or allograft were assessed immediately after operation, 6 months and 1, 2, 3 and 10 years after operation. Acetabular angle, acetabular cup coverage, bone graft width, and bone graft height were measured and questionnaire was designed to determine acetabular cup stability and grade graft integrity. Four investigators graded grafts and inter-rater and intra-rater reliability of the questionnaire was tested.Results: All measured parameters in all patients and in patients with autograft and those with allograft separately showed significant changes consistent with graft failure and acetabular cup instability when level of significance was set at p < 0.05.Conclusions: Results of this study show significant decrease in acetabular cup stability when either autograft or allograft is used for cemented acetabular reconstruction of dysplastic hip. Further, allografts showed twice as rapid failure as autografts. Although these results contradict both good short-term and long-term results in published literature, they present warning for future use of free bulk bone grafts in reconstructive hip surgery. © Springer-Verlag Berlin Heidelberg 2014.
Sovilj S.,University of Zagreb |
Van Oosterom A.,University of Lausanne |
Rajsman G.,Clinical Hospital Center Zagreb |
Magjarevic R.,University of Zagreb
Physiological Measurement | Year: 2010
In patients undergoing coronary artery bypass grafting (CABG) surgery, post-operative atrial fibrillation (AF) occurs with a prevalence of up to 40%. The highest incidence is seen between the second and third day after the operation. Following cardiac surgery AF may cause various complications such as hemodynamic instability, heart attack and cerebral or other thromboembolisms. AF increases morbidity, duration and expense of medical treatments. This study aims at identifying patients at high risk of post-operative AF. Early prediction of AF would provide timely prophylactic treatment and would reduce the incidence of arrhythmia. Patients at low risk of post-operative AF could be excluded on the basis of the contraindications of anti-arrhythmic drugs. The study included 50 patients in whom lead II electrocardiograms were continuously recorded for 48 h following CABG. Univariate statistical analysis was used in the search for signal features that could predict AF. The most promising ones identified were P wave duration, RR interval duration and PQ segment level. On the basis of these, a nonlinear multivariate prediction model was made by deploying a classification tree. The prediction accuracy was found to increase over time. At 48 h following CABG, the measured best smoothed sensitivity was 84.8% and the specificity 85.4%. The positive and negative predictive values were 72.7% and 92.8%, respectively, and the overall accuracy was 85.3%. With regard to the prediction accuracy, the risk assessment and prediction of post-operative AF is optimal in the period between 24 and 48 h following CABG. © 2010 Institute of Physics and Engineering in Medicine.
Jakovina Blazekovic S.,Clinical Hospital Center Zagreb |
Bicanic G.,University of Zagreb |
Hrabac P.,University of Zagreb |
Tripkovic B.,Clinical Hospital Center Zagreb |
Delimar D.,University of Zagreb
International Orthopaedics | Year: 2014
Purpose: During total knee arthroplasty (TKA) blood loss can be significant and in spite of all techniques for reducing blood loss there is still a significant possibility for blood transfusions. For blood loss management during TKA, pre-operative autologous blood donation (PABD) is still a standard of care. In this prospective randomised study we have evaluated the efficacy of PABD in patients undergoing TKA to answer the question whether there is any need for autologous blood donations during TKA and, if yes, for which group of patients. Methods: Patients were randomised to three groups. In group 1 patients did not donate autologous blood, in group 2 patients donated 1 dose 72 hours prior to TKA and in group 3 patients donated autologous blood 14 days prior to TKA. In all patients haemoglobin, haematocrit, thrombocyte and reticulocyte values, iron concentrations (Fe, unsaturated iron binding capacity, total iron binding capacity), activated partial thromboplastin time, prothrombin time, and intra-operative and post-operative blood loss were measured and compared. Results: With PABD there was no reduction in allogeneic blood transfusions and a large number of taken doses of autologous blood was discarded, which significantly increased the cost of treatment for these patients. For patients undergoing TKA, PABD can provoke iatrogenic anaemia and thereby increase the likelihood of the need for allogeneic blood transfusion. Conclusions: Results of our study showed that PABD in non-anaemic patients is not justified and is not economically feasible. © 2013 Springer-Verlag Berlin Heidelberg.
Bicanic G.,University of Zagreb |
Crnogaca K.,Clinical Hospital Center Zagreb |
Barbaric K.,Clinical Hospital Center Zagreb |
Delimar D.,University of Zagreb
Medical Hypotheses | Year: 2014
Periprosthetic infection is regarded as one of the most feared complications following total knee arthroplasty, developing in 0.4-2% of patients. Staphylococcus aureus and Staphylococcus epidermidis are credited for more than half of all infections. Cefazolin is the most commonly used antibiotic drug in arthroplasty antibiotic prophylaxis worldwide. Guidelines and studies recommend that prophylactic antibiotics should be completely infused within 60. min before the surgical incision. Cefazolin achieves highest peak bone concentrations 40. min after parenteral application with serum half-life of 108. min and bone half-life of 42. min. Respecting the given pharmacokinetics of cefazolin and theoretical mathematical model we hypothesise that parenteral application of cefazolin should be in time period not longer than 30. min before incision (tourniquet inflation) and not less than 10. min before tourniquet inflation if given in bolus. This new regime would provide maximal blood concentration of the cefazolin and almost maximal bone concentration of the cefazolin at the beginning of the operation and at the beginning of the tourniquet inflation. © 2014 Elsevier Ltd.
Oberhofer D.,University of Zagreb |
Juras J.,Clinical Hospital Center Zagreb |
Pavicic A.M.,University of Zagreb |
Zuric I.R.,University of Zagreb |
Rumenjak V.,University of Zagreb
Croatian Medical Journal | Year: 2012
Aim: To assess diagnostic value of perioperative procalcitonin (PCT) levels compared to C-reactive protein (CRP) levels in early detection of infectious complications following colorectal surgery. Methods: This prospective observational study included 79 patients undergoing elective colorectal surgery. White blood cell count, CRP, and PCT were measured preoperatively and on postoperative days (POD) 1, 2, 3, 5, and patients were followed for postoperative complications. Diagnostic accuracy of CRP and PCT values on each day was analyzed by the receiver operating characteristics (ROC) curve, with infectious complications as an outcome measure. ROC curves with the largest area under the curve for each inflammatory marker were compared in order to define the marker with higher diagnostic accuracy. Results: Twenty nine patients (36.7%) developed infectious complications. CRP and PCT concentrations increased in the early postoperative period, with a significant difference between patients with and without complications at all measured postoperative times. ROC curve analysis showed that CRP concentrations on POD 3 and PCT concentrations on POD 2 had similar predictive values for the development of infectious complications (area under the curve, 0.746 and 0.750, respectively) with the best cut-off values of 99.0 mg/L for CRP and 1.34 μg/L for PCT. Diagnostic accuracy of CRP and PCT was highest on POD 5, however the cut-off values were not considered clinically useful. Conclusion: Serial postoperative PCT measurements do not offer an advantage over CRP measurements for prediction of infectious complications following colorectal surgery.
Kranjcec B.,Zabok General Hospital |
Papes D.,Clinical Hospital Center Zagreb |
Altarac S.,Zabok General Hospital
World Journal of Urology | Year: 2014
Purpose: To test whether d-mannose powder is effective for recurrent urinary tract infection (UTI) prevention. Materials and methods: After initial antibiotic treatment of acute cystitis, 308 women with history of recurrent UTI and no other significant comorbidities were randomly allocated to three groups. The first group (n = 103) received prophylaxis with 2 g of d-mannose powder in 200 ml of water daily for 6 months, the second (n = 103) received 50 mg Nitrofurantoin daily, and the third (n = 102) did not receive prophylaxis. Results: Overall 98 patients (31.8 %) had recurrent UTI: 15 (14.6) in the d-mannose group, 21 (20.4) in Nitrofurantoin group, and 62 (60.8) in no prophylaxis group, with the rate significantly higher in no prophylaxis group compared to active groups (P < 0.001). Patients in d-mannose group and Nitrofurantoin group had a significantly lower risk of recurrent UTI episode during prophylactic therapy compared to patients in no prophylaxis group (RR 0.239 and 0.335, P < 0.0001). In active groups, 17.9 % of patients reported side effects but they were mild and did not require stopping the prophylaxis. Patients in d-mannose group had a significantly lower risk of side effects compared to patients in Nitrofurantoin group (RR 0.276, P < 0.0001), but the clinical importance of this finding is low because Nitrofurantoin was well tolerated. Conclusions: In our study, d-mannose powder had significantly reduced the risk of recurrent UTI which was no different than in Nitrofurantoin group. More studies will be needed to validate the results of this study, but initial findings show that d-mannose may be useful for UTI prevention. © 2013 Springer-Verlag Berlin Heidelberg.
Aleric I.,Clinical Hospital Center Zagreb
Medicinski pregled | Year: 2012
Non-small cell lung cancers are among the leading causes of cancer morbidity and mortality worldwide. The prognosis is usually based on traditional pathohistological parameters and clinical stage, but additional prognostic survival factors have also been sought. The aim of this retrospective study was to explore the membranous expression of HER-2/neu and estrogen receptors in nonsmall cell lung cancers and their relation to survival of patients with non-small cell lung cancers and to traditional prognostic factors. The sample consisted of 132 consecutive, surgically resected patient tissues of non-small cell lung cancers, and the following parameters were examined: HER-2/neu and estrogen receptor expression, as well as the related clinical and pathological features: tumor, nodes, and metastases stage, level of tumor necrosis, histological and nuclear grade, lymphocytic infiltrate, and number of mitoses. HER-2/neu was positive in 28.8% of tumor samples, and estrogen receptor expression was positive in 29.5% of tumors, but neither was significantly associated with the outcome of non-small cell lung cancers. There was a significant association between HER-2/neu and nuclear grade (P=0.01). In addition, the association between estrogen receptor expression and histological type of tumor (P=0.04) and mitotic rate (P=0.008) was found. Kaplan-Meier analysis showed a significant association of patients' overall survival with the tumor node metastasis stage (P<0.001) and the degree of tumor necrosis (P=0.02). Cox proportional hazard regression analysis showed that male gender (P=0.01), histological type (P=0.03), high degree of necrosis (P=0.006), and higher histological grade (P=0.037) were associated with the patients' survival. Our findings indicate that the expression of HER-2/neu and estrogen receptor is less reliable than traditional histological parameters in predicting the survival of patients with non-small cell lung cancers.