Institute for Research and Education

Minneapolis, MN, United States

Institute for Research and Education

Minneapolis, MN, United States
Time filter
Source Type

Yamamoto J.,University of Hawaii at Manoa | Bergstrom J.,University of California at San Diego | Davis A.,University of California at San Diego | Wing D.,University of California at San Diego | And 3 more authors.
PLoS ONE | Year: 2017

Background The causes of age-related hyperkyphosis (HK) include osteoporosis, but only 1/3 of those most severely affected have vertebral fractures, suggesting that there are other important, and potentially modifiable causes. We hypothesized that muscle mass and quality may be important determinants of kyphosis in older persons. Methods We recruited 72 persons >65 years to participate in a prospective study designed to evaluate kyphosis and fall risk. At the baseline visit, participants had their body composition measures completed using Dual Energy X-ray Absorptiometry (DXA). They had kyphosis measured in either the standing [S] or lying [L] position: 1) Cobb angle from DXA [L]; 2) Debrunner kyphometer [S]; 3) architect's flexicurve ruler [S]; and 4) blocks method [L]. Multivariable linear/logistic regression analyses were done to assess the association between each body composition and 4 kyphosis measures. Results Women (n = 52) were an average age of 76.8 (SD 6.7) and men 80.5 (SD 7.8) years. They reported overall good/excellent health (93%), the average body mass index was 25.3 (SD 4.6) and 35% reported a fall in the past year. Using published cut-offs, about 20±30% were determined to have HK. For the standing assessments of kyphosis only, after adjusting for age, sex, weight and hip BMD, persons with lower TLM were more likely to be hyperkyphotic. Conclusions Lower TLM is associated with HK in older persons. The results were stronger when standing measures of kyphosis were used, suggesting that the effects of muscle on thoracic kyphosis are best appreciated under spinal loading conditions.

Lewis J.R.,University of Western Australia | Lewis J.R.,University of Sydney | Schousboe J.T.,Institute for Research and Education | Schousboe J.T.,University of Minnesota | And 7 more authors.
Arteriosclerosis, Thrombosis, and Vascular Biology | Year: 2016

Objective-Dual-energy X-ray absorptiometry is a low-cost, minimal radiation technique used to improve fracture prediction. Dual-energy X-ray absorptiometry machines can also capture single-energy lateral spine images, and abdominal aortic calcification (AAC) is commonly seen on these images. Approach and Results-We investigated whether dual-energy X-ray absorptiometry-derived measures of AAC were related to an established test of generalized atherosclerosis in 892 elderly white women aged >70 years with images captured during bone density testing in 1998/1999 and B-mode carotid ultrasound in 2001. AAC scores were calculated using a validated 24-point scale into low (AAC24 score, 0 or 1), moderate (AAC24 scores, 2-5), and severe AAC (AAC24 scores, >5) seen in 45%, 36%, and 19%, respectively. AAC24 scores were correlated with mean and maximum common carotid artery intimal medial thickness (rs=0.12, P<0.001 and rs=0.14, P<0.001). Compared with individuals with low AAC, those with moderate or severe calcification were more likely to have carotid atherosclerotic plaque (adjusted prevalence ratio (PR), 1.35; 95% confidence interval, 1.14-1.61; P<0.001 and prevalence ratio, 1.94; 95% confidence interval, 1.65-2.32; P<0.001, respectively) and moderate carotid stenosis (adjusted prevalence ratio, 2.22; 95% confidence interval, 1.39-3.54; P=0.001 and adjusted prevalence ratio, 4.82; 95% confidence interval, 3.09-7.050; P<0.001, respectively). The addition of AAC24 scores to traditional risk factors improved identification of women with carotid atherosclerosis as quantified by C-statistic (+0.075, P<0.001), net reclassification (0.249, P<0.001), and integrated discrimination (0.065, P<0.001). Conclusions-AAC identified on images from a dual-energy X-ray absorptiometry machine were strongly related to carotid ultrasound measures of atherosclerosis. This low-cost, minimal radiation technique used widely for osteoporosis screening is a promising marker of generalized extracoronary atherosclerosis. © 2015 American Heart Association, Inc.

News Article | December 7, 2016

Ayahuasca is a beverage that has been used for centuries by Native South-Americans. Studies suggest that it exhibits anxiolytic and antidepressant effects in humans. One of the main substances present in the beverage is harmine, a beta-carboline which potential therapeutic effects for depression has been recently described in mice. "It has been shown in rodents that antidepressant medication acts by inducing neurogenesis. So we decided to test if harmine, an alkaloid with the highest concentration in the psychotropic plant decoction ayahuasca, would trigger neurogenesis in human neural cells", said Vanja Dakic, PhD student and one of the authors in the study. In order to elucidate these effects, researchers from the D'Or Institute for Research and Education (IDOR) and the Institute of Biomedical Sciences at the Federal University of Rio de Janeiro (ICB-UFRJ) exposed human neural progenitors to this beta-carboline. After four days, harmine led to a 70% increase in proliferation of human neural progenitor cells. Researchers were also able to identify how the human neural cells respond to harmine. The described effect involves the inhibition of DYRK1A, which is located on chromosome 21 and is over activated in patients with Down syndrome and Alzheimer's Disease. "Our results demonstrate that harmine is able to generate new human neural cells, similarly to the effects of classical antidepressant drugs, which frequently are followed by diverse side effects. Moreover, the observation that harmine inhibits DYRK1A in neural cells allows us to speculate about future studies to test its potential therapeutic role over cognitive deficits observed in Down syndrome and neurodegenerative diseases", suggests Stevens Rehen, researcher from IDOR and ICB-UFRJ. This study, published Dec. 6 in PeerJ, was funded by Brazilian funding agencies FAPERJ, CNPq, CAPES, FINEP, BNDES e FAPESP.

Christian K.,Good Samaritan Hospital | Engel A.M.,Institute for Research and Education | Smith J.M.,Good Samaritan Hospital | Smith J.M.,Vascular and Thoracic Surgery Inc.
American Surgeon | Year: 2011

This study investigated and compared the risk factors and outcomes of patients undergoing coronary artery bypass graft surgery with and without the occurrence of prolonged mechanical ventilation. Data in a cardiac surgery database were examined retrospectively. Data selected included any isolated coronary artery bypass graft surgery performed by the surgical group from August 2005 to June 2009. The resulting cohort included a total of 2933 patients which was comprised of 116 patients with a ventilation time of greater than 72 hours (prolonged ventilation) and 2817 patients with a ventilation time of 72 hours or less (no prolonged ventilation). Patients with a prolonged ventilation time were matched (1:3 ratio) to patients not requiring a prolonged ventilation time by year of surgery resulting in our study cohort of 464 patients. To generate the unadjusted risks of each factor, χ2 and t test analysis were performed. Logistic regression analysis was then used to investigate the adjusted risk between cases and controls and each of the significant variables. χ2 and t tests were conducted comparing cases and controls with the outcome variables. Patients undergoing coronary artery bypass graft that experienced a prolonged ventilation time (cases) were more likely female, had a New York Hospital Association functional class of III or IV, and had a longer perfusion time. There was no significant difference between cases and controls with diabetes, chronic obstructive pulmonary disease, left ventricular ejection fraction, or body mass index while controlling for all significant risk factors. Careful patient selection and preparation during preoperative evaluation may help identify patients at risk for prolonged mechanical ventilation and thus help prevent the added morbidity and mortality associated with it.

Schilling J.,Good Samaritan Hospital | Engel A.M.,Institute for Research and Education | Hassan M.,Good Samaritan Hospital | Smith J.M.,Good Samaritan Hospital | Smith J.M.,Vascular and Thoracic Surgery Inc.
Journal of Cardiac Surgery | Year: 2012

Background: Advances in optics and instrumentation with the da Vinci S Surgical System have facilitated minimally invasive and robotic cardiac procedures including mitral valve repair and atrial myxoma excision. We report our retrospective data comparing robotically assisted myxoma excision with standard median sternotomy excision. Methods: Data were collected for cardiac myxoma resection performed between January 2000 and December 2009. The resulting cohort included a total of 57 patients. These patients were grouped into two categories: robotic-assisted (n = 17) surgical procedures and traditional (nonrobotic; n = 40) surgical procedures. Presurgical and surgical risk factors were examined. Results: Univariate analysis comparing the surgical procedure groups and surgical risk factors found a significant difference in 3 of the 14 variables. Cannulation in all patients undergoing robotic-assisted cardiac myxoma excision was performed through cannulating the common femoral artery and vein while cannulation for the traditional procedures was performed using the aorta and atrium except for two patients. For aortic occlusion, 14 of the robotic-assisted cardiac myxoma patients had balloon occlusion and 34 of the traditional cardiac myxoma patients had aortic cross-clamp occlusion. Operating time was significantly shorter for robotic cases (2.7 hours) compared with traditional cases (3.5 hours). Conclusion: Robotic excision of atrial myxomas is safe and may be an alternative to traditional open surgery in selected patients. © 2012 Wiley Periodicals, Inc.

Grannan K.,Good Samaritan Hospital | Snyder J.,Good Samaritan Hospital | Mcdonough S.,Institute for Research and Education | Engel A.,Institute for Research and Education | Farnum J.,Good Samaritan Hospital
American Surgeon | Year: 2011

Follicular neoplasms of the thyroid are a frequent indication for surgery of the thyroid gland. We evaluated the use of frozen sections on intraoperative decision-making, possible avoidance of reoperative surgery, and histologic findings in a retrospective cohort. A database was created of all thyroid operations from 2001 to 2007. Data collected included age, gender, preoperative cytology, indication for surgery, surgeon, intraoperative decision-making, and histologic findings. Of the 723 thyroidectomies, 203 were performed for follicular neoplasms diagnosed by fine needle aspiration. Of these, 135 had cytology reports available within our electronic medical record; 44 per cent (59 of 135) of these patients had an intraoperative frozen section. Only two of 59 (3.4%) were positive for carcinoma, both of which were papillary carcinomas. One was interpreted as "suspicious" for carcinoma by the pathologist. In these three cases, the surgeon proceeded with total thyroidectomy at the time of initial surgery. The results of frozen section altered the operation in only three of 59 cases (5.1%). Intraoperative frozen section rarely impacts the conduct of thyroidectomy for follicular neoplasms.

Dunki-Jacobs E.,Good Samaritan Hospital | Grannan K.,Good Samaritan Hospital | McDonough S.,Institute for Research and Education | Engel A.M.,Institute for Research and Education
American Journal of Surgery | Year: 2012

Background: The purpose of this study was to describe the incidence and clinical/pathologic characteristics of papillary thyroid microcarcinoma (PMC) in a community hospital setting and to evaluate the frequency and characteristics of these lesions when unsuspected preoperatively. Methods: A total of 723 patients underwent a partial or total thyroidectomy. A retrospective review was performed. Results: A total of 194 of the 723 patients had a final diagnosis of papillary carcinoma. Ninety-six (49%) of these tumors were PMCs defined as being 1.0 cm or less in diameter. One third (32 of 96) of these lesions were multifocal and 16.7% (16 of 96) were found to have regional lymph node metastases. The majority (58%) of PMCs were found on final pathology and were clinically unsuspected (occult). Multifocality was found in 32.1% (18 of 56) of patients with clinically unsuspected PMC, with nodal metastases in 3.6% (2 of 56). The other 40 patients with PMC had surgeries performed for a clinical reason related to that pathologic lesion. This clinically suspected group was comparably multifocal (35%), but more likely to have cervical lymph node metastasis (35%). Sixty-six percent (37 of 56) diagnosed with a clinically unsuspected PMC underwent a partial thyroidectomy at the initial surgery. Conclusions: The prevalence of clinically unsuspected PMC in our population undergoing thyroidectomy was 7.7% (56 of 723). In our institution, this is more than half of all PMCs. The incidence of cervical lymph node metastasis in clinically unsuspected PMC was only 3.6% compared with 35% in clinically suspected disease, suggesting that the biological behavior (and possibly treatment) may be different. Long-term follow-up evaluation is needed to better evaluate the significance of these differences.© 2012 Elsevier Inc. All rights reserved.

Thors A.,Good Samaritan Hospital | Dunki-Jacobs E.,Good Samaritan Hospital | Engel A.M.,Institute for Research and Education | McDonough S.,Institute for Research and Education | Welling R.E.,Good Samaritan Hospital
Journal of Surgical Education | Year: 2010

Background: Patient quality outcomes are a major focus of the health care industry. It is unknown what effect involvement in graduate medical education (GME) has on patient outcomes. The purpose of this study is to begin to examine whether GME involvement in postoperative care impacts patient quality outcomes. Methods: The retrospective cohort included all patients who underwent a nonemergent colectomy from January 1, 2007 to January 1, 2008 at a 2-hospital system. Data collected included patient demographics, patient quality outcomes, complications, and GME involvement. Patient quality outcomes were based on compliance with the Surgical Care Improvement Project (SCIP) guidelines. Results: A total of 159 nonemergent colectomies were analyzed. The GME group accounted for 116 (73%) patients. A significant difference was found in several SCIP process-based measures of quality when comparing the GME group with the non-GME group. Postoperative antibiotics were more likely to be stopped within 24 hours (p = 0.010), and preoperative heparin and postoperative deep vein thrombosis (DVT) prophylaxis were more likely to be administered (p < 0.001). Additionally, patients in the GME group showed improved quality outcomes as there were significantly fewer postoperative complications (p < 0.001) and a shorter duration of stay (p = 0.008). The use of gastrointestinal prophylaxis was more common in the non-GME group (p = 0.002). No significant differences were observed between the 2 groups in respect to age, sex, diabetes, preoperative antibiotics, antibiotics, 1 hour before surgery, postoperative antibiotics, and continuation of home β blockade. Conclusions: GME at teaching institutions has a positive impact on patient quality outcomes. At our institution, many of the SCIP measurable outcomes had improved compliance if an attending physician participated in the GME program. © 2010 Association of Program Directors in Surgery Published by Elsevier Inc. All rights reserved.

Gibler J.,Good Samaritan Hospital | Nyswonger G.,Good Samaritan Hospital | Engel A.M.,Institute for Research and Education | Grannan K.,Good Samaritan Hospital | Welling R.,Good Samaritan Hospital
Journal of Surgical Education | Year: 2011

Objective: The objective of this study was to evaluate patient satisfaction in an outpatient community-based surgical clinic to seek opportunities for improvement. Methods: A paper survey was distributed to patients at the Faculty Medical Center Clinic over a 12-week period. The survey allowed patients to rate their experience on a 5-point scale from "very dissatisfied" to "very satisfied." The survey addressed referral to the clinic, appointment scheduling, visit experience, wait times, laboratory testing, and satisfaction with surgery. Separate from the surveys, data were collected regarding wait time in clinic prior to being placed in an examining room, time spent waiting for the physician, time spent with the physician, overall time spent in clinic, and appointment time to surgery. Results: During the 12-week time period, 87 surveys were returned from patients in the surgery clinic for a 69% response rate. Most patients were referred to the surgery clinic from the emergency department or their primary care physicians at 44% and 43%, respectively. Just over half of the patients responded that they were "very satisfied" with their overall experience. Of those surveyed, 40% of patients were "very satisfied" with their wait time for the first visit to the clinic, 52% with time in waiting room, 43% with time in examining room, and 47% with time spent with physician. Only 16.4% of patients were "very dissatisfied" or "mostly dissatisfied" with time waiting for appointment, 17.9% with time available for appointment, 14.3% with time in waiting room, 18.2% in time waiting in examination room for the physician, and 20.9% of time wait to schedule surgery. Data were also collected on 203 surgical clinic patients during this time. Of the 203 patients, 55% were new patients, 31% were postoperative patients, and 14% were in the clinic for another type of visit. Conclusions: Overall patient satisfaction was good for the clinic, yet there were areas to improve. Efficiency of scheduling patients, improving wait time for waiting room, examining room, and time prior to scheduling surgery are areas that need improvement. Modification of the current practice at the surgery clinic could result in improvement of patient satisfaction in future evaluation. © 2011 Association of Program Directors in Surgery.

Shafiro V.,Rush University Medical Center | Gygi B.,Institute for Research and Education | Cheng M.-Y.,Rush University Medical Center | Vachhani J.,Rush University Medical Center | Mulvey M.,Rush University Medical Center
Ear and Hearing | Year: 2011

Objectives: Environmental sound perception serves an important ecological function by providing listeners with information about objects and events in their immediate environment. Environmental sounds such as car horns, baby cries, or chirping birds can alert listeners to imminent dangers as well as contribute to one's sense of awareness and well being. Perception of environmental sounds as acoustically and semantically complex stimuli may also involve some factors common to the processing of speech. However, very limited research has investigated the abilities of cochlear implant (CI) patients to identify common environmental sounds, despite patients' general enthusiasm about them. This project (1) investigated the ability of patients with modern-day CIs to perceive environmental sounds, (2) explored associations among speech, environmental sounds, and basic auditory abilities, and (3) examined acoustic factors that might be involved in environmental sound perception. Design: Seventeen experienced postlingually deafened CI patients participated in the study. Environmental sound perception was assessed with a large-item test composed of 40 sound sources, each represented by four different tokens. The relationship between speech and environmental sound perception and the role of working memory and some basic auditory abilities were examined based on patient performance on a battery of speech tests (HINT, CNC, and individual consonant and vowel tests), tests of basic auditory abilities (audiometric thresholds, gap detection, temporal pattern, and temporal order for tones tests), and a backward digit recall test. Results: The results indicated substantially reduced ability to identify common environmental sounds in CI patients (45.3%). Except for vowels, all speech test scores significantly correlated with the environmental sound test scores: r = 0.73 for HINT in quiet, r = 0.69 for HINT in noise, r = 0.70 for CNC, r = 0.64 for consonants, and r = 0.48 for vowels. HINT and CNC scores in quiet moderately correlated with the temporal order for tones. However, the correlation between speech and environmental sounds changed little after partialling out the variance due to other variables. Conclusions: Present findings indicate that environmental sound identification is difficult for CI patients. They further suggest that speech and environmental sounds may overlap considerably in their perceptual processing. Certain spectrotemproral processing abilities are separately associated with speech and environmental sound performance. However, they do not appear to mediate the relationship between speech and environmental sounds in CI patients. Environmental sound rehabilitation may be beneficial to some patients. Environmental sound testing may have potential diagnostic applications, especially with difficult-to-test populations and might also be predictive of speech performance for prelingually deafened patients with cochlear implants. © Copyright 2011 by Lippincott Williams & Wilkins.

Loading Institute for Research and Education collaborators
Loading Institute for Research and Education collaborators