Sharma P.,University of Michigan |
Goodrich N.P.,Arbor Research Collaborative for Health |
Guidinger M.K.,Arbor Research Collaborative for Health |
Merion R.M.,University of Michigan
Journal of the American Society of Nephrology | Year: 2013
Incident ESRD after liver transplantation (LT) is associated with high post-transplant mortality. We constructed and validated a continuous renal risk index (RRI) to predict post-LT ESRD. Data for 43,514 adult recipients of deceased donor LT alone (February 28, 2002 to December 31, 2010) were linked from the Scientific Registry of Transplant Recipients and the Centers forMedicare and Medicaid Services ESRD Program. An adjusted Cox regressionmodel of time to post-LT ESRDwas fitted, and the resulting equation was used to calculate an RRI for each LT recipient. The RRI included 14 recipient factors: age, African- American race, hepatitis C, cholestatic disease, body mass index35, pre-LT diabetes, ln creatinine for recipients not on dialysis, ln albumin, ln bilirubin, serum sodium,134 mEq/L, status-1, previous LT, transjugular intrahepatic portosystemic shunt, and acute dialysis at LT. This RRI was validated and had a C statistic of 0.76 (95% confidence interval, 0.75 to 0.78). Higher RRI associated significantly with higher 5-year cumulative incidence of ESRD and post-transplant mortality. In conclusion, the RRI constructed in this study quantifies the risk of post-LT ESRD and is applicable to all LT alone recipients. This new validated measuremay serve as an important prognostic tool in ameliorating post-LT ESRDrisk and improve survival by informing post-LT patient management strategies. Copyright © 2013 by the American Society of Nephrology.
News Article | December 2, 2016
ANN ARBOR, Mich. - According to an annual data report from the United States Renal Data System (USRDS), hospitalization and mortality rates for patients with chronic kidney disease continue to decline in the U.S. Along with those rates, the report highlights several current trends in kidney disease in the U.S., including Medicare spending in the patient population and number of kidney transplants. This year's report provides data from 2014 and is released by the USRDS coordinating center based at the University of Michigan Kidney Epidemiology and Cost Center, in partnership with Arbor Research Collaborative for Health. The report states that hospitalization rates among end-stage kidney disease patients decreased to 1.7 admissions per patient per year, as compared to 2.1 in 2005, or a reduction of 19 percent. End-stage kidney disease is the last stage of chronic kidney disease when the kidneys can no longer remove waste and excess water from the body, and dialysis or kidney transplantation is necessary for survival. In addition, mortality rates continue to decrease for dialysis and transplant patients, falling by 32 percent and 44 percent, respectively, since 1996. "Most recent estimates indicate 14.8 percent of U.S. adults have chronic kidney disease," says Rajiv Saran, M.D., professor of internal medicine at the University of Michigan and director of the USRDS coordinating center. "Fortunately, we've seen steeper declines in mortality rates in more recent years in this patient population, which is promising." "An interesting note on kidney transplants is a relatively recent initiative called kidney paired donation," Saran says. "The initiative is aimed at increasing the availability of living donor transplants, and in its simplest form is essentially when two living donors do not match with the respective recipients and decide to perform an exchange whereby the donation goes to each other's compatible recipient. Kidney paired donation transplants have risen sharply in recent years with 552 performed in 2014, representing 10 percent of living donor transplants that year." According to Saran, earlier diagnosis and treatment of chronic kidney disease can improve patient outcomes. "As newly reported cases of end-stage kidney disease continue to happen each year, physicians and patients need to have continued dialogue about the disease and how best to manage it," Saran says. "We hope this report provides fellow clinicians and researchers with valuable information they can use when discussing the disease with their patients and colleagues." Authors: In addition to Saran, the report's U-M authors include Bruce Robinson, M.D., Vahakn Shahinian, M.D., John Ayanian, M.D., Jennifer Bragg-Gresham, Ph.D., Debbie Gipson, M.D., Yun Han, M.S., Kevin He, Ph.D., William Herman, M.D., Michael Heung, M.D., Richard A. Hirth, Ph.D., David Hutton, Ph.D., Yi Li, Ph.D., Yee Lu, M.D., Hal Morgenstern, Ph.D., Brahmajee Nallamothu, M.D., Brett Plattner, M.D., Ronald Pisoni, Ph.D., Panduranga Rao, M.D., Douglas E. Schaubel, Ph.D., David T. Selewski, M.D., Peter Song, Ph.D., and Kenneth J. Woodside, M.D. Funding: Funding for the project came from the National Institute of Diabetes and Digestive and Kidney Diseases, National Institutes of Health, and the U.S. Department of Health and Human Services, under contract HHSN276201400001C, and the USRDS Coordinating Center Team, which consists of investigators and staff from the University of Michigan Health System, the Kidney Epidemiology and Cost Center, in partnership with Arbor Research Collaborative for Health. Disclosures: Dr. Hal Morgenstern is a consultant at Arbor Research Collaborative for Health. Dr. Jennifer Bragg-Gresham is a consultant with IMPAQ, and with Medical Education Institute (MEI) involving quality of life performance measures. Reference: United States Renal Data System. 2016 USRDS annual data report: Epidemiology of kidney disease in the United States. National Institutes of Health. National Institute of Diabetes and Digestive and Kidney Disease, Bethesda, MD, 2016. http://www. .
Pisoni R.L.,Arbor Research Collaborative for Health |
Zepel L.,Arbor Research Collaborative for Health |
Port F.K.,Arbor Research Collaborative for Health |
Robinson B.M.,Arbor Research Collaborative for Health |
Robinson B.M.,University of Michigan
American Journal of Kidney Diseases | Year: 2015
Background Since the bundled end-stage renal disease prospective payment system began in 2011 in the United States, some hemodialysis practices have changed substantially, raising the question of whether vascular access practice also has changed. We describe monthly US vascular access use from August 2010 to August 2013 with international comparisons, and other aspects of US vascular access practice. Study Design Prospective observational cohort study of vascular access. Setting & Participants Maintenance hemodialysis patients in the Dialysis Outcomes and Practice Patterns Study (DOPPS) Practice Monitor (DPM) in the United States (N = 3,442; US patients) and 19 other nations (N = 8,478). Predictors Country, patient demographics, time period. Outcomes Vascular access use, pre-end-stage renal disease access timing of first nephrologist care and arteriovenous access placement, patient self-reported vascular access preferences (United States only), treatment practices as stated by medical directors. Results In the United States from August 2010 to August 2013, arteriovenous fistula (AVF) use increased from 63% to 68%, while catheter use declined from 19% to 15%. Although AVF use did not differ greatly across age groups, arteriovenous graft use was 2-fold higher among black (26%) versus nonblack US patients (13%) in 2013. Across 20 countries in 2013, AVF use ranged from 49% to 92%, whereas catheter use ranged from 1% to 45%. Patient-reported vascular access preferences differed by sex and race, with 16% to 20% of patients feeling uninformed regarding benefits/risks of different vascular access types. Among new (incident) US hemodialysis patients, AVF use remains low, with ∼70% initiating hemodialysis therapy with a catheter (60% starting with catheter when having ≥4 months of predialysis nephrology care). In the United States, longer typical times to first AVF cannulation were reported. Limitations Noncompletion of surveys may affect the generalizability of findings to the wider hemodialysis population. Conclusions AVF use has increased, with catheter use decreasing among prevalent US hemodialysis patients since the introduction of the prospective payment system. However, AVF use at dialysis therapy initiation remains low, suggesting that reforms affecting predialysis care may be necessary to incentivize improvements in fistula rates at dialysis therapy initiation as achieved for prevalent hemodialysis patients. © 2015 National Kidney Foundation, Inc.
Sharma P.,University of Michigan |
Schaubel D.E.,University of Michigan |
Gong Q.,University of Michigan |
Guidinger M.,Arbor Research Collaborative for Health |
And 2 more authors.
Hepatology | Year: 2012
Candidates with fulminant hepatic failure (Status-1A) receive the highest priority for liver transplantation (LT) in the United States. However, no studies have compared wait-list mortality risk among end-stage liver disease (ESLD) candidates with high Model for End-Stage Liver Disease (MELD) scores to those listed as Status-1A. We aimed to determine if there are MELD scores for ESLD candidates at which their wait-list mortality risk is higher than that of Status-1A, and to identify the factors predicting wait-list mortality among those who are Status-1A. Data were obtained from the Scientific Registry of Transplant Recipients for adult LT candidates (n = 52,459) listed between September 1, 2001, and December 31, 2007. Candidates listed for repeat LT as Status-1 A were excluded. Starting from the date of wait listing, candidates were followed for 14 days or until the earliest occurrence of death, transplant, or granting of an exception MELD score. ESLD candidates were categorized by MELD score, with a separate category for those with calculated MELD > 40. We compared wait-list mortality between each MELD category and Status-1A (reference) using time-dependent Cox regression. ESLD candidates with MELD > 40 had almost twice the wait-list mortality risk of Status-1A candidates, with a covariate-adjusted hazard ratio of HR = 1.96 (P = 0.004). There was no difference in wait-list mortality risk for candidates with MELD 36-40 and Status-1A, whereas candidates with MELD < 36 had significantly lower mortality risk than Status-1A candidates. MELD score did not significantly predict wait-list mortality among Status-1A candidates (P = 0.18). Among Status-1A candidates with acetaminophen toxicity, MELD was a significant predictor of wait-list mortality (P < 0.0009). Posttransplant survival was similar for Status-1A and ESLD candidates with MELD > 20 (P = 0.6). Conclusion: Candidates with MELD > 40 have significantly higher wait-list mortality and similar posttransplant survival as candidates who are Status-1A, and therefore, should be assigned higher priority than Status-1A for allocation. Because ESLD candidates with MELD 36-40 and Status-1A have similar wait-list mortality risk and posttransplant survival, these candidates should be assigned similar rather than sequential priority for deceased donor LT. © 2011 American Association for the Study of Liver Diseases.
Mathur A.K.,University of Michigan |
Ashby V.B.,University of Michigan |
Sands R.L.,University of Michigan |
Wolfe R.A.,Arbor Research Collaborative for Health
American Journal of Transplantation | Year: 2010
The effect of demand for kidney transplantation, measured by end-stage renal disease (ESRD) incidence, on access to transplantation is unknown. Using data from the U.S. Census Bureau, Centers for Medicare & Medicaid Services (CMS) and the Organ Procurement and Transplantation Network/Scientific Registry of Transplant Recipients (OPTN/SRTR) from 2000 to 2008, we performed donation service area (DSA) and patient-level regression analyses to assess the effect of ESRD incidence on access to the kidney waiting list and deceased donor kidney transplantation. In DSAs, ESRD incidence increased with greater density of high ESRD incidence racial groups (African Americans and Native Americans). Wait-list and transplant rates were relatively lower in high ESRD incidence DSAs, but wait-list rates were not drastically affected by ESRD incidence at the patient level. Compared to low ESRD areas, high ESRD areas were associated with lower adjusted transplant rates among all ESRD patients (RR 0.68, 95% CI 0.66-0.70). Patients living in medium and high ESRD areas had lower transplant rates from the waiting list compared to those in low ESRD areas (medium: RR 0.68, 95% CI 0.66-0.69; high: RR 0.63, 95% CI 0.61-0.65). Geographic variation in access to kidney transplant is in part mediated by local ESRD incidence, which has implications for allocation policy development. © 2010 The American Society of Transplantation and the American Society of Transplant Surgeons.
Luan F.L.,University of Michigan |
Steffick D.E.,Arbor Research Collaborative for Health |
Ojo A.O.,University of Michigan |
Ojo A.O.,Arbor Research Collaborative for Health
Transplantation | Year: 2011
Background: New-onset diabetes after transplant (NODAT) is a serious complication after kidney transplantation. We studied the relationship between steroid-free maintenance regimens and NODAT in a national cohort of adult kidney transplant patients. Methods: A total of 25,837 previously nondiabetic kidney transplant patients, engrafted between January 1, 2004, and December 31, 2006, were included in the study. Logistic regression analysis was used to compare the risk of developing NODAT within 3 years after transplant for patients discharged with and without steroid-containing maintenance immunosuppression regimens. The effect of transplant program-level practice regarding steroid-free regimens on the risk of NODAT was studied as well. Results: The cumulative incidence of NODAT within 3 years of transplant was 16.2% overall; 17.7% with maintenance steroids and 12.3% without (P<0.001). Patients discharged with steroids had 42% greater odds of developing NODAT compared with those without steroids (adjusted odds ratio [AOR]=1.42, 95% confidence interval [CI]=1.27-1.58, P<0.001). The maintenance regimen of tacrolimus and mycophenolate mofetil or mycophenolate sodium was associated with 25% greater odds of developing NODAT (AOR=1.25, 95% CI=1.08-1.45, P=0.003) than the regimen of cyclosporine and mycophenolate mofetil or mycophenolate sodium. Several induction therapies also were associated with lower odds of NODAT compared with no induction. Patients from programs that used steroid-free regimens for a majority of their patients had reduced odds of NODAT compared with patients from programs discharging almost all of their patients on steroid-containing regimens. Conclusion: The adoption of steroid-free maintenance immunosuppression at discharge from kidney transplantation in selected patients was associated with reduced odds of developing NODAT within 3 years. © 2011 by Lippincott Williams & Wilkins.
Merion R.M.,University of Michigan |
Merion R.M.,Arbor Research Collaborative for Health
Seminars in Liver Disease | Year: 2010
Liver transplantation has rapidly advanced from an experimental therapy to a mainstream treatment option for a wide range of acute and chronic liver diseases. Indications for liver transplant have evolved to include previously contraindicated conditions such as hepatocellular carcinoma and alcohol-related liver disease. Cirrhosis from chronic hepatitis C infection remains the most common indication today. Multidisciplinary evaluation for liver transplantation is intended to confirm the patient's suitability and identify the appropriate timing of transplant, although the latter is problematic as a result of the ongoing donor organ shortage. Deceased liver donors have been increasing in number, but increasing donor age has been associated with less satisfactory posttransplant results. Living donor liver transplant is a dramatic but very infrequent procedure; risk to the living donor is of paramount concern. The main focus of deceased donor allocation has transitioned from waiting time to estimation of the likelihood of death without transplant (medical urgency), and now relies upon a laboratory-based Model for End-Stage Liver Disease (MELD) score for candidates with chronic liver disease. Those with acute liver failure are prioritized ahead of those with chronic conditions. Although not used as a direct criterion for allocation, development of the concept of transplant survival benefit, i.e., the extra years of life attributable to transplant, has facilitated better ordering of those candidates likely to have the most benefit, while restricting access to those whose lives will be extended minimally or not at all. Overall posttransplant outcomes have steadily improved, with unadjusted 5-year patient survival rates of 77% among patients transplanted with MELD score between 15 and 20, and 72% for those with MELD scores between 21 and 30. Copyright © 2010 by Thieme Medical Publishers, Inc.
Fuller D.S.,Arbor Research Collaborative for Health |
Pisoni R.L.,Arbor Research Collaborative for Health |
Bieber B.A.,Arbor Research Collaborative for Health |
Gillespie B.W.,Arbor Research Collaborative for Health |
Robinson B.M.,Arbor Research Collaborative for Health
American Journal of Kidney Diseases | Year: 2013
From August 2010 to December 2011, anemia management has undergone substantial changes. Significant reductions in intravenous epoetin doses and declines in Hb levels in the latter half of 2011 suggest that changes in regulatory guidance (such as the June ESA label change) exerted a larger influence on reducing ESA doses than changes in payment introduced at the beginning of 2011.There has been a substantial shift toward lower Hb levels from 2010 to 2011. Although the largest change in distribution was from ≥12 to 10-10.9 g/dL, the percentage of patients with Hb <10 g/dL increased to 16% by December 2011. Notably, patients with Hb <10 g/dL had the largest decline in prescribed epoetin doses and similar intravenous iron prescription patterns to other patients. This finding is consistent with lower ceiling epoetin doses, lower Hb targets, and tolerance of lower Hb levels in general; however, in the context of reports of rising transfusion rates (often occurring in hospitals and potentially outside the nephrologist's control), we believe this represents an opportunity to improve care. Accordingly, we suggest nephrologists give consideration to raising epoetin dose and intravenous iron use in many of these patients to limit avoidable transfusions. We have also observed increases in intravenous iron utilization along with a large rise in serum ferritin levels in 2011, to a median value of nearly 700 ng/mL. The consequences of greater iron dosing and high cumulative iron doses over time merit urgent study. Until then, clinical caution is warranted. ©2013 by the National Kidney Foundation, Inc.
Port F.K.,Arbor Research Collaborative for Health
American Journal of Kidney Diseases | Year: 2014
When randomized controlled trials are unavailable, clinicians have to rely on observational studies. However, analyses using observational data to evaluate specific treatments and their associations with outcomes often are biased through confounding by clinical indication for the treatment of interest. Given the rich observational data and limited clinical trial data available in the dialysis population, successfully accounting for this bias can lead to substantial knowledge generation. In recent decades, much has been learned about statistical methods for observational data, including the fact that even extensive adjustments may not always overcome this bias, particularly when unmeasured confounders exist. In this article, examples based on the international DOPPS (Dialysis Outcomes and Practice Patterns Study) are used to demonstrate the value of practice-based instrumental variable analyses. This methodology leverages the marked differences in practice patterns among dialysis facilities and uses the reasonable assumption that patients are assigned to a dialysis facility without consideration of its specific treatment pattern in order to minimize bias in analyses relying on observational data. Examples using the dialysis facility as an instrument that are reviewed in depth in this article include studies of dialysate sodium concentration, systolic blood pressure targets, and treatment time, demonstrate the value of this methodology to produce advanced knowledge. However, practice-based analyses have potentialLimitations. Specifically, observation of sufficiently large differences in practice patterns is required and these analyses should consider that the treatment of interest may be associated with other facility treatment practices. These examples from the DOPPS hopefully will stimulate advances in methodologies and critical clinical work toward improving patient care by identifying beneficial treatment practices applicable to dialysis, chronic kidney disease, and beyond. © 2014 National Kidney Foundation, Inc.
Ramirez S.P.,Arbor Research Collaborative for Health
Clinical journal of the American Society of Nephrology : CJASN | Year: 2012
When hemodialysis dose is scaled to body water (V), women typically receive a greater dose than men, but their survival is not better given a similar dose. This study sought to determine whether rescaling dose to body surface area (SA) might reveal different associations among dose, sex, and mortality. Single-pool Kt/V (spKt/V), equilibrated Kt/V, and standard Kt/V (stdKt/V) were computed using urea kinetic modeling on a prevalent cohort of 7229 patients undergoing thrice-weekly hemodialysis. Data were obtained from the Centers for Medicare & Medicaid Services 2008 ESRD Clinical Performance Measures Project. SA-normalized stdKt/V (SAN-stdKt/V) was calculated as stdKt/V × ratio of anthropometric volume to SA/17.5. Patients were grouped into sex-specific dose quintiles (reference: quintile 1 for men). Adjusted hazard ratios (HRs) for 1-year mortality were calculated using Cox regression. spKt/V was higher in women (1.7 ± 0.3) than in men (1.5 ± 0.2; P<0.001), but SAN-stdKt/V was lower (women: 2.3 ± 0.2; men: 2.5 ± 0.3; P<0.001). For both sexes, mortality decreased as spKt/V increased, until spKt/V was 1.6-1.7 (quintile 4 for men: HR, 0.62; quintile 3 for women: HR, 0.64); no benefit was observed with higher spKt/V. HR for mortality decreased further at higher SAN-stdKt/V in both sexes (quintile 5 for men: HR, 0.69; quintile 5 for women: HR, 0.60). SA-based dialysis dose results in dose-mortality relationships substantially different from those with volume-based dosing. SAN-stdKt/V analyses suggest women may be relatively underdosed when treated by V-based dosing. SAN-stdKt/V as a measure for dialysis dose may warrant further study.