Monterey, CA, United States
Monterey, CA, United States

Time filter

Source Type

Chandra Katari A.,GENPACT India Pvt. Ltd | Umar Shaik N.,CSC Company | Rao Pasupuleti V.,P.A. College
International Journal of Applied Environmental Sciences | Year: 2013

In this study, we use different techniques to predict the probability of customer payment behavior for accounts receivables (in the form of invoices) in finance business. Our goal in this paper is to develop a statistical method that yields predictive distributions for delinquency occurrence of the customer. For these purpose two different approaches, logistic regression and discriminant analysis are used and we also applied both methods in the case of unified and sitespecific scenarios. Results are presented and discussed to choose the best model. Using simple logistic regression and discriminant analysis, we illustrate the importance of comparing models with different number of parameters. Goodness of fit of the logistic regression model will be examined by using likelihood ratio test & wald's test. For discriminant function Eigen values (λ), canonical correlation eta (η), wilks' lambda (Λ) and chi-square test (χ2) are used for goodness of fit test. © Research India Publications.


Jassal S.K.,University of California at San Diego | Chonchol M.,University of Colorado at Denver | Von Mhlen D.,University of California at San Diego | Smits G.,CSC Inc | Barrett-Connor E.,University of California at San Diego
American Journal of Medicine | Year: 2010

Background Recent systematic reviews have cast doubt on the association between vitamin D and cardiovascular disease. No prior studies have investigated the association between 25-hydroxyvitamin D (25[OH]D), 1,25-dihydroxyvitamin D (1,25[OH]2D), or intact parathyroid hormone and cardiovascular mortality in a temperate climate. Methods A total of 1073 community-dwelling older adults were evaluated in 1997-1999; serum levels of 25(OH)D (mean 42 ng/mL), 1,25(OH)2D (median 29 pg/mL), and intact parathyroid hormone (median 46 pg/mL) were measured; mean estimated glomerular filtration rate was 74 mL/min/1.73 m2. Participants were followed up to 10.4 (mean 6.4) years with 111 cardiovascular deaths. Results In unadjusted Cox proportional hazards models, higher levels of 1,25(OH)2D were protective against cardiovascular mortality, whereas higher levels of intact parathyroid hormone predicted increased risk of cardiovascular death. After adjusting for age alone or multiple covariates, there was no significant association between 25(OH)D, 1,25(OH)2D, or intact parathyroid hormone and cardiovascular mortality; results did not differ by an estimated glomerular filtration rate < 60 mL/min/1.73 m2 or < 60 mL/min/1.73 m2. Conclusion In this prospective study of Caucasian, middle-income, community-dwelling older adults living in sunny southern California, serum levels of 25(OH)D, 1,25(OH)2D, and intact parathyroid hormone were not independently associated with cardiovascular mortality.


Sessions W.R.,CSC Inc. | Reid J.S.,U.S. Navy | Benedetti A.,European Center for Medium Range Weather Forecasts Reading | Colarco P.R.,NASA | And 22 more authors.
Atmospheric Chemistry and Physics | Year: 2015

Here we present the first steps in developing a global multi-model aerosol forecasting ensemble intended for eventual operational and basic research use. Drawing from members of the International Cooperative for Aerosol Prediction (ICAP) latest generation of quasi-operational aerosol models, 5-day aerosol optical thickness (AOT) forecasts are analyzed for December 2011 through November 2012 from four institutions: European Centre for Medium-Range Weather Forecasts (ECMWF), Japan Meteorological Agency (JMA), NASA Goddard Space Flight Center (GSFC), and Naval Research Lab/Fleet Numerical Meteorology and Oceanography Center (NRL/FNMOC). For dust, we also include the National Oceanic and Atmospheric Administration-National Geospatial Advisory Committee (NOAA NGAC) product in our analysis. The Barcelona Supercomputing Centre and UK Met Office dust products have also recently become members of ICAP, but have insufficient data to be included in this analysis period. A simple consensus ensemble of member and mean AOT fields for modal species (e.g., fine and coarse mode, and a separate dust ensemble) is used to create the ICAP Multi-Model Ensemble (ICAP-MME). The ICAP-MME is run daily at 00:00 UTC for 6-hourly forecasts out to 120 h. Basing metrics on comparisons to 21 regionally representative Aerosol Robotic Network (AERONET) sites, all models generally captured the basic aerosol features of the globe. However, there is an overall AOT low bias among models, particularly for high AOT events. Biomass burning regions have the most diversity in seasonal average AOT. The Southern Ocean, though low in AOT, nevertheless also has high diversity. With regard to root mean square error (RMSE), as expected the ICAP-MME placed first over all models worldwide, and was typically first or second in ranking against all models at individual sites. These results are encouraging; furthermore, as more global operational aerosol models come online, we expect their inclusion in a robust operational multi-model ensemble will provide valuable aerosol forecasting guidance. © 2015 Atmos. Chem. Phys.


Jassal S.K.,University of California at San Diego | Jassal S.K.,Veterans Administration San Diego Healthcare System | Chonchol M.,University of Colorado at Denver | Laughlin G.A.,University of California at San Diego | And 6 more authors.
American Journal of Cardiology | Year: 2012

Longitudinal studies of the association of estimated glomerular filtration rate (eGFR) and albuminuria with coronary artery calcium (CAC), a measure of cardiovascular disease burden, are few and contradictory. In this study, 421 community-dwelling men and women (mean age 67 years) without known heart disease had eGFRs assessed using the Modification of Diet in Renal Disease (MDRD) equation and albuminuria assessed by urine albumin/creatinine ratio (ACR) from 1997 to 1999. The mean eGFR was 78 ml/min/1.73 m2, and the median ACR was 10 mg/g. CAC was measured using electron-beam computed tomography from 2000 to 2001, when the median total Agatston CAC score was 77; 4.5 years later, 338 participants still without heart disease underwent repeat scans (median CAC score 112); 46% of participants showed CAC progression, defined as an increase <2.5 mm3 in square root-transformed CAC volume score. Cross-sectional and longitudinal logistic regression analyses showed no separate or joint association between eGFR or ACR and CAC severity or progression. In conclusion, this study does not support the use of eGFR or ACR to identify asymptomatic older adults who should be screened for subclinical cardiovascular disease with initial or sequential scanning for CAC. In the elderly, kidney function and CAC may not progress together. © 2012 Elsevier Inc. All rights reserved.


Rubin J.I.,National Research Council Italy | Rubin J.I.,U.S. Navy | Reid J.S.,U.S. Navy | Hansen J.A.,U.S. Navy | And 10 more authors.
Atmospheric Chemistry and Physics | Year: 2016

An ensemble-based forecast and data assimilation system has been developed for use in Navy aerosol forecasting. The system makes use of an ensemble of the Navy Aerosol Analysis Prediction System (ENAAPS) at 1 × 1°, combined with an ensemble adjustment Kalman filter from NCAR's Data Assimilation Research Testbed (DART). The base ENAAPS-DART system discussed in this work utilizes the Navy Operational Global Analysis Prediction System (NOGAPS) meteorological ensemble to drive offline NAAPS simulations coupled with the DART ensemble Kalman filter architecture to assimilate bias-corrected MODIS aerosol optical thickness (AOT) retrievals. This work outlines the optimization of the 20-member ensemble system, including consideration of meteorology and source-perturbed ensemble members as well as covariance inflation. Additional tests with 80 meteorological and source members were also performed. An important finding of this work is that an adaptive covariance inflation method, which has not been previously tested for aerosol applications, was found to perform better than a temporally and spatially constant covariance inflation. Problems were identified with the constant inflation in regions with limited observational coverage. The second major finding of this work is that combined meteorology and aerosol source ensembles are superior to either in isolation and that both are necessary to produce a robust system with sufficient spread in the ensemble members as well as realistic correlation fields for spreading observational information. The inclusion of aerosol source ensembles improves correlation fields for large aerosol source regions, such as smoke and dust in Africa, by statistically separating freshly emitted from transported aerosol species. However, the source ensembles have limited efficacy during long-range transport. Conversely, the meteorological ensemble generates sufficient spread at the synoptic scale to enable observational impact through the ensemble data assimilation. The optimized ensemble system was compared to the Navy's current operational aerosol forecasting system, which makes use of NAVDAS-AOD (NRL Atmospheric Variational Data Assimilation System for aerosol optical depth), a 2-D variational data assimilation system. Overall, the two systems had statistically insignificant differences in root-mean-squared error (RMSE), bias, and correlation relative to AERONET-observed AOT. However, the ensemble system is able to better capture sharp gradients in aerosol features compared to the 2DVar system, which has a tendency to smooth out aerosol events. Such skill is not easily observable in bulk metrics. Further, the ENAAPS-DART system will allow for new avenues of model development, such as more efficient lidar and surface station assimilation as well as adaptive source functions. At this early stage of development, the parity with the current variational system is encouraging. © Author(s) 2016.


Goldberg R.J.,Aurora University | Smits G.,CSC Inc. | Wiseman A.C.,Aurora University
Transplantation | Year: 2010

Background.: The degree to which recipient/donor (R/D) size mismatching leads to nephron underdosing and worse kidney allograft survival remains poorly defined, particularly in the setting of preexisting nephron loss such as the expanded criteria donor (ECD). Methods.: We performed a retrospective analysis of 69,737 deceased donor transplants followed by a subset analysis of ECD transplants using data from the Scientific Registry of Transplant Recipients from 1992 to 2005. Ratios of R/D body surface area (BSA) were used to estimate nephron disparity and segregate pairs. Results.: In the entire cohort, severe BSA disparity (R/D BSA>1.38 m) was associated with slightly worse 10-year unadjusted graft survival (35% for severe BSA disparity vs. 39% in pairs of comparable size, P<0.0001). In multivariate analysis, BSA disparity was associated with a 15% increased risk of graft loss (hazard ratio 1.15, P<0.0001). Within ECD cohorts, severe BSA disparity was associated with a decrease in 10-year unadjusted graft survival of greater magnitude than the overall cohort (10% for severe BSA disparity vs. 22% in pairs of comparable size, P<0.0004). On multivariate analysis, severe R/D BSA disparity was associated with worse allograft survival similar to the entire cohort (hazard ratio 1.18, P=0.04). Conclusions.: Recipients receiving kidneys from substantially smaller donors have a statistically higher rate of graft loss that is more pronounced in ECD kidneys. Although severe R/D size disparity is an independent risk factor for graft loss, the magnitude of this risk requires consideration in the context of other risk factors for the graft loss and the hazards of dialysis. © 2010 by Lippincott Williams & Wilkins.


Chandra Katari A.,GENPACT India Pvt. Ltd | Umar Shaik N.,CSC Company | Rao Pasupuleti V.,P.A. College
International Journal of Applied Environmental Sciences | Year: 2013

In this paper we discussed a brief empirical application of ARIMA in various time intervals of US inflation data and provided recommendations on the time limit for ARIMA. Traditional time series models such as ARIMA models have been proven to be inadequate for modeling long or short range dependence. In this context we introduced MCARIMA methodology and its applications of fine-tuning of ARIMA residuals using Markov Chain to improve prediction accuracy and to decide the time interval for better forecasting. © Research India Publications.


Miller D.H.,US Ecology | Kreis Jr. R.G.,US Ecology | Huang W.-C.,U.S. Navy | Xia X.,CSC Corporation
Biological Invasions | Year: 2010

A Lake Michigan Ecosystem Model (LM-Eco) that includes a detailed description of trophic levels and their interactions was developed for Lake Michigan. The LM-Eco model constitutes a first step toward a comprehensive Lake Michigan ecosystem productivity model to investigate ecosystem-level responses and effects within the lower food web of the lake. The effect of the invasive species Bythotrephes longimanus on individual zooplankton species was investigated based upon extensive field data collected at multiple locations in Lake Michigan during the 1994-1995 Lake Michigan Mass Balance Study. Field data collected at 15 sampling stations within Lake Michigan over a series of 8 sampling cruises throughout a 2 year period demonstrated that over 65% of zooplankton species exhibited a decline with the occurrence of Bythotrephes in the sample. The LM-Eco model was successfully applied to simulate the trends of Bythotrephes and zooplankton abundance as observed in the collected field data. Model simulations allowed for examination of interactions between the invader Bythotrephes and native zooplankton groups on a 5 km by 5 km resolution throughout Lake Michigan. Analysis was completed as a time series specific to individual field sampling locations within the lake, and also on a lake-wide scale. © 2010 US Government.


Rohret D.,CSC Inc.
Proceedings of the 10th International Conference on Cyber Warfare and Security, ICCWS 2015 | Year: 2015

The International Council on Systems Engineering's Resilient System's Working Group defines resiliency as, 'the capability of a system with specific characteristics before, during, and after a disruption to absorb the disruption, recover to an acceptable level of performance, and sustain that level for an acceptable period of time' [INCOSE, 2013]. An operational resiliency model describes a measurement process while demonstrating the scope of achieving resiliency through a dynamic process that includes anticipation of negative effects, withstanding the affects, recovery, and network evolution. In order to maintain the command and control (C2) advantage during military operations and throughout cyberspace, the measure of functional resiliency must be quantified for integrated and operational systems to provide network defenders and military decision makers the level of capability (to recover) following a significant cyber incident or a catastrophic natural event. To achieve functional resonance, and vet potential and future threats, technologies competing for network resources must be identified and stressed to determine their role in resiliency and the potential affect they will have on operational systems during an aggressive cyber attack. Through network analysis, based on actual adversarial research and case studies, adaptive analysis teams collect the necessary data to determine a systems' resonance characteristics, specifically, interdependent technologies and processes that can negatively affect a single system or an enterprise network. The traditional role of a vulnerability analysis team is to identify and exploit every vulnerable system or process in order to expose and mitigate weaknesses for the purpose of creating a more viable network. This scope is narrow and confined to a limited range of requirements or technologies based on a similarly narrow set of objectives and goals. To compound the problems associated with obtaining an acceptable resilient posture for a specific system or an enterprise network is the IT industry's misconception that resiliency is tantamount to bandwidth and not a measurement of capability. Network managers attempt to solve poor resiliency by installing more network appliances (redundancy) and adding additional bandwidth; both costly and often ineffective. It is paramount that network managers first identify their current resiliency and associated functional resonance issues prior to initiating corrective actions. The intent of this research is to identify current methods of measuring or achieving acceptable resilience for an enterprise network, identify shortfalls in acquiring accurate and actionable data, and the incorrect application of mitigations that result in no or little resiliency enhancement. The author outlines a process to accurately measure a networks resiliency posture, which will lead to effective mitigations and enhancements allowing for a rapid and cost-effective recovery of functionality.


Rohret D.,CSC Inc
5th International Conference on Information Warfare and Security, ICIW 2010 | Year: 2010

The effectiveness of Department of Defence (DoD) computer network Red and Blue teaming depends upon accurate trend analysis data, which allows assessment teams to tailor their vulnerability analysis and penetration testing methods and techniques to the specific networks and architectures they will be assessing. Most trends and analysis data are derived primarily from commercial and open-source research that are nonspecific, even generic in nature. Organizations and researchers attempt to track information from very large data sets derived from high-level observations of Internet and intranet traffic, generalizing attack and vulnerability analysis for a wide and varied audience/clientele. This approach to trends research provides high level statistical data that does not benefit Red team analyst in determining the most effective method of assessing specific systems and networks. Poor delineation of the terms "event" and "incident' compound the problem in many open-source and commercial trend reports, as the two terms represent different aspects of an attack and are often incorrectly derived, skewing or invalidating the results. Additionally data may be emphasized or suppressed depending on the reporting organizations goals, products, and or specialization in network security. Unlike malicious hackers or adversaries, Red and Blue teams providing assessments and vulnerability studies for government and commercial clientele are usually required to accomplish their assessments within a specific time frame and within a fixed budget. Commercial and custom vulnerability trends analysis may take months and requires extensive resources, which may not be available to the assessment teams in a timely manner. These obstacles are further compounded by interdepartmental segregation of duties within Government agencies that limit or prohibit uncharted actions based on funding guidelines and/or mission statements. This paper endeavours to identify issues preventing the identification of timely and specific vulnerability and network attack trends data, providing an alternative to current trends research methodology for Computer Network Security (CNS) Red and Blue teams assessing specific non-generic networks and network-centric data systems.

Loading CSC Inc. collaborators
Loading CSC Inc. collaborators