Applied Research Assoc. Inc.

Raleigh, NC, United States

Applied Research Assoc. Inc.

Raleigh, NC, United States
SEARCH FILTERS
Time filter
Source Type

Kazmee H.,Applied Research Assoc Inc. | Tutumluer E.,University of Illinois at Urbana - Champaign | Beshears S.,University of Illinois at Springfield
Transportation Research Record | Year: 2017

Quality assurance is extremely important for satisfactory end performance of a constructed pavement. Traditional quality control and quality assurance (QC/QA) procedures based on volumetric and surface property checks are becoming outdated when used for constructing pavement foundation layers and ensuring pavement longevity. Recent emphasis in QC/QA procedures has shifted from a density-based approach to stiffness- and strength-based approaches with newly adopted advanced technologies. However, the need for QC/QA is often overlooked in the construction of low-volume roads with unbound aggregate layers, which may be built with recycled or out-of-specification materials of marginal quality: both are currently common sustainable practices. This paper summarizes key findings from QC/QA tests performed on full-scale pavement test sections in a recent research study conducted by the Illinois Center for Transportation. The focus of the tests was the validation of material specifications that the Illinois Department of Transportation has recently adopted for large-size unconventional aggregates, known as aggregate subgrade, through accelerated pavement testing. Seven representative aggregate types were used to construct test sections with aggregate sub-grade and virgin and recycled capping and subbase layers. Density measurements from a nuclear gauge were collected and routinely contrasted with modulus results from the lightweight deflectometer and soil stiffness gauge (GeoGauge) from the constructed layers. Further, forensic strength assessment was carried out by dynamic cone penetrometer and a variable energy PANDA penetration device. Geoendoscopic imaging, coring, and trenching were also conducted to identify the depth of the water table and the thickness of the as-constructed layer. The PANDA penetrometer results, in conjunction with geoendoscopy, proved to be effective in correlating rutting performances to QC/QA test results.


Bhattacharya B.,Applied Research Assoc Inc. | Gotlif A.,Applied Research Assoc Inc. | Darter M.,Applied Research Assoc Inc.
Transportation Research Record | Year: 2017

This paper describes the implementation of the bonded concrete overlay of asphalt pavement mechanistic-empirical (BCOA-ME) design procedure, developed by the University of Pittsburgh, into the AASHTOWare Pavement ME Design software. The BCOA-ME procedure was generally compatible with the mechanistic-empirical framework of Pavement ME and thus adaptable into the design framework. A thin bonded concrete overlay of existing asphalt pavements includes short to medium joint spacings (typically 6 × 6 ft) and a strong bond or high-contact friction between the portland cement concrete slab and the existing asphalt concrete surface. As much of the theory, concepts, assumptions, and inputs in the BCOA-ME design procedure as possible were implemented into the Pavement ME software. Differences included those required to match the computational procedures of Pavement ME (e.g., axle load spectra versus equivalent single-axle loads, monthly asphalt concrete damaged dynamic modulus, monthly portland cement concrete strength and modulus, and monthly unbound material resilient modulus). Longitudinal joint spacing ranging from 5 to 8 ft (not <5 ft) was included. Longitudinal fatigue cracking was directly considered as in the BCOA-ME, which initiates at the bottom of the slab. The calibration of the longitudinal cracking transfer function produced excellent goodness-of-fit statistics with no significant bias. The new procedure was incorporated into Pavement ME Version 2.3, released in July 2016.


Lee H.S.,Applied Research Assoc Inc. | Ayyala D.,Applied Research Assoc Inc. | Von Quintus H.,Applied Research Assoc Inc.
Transportation Research Record | Year: 2017

In this study, dynamic backcalculation was conducted on two selected Long-Term Pavement Performance (LTPP) sections, one from a cold climate zone and the other from a warmer climate zone. It was demonstrated that not only the viscoelastic dynamic modulus (| E∗|) of the asphalt has been backcalculated from an extensive set of field data but also the master curve has been constructed successfully with the back-calculated data. It was also found that the backcalculated master curves were significantly different from those constructed with the LTPP | E∗| coefficients. Furthermore, the backcalculated dynamic modulus was obtained from the same pavement sections after 12 to 14 years of service and compared with the initial baseline modulus.


Hanes J.,Applied Research Assoc. Inc. | Wiegand R.P.,University of Central Florida
Proceedings - Annual Reliability and Maintainability Symposium | Year: 2017

Fault Tree Analysis (FTA) is used extensively to evaluate the logical dependency of a system on its constituent components. Fault trees (FTs) can be used to identify and correct weaknesses in a design before a system goes to production. Effective methods have been developed over the course of several decades for finding minimal cut sets (MCS). Cut sets identify combinations of component failures that cause the system to fail. Other methods focus on probability risk assessment, in which component failure probabilities are evaluated to determine which failure events are most probable under normal operating conditions. However, traditional FTs do not contain information about the physical location of the components that make up the system. Thus, they cannot identify vulnerabilities induced by the proximity relationships of those components. Components that are sufficiently close to each other could be defeated by a single event with a large enough radius of effect. Events such as the Deepwater Horizon explosion and subsequent oil spill demonstrate the potentially devastating risk posed by such vulnerabilities. Adding positional information to the logical information contained in the FT can capture proximity relationships that constitute vulnerabilities in the overall system but are not contained in the logical structure alone. Thus, existing FTA methods cannot address these concerns. Making use of the positional information would require extensions to existing solution methods or possibly new methods altogether. In practice, fault trees can grow very large, exceeding one thousand components for a large system, which causes a combinatorial explosion in the number of possible solutions. Traditional methods cope with this problem by limiting the number of solutions; generally this is an acceptable limitation since those methods will find the most likely events capable of defeating the fault tree. However, adding more information to the tree and searching for different criteria (such as conditional probabilities) can render that trade invalid and motivates the search for alternate means to find vulnerabilities in the system. Candidate methods for this type of problem should be able to find 'hot spots' in the physical space of very large real world systems where a destructive event would damage multiple components and cause the overall system to fail. In the present research, a test set of medium to large fault tree systems was generated using Lindenmayer systems. These systems vary in size from tens of components to over a thousand and vary in terms of complexity as measured by the proportion of operator types and size of minimal cut sets. Two solution approaches were explored in this research that use graph clustering to integrate positional information with FT solutions as an initial attempt to solve spatially constrained fault trees. These methods were applied to the set of test fault trees to evaluate their performance in finding solutions to this type of problem. The first method uses xfta, a freely available FT solver from OpenPSA, to find minimal cut sets, then performs k-means clustering on the resulting cut sets to determine whether a spatial vulnerability exists. This method works well for smaller fault trees for which all minimal cut sets can be determined. However, for large, complex fault trees, there remains the possibility that crucial vulnerabilities are not identified since the overall proportion of MCS that can be evaluated in practical time can be less than one in a million. The second method performs a modified k-means cluster on the entire set of components to find groups of spatially related components, then feeds the groups into a fault tree evaluator. This method also works, though not very effectively, for smaller fault trees or when the radius of effect is large relative to the physical space. Neither method provides a deterministic means to solve large complex fault trees, leaving open the question of whether better methods exist to solve this type of problem. The combinatorial effect combined with the addition of positional information increases the difficulty of finding solutions in the search space. This research is presented in the hope of stimulating interest in the research community to find better methods of finding and correcting vulnerabilities using fault trees with location information. © 2017 IEEE.


Salzar R.S.,University of Virginia | Treichler D.,Advanced Technology and Research Corporation | Wardlaw A.,Advanced Technology and Research Corporation | Weiss G.,Applied Research Assoc. Inc. | Goeller J.,Advanced Technology and Research Corporation
Journal of Neurotrauma | Year: 2017

The potential of blast-induced traumatic brain injury from the mechanism of localized cavitation of the cerebrospinal fluid (CSF) is investigated. While the mechanism and criteria for non-impact blast-induced traumatic brain injury is still unknown, this study demonstrates that local cavitation in the CSF layer of the cranial volume could contribute to these injuries. The cranial contents of three post-mortem human subject (PMHS) heads were replaced with both a normal saline solution and a ballistic gel mixture with a simulated CSF layer. Each were instrumented with multiple pressure transducers and placed inside identical shock tubes at two different research facilities. Sensor data indicates that cavitation may have occurred in the PMHS models at pressure levels below those for a 50% risk of blast lung injury. This study points to skull flexion, the result of the shock wave on the front of the skull leading to a negative pressure in the contrecoup, as a possible mechanism that contributes to the onset of cavitation. Based on observation of intracranial pressure transducer data from the PMHS model, cavitation onset is thought to occur from approximately a 140 kPa head-on incident blast. © 2017, Mary Ann Liebert, Inc.


Young L.A.,Physical Sciences, Inc | Rule G.T.,Physical Sciences, Inc | Bocchieri R.T.,Applied Research Assoc. Inc. | Burns J.M.,Physical Sciences, Inc
Seminars in Neurology | Year: 2015

Despite years of effort to prevent traumatic brain injuries (TBIs), the occurrence of TBI in the United States alone has reached epidemic proportions. When an external force is applied to the head, it is converted into stresses that must be absorbed into the brain or redirected by a helmet or other protective equipment. Complex interactions of the head, neck, and jaw kinematics result in strains in the brain. Even relatively mild mechanical trauma to these tissues can initiate a neurochemical cascade that leads to TBI. Civilians and warfighters can experience head injuries in both combat and noncombat situations from a variety of threats, including ballistic and blunt impact, acceleration, and blast. It is critical to understand the physics created by these threats to develop meaningful improvements to clinical care, injury prevention, and mitigation. Here the authors review the current state of understanding of the complex loading conditions that lead to TBI and characterize how these loads are transmitted through soft tissue, the skull and into the brain, resulting in TBI. In addition, gaps in knowledge and injury thresholds are reviewed, as these must be addressed to better design strategies that reduce TBI incidence and severity. © Georg Thieme Verlag KG Stuttgart New York.


Nemeth C.P.,Applied Research Assoc. Inc. | Herrera I.,Sintef
Reliability Engineering and System Safety | Year: 2015

Resilience Engineering (RE) has developed theories, methods, and tools to deliberately manage the adaptive ability of organizations in order to function effectively and safely. As the first peer-reviewed journal publication in the field, this special issue has three purposes: to provide the scientific and industrial communities with the opportunity to present current work in RE, to critically view REs progress and contributions to research and practice, and to pose questions to stimulate thinking about REs future. We propose three values for the RE field of practice: observation, analysis, and design and development. The special issues content and viewpoints are not intended to provide conclusive answers, but rather to stimulate further inquiry and growth. © 2015 Elsevier Ltd. All rights reserved.


Vickery P.J.,Applied Research Assoc. Inc.
Proceedings of the Annual Offshore Technology Conference | Year: 2014

This effort represents a critical examination of the suitability of the models for atmospheric turbulence used in the draft API RP 2MET for describing the characteristics of hurricane winds offshore, using data collected from recent (post-2000) hurricanes. The preliminary results indicate that the current provisions are not suitable for describing hurricane winds, and hence wind loads associated with hurricanes. Using wind speed measurements obtained from Gulf of Mexico hurricanes from a number of sources, the models for mean velocity profiles, gust factors and turbulence intensities given in the draft API RP 2MET were tested against these data. The wind speed data sources include: 1.0 Vertical variation of wind speed with height using NOAA dropwindsonde data 2.0 Gust factor data from NOAA data buoys and C-MAN stations 3.0 Time series of wind speeds from a number of production facilities located in the Gulf of Mexico The investigation found that the provisions of API RP 2MET do not provide a good representation of the structure of the hurricane boundary layer. A major reason that the provisions of API RP 2MET do not model the hurricane boundary layer well is that the models were developed using North Sea data, coupled with an assumption that the surface drag coefficient increases with wind speed. The net result is that at high wind speeds the API RP 2MET relations yield a mean velocity profile that increases too rapidly with height, overestimating wind speeds at typical deck height. For typical design wind speeds for structures located in the Gulf of Mexico the error associated with the API RP 2MET representation of the gust factor varies with height, overestimating the gust factor near the surface (10 m elevation or less), and underestimating it at higher elevations. Copyright 2014, Offshore Technology Conference.


Hein D.K.,Applied Research Assoc. Inc.
T and DI Congress 2014: Planes, Trains, and Automobiles - Proceedings of the 2nd Transportation and Development Institute Congress | Year: 2014

While many agencies responsible for the management of airport infrastructure are working towards holistic infrastructure management, there is currently no standard in place in Canada and the United States. An international standards organization 55000 series of standards is currently under development. This standard uses the British Standards Institute Publicly Available Specification PAS 55 as a foundation for the standard development which is due for completion in late 2013. In 2011, the Transportation Research Board (TRB) Airport Cooperative Research Program (ACRP), sponsored a study to develop a guidebook and primer for airport asset managers in Canada and the United States. This paper, outlines the results of asset management best appropriate practices gleaned from surveys and interviews of over 50 airports of various sizes across North America. The paper outlines a 10 step process for successful asset management implementation and provides details on policy, objectives, strategies and plans for implementing an asset management framework. Specific best practices are described and highlighted along with the keys to their successful implementation. © 2014 American Society of Civil Engineers.


Wilke P.W.,Applied Research Assoc. Inc.
T and DI Congress 2014: Planes, Trains, and Automobiles - Proceedings of the 2nd Transportation and Development Institute Congress | Year: 2014

The Rolling Wheel Deflectometer (RWD) is an innovative device developed for the efficient, high-speed measurement of pavement structural response over a broad network of roads. It uses a series of lasers mounted beneath the bed of a custom-built 16-meter semi-trailer to measure a continuous profile of pavement deflections produced by the trailer's 8,164 kilogram (kg) single axle load. The RWD has recently moved from a research prototype to a production tool that has been used for network level pavement structural evaluation. This paper presents the results of a study that evaluated the structural capacity of 463 kilometers (km) of Pennsylvania Department of Transportation's (PennDOT's) local and arterial roads using the RWD and a Falling Weight Deflectometer (FWD) and comparison of the results to the estimated structural capacity stored in PennDOT's Roadway Management System (RMS). The structural capacity determinations from RWD data were based on a methodology developed by the Asphalt Institute (AI) that correlated deflections from a Benkleman Beam testing device to remaining pavement life. The structural capacity estimates from the FWD were based on conventional techniques described in the 1993 AASHTO Pavement Design Guide. The RMS estimates are based on pavement composition and age data. The results of the study indicated a good correlation between the estimates of remaining pavement life from the three methods, although significant scatter was observed in the data. All three methods clearly distinguished between the structural capacities of three groups of roads with known differences in pavement strength. The study verifies that the RWD is a useful tool for network level pavement evaluation for planning purposes. © 2014 American Society of Civil Engineers.

Loading Applied Research Assoc. Inc. collaborators
Loading Applied Research Assoc. Inc. collaborators