Time filter

Source Type

Long Beach, CA, United States

Welcher J.B.,University of Southern California | Welcher J.B.,Cedars Sinai Medical Center | Welcher J.B.,Biomechanical Research and Testing LLC | Popovich J.M.,University of Southern California | And 2 more authors.
Medical Engineering and Physics | Year: 2011

A new sensor array intended to accurately and directly measure spatial and time-dependent pressures within a highly curved biological intra-articular joint was developed and tested.To evaluate performance of the new sensor array for application within intra-articular joints generally, and specifically to fit within the relatively restrictive space of the lumbar spine facet joint, geometric constraints of length, width, thickness and sensor spatial resolution were evaluated. Additionally, the effects of sensor array curvature, frequency response, linearity, drift, hysteresis, repeatability, and total system cost were assessed.The new sensor array was approximately 0.6. mm in thickness, scalable to below the nominal 12. mm wide by 15 high lumbar spine facet joint size, offered no inherent limitations on the number or spacing of the sensors with less than 1.7% cross talk with sensor immediately adjacent to one another. No difference was observed in sensor performance down to a radius of curvature of 7. mm and a 0.66 ± 0.97% change in sensor sensitivity was observed at a radius of 5.5. mm. The sensor array had less than 0.07. dB signal loss up to 5.5. Hz, linearity was 0.58 ± 0.13% full scale (FS), drift was less than 0.2% FS at 250. s and less than 0.6% FS at 700. s, hysteresis was 0.78 ± 0.18%. Repeatability was excellent with a coefficient of variation less than 2% at pressures between 0 and 1.000. MPa. Total system cost was relatively small as standard commercially available data acquisition systems could be utilized, with no specialized software, and individual sensors within an array can be replaced as needed.The new sensor array had small and scalable geometry and very acceptable intrinsic performance including minimal to no alteration in performance at physiologically relevant ranges of joint curvature. © 2011 IPEM. Source

Welcher J.B.,University of Southern California | Welcher J.B.,Cedars Sinai Medical Center | Welcher J.B.,Biomechanical Research and Testing LLC | Popovich Jr. J.M.,University of Southern California | And 4 more authors.
ASME 2010 Summer Bioengineering Conference, SBC 2010 | Year: 2010

A new sensor for spatial and temporal intra-facet pressure measurement was developed and tested. Results show the new sensor meets or exceeds the design criteria and does not exhibit the problems with curvature effects previously reported. Additional work is in progress developing more robust and spatially relevant sensors to be utilized in human lumbar cadaveric specimens. Copyright © 2010 by ASME. Source

Vandiver W.,01 Civic Center Drive West | Ikram I.,Biomechanical Research and Testing LLC | Randles B.,Biomechanical Research and Testing LLC
SAE Technical Papers | Year: 2013

The accuracy of pre-crash data recorded in an Airbag Control Module (ACM) with Event Data Recorder (EDR) functionality has been studied and quantified for vehicles from several vehicle manufacturers. Most published research has involved vehicles with accessible data that can be downloaded via commercially available crash data retrieval equipment. Some Mitsubishi vehicles, including the 2009 Mitsubishi Lancer GTS, are capable of recording crash data that can be accessed only by the manufacturer. The accuracy of such data becomes important when it is intended to be used as part of a collision analysis. The pre-crash speed data recorded by a 2009 Mitsubishi Lancer vehicle was evaluated by generating artificial deployment events while running the vehicle on a 4-wheel dynamometer and simultaneously capturing data through the OBDII port. The tests were run at speeds up to approximately 145 kilometers per hour (90 miles per hour). The data from these tests illustrate the specific characteristics of the recording time and sample rate for the Mitsubishi Lancer ACM (i.e., 2.3 seconds of pre-crash data recorded at 100 millisecond intervals) and showed consistent results. Copyright © 2013 SAE International. Source

Vandiver W.,01 Civic Center Drive West | Anderson R.,Biomechanics Analysis | Ikram I.,Biomechanical Research and Testing LLC | Randles B.,Biomechanical Research and Testing LLC | Furbish C.,Biomechanical Research and Testing LLC
SAE Technical Papers | Year: 2015

The 2012 Kia Soul was manufactured with an Airbag Control Module (ACM) with an Event Data Recorder (EDR) function to record crash related data. However, 2013 is the first model year supported by the download tool and software manufactured for Kia vehicles and distributed by GIT America, Inc. Even with the same make and model, using the Kia EDR tool to image data from an unsupported model year calls into question whether some or any of the data has been properly translated. By way of example, a method for evaluating the usability of the crash related data obtained via coverage spoofing a 2012 Kia Soul is presented. Eight vehicle-to-barrier crash tests were conducted in a 2012 Kia Soul. The Kia EDR tool was utilized to retrieve crash data from the vehicle's EDR following each test by choosing the software translation settings for a 2013 Kia Soul. The recorded and translated crash data for those tests were analyzed and compared to on-board instrumentation. The results showed that some recorded data including vehicle speed, steering input, service brake (on/off) and seat belt status were reliable but that engine throttle (%) was not. Copyright © 2015 SAE International. Source

Suway J.A.,Biomechanical Research and Testing LLC | Welcher J.,Biomechanical Research and Testing LLC
SAE Technical Papers | Year: 2016

It is extremely important to accurately depict photographs or video taken of a scene at night, when attempting to show how the subject scene appeared. It is widely understood that digital image sensors cannot capture the large dynamic range that can be seen by the human eye. Furthermore, todays commercially available printers, computer monitors, TV's or other displays cannot reproduce the dynamic range that is captured by the digital cameras. Therefore, care must be taken when presenting a photograph or video while attempting to accurately depict a subject scene. However, there are many parameters that can be altered, while taking a photograph or video, to make a subject scene either too bright or too dark. Similarly, adjustments can be made to a printer or display to make the image appear either too bright or too dark. There have been several published papers and studies dealing with how to properly capture and calibrate photographs and video of a subject scene at night. Most of these approaches have used a qualitative approach. Some methods have used contrast boards or gradients and the individual taking the photograph or video records his/her observations of what can and cannot be seen on the gradient. Then the photograph or video is calibrated so that the image matches what the initial observer could see. One prior method calibrates a CRT monitor, DLP projector and printer to produce images with similar contrast detection. Again, this approach is qualitative. This study presents a method for calibrating photographs and video, for use and display on printers, computer monitors, TV's or other displays, with a quantitative method. This method removes potential interpretation bias and provides a scientific approach for determining if a photograph or video accurately depicts the contrast of the subject scene. This is accomplished by applying a similar approach to two different methods. The first method allows for a calibrated image of an object that cannot be seen and the second method allows for a calibrated image of an object that can be seen. In both methods, this is accomplished by measuring the contrast in a scene and adjusting the image for the appropriate contrast. Since there are no, previously published, methodologies for quantitatively determining if an image accurately represents a scene, this methodology is compared to previous, qualitative methods as well as Adrian's Visibility model. © Copyright 2016 SAE International. Source

Discover hidden collaborations