Time filter

Source Type

Garcia-Moreno A.-I.,Research Center for Applied Science and Advanced Technology | Gonzalez-Barbosa J.-J.,Research Center for Applied Science and Advanced Technology | Hurtado-Ramos J.B.,Research Center for Applied Science and Advanced Technology | Ornelas-Rodriguez F.-J.,Research Center for Applied Science and Advanced Technology
2015 38th International Conference on Telecommunications and Signal Processing, TSP 2015 | Year: 2015

In this paper, we compute the precision with which 3D points, camera orientation, position and calibration are estimated for a laser rangefinder and a multicamera-system. Both sensors are used to digitize urban environments. Errors in the localization of image features introduce errors in the reconstruction. Some algorithms are numerically unstable, intrinsically, or in conjunction to particular setups of points and/or of cameras. A practical methodology is presented to predict the error propagation inside the calibration process between both sensors. Performance charts of the error propagation in the intrinsic camera parameters and the relationship between the laser and the noisy of both sensors were calculated using simulations, an analitical analysis. Results for the calibration, error projection between camera-laser and uncertainty analysis are presented for data collected by the mobile terrestrial platform. © 2015 IEEE.


Garcia-Moreno A.-I.,Research Center for Applied Science and Advanced Technology | Hernandez-Garcia D.-E.,Research Center for Applied Science and Advanced Technology | Hernandez-Garcia D.-E.,National Polytechnic Institute of Mexico | Gonzalez-Barbosa J.-J.,Research Center for Applied Science and Advanced Technology | And 7 more authors.
Robotics and Autonomous Systems | Year: 2014

In this work we present an in-situ method to compute the calibration of two sensors, a LIDAR (Light Detection and Ranging) and a spherical camera. Both sensors are used in urban environment reconstruction tasks. In this scenario the speed at which the various sensors acquire and merge the information is very important; however reconstruction accuracy, which depends on sensors calibration, is also of high relevance. Here, a new calibration pattern, visible to both sensors is proposed. By this means, the correspondence between each laser point and its position in the camera image is obtained so that the texture and color of each LIDAR point can be known. Experimental results for the calibration and uncertainty analysis are presented for data collected by the platform integrated with a LIDAR and a spherical camera. © 2014 Elsevier B.V. All rights reserved.

Loading Research Center for Applied Science and Advanced Technology collaborators
Loading Research Center for Applied Science and Advanced Technology collaborators