Entity

Time filter

Source Type


Mondragon I.F.,Computer Vision Group Upm | Olivares-Mendez M.A.,Computer Vision Group Upm | Campoy P.,Computer Vision Group Upm | Martinez C.,Computer Vision Group Upm | Mejias L.,Queensland University of Technology
Autonomous Robots | Year: 2010

This paper presents an implementation of an aircraft pose and motion estimator using visual systems as the principal sensor for controlling an Unmanned Aerial Vehicle (UAV) or as a redundant system for an Inertial Measure Unit (IMU) and gyros sensors. First, we explore the applications of the unified theory for central catadioptric cameras for attitude and heading estimation, explaining how the skyline is projected on the catadioptric image and how it is segmented and used to calculate the UAV's attitude. Then we use appearance images to obtain a visual compass, and we calculate the relative rotation and heading of the aerial vehicle. Additionally, we show the use of a stereo system to calculate the aircraft height and to measure the UAV's motion. Finally, we present a visual tracking system based on Fuzzy controllers working in both a UAV and a camera pan and tilt platform. Every part is tested using the UAV COLIBRI platform to validate the different approaches, which include comparison of the estimated data with the inertial values measured onboard the helicopter platform and the validation of the tracking schemes on real flights. © 2010 Springer Science+Business Media, LLC. Source

Discover hidden collaborations