Time filter

Source Type

Vernon, France

Zarrouati N.,Directorate General of Armaments | Aldea E.,Sysnav | Rouchon P.,MINES ParisTech
Proceedings of the American Control Conference

In this paper, we use known camera motion associated to a video sequence of a static scene in order to estimate and incrementally refine the surrounding depth field. We exploit the SO(3)-invariance of brightness and depth fields dynamics to customize standard image processing techniques. Inspired by the Horn-Schunck method, we propose a SO(3)-invariant cost to estimate the depth field. At each time step, this provides a diffusion equation on the unit Riemannian sphere of R3 that is numerically solved to obtain a real time depth field estimation of the entire field of view. Two asymptotic observers are derived from the governing equations of dynamics, respectively based on optical flow and depth estimations: implemented on noisy sequences of synthetic images as well as on real data, they perform a more robust and accurate depth estimation. This approach is complementary to most methods employing state observers for range estimation, which uniquely concern single or isolated feature points. © 2012 AACC American Automatic Control Council). Source

Zarrouati N.,Directorate General of Armaments | Hillion M.,Sysnav | Petit N.,MINES ParisTech
Proceedings of the American Control Conference

We propose in this paper a method to estimate the velocity of a rigid body, using a novel stereo-vision principle. It is presented and applied in a laboratory test case which is representative of low-cost navigation for ground vehicles. The method exploits the dynamics of a scalar field obtained by weighting and averaging the brightness perceived by two embedded neighboring cameras. To be more specific, the cameras are complemented with a gyrometer to retrieve the curvilinear velocity of the moving rigid body. The proposed method is first tested on synthetic data, then on real data, and shows robustness to poor quality of image data. Significant levels of noise and blur are tested; in addition, this method does not require high resolution images, as opposed to any existing methods based on triangulation and tracking of keypoints. © 2012 AACC American Automatic Control Council). Source

Zarrouati N.,MINES ParisTech | Zarrouati N.,Directorate General of Armaments | Aldea E.,Sysnav | Rouchon P.,MINES ParisTech
Proceedings - International Conference on Pattern Recognition

The objective of our work is to reconstruct the dense structure of a static scene observed by a monocular camera system following a known trajectory. Our main contribution is representated by the proposition of a TV-L1 energy functional that estimates directly the unknown depth field given the camera motion, thus avoiding to estimate as an intermediate step an optical flow field with additional geometric constraints. Our method has two main interests: we highlight a practical minimal parametrization for the given assumptions (static scene, known camera motion) and we solve the resulting variational problem using an efficient, discontinuity preserving formulation. © 2012 ICPR Org Committee. Source

Dorveaux E.,MINES ParisTech | Vissiere D.,Sysnav | Petit N.,MINES ParisTech
Proceedings of the 2010 American Control Conference, ACC 2010

We address the problem of the calibration of an array of sensors by investigating theoretically and experimentally the case of 2 three-axis sensors. Our focus is on magnetometers that can be used in a low-cost inertial navigation system. Usual errors (misalignments, non-orthogonality, scale factors, biases) are accounted for. The proposed calibration method requires no specific calibration hardware. Instead, we solely use the fact that, if the sensor is properly calibrated, the norm of the sensed field must remain constant irrespective of the sensors orientation. Several strategies of calibration for an array of sensors are described along with the impact of (unavoidable) field disturbances. Experiments conducted with a couple of magneto-resistive magnetometers and data fusion results illustrate the relevance of the approach. © 2010 AACC. Source

Bristeau P.-J.,MINES ParisTech | Callou F.,Parrot | Vissiere D.,Sysnav | Petit N.,MINES ParisTech
IFAC Proceedings Volumes (IFAC-PapersOnline)

This paper exposes the Navigation and Control technology embedded in a recently commercialized micro Unmanned Aerial Vehicle (UAV), the AR.Drone, which cost and performance are unprecedented among any commercial product for mass markets. The system relies on state-of-the-art indoor navigation systems combining low-cost inertial sensors, computer vision techniques, sonar, and accounting for aerodynamics models. © 2011 IFAC. Source

Discover hidden collaborations