Entity

Time filter

Source Type


Deusdado P.,IntRoSys S.A. | Pinto E.,New University of Lisbon | Guedes M.,IntRoSys S.A. | Marques F.,New University of Lisbon | And 12 more authors.
Advances in Intelligent Systems and Computing | Year: 2016

This paper presents an aerial-ground field robotic team, designed to collect and transport soil and biota samples in estuarine mudflats. The robotic system has been devised so that its sampling and storage capabilities are suited for radionuclides and heavy metals environmental monitoring. Automating these time-consuming and physically demanding tasks is expected to positively impact both their scope and frequency. The success of an environmental monitoring study heavily depends on the statistical significance and accuracy of the sampling procedures, which most often require frequent human intervention. The bird’s-eye viewprovided by the aerial vehicle aims at supporting remote mission specification and execution monitoring. This paper also proposes a preliminary experimental protocol tailored to exploit the capabilities offered by the robotic system. Preliminary field trials in real estuarine mudflats show the ability of the robotic system to successfully extract and transport soil samples for offline analysis. © Springer International Publishing Switzerland 2016. Source


Lourenco A.,New University of Lisbon | Marques F.,New University of Lisbon | Santana P.,Iscte Instituto Universitario Of Lisbon Iscte Iul | Santana P.,Telecommunications Institute of Portugal | Barata J.,New University of Lisbon
2014 IEEE International Conference on Robotics and Biomimetics, IEEE ROBIO 2014 | Year: 2014

This paper presents a method for 3-D based obstacle detection on autonomous vehicles navigating in vegetated environments. At its core three different methods processing the surrounding occupancy, taken at separate stages and volumetric resolutions, are combined to a reliable and broad solution. Geometric relationships are evaluated at a coarse, yet robust, volumetric representation to form an initial assessment on obstacles. Then, a more careful evaluation takes place, at finer resolutions, to determine which obstacles are part of the scene's vegetation, thus not real obstacles. Field experiments are shown to validate the method's applicability on two different autonomous vehicles: a water surface robot and a terrestrial four-wheeled one. © 2014 IEEE. Source


Silva J.,New University of Lisbon | Mendonca R.,New University of Lisbon | Marques F.,New University of Lisbon | Rodrigues P.,New University of Lisbon | And 3 more authors.
2014 IEEE International Conference on Robotics and Biomimetics, IEEE ROBIO 2014 | Year: 2014

This paper presents a method for vision-based landing of a multirotor unmanned aerial vehicle (UAV) on an autonomous surface vehicle (ASV) equipped with a helipad. The method includes a mechanism for helipad behavioural search when outside the UAV's field of view, a learning saliency-based mechanism for visual tracking the helipad, and a cooperative strategy for the final vision-based landing phase. Learning how to track the helipad from above occurs during takeoff and cooperation results from having the ASV tracking the UAV for assisting its landing. A set of experimental results with both simulated and physical robots show the feasibility of the presented method. © 2014 IEEE. Source


Pinto E.,New University of Lisbon | Deusdado P.,IntRoSys S.A. | Marques F.,New University of Lisbon | Lourenco A.,New University of Lisbon | And 4 more authors.
ISMA 2015 - 10th International Symposium on Mechatronics and its Applications | Year: 2015

This paper presents a multi-core processing solution for ROS-based service robots. The power management together with the control and availability of the processing resources are supervised by a custom-made Power Management Board (PMB) based on a Digital Signal Processor (DSP) micro controller, implementing a Health and Usage Monitoring System (HUMS). The proposed architecture also allows for the PMB to control the most critical robot functions in case of low battery conditions or impossibility of performing energy harvesting, thus extending the lifespan of the robot. All PMB data is recorded on a SD card so as to allow offline analyses of the robotic mission and, thus, support subsequent maintenance activities. Two different implementations of the proposed system have been fielded in two Multi-Robot Systems (MRS) for environmental monitoring, covering aerial, water surface, and wheeled ground vehicles. An additional implementation of the architecture is currently being deployed on an industrial autonomous logistics robot. These three implementations are presented and discussed. © 2015 IEEE. Source


Gomes P.,New University of Lisbon | Santana P.,Iscte Instituto Universitario Of Lisbon Iscte Iul | Barata J.,New University of Lisbon
International Journal of Advanced Robotic Systems | Year: 2014

This paper presents a vision-based method for fire detection from fixed surveillance smart cameras. The method integrates several well-known techniques properly adapted to cope with the challenges related to the actual deployment of the vision system. Concretely, background subtraction is performed with a context-based learning mechanism so as to attain higher accuracy and robustness. The computational cost of a frequency analysis of potential fire regions is reduced by means of focusing its operation with an attentive mechanism. For fast discrimination between fire regions and fire-coloured moving objects, a new colour-based model of fire's appearance and a new wavelet-based model of fire's frequency signature are proposed. To reduce the false alarm rate due to the presence of fire-coloured moving objects, the category and behaviour of each moving object is taken into account in the decision-making. To estimate the expected object's size in the image plane and to generate geo-referenced alarms, the camera-world mapping is approximated with a GPS-based calibration process. Experimental results demonstrate the ability of the proposed method to detect fires with an average success rate of 93.1% at a processing rate of 10 Hz, which is often sufficient for real-life applications. © 2014 The Author(s). Licensee InTech. Source

Discover hidden collaborations