Time filter

Source Type

Krailling, Germany

Agency: Cordis | Branch: FP7 | Program: CP | Phase: FoF-ICT-2013.7.1 | Award Amount: 22.24M | Year: 2014

The European manufacturing industry needs competitive solutions to keep global leadership in products and services. Exploiting synergies across application experts, technology suppliers, system integrators and service providers will speed up the process of bringing innovative technologies from research labs to industrial end-users. As an enabler in this context, the EuRoC initiative proposes to launch three industry-relevant challenges: 1) Reconfigurable Interactive Manufacturing Cell, 2) Shop Floor Logistics and Manipulation, 3) Plant Servicing and Inspection. It aims at sharpening the focus of European manufacturing through a number of application experiments, while adopting an innovative approach which ensures comparative performance evaluation. Each challenge is launched via an open call and is structured in 3 stages. 45 Contestants are selected using a challenge in a simulation environment: the low barrier of entry allows new players to compete with established robotics teams. Matching up the best Contestants with industrial end users, 15 Challenger teams are admitted to the second stage, where the typical team is formed by research experts, technology suppliers, system integrators, plus end users. Teams are required to benchmark use cases on standard robotic platforms empowered by this consortium. After a mid-term evaluation with public competition, the teams advance to showcasing the use case in a realistic environment. After an open judging process, 6 Challenge Finalists are admitted to run pilot experiments in a real environment at end-user sites to determine the final EuRoC Winner. A number of challenge advisors and independent experts decide about access to the subsequent stages. A challenge-based approach with multiple stages of increasing complexity and financial support for competing teams will level the playing field for new contestants, attract new developers and new end users toward customisable robot applications, and provide sustainable solutions to carry out future challenges.

Agency: Cordis | Branch: FP7 | Program: CP | Phase: ICT-2013.2.1 | Award Amount: 11.09M | Year: 2013

Using a proven-in-practice user-centric design methodology, TRADR develops novel S&T for human-robot teams to assist in disaster response efforts, over multiple missions: The novel S&T makes experience persistent. Various kinds of robots collaborate with human team members to explore the environment, and gather physical samples. Throughout this collaborative effort, TRADR enables the team to gradually develop its understanding of the disaster area over, multiple possibly asynchronous missions (persistent environment models), to improve team members understanding of how to work in the area (persistent multi-robot action models), and to improve team-work (persistent human-robot teaming).

Weiss S.,ETH Zurich | Achtelik M.W.,ETH Zurich | Lynen S.,ETH Zurich | Achtelik M.C.,Ascending Technologies | And 3 more authors.
Journal of Field Robotics | Year: 2013

The recent technological advances in Micro Aerial Vehicles (MAVs) have triggered great interest in the robotics community, as their deployability in missions of surveillance and reconnaissance has now become a realistic prospect. The state of the art, however, still lacks solutions that can work for a long duration in large, unknown, and GPS-denied environments. Here, we present our visual pipeline and MAV state-estimation framework, which uses feeds from a monocular camera and an Inertial Measurement Unit (IMU) to achieve real-time and onboard autonomous flight in general and realistic scenarios. The challenge lies in dealing with the power and weight restrictions onboard a MAV while providing the robustness necessary in real and long-term missions. This article provides a concise. of our work on achieving the first onboard vision-based power-on-andgo system for autonomousMAV flights.We discuss our insights on the lessons learned throughout the different stages of this research, from the conception of the idea to the thorough theoretical analysis of the proposed framework and, finally, the real-world implementation and deployment. Looking into the onboard estimation of monocular visual odometry, the sensor fusion strategy, the state estimation and self-calibration of the system, and finally some implementation issues, the reader is guided through the different modules comprising our framework. The validity and power of this framework are illustrated via a comprehensive set of experiments in a large outdoor mission, demonstrating successful operation over flights of more than 360 m trajectory and 70 m altitude change. © 2013 Wiley Periodicals, Inc. Source

Agency: Cordis | Branch: FP7 | Program: CP | Phase: ICT-2007.2.2 | Award Amount: 3.36M | Year: 2009

Autonomous micro helicopters are about to play major roles in tasks like reconnaissance for search and rescue, environment monitoring, security surveillance, inspection, law enforcement, etc. The ability to fly allows easily avoiding obstacles on the ground and to have an excellent birds eye view. Therefore flying robots are the logical heir of ground based mobile robots. Its navigational and hovering advantages make them the ideal platform for exploration, mapping and monitoring tasks. If they are further realized in small scale, they can also be used in narrow out- and indoor environment and they represent only a limited risk for the environment and people living in it. However, for such operations todays systems navigating on GPS information only are not sufficient any more. Fully autonomous operation in cities or other dense environments requires the micro helicopter to fly at low altitude or indoors where GPS signals are often shadowed and to actively explore unknown environments while avoiding collisions and creating maps. This involves a number of challenges on all levels of helicopter design, perception, actuation, control, navigation and power supply that have yet to be solved.\nOur S and T endeavor proposed in this project will therefore focus on micro helicopter design, visual 3D mapping and navigation, low power communication including range estimation and multi-robot control under environmental constraints. It shall lead to novel micro flying robots that are:\n Inherently safe due to very low weight (around 500g) and appropriate propeller design;\n capable of vision-based fully autonomous navigation and mapping;\n able of coordinated flight in small swarms in constrained and dense environments.

News Article | January 17, 2016
Site: http://www.techtimes.com/rss/sections/futuretech.xml

Intel joined forces with Ars Electronica Futurelab to set the Guinness World Record for having the most number of unmanned aerial vehicles in the air at the same time, but that's just skimming the surface. Intel is very enthusiastic about drones. In fact, Intel's boss Brian Krzanich said at the 2016 CES in Las Vegas that drones will light up the skies to replace fireworks down the road. "I see a future where fireworks and all their risks of smoke and dirt are a thing of the past, and they're replaced by shows that have unlimited creativity and potential – and powered by drones," he said. Back in August last year, the company invested $60 million in Chinese drone maker Yuneec Holding. German drone maker Ascending Technologies also moved under Intel's umbrella on Jan. 4, 2016. The company likewise made an undisclosed investment in Airware of San Francisco. "Intel gains expertise and technology to accelerate the deployment of Intel RealSense technology into the fast growing drone market segment," Intel says in a blog post about having Ascending Technologies on board. The company will continue to work with the Ascending Technologies team to keep on providing support for its present customers while also working hand-in-hand with Intel's Perceptual Computing team to come up with a UAV technology that will soon "help drones fly with more awareness of their environments." On Nov. 4, 2015, Intel and Futurelab pre-programmed 100 drones and launched them in the sky to show off a spectacular light show synchronized with Beethoven's Fifth Symphony played by a live orchestra. Albeit the show was filmed last year, the video was initially showcased during the keynote speech of Krzanich at the 2016 CES on Jan. 5. These drones, which were fitted with LEDs, concurrently lit up the skies over Ahrenlohe Airfield near Hamburg, Germany for seven minutes. They climbed as high as 328 feet to show off their choreographed routines. The light show ended with the drones forming the 250-meter wide (820 feet) logo of Intel. A Guiness World Record judge was present during the show to verify and award the new record to the two companies. Horst Hörtner, Ars Electronica Futurelab's director, said the new record is a result of the companies' years of hard work. "Drone 100 was a crazy idea that came out of a hallway conversation inside Intel, and now it has become a reality," said Anil Nanduri, the general manager of New Markets in Intel's Perceptual Computing Group. "Working with Ars Electronica Futurelab, we were able to create a formation of 100 UAVs in the sky, creating amazing images and ending with the Intel logo." Weighing 700 grams (1.5 pounds) each, the quadcopters were built by Ascending Technologies. Futurelab member Andreas Jalsovec said Intel developed the ground controls software, which required a powerful computer to make the show possible. Chief pilot Martin Morth said that drones do not always look at people, "sometimes, it's the drones that you should be looking at." You can watch the video below.

Discover hidden collaborations