Agency: GTR | Branch: EPSRC | Program: | Phase: Research Grant | Award Amount: 858.32K | Year: 2015
Autonomous robots, capable of independent and intelligent navigation through unknown environments, have the potential to significantly increase human safety and security. They could replace people in potentially hazardous tasks, for instance search and rescue operations in disaster zones, or surveys of nuclear/chemical installations. Vision is one of the primary senses that can enable this capability, however, visual information processing is notoriously difficult, especially at speeds required for fast moving robots, and in particular where low weight, power dissipation and cost of the system are of concern. Conventional hardware and algorithms are not up to the task. The proposal here is to tightly integrate novel sensing and processing hardware, together with vision, navigation and control algorithms, to enable the next generation of autonomous robots. At the heart of the system will be a device known as a vision chip. This bespoke integrated circuit differs from a conventional image sensor, including a processor with each pixel. This will offer unprecedented performance. The massively parallel processor array will be programmed to pre-process images, passing higher-level feature information upstream to vision tracking algorithms and the control system. Feature extraction at pixel level results in an extremely efficient and high speed throughput of information. Another feature of the new vision chip will be the measurement of time of flight data in each pixel. This will allow the distance to a feature to be extracted and combined with the image plane data for vision tracking, simplifying and speeding up the real-time state estimation and mapping capabilities. Vision algorithms will be developed to make the most optimal use of this novel hardware technology. This project will not only develop a unique vision processing system, but will also tightly integrate the control system design. Vision and control systems have been traditionally developed independently, with the downstream flow of information from sensor through to motor control. In our system, information flow will be bidirectional. Control system parameters will be passed to the image sensor itself, guiding computational effort and reducing processing overheads. For example a rotational demand passed into the control system, will not only result in control actuation for vehicle movement, but will also result in optic tracking along the same path. A key component of the project will therefore be the management and control of information across all three layers: sensing, visual perception and control. Information share will occur at multiple rates and may either be scheduled or requested. Shared information and distributed computation will provide a breakthrough in control capabilities for highly agile robotic systems. Whilst applicable to a very wide range of disciplines, our system will be tested in the demanding field of autonomous aerial robotics. We will integrate the new vision sensors onboard an unmanned air vehicle (UAV), developing a control system that will fully exploit the new tracking capabilities. This will serve as a demonstration platform for the complete vision system, incorporating nonlinear algorithms to control the vehicle through agile manoeuvres and rapidly changing trajectories. Although specific vision tracking and control algorithms will be used for the project, the hardware itself and system architecture will be applicable to a very wide range of tasks. Any application that is currently limited by tracking capabilities, in particular when combined with a rapid, demanding control challenge would benefit from this work. We will demonstrate a step change in agile, vision-based control of UAVs for exploration, and in doing so develop an architecture which will have benefits in fields as diverse as medical robotics and industrial production.
Agency: GTR | Branch: Innovate UK | Program: | Phase: Collaborative Research & Development | Award Amount: 2.56M | Year: 2014
Hybrid Air Vehicles Ltd has formed a collaborative industrial research team with Blue Bear Systems Research, Forward Composites, Liverpool University, Sheffield University and Cranfield University. This project team will advance the fundamental and interrelated enabling technologies required to maintain the UKs lead in the field of hybrid air vehicles – a novel aircraft form with substantial worldwide sales potential (against competitors such as Lockheed Martin and EADS). The project will focus on lowering the developmental risks in key technology areas such as novel aircraft aerodynamics, carbon composite structures, avionics monitoring systems and improving rate production to enable launch of production design and manufacture. The project results will be exploited by HAV and the UK aerospace supply chain generating UK jobs and maintaining HAV’s lead in the field of hybrid air vehicles and LTA technology.
Agency: GTR | Branch: EPSRC | Program: | Phase: Training Grant | Award Amount: 4.93M | Year: 2014
The global Robotics and Autonomous Systems (RAS) market was $25.5bn in 2001 and is growing. The market potential for future robotics and autonomous systems is of huge value to the UK. The need for expansion in this important sector is well recognised, as evidenced by the Chancellor of the Exchequers announcement of £35m investment in the sector in 2012, the highlighting of this sector in the 2012 BIS Foresight report Technology and Innovation Futures and the identification of robotics and autonomous systems by the Minister for Universities and Science in 2013 as one of the 8 great technologies that will drive future growth. This expansion will be fuelled by a step change in RAS capability, the key to which is their increased adaptability. For example, a home care robot must adapt safely to its owners unpredictable behaviour; micro air vehicles will be sent into damaged buildings without knowing the layout or obstructions; a high value manufacturing robot will need to manufacture small batches of different components. The key to achieving increased adaptability is that the innovators who develop them must, themselves, be very adaptable people. FARSCOPE, the Future Autonomous and Robotic Systems Centre for PhD Education, aims to meet the need for a new generation of innovators who will drive the robotics and autonomous systems sector in the coming decade and beyond. The Centre will train over 50 students in the essential RAS technical underpinning skills, the ability to integrate RAS knowledge and technologies to address real-world problems, and the understanding of wider implications and applications of RAS and the ability to innovate within, and beyond, this sector. FARSCOPE will be delivered by a partnership between the University of Bristol (UoB) and the University of the West of England (UWE). It will bring together the dedicated 3000 square metre Bristol Robotics Laboratory (BRL), one of the largest robotics laboratories in Europe, with a trainin and supervising team drawn from UoB and UWE offering a wide breadth of experience and depth of expertise in autonomous systems and related topics. The FARSCOPE centre will exploit the strengths of BRL, including medical and healthcare robotics, energy autonomous robotics, safe human-robot interactions, soft robotics, unconventional computing, experimental psychology, biomimicry, machine vision including vision-based navigation and medical imaging and an extensive aerial robotics portfolio including unmanned air vehicles and autonomous flight control. Throughout the four-year training programme industry and stakeholder partners will actively engage with the CDT, helping to deliver the programme and sharing both their domain expertise and their commercial experience with FARSCOPE students. This includes regular seminar series, industrial placements, group grand challenge project, enterprise training and the three-year individual research project. Engaged partners include BAE Systems, DSTL, Blue Bear Systems, SciSys, National Composites Centre, Rolls Royce, Toshiba, NHS SouthWest and OC Robotics. FARSCOPE also has commitment from a range of international partners from across Europe, the Americas and Asia who are offering student exchange placements and who will enhance the global perspective of the programme.
Agency: GTR | Branch: Innovate UK | Program: | Phase: Feasibility Study | Award Amount: 109.42K | Year: 2013
This project develops an autonomous path following capability (in the form of a sensor and algorithm kit) for aerial inspection robots used to remotely survey structures in sectors such as oil & gas, mining, energy, chemical processing, water and transport. Aerial robots have enormous potential to slash costs relative to manual inspections, which are equipment and manpower intensive and typically represent a large proportion of the recurring cost of a structure over its lifetime. Current generation robots are typically operated manually within line of sight of a remote operator; this project will develop a sensor and algorithm kit enabling such robots to automatically retrace their steps around a known structure using vision and learning, greatly speeding up repetitive surveys. A 3D visual feature map is generated and refined, and over subsequent missions a robot would use this map of the structure for autonomous visual navigation using a relocalisation approach, allowing it to reach and return from the areas to be inspected autonomously. The proposed robot combines the real-time full 3D visual mapping and relocalisation methods developed at the University of Bristol and flight control technology developed by Blue Bear.