The University of Bremen is one of 11 institutions classed as an "Elite university" in Germany, and a university of approximately 23,500 people from 126 countries that are studying, teaching, researching, and working in Bremen. It has become the science center of North West Germany.The university has most notably reputation in political science, industrial engineering, digital media, physics, mathematics, microbiology, geoscience , and European law.Its commitment was rewarded with the title “Stadt der Wissenschaft 2005” , which science, politics, business and culture won jointly for Bremen and Bremerhaven, by the Foundation for German Science .Some of the paths that were taken back then, also referred to as the "Bremen model", have since become characteristics of modern universities, such as interdisciplinary, explorative learning, social relevance to practice-oriented project studies which enjoy a high reputation in the academic world as well as in business and industry. Other reform approaches of the former ‘new university’ have proven to be errors such as waiving a mid-level faculty, tripartite representation or too “student-friendly” examination regulations and were given up in Bremen a few years down the track. Wikipedia.
BAUER Maschinen GmbH and University of Bremen | Date: 2014-05-13
The invention relates to underwater drilling for procuring and analyzing ground samples of a bed of a body of water. An underwater drilling device placed onto a bed of the body of water. By a drill drive a drill rod composed of at least one tubular drill rod element is drilled into the bed of the body of water in a first drilling step, wherein a drill core is received in a receiving part in the tubular drill rod element. The receiving part with the drill core is deposited in a storage place of a storage area on the base frame. Subsequently, one further drilling step is carried out with a further drill rod element. By means of sensor means at least one physical and/or chemical property of the drill core is determined. data on the storage place of the drill core in the second storage area.
Agency: European Commission | Branch: H2020 | Program: RIA | Phase: FOF-11-2016 | Award Amount: 7.99M | Year: 2016
NIMBLE: collaboration Network for Industry, Manufacturing, Business and Logistics in Europe will develop the infrastructure for a cloud-based, Industrie 4.0, Internet-of-things-enabled B2B platform on which European manufacturing firms can register, publish machine-readable catalogs for products and services, search for suitable supply chain partners, negotiate contracts and supply logistics, and develop private and secure B2B and M2M information exchange channels to optimise business work flows. The infrastructure will be developed as open source software under an Apache-type, permissive license. The governance model is a federation of platforms for multi-sided trade, with mandatory interoperation functions and optional added-value business functions that can be provided by third parties. This will foster the growth of a net-centric business ecosystem for sustainable innovation and fair competition as envisaged by the Digital Agenda 2020. Prospective NIMBLE providers can take the open source infrastructure and bundle it with sectoral, regional or functional added value services and launch a new platform in the federation. Internet platforms need fast adoption rates and the work plan reflects this: we start attracting early adopters from day one and develop the initial, working platform in year one. Added-value business functions follow in year two and final validation at large scale, involving hundreds of external firms, will happen in year three. Our adoption plan is designed to enable two or more platform providers at the end of the project, and to have 1000 to 2000 enterprises connected to the overall ecosystem at that point. NIMBLE has 17 partners grouped around 3 main activities: developing the infrastructure, running a platform adoption programme, and validating the platform with 4 supply chains (white goods, wooden houses, fashion fabrics, and child care furniture). NIMBLE will give manufacturing SMEs in Europe a stable and sustainable digital ecosystem.
Agency: European Commission | Branch: H2020 | Program: RIA | Phase: BG-09-2016 | Award Amount: 15.49M | Year: 2016
The overall objective of INTAROS is to develop an integrated Arctic Observation System (iAOS) by extending, improving and unifying existing systems in the different regions of the Arctic. INTAROS will have a strong multidisciplinary focus, with tools for integration of data from atmosphere, ocean, cryosphere and terrestrial sciences, provided by institutions in Europe, North America and Asia. Satellite earth observation data plays an increasingly important role in such observing systems, because the amount of EO data for observing the global climate and environment grows year by year. In situ observing systems are much more limited due to logistical constraints and cost limitations. The sparseness of in situ data is therefore the largest gap in the overall observing system. INTAROS will assess strengths and weaknesses of existing observing systems and contribute with innovative solutions to fill some of the critical gaps in the in situ observing network. INTAROS will develop a platform, iAOS, to search for and access data from distributed databases. The evolution into a sustainable Arctic observing system requires coordination, mobilization and cooperation between the existing European and international infrastructures (in-situ and remote including space-based), the modeling communities and relevant stakeholder groups. INTAROS will include development of community-based observing systems, where local knowledge is merged with scientific data. An integrated Arctic Observation System will enable better-informed decisions and better-documented processes within key sectors (e.g. local communities, shipping, tourism, fisheries), in order to strengthen the societal and economic role of the Arctic region and support the EU strategy for the Arctic and related maritime and environmental policies.
Agency: European Commission | Branch: H2020 | Program: RIA | Phase: BG-01-2015 | Award Amount: 9.21M | Year: 2016
ATLAS creates a dynamic new partnership between multinational industries, SMEs, governments and academia to assess the Atlantics deep-sea ecosystems and Marine Genetic Resources to create the integrated and adaptive planning products needed for sustainable Blue Growth. ATLAS will gather diverse new information on sensitive Atlantic ecosystems (incl. VMEs and EBSAs) to produce a step-change in our understanding of their connectivity, functioning and responses to future changes in human use and ocean climate. This is possible because ATLAS takes innovative approaches to its work and interweaves its objectives by placing business, policy and socioeconomic development at the forefront with science. ATLAS not only uses trans-Atlantic oceanographic arrays to understand and predict future change in living marine resources, but enhances their capacity with new sensors to make measurements directly relevant to ecosystem function. The ATLAS team has the track record needed to meet the projects ambitions and has already developed a programme of 25 deep-sea cruises, with more pending final decision. These cruises will study a network of 12 Case Studies spanning the Atlantic including sponge, cold-water coral, seamount and mid-ocean ridge ecosystems. The team has an unprecedented track record in policy development at national, European and international levels. An annual ATLAS Science-Policy Panel in Brussels will take the latest results and Blue Growth opportunities identified from the project directly to policy makers. Finally, ATLAS has a strong trans-Atlantic partnership in Canada and the USA where both government and academic partners will interact closely with ATLAS through shared cruises, staff secondments, scientific collaboration and work to inform Atlantic policy development. ATLAS has been created and designed with our N American partners to foster trans-Atlantic collaboration and the wider objectives of the Galway Statement on Atlantic Ocean Cooperation.
Agency: European Commission | Branch: H2020 | Program: SGA-RIA | Phase: FETFLAGSHIP | Award Amount: 89.00M | Year: 2016
This project is the second in the series of EC-financed parts of the Graphene Flagship. The Graphene Flagship is a 10 year research and innovation endeavour with a total project cost of 1,000,000,000 euros, funded jointly by the European Commission and member states and associated countries. The first part of the Flagship was a 30-month Collaborative Project, Coordination and Support Action (CP-CSA) under the 7th framework program (2013-2016), while this and the following parts are implemented as Core Projects under the Horizon 2020 framework. The mission of the Graphene Flagship is to take graphene and related layered materials from a state of raw potential to a point where they can revolutionise multiple industries. This will bring a new dimension to future technology a faster, thinner, stronger, flexible, and broadband revolution. Our program will put Europe firmly at the heart of the process, with a manifold return on the EU investment, both in terms of technological innovation and economic growth. To realise this vision, we have brought together a larger European consortium with about 150 partners in 23 countries. The partners represent academia, research institutes and industries, which work closely together in 15 technical work packages and five supporting work packages covering the entire value chain from materials to components and systems. As time progresses, the centre of gravity of the Flagship moves towards applications, which is reflected in the increasing importance of the higher - system - levels of the value chain. In this first core project the main focus is on components and initial system level tasks. The first core project is divided into 4 divisions, which in turn comprise 3 to 5 work packages on related topics. A fifth, external division acts as a link to the parts of the Flagship that are funded by the member states and associated countries, or by other funding sources. This creates a collaborative framework for the entire Flagship.
Peixoto T.P.,University of Bremen
Physical Review Letters | Year: 2013
We investigate the detectability of modules in large networks when the number of modules is not known in advance. We employ the minimum description length principle which seeks to minimize the total amount of information required to describe the network, and avoid overfitting. According to this criterion, we obtain general bounds on the detectability of any prescribed block structure, given the number of nodes and edges in the sampled network. We also obtain that the maximum number of detectable blocks scales as √N, where N is the number of nodes in the network, for a fixed average degree âŸ̈kâŸ©. We also show that the simplicity of the minimum description length approach yields an efficient multilevel Monte Carlo inference algorithm with a complexity of O(τNlogâ¡N), if the number of blocks is unknown, and O(τN) if it is known, where τ is the mixing time of the Markov chain. We illustrate the application of the method on a large network of actors and films with over 106 edges, and a dissortative, bipartite block structure. © 2013 American Physical Society.
Peixoto T.P.,University of Bremen
Physical Review X | Year: 2014
Discovering and characterizing the large-scale topological features in empirical networks are crucial steps in understanding how complex systems function. However, most existing methods used to obtain the modular structure of networks suffer from serious problems, such as being oblivious to the statistical evidence supporting the discovered patterns, which results in the inability to separate actual structure from noise. In addition to this, one also observes a resolution limit on the size of communities, where smaller but well-defined clusters are not detectable when the network becomes large. This phenomenon occurs for the very popular approach of modularity optimization, which lacks built-in statistical validation, but also for more principled methods based on statistical inference and model selection, which do incorporate statistical validation in a formally correct way. Here, we construct a nested generative model that, through a complete description of the entire network hierarchy at multiple scales, is capable of avoiding this limitation and enables the detection of modular structure at levels far beyond those possible with current approaches. Even with this increased resolution, the method is based on the principle of parsimony, and is capable of separating signal from noise, and thus will not lead to the identification of spurious modules even on sparse networks. Furthermore, it fully generalizes other approaches in that it is not restricted to purely assortative mixing patterns, directed or undirected graphs, and ad hoc hierarchical structures such as binary trees. Despite its general character, the approach is tractable and can be combined with advanced techniques of community detection to yield an efficient algorithm that scales well for very large networks.
Agency: European Commission | Branch: H2020 | Program: RIA | Phase: ICT-25-2016-2017 | Award Amount: 4.39M | Year: 2017
While online grocery stores are expanding, supermarkets continue to provide customers with the sensory experience of choosing goods while walking between display shelves. Therefore, retail and logistics companies are concerned with making the shopping experience more comfortable and exciting while, at the same time, using technology to reduce costs and improve efficiency. The REFILLS project aims at developing robotic systems able to address the in-store logistics needs of the retail market. Three scenarios building on top of each other are considered. In the 1st scenario, mobile robots inspect shelves and generate semantic environment maps for layout identification and store monitoring. The 2nd scenario employs robot arms for autonomous sorting of articles in the backroom and for assisting human clerks with shelf refilling in the shop. In the 3rd scenario, the autonomy of the robot is strengthened, resulting in a robotic clerk capable of manipulating articles varying in shape, surface, fragility, stiffness and weight, and refill shelves without human intervention. These scenarios trigger a number of research and technology challenges that are tackled within REFILLS. Information on the supermarket articles is exploited to create powerful knowledge bases, which are used by the robots to identify shelves, recognize missing or misplaced articles, handling them and navigate the shop. Reasoning allows robots to cope with changing task requirements and contexts, and perception-guided reactive control makes them robust to execution errors and uncertainty. A modular approach is adopted for the design of cost-efficient robotic units. The work plan will generate exploitable results through three integration and evaluation phases. A final demonstration will take place at a real retail store. In sum, REFILLS is committed to generating wide impact in the retail market domain and beyond through the development of efficient logistics solutions for professional use in supermarkets.
Agency: European Commission | Branch: H2020 | Program: ERC-STG | Phase: ERC-2016-STG | Award Amount: 1.50M | Year: 2017
Despite many advances in earthquake science, the tendency for faults to host earthquake slip, aseismic slip or slow slip events is far from well understood. Earthquakes are not yet predictable in a meaningful way, and laboratory observations do not satisfactorily explain many general observations of fault slip. Existing data has been gathered at slip velocities orders of magnitude faster than plate convergence rates, therefore the fundamental question addressed by the PREDATORS project is how faults slip when driven tectonic rates as they are in nature. I suggest that laboratory friction experiments conducted at these rates may reveal widespread frictional instability that explains the occurrence of (both fast and slow) earthquakes on plate-boundary faults, and that long-term shear loading driven by slow, plate convergence rates is more representative of interseismic real faults and captures processes which intermediate- to high-velocity experiments cannot. The experimental research proposed here utilizes an increasing complexity approach, from existing successful techniques to more innovative measurements using equipment modified to reliably shear at appropriately slow rates and under a wide range of interior Earth conditions. Rock and mineral standards will be used to establish a basic and widely applicable framework for frictional behaviour, while natural fault samples will be used for site-specific problems. This project will provide a comprehensive set of measurements and observations of fault behaviour at realistically slow plate tectonic deformation rates. Combined with existing measurements, this will provide a complete description of rock/sediment friction over the entire possible range of slip velocities. By comparison with geophysical observations on real faults, these results will help explain current seismicity patterns and other slip phenomena, and predict fault behaviour at locations where sampling and geologic characterization is limited.
Agency: European Commission | Branch: H2020 | Program: RIA | Phase: EINFRA-22-2016 | Award Amount: 2.00M | Year: 2017
Open Science is around the corner. Scientists and organizations see it as a way to speed up, improve quality and reward, while policy makers see it as a means to optimize cost of science and leverage innovation. Open Science is an emerging vision, a way of thinking, whose challenges always gaze beyond its actual achievements. De facto, todays scientific communication ecosystem lacks tools and practices to allow researchers to fully embrace Open Science. OpenAIRE-Connect aims to provide technological and social bridges, and deliver services enabling uniform exchange of research artefacts (literature, data, and methods), with semantic links between them, across research communities and content providers in scientific communication. It will introduce and implement the concept of Open Science as a Service (OSaaS) on top of the existing OpenAIRE infrastructure, delivering out-of-the-box, on-demand deployable tools. OpenAIRE-Connect will adopt an end-user driven approach (via the involvement of 5 prominent research communities), and enrich the portfolio of OpenAIRE infrastructure production services with a Research Community Dashboard Service and a Catch-All Notification Broker Service. The first will offer publishing, interlinking, packaging functionalities to enable them to share and re-use their research artifacts (introducing methods, e.g. data,software, protocols). This effort, supported by the harvesting and mining intelligence of the OpenAIRE infrastructure, will provide communities with the content and tools they need to effectively evaluate and reproduce science. OpenAIRE-Connect will combine dissemination and training with OpenAIREs powerful NOAD network engaging research communities and content providers in adopting such services. These combined actions will bring immediate and long-term benefits to scholarly communication stakeholders by affecting the way research results are disseminated, exchanged, evaluated, and re-used.