The University of Essex is a British public research university whose original and largest campus is near the town of Colchester, England. It was established in 1963 and received its Royal Charter in 1965.The university's main campus is located within Wivenhoe Park in the English county of Essex, less than a mile from the town of Wivenhoe & 2 miles from the town of Colchester. Apart from the Wivenhoe Park campus, there is a rapidly developing campus in Southend-on-Sea , and the East 15 Acting School is based in Loughton. The University's motto, Thought the harder, heart the keener, is adapted from the Anglo-Saxon poem The Battle of Maldon. The university enjoys collaborative partnerships with a number of institutions across the eastern region. These are University Campus Suffolk, Colchester Institute, Kaplan Open Learning , South Essex College and Writtle College.The university exhibits an international character with 132 countries represented in its student body. The latest Research Assessment Exercise in 2008 ranked Essex ninth in the UK for the quality of its research with more than 90% of research recognised internationally for its quality, with 22% of research rated as 'world leading'. The university is referenced by QS World University Rankings as a world leader in social science, with internationally recognised strengths in the humanities. Wikipedia.
McGenity T.J.,University of Essex
Current Opinion in Biotechnology | Year: 2014
Intertidal wetlands, primarily salt marsh, mangrove and mudflats, which provide many essential ecosystem services, are under threat on numerous fronts; a situation that is made worse by crude-oil pollution. Microbes are the main vehicle for remediation of such sediments, and new discoveries, such as novel biodegradation pathways, means of accessing oil, multi-species interactions, and community-level responses to oil addition, are helping us to understand, predict and monitor the fate of oil. Despite this, there are many challenges, not least because of the heterogeneity of these ecosystems and the complexity of crude oil. For example, there is growing awareness about the toxicity of the oxygenated products that result from crude-oil weathering, which are difficult to degrade. This review highlights how developments in areas as diverse as systems biology, microbiology, ecology, biogeochemistry and analytical chemistry are enhancing our understanding of hydrocarbon biodegradation and thus bioremediation of oil-polluted intertidal wetlands. © 2013 .
Agency: GTR | Branch: ESRC | Program: | Phase: Research Grant | Award Amount: 27.43M | Year: 2016
Understanding Society: the UK Household Longitudinal Study is the largest household panel study in the world which addresses the key scientific and policy questions of the 21st century. It collects high quality annual longitudinal data on individuals of all ages in households representative of the UK population. Such data enables researchers to explore the experience, causes and consequences of changes in peoples lives - their family structure, health, income, expenditure, employment and housing - key issues for policy makers today. The Study also has additional samples that allow the detailed exploration of the circumstances of key immigrant and ethnic minority groups; and collects data on cognition, objective measures of health and genetics to understand how peoples health and wider circumstances interact. Additionally, the Study invests in innovative ways of collecting data to continually improve the content and quality of data available. Overall, therefore, the Study enables the production of research to inform policy and practice. The Study was inaugurated in 2008 with an Innovation Panel to test methods and the first main wave of fieldwork in 2009. To date four waves of the main study and six waves of the Innovation Panel, as well as data collected from a nurse visit and derived from blood samples, are deposited in the UK Data Service. Data collection and planning are ongoing for Waves 5-8; this bid covers the costs of data collection for Waves 9-11 and associated activities. Based on careful experimental research and evaluation, in Waves 9-11 the Study will move to mixed mode data collection - meaning people will be able to complete the questionnaire face-to-face or online. This maximises flexibility for respondents, but given people may answer questions differently, depending on whether an interviewer is asking them or not, it creates complexities for data users. Crucial to our work will be to support data users to ensure they are able to use such complex data effectively. Policy and research agendas are constantly evolving, and it is important in a longitudinal study to balance creating long series of the same questions with including questions that address emerging topics and make effective use of new approaches to data collection. In this funding period, we will undertake a programme of innovation to bring in new technologies, enabling us to collect better data to address critical social science and policy issues. We will also work with Topic Champions to improve the content of the survey and the way we present the data to users. Supporting researchers in universities, government, third sector and businesses to use the data effectively is fundamental to the success of the Study. We will therefore invest in improving our user support for the Study and sharing the data in different ways to make them easier for different kind of users to analyse. We also have a Policy Unit that directly works with government departments and third sector organisations to ensure that they are fully aware of how Understanding Society data can be used to address their policy concerns and to help them do so where appropriate, and an Impact Fellow who supports both policy users and researchers to work effectively together to generate impact. During this funding period, as more waves of the Study are released to data users, the value of the study will increase significantly as it can be used to answer more questions about the effect of different kinds of changes on peoples lives. We will create a wide range of opportunities for users to share their findings - for example at conferences and workshops, through Insights, by promoting publications and case studies on the website and through social media - and by creating an online community of users so that they can engage with each other.
Agency: GTR | Branch: ESRC | Program: | Phase: Research Grant | Award Amount: 4.74M | Year: 2015
Edward Snowdens leaks about the extent of US and UK intelligence services electronic surveillance dramatically demonstrated how in an increasingly digitised world, technological developments and the collection, storage and use of big data pose unprecedented challenges for the protection of human rights. The aim of this programme of research is to ensure that the use of technological developments and big data are compatible with the ideals of human rights protection and can even have a positive impact. Snowdens revelations are part of a much bigger picture in which electronic monitoring and data is collected and shared by companies and states on a routine, daily basis through social media, consumer activity and smartphones. The same technologies that threaten our privacy also provide opportunities for enhanced protection of human rights through better documentation of human rights violations and by demonstrating the effectiveness of rights-shaped policies in order to influence resource allocation and budgets. Existing work either fails to consider the rights-implications of the use of Information and Communication Technology (ICT) and big data or focuses on a particular right. What is missing is a wider investigation into the diverse and complex rights-implications (positive and negative) of the use of ICT and big data including, but not limited to, privacy and the many social, ethical and legal issues lurking beneath the surface of human-machine interaction and use of big data. Moreover, regulation of the use of ICT and big data is currently fragmented between states, the United Nations and internet governance sector. This project will provide added value by offering a fuller picture of the totality of human rights issues raised by ICT and big data to advance new thinking and regulatory solutions. The research questions focus on issues that cut across the threats and opportunities:1) How is the use of ICT and big data shaping the content and scope of rights? (2) How does the use of ICT and big data shape operational practices across state and non-state activities? What new theoretical questions and implications for human rights are generated? (3) What methodologies are needed to identify and document the misuse of modern technologies and the failure to comply with rights-based obligations? (4) How can the use of ICT and big data best support evidence-based approaches to human rights protection and advocacy? (5) What possibilities and limitations exist for regulating the collection, storage and use of ICT and big data by states and non-state actors? The project will be organised into 4 work streams. The first (WS1) will focus on the overarching and synthesising themes. This will be complemented and informed by three in-depth studies: state and non-state surveillance (WS2); health as an example of using new technologies and big data for accountability purposes and evidence-based approaches to rights (WS3); and human rights advocacy and humanitarian work (WS4). The project will use multiple methods (desk research, interviews, econometrics, comparative case studies and computational techniques). Communication and impact streams will be developed across the different stakeholder communities to establish agreement and a shared vocabulary by forming an expert working group of practitioners (human rights and technology), international internet governance, UN actors and academics that will meet twice a year to develop international standards on information technology, big data and human rights. Dissemination and engagement with academics, practitioners and the public will be achieved via an interactive website; social media channels; conferences; public events; academic publications; shorter practice-orientated articles, policy briefings and blogs; media pieces; and time sensitive interventions targeting key policymakers and influencers and international, regional and national courts.
Agency: GTR | Branch: NERC | Program: | Phase: Research Grant | Award Amount: 762.71K | Year: 2016
The impacts of climate change, and warming in particular, on natural ecosystems remain poorly understood, and research to date has focused on individual species (e.g. range shifts of polar bears). Multispecies systems (food webs, ecosystems), however, can possess emergent properties that can only be understood using a system-level perspective. Within a given food web, the microbial world is the engine that drives key ecosystem processes, biogeochemical cycles (e.g. the carbon-cycle) and network properties, but has been hidden from view due to difficulties with identifying which microbes are present and what they are doing. The recent revolution in Next Generation Sequencing has removed this bottleneck and we can now open the microbial black box to characterise the metagenome (who is there?) and metatranscriptome (what are they doing?) of the community for the first time. These advances will allow us to address a key overarching question: should we expect a global response to global warming? There are bodies of theory that suggest this might be the case, including the Metabolic Theory of Ecology and the Everything is Everywhere hypothesis of global microbial biogeography, yet these ideas have yet to be tested rigorously at appropriate scales and in appropriate experimental contexts that allow us to identify patterns and causal relationships in real multispecies systems. We will assess the impacts of warming across multiple levels of biological organisation, from genes to food webs and whole ecosystems, using geothermally warmed freshwaters in 5 high-latitude regions (Svalbard, Iceland, Greenland, Alaska, Kamchatka), where warming is predicted to be especially rapid,. Our study will be the first to characterise the impacts of climate change on multispecies systems at such an unprecedented scale. Surveys of these sentinel systems will be complemented with modelling and experiments conducted in these field sites, as well as in 100s of large-scale mesocosms (artificial streams and ponds) in the field and 1,000s of microcosms of robotically-assembled microbial communities in the laboratory. Our novel genes-to-ecosystems approach will allow us to integrate measures of biodiversity and ecosystem functioning. For instance, we will quantify key functional genes as well as quantifying which genes are switched on (the metatranscriptome) in addition to measuring ecosystem functioning (e.g. processes related to the carbon cycle). We will also measure the impacts of climate change on the complex networks of interacting species we find in nature - what Darwin called the entangled bank - because food webs and other types of networks can produce counterintuitive responses that cannot be predicted from studying species in isolation. One general objective is to assess the scope for biodiversity insurance and resilience of natural systems in the face of climate change. We will combine our intercontinental surveys with natural experiments, bioassays, manipulations and mathematical models to do this. For instance, we will characterise how temperature-mediated losses to biodiversity can compromise key functional attributes of the gene pool and of the ecosystem as a whole. There is an assumption in the academic literature and in policy that freshwater ecosystems are relatively resilient because the apparently huge scope for functional redundancy could allow for compensation for species loss in the face of climate change. However, this has not been quantified empirically in natural systems, and errors in estimating the magnitude of functional redundancy could have substantial environmental and economic repercussions. The research will address a set of key specific questions and hypotheses within our 5 themed Workpackages, of broad significance to both pure and applied ecology, and which also combine to provide a more holistic perspective than has ever been attempted previously.
Agency: European Commission | Branch: H2020 | Program: ERC-ADG | Phase: ERC-ADG-2015 | Award Amount: 2.50M | Year: 2016
Natural language expressions are supposed to be unambiguous in context. Yet more and more examples of use of expressions that are ambiguous in context, yet felicitous and rhetorically unmarked, are emerging. In my own work, I demonstrated that ambiguity in anaphoric reference is ubiquitous, through the study of disagreements in annotation, that I pioneered in CL. Since then, additional cases of ambiguous anaphoric reference have been found; and similar findings have been made for other aspects of language interpretation, including wordsense disambiguation, and even part-of-speech tagging. Using the Phrase Detectives Game-With-A-Purpose to collect massive amounts of judgments online, we found that up to 30% of anaphoric expressions in our data are ambiguous. These findings raise a serious challenge for computational linguistics (CL), as assumptions about the existence of a single interpretation in context are built in the dominant methodology, that depends on a reliably annotated gold standard. The goal of the proposed project is to tackle this fundamental issue of disagreements in interpretation by using computational methods for collecting and analysing such disagreements, some of which already exist but have never before been applied in linguistics on a large scale, some we will develop from scratch. Specifically, I propose to develop more advanced games-with-a-purpose to collect massive amounts of data about anaphora from people playing a game. I propose to use Bayesian models of annotation, widely used in epidemiology but not in linguistics, to analyse such data and identify genuine ambiguities; doing this for anaphora will require novel methods. Third, I propose to use these data to revisit current theories about anaphoric expressions that do not seem to cause infelicitousness when ambiguous. Finally, I propose to develop the first supervised approach to anaphora resolution that does not require a gold standard as a blueprint for other areas.
Agency: European Commission | Branch: H2020 | Program: RIA | Phase: ICT-14-2014 | Award Amount: 5.89M | Year: 2015
CHARISMA proposes an intelligent hierarchical routing and paravirtualised architecture that unites two important concepts: devolved offload with shortest path nearest to end-users and an end-to-end security service chain via virtualized open access physical layer security (PLS). The CHARISMA architecture meets the goals of low-latency (<1ms) and security required for future converged wireless/wireline advanced 5G networking. This provides a cloud infrastructure platform with increased spectral and energy efficiency and enhanced performance targeting the identified needs for 1000-fold increased mobile data volume, 10-100 times higher data rates, 10-100 times more connected devices and 5x reduced latency. Fully aligned and committed to the 5G-PPP principles and KPIs, the CHARISMA proposal brings together 10G-wireless (via mm-wave/60-GHz & free-space optics, FSO) access and 100G fixed optical (OFDM-PON) solutions through an intelligent cloud radio-access-network (C-RAN) and intelligent radio remote head (RRH) platform with IPv6 Trust Node routing featuring very low-latency for the traffic management. Low-cost Ethernet is used across front- and backhaul, and end-user equipment (vCPE), and intelligence distributed across the back-, front-hauls, and perimetric data transports. Ad-hoc mobile device interconnectivities (D2D, D2I, C2C etc.), content delivery network (CDN) and mobile distributed caching (MDC) offer an energy-efficient (better than x20 improvement possible) information-centric networking (ICN) architecture. Furthermore, caching will provide efficient utilization of scarce resources by early aggregating data or/and by executing communication locally. The CHARISMA approach will benefit user experiences with ground-breaking low-latency services, high-bandwidth, and mobile cloud resilient network security.
Agency: European Commission | Branch: H2020 | Program: RIA | Phase: PHC-28-2015 | Award Amount: 5.00M | Year: 2016
Personal health systems for the management of chronic diseases have seen giant leaps in development over recent years. These systems offer vital sign monitoring and therapy delivery at home, focusing on the primary physical disease conditions. However, they do not provide support for early mood assessment or psychological treatment and lack a real-time comprehensive assessment of the patients mental status. Depression is the third leading contributor to global diseases, and depressive mood state is also considered to be strictly related to the onset or worsening of a severe primary somatic disease. Indeed effective preventive medicine related to the onset of depressive symptoms as a comorbidity and worsening factor of psychosomatic diseases such as myocardial infarction, leg-amputation, cancer, and kidney failure is lacking. NEVERMIND sets out to empower people who suffer from symptoms of depression related to a serious somatic disease by placing them at the center of their mental healthcare. Equipped with just a smartphone and a lightweight sensitized shirt, patients seeking care and treatment for their mental illnesses interact with these devices that collect data about their mental and physical health, to then get effective feedback. Lifestyle factors, i.e. diet, physical activity and sleep hygiene, play a significant mediating role in the development, progression and treatment of depression, and in NEVERMIND will be monitored by a real-time Decision Support System running locally on the patients smartphone, predicting the severity and onset of depressive symptoms, by processing physiological data, body movement, speech, and the recurrence of social interactions. The data will trigger a response encouraging the patient to conduct or alter activities or lifestyle to reduce the occurrence and severity of depressive symptoms. The final aim is to bring this system to the market, giving people the tools to control their depression and unburden their minds.
Agency: GTR | Branch: NERC | Program: | Phase: Research Grant | Award Amount: 622.13K | Year: 2016
Sea surface temperature has increased by about 0.8 degrees Celcius since 1880 and is projected to increase by another 2 degrees by the year 2100. This will expose the plants and animals that live in tropical waters to temperatures that are warmer than their ancestors have experienced over the past million years. Included in these organisms are the photosynthetic microrganisms that provide the organic matter that supports marine food webs and facilitate transfer of carbon dioxide from the atmosphere to ocean. In tropical waters where temperatures are above about 25 degrees Celcius, phytoplankton are likely to experience direct negative effects of increased temperature on their physiology as they are often exposed to temperatures that are higher than the optimal temperature for their growth. This situation contrasts with that for temperate and polar waters where increased temperature may stimulate growth of the indigenous phytoplankton species or allow more thermally tolerant species to immigrate. Our research addresses the questions How do cyanobacteria acclimate to temperatures that are supra-optimal for growth? What are the implications of this acclimation for their productivity in a warming ocean? and How can we account for acclimation to supra-optimal temperatures in models of cyanobacteria growth? Unlike previous research on short-term (minutes to hours) responses of cyanobacteria, algae and vascular plants to heat shock, we propose to investigate the mechanisms of long-term (days to weeks) acclimation to heat stress and the implications of this acclimation for growth and physiology. As far as we are aware, this will be the first such investigation of long term acclimation to supra-optimal (heat) temperatures for an alga or a cyanobacterium, and as such will complement the more extensive literature on acclimation to sub-optimal (cold) temperatures in plants, algae and cyanobacteria by providing information that is particularly relevant in the face of global warming. We will employ a holistic approach using state-of-the-art methods to obtain this understanding. Transcriptomics will be used to generate the data to construct gene regulatory networks involved in sensing and responding to high temperature. Comparison of these networks amongst species with different tolerances to high temperature will be used to identify communalities and differences that may explain the observed thermal sensitivities. Proteomics and metabolomics will be used to assess the remodeling of cell metabolism that occurs as a consequence of acclimation to high temperature. Measurements of physiological rates, elemental composition (C, N, P) and biochemical composition will be used in an assessment of the system level outcomes of this acclimation in terms of biomass and productivity. The proposed comprehensive assessment of thermal acclimation is both timely and novel, and will contribute to continued excellence in a field where UK researchers make major impacts in a topic of global significance. Our research will help scientists to understand how global warming due to mans activities is changing a fundamental component of Earths life support system. Marine phytoplankton produce about 50% of the oxygen that we breathe, and play a role over millennial times scales in regulating atmospheric carbon dioxide levels. The information that we obtain will be used in the further development of the increasingly sophisticated models of marine ecology that are used in making projections of how the ocean is responding to climate change. In addition, cyanobacteria are being investigated for their potential use in biotechnology for production of low value products such as protein for animal feed or lipids for production of bio-diesel, as well as high value products including nutritional supplements (carotenoids, fatty acids, polysaccharides, vitamins, sterols) for consumption by humans and other products (dyes, pharmaceuticals, adhesives, surfactants).
Agency: GTR | Branch: BBSRC | Program: | Phase: Research Grant | Award Amount: 695.93K | Year: 2016
Numerous studies of the effects of CO2 enrichment in the field, including wheat, show that increased crop yields can be obtained through increased photosynthetic carbon assimilation. Furthermore, experiments conducted in the applicants laboratories on transgenic plants, in which activities of individual enzymes were altered, provided evidence that manipulation of the Calvin cycle has the potential to improve photosynthesis and increase plant productivity. These studies together with integrated systems modelling identified photosynthetic carbon assimilation as an untapped target to increase photosynthetic efficiency and yield by as much as 60%. The overall aim of this project is to exploit the extensive knowledge of photosynthesis and experience gained from its manipulation in model species to produce wheat plants with enhanced photosynthetic performance and increased yield. We will undertake growth, yield, physiological and molecular analysis of transgenic plants in high-light controlled environments, and test the most promising events these in replicated field trials in the UK and in Illinois. Given that our aim is to further increase yields we will be using two modern top yielding cultivars one adapted to W. European temperate conditions and the other to the US Midwest with high humidity and high temperature. Each will be tested in replicated trials in their native environments. Use of these two very different growing environments and genetic backgrounds will test the broader efficacy of the transgenic modifications to provide a sound basis for the production of higher yielding varieties for the developing world.
Agency: European Commission | Branch: H2020 | Program: RIA | Phase: ICT-24-2015 | Award Amount: 5.16M | Year: 2016
This project addresses the scientific, technological and clinical problem of recovery of hand function after amputation. Despite decades of research and development on artificial limbs and neural interfaces, amputees continue to use technology for powered prostheses developed over 40 years ago, namely myoelectric prostheses controlled via superficial electrodes. These devices do not purposely provide sensory feedback and are known for their poor functionality, controllability and sensory feedback, mainly due to the use of surface electrodes. The consortium has pioneered the use of osseointegration as a long-term stable solution for the direct skeletal attachment of limb prostheses. This technology aside from providing an efficient mechanical coupling, which on its own has shown to improve prosthesis functionality and the patients quality of life, can also be used as a bidirectional communication interface between implanted electrodes and the prosthetic arm. This is today the most advanced and unique technique for bidirectional neuromuscular interfacing, suited for the upper limb amputees, which was proven functional in the long term. The goal of the DeTOP project is to push the boundaries of this technology made in Europe to the next TRL and to make it clinically available to the largest population of upper limb amputees, namely transradial amputees. This objective will be targeted by developing a novel prosthetic hand with improved functionality, smart mechatronic devices for safe implantable technology, and by studying and assessing paradigms for natural control (action) and sensory feedback (perception) of the prosthesis through the implant. The novel technologies and findings will be assessed by three selected patients, implanted in a clinical centre. DeTOP bridges several currently disjointed scientific fields and is therefore critically dependent on the collaboration of engineers, neuroscientists and clinicians.