Entity

Time filter

Source Type

Cambridge, United Kingdom

Nagaraja S.,Computer Laboratory
Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) | Year: 2010

There are a number of scenarios where users wishing to communicate, share a weak secret. Often, they are also part of a common social network. Connections (edges) from the social network are represented as shared link keys between participants (vertices). We propose mechanisms that utilise the graph topology of such a network, to increase the entropy of weak pre-shared secrets. Our proposal is based on using random walks to identify a chain of common acquaintances between Alice and Bob, each of which contribute entropy to the final key. Our mechanisms exploit one-wayness and convergence properties of Markovian random walks to, firstly, maximize the set of potential entropy contributors, and second, to resist any contribution from dubious sources such as Sybill sub-networks. © 2010 Springer-Verlag. Source


Gordon M.J.C.,Computer Laboratory | Kaufmann M.,University of Texas at Austin | Ray S.,University of Texas at Austin
Journal of Automated Reasoning | Year: 2011

We present a case study illustrating how to exploit the expressive power of higher-order logic to complete a proof whose main lemma is already proved in a first-order theorem prover. Our proof exploits a link between the HOL4 and ACL2 proof systems to show correctness of a cone of influence reduction algorithm, implemented in ACL2, with respect to the classical semantics of linear temporal logic, formalized in HOL4. © 2010 Springer Science+Business Media B.V. Source


News Article
Site: http://www.rdmag.com/rss-feeds/all/rss.xml/all

Here’s a crossword puzzle clue for you: A tall, long-necked spotted ruminant of Africa. If you’re someone who knows African wildlife like the back of your hand, it may take you only a moment to come up with: Giraffe. Researchers from the Univ. of Cambridge, New York Univ., and Université de Montréal have developed a web-based platform that can assist you with your morning crossword puzzle and act as a reverse dictionary.  But as fun as solving crossword puzzles might be, the tool may have a deeper role to play when it comes to teaching artificial intelligence (AI) systems human language. The research behind the tool was published in Transactions of the Association for Computational Linguistics. “To compile a bank of dictionary definitions for training the model, we started with all words in the target embedding space. For each of these words, we extracted dictionary-style definitions from five electronic sources: Wordnet, The American Heritage Dictionary, The Collaborative International Dictionary of English, Wiktionary and Webster’s,” the researchers wrote in their study. “To allow models access to more factual knowledge than might be present in a dictionary (for instance, information about specific entities, places or people) we supplemented this training data with information extracted from Simple Wikipedia,” they added. The researchers released the code for their system online for future research usage. The research represents an early step towards endowing machines with the ability of human language. Deep learning—which is when scientists feed an artificial neural network with massive amounts of data—is integral to this process. “Despite recent progress in AI, problems involving language understanding are particularly difficult, and our work suggests many possible applications of deep neural network to language technology,” said co-author Felix Hill, of Cambridge’s Computer Laboratory, in a statement. “One of the biggest challenges in training computers to understand language is recreating the many rich and diverse information sources available to humans when they learn to speak and read.” Currently, the way computers learn language, according to Hill, is similar to the psychological framework referred to as cognitivism. In order to understand the larger context of human communication though, researchers need to combine cognitivism with behaviorism, in order for a machine to be able to infer meaning. According to Univ. of Cambridge, the researchers are looking into integrating behaviorist-style models of language learning and linguistic interaction into their system. R&D 100 AWARD ENTRIES NOW OPEN: Establish your company as a technology leader! For more than 50 years, the R&D 100 Awards have showcased new products of technological significance. You can join this exclusive community!  .


News Article
Site: http://phys.org/technology-news/

The researchers behind the study, led by the University of Cambridge, will present their results today (13 April) at the 25th International World Wide Web Conference in Montréal. The Cambridge researchers, working with colleagues from the University of Birmingham, Queen Mary University of London, and University College London, used data from approximately 37,000 users and 42,000 venues in London to build a network of Foursquare places and the parallel Twitter social network of visitors, adding up to more than half a million check-ins over a ten-month period. From this data, they were able to quantify the 'social diversity' of various neighbourhoods and venues by distinguishing between places that bring together strangers versus those that tend to bring together friends, as well as places that attract diverse individuals as opposed to those which attract regulars. When these social diversity metrics were correlated with wellbeing indicators for various London neighbourhoods, the researchers discovered that signs of gentrification, such as rising housing prices and lower crime rates, were the strongest in deprived areas with high social diversity. These areas had an influx of more affluent and diverse visitors, represented by social media users, and pointed to an overall improvement of their rank, according to the UK Index of Multiple Deprivation. The UK Index of Multiple Deprivation (IMD) is a statistical exercise conducted by the Department of Communities and Local Government, which measures the relative prosperity of neighbourhoods across England. The researchers compared IMD data for 2010, the year their social and place network data was gathered, with the IMD data for 2015, the most recent report. "We're looking at the social roles and properties of places," said Desislava Hristova from the University's Computer Laboratory, and the study's lead author. "We found that the most socially cohesive and homogenous areas tend to be either very wealthy or very poor, but neighbourhoods with both high social diversity and high deprivation are the ones which are currently undergoing processes of gentrification." This aligns with previous research, which has found that tightly-knit communities are more resistant to changes and resources remain within the community. This suggests that affluent communities remain affluent and poor communities remain poor because they are relatively isolated. Hristova and her co-authors found that of the 32 London boroughs, the borough of Hackney had the highest social diversity, and in 2010, had the second-highest deprivation. By 2015, it had also seen the most improvement on the IMD index, and is now an area undergoing intense gentrification, with house prices rising far above the London average, fast-decreasing crime rate and a highly diverse population. In addition to Hackney, Tower Hamlets, Greenwich, Hammersmith and Lambeth are also boroughs with high social diversity and high deprivation in 2010, and are now undergoing the process of gentrification, with all of the positive and negative effects that come along with it. The ability to predict the gentrification of neighbourhoods could help local governments and policy-makers improve urban development plans and alleviate the negative effects of gentrification while benefitting from economic growth. In order to measure the social diversity of a given place or neighbourhood, the researchers defined four distinct measures: brokerage, serendipity, entropy and homogeneity. Brokerage is the ability of a place to connect people who are otherwise disconnected; serendipity is the extent to which a place can induce chance encounters between its visitors; entropy is the extent to which a place is diverse with respect to visits; and homogeneity is the extent to which the visitors to a place are homogenous in their characteristics. Within categories of places, the researchers found that some places were more likely places for friends to meet, and some were for more fleeting encounters. For example, in the food category, strangers were more likely to meet at a dumpling restaurant while friends were more likely to meet at a fried chicken restaurant. Similarly, friends were more likely to meet at a B&B, football match or strip club, while strangers were more likely to meet at a motel, art museum or gay bar. "We understand that people who diversify their contacts not only socially but also geographically have high social capital but what about places?" said Hristova. "We all have a general notion of the social diversity of places and the people that visit them, but we've attempted to formalise this - it could even be used as a specialised local search engine." For instance, while there are a number of ways a tourist can find a highly-recommended restaurant in a new city, the social role that a place plays in a city is normally only known by locals through experience. "Whether a place is touristy or quiet, artsy or mainstream could be integrated into mobile system design to help newcomers or tourists feel like locals," said Hristova. Explore further: Diversity or deprivation -- what makes a 'bad' neighborhood


News Article
Site: http://phys.org/technology-news/

Specialised computer software components to improve the security, speed and scale of data processing in cloud computing are being developed by a University of Cambridge spin-out company. The company, Unikernel Systems, which was formed by staff and postdoctoral researchers at the University Computer Laboratory, has recently been acquired by San-Francisco based software company Docker Inc. Unikernels are small, potentially transient computer modules specialised to undertake a single task at the point in time when it is needed. Because of their reduced size, they are far more secure than traditional operating systems, and can be started up and shut down quickly and cheaply, providing flexibility and further security. They are likely to become increasingly used in applications where security and efficiency are vital, such as systems storing personal data and applications for the so-called Internet of Things (IoT) – internet-connected appliances and consumer products. "Unikernels provide the means to run the same application code on radically different environments from the public cloud to IoT devices," said Dr Richard Mortier of the Computer Laboratory, one of the company's advisors. "This allows decisions about where to run things to be revisited in the light of experience - providing greater flexibility and resilience. It also means software on those IoT devices is going to be a lot more reliable." Recent years have seen a huge increase in the amount of data that is collected, stored and processed, a trend that will only continue as increasing numbers of devices are connected to the internet. Most commercial data storage and processing now takes place within huge datacentres run by specialist providers, rather than on individual machines and company servers; the individual elements of this system are obscured to end users within the 'cloud'. One of the technologies that has been instrumental in making this happen is virtual machines. Normally, a virtual machine (VM) runs just like a real computer, with its own virtual operating system – just as your desktop computer might run Windows. However, a single real machine can run many VMs concurrently. VMs are general purpose, able to handle a wide range of jobs from different types of user, and capable of being moved across real machines within datacentres in response to overall user demand. The University's Computer Laboratory started research on virtualisation in 1999, and the Xen virtual machine monitor that resulted now provides the basis for much of the present-day cloud. Although VMs have driven the development of the cloud (and greatly reduced energy consumption), their inherent flexibility can come at a cost if their virtual operating systems are the generic Linux or Windows systems. These operating systems are large and complex, they have significant memory footprints, and they take time to start up each time they are required. Security is also an issue, because of their relatively large 'attack surface'. Given that many VMs are actually used to undertake a single function, (e.g. acting as a company database), recent research has shifted to minimising complexity and improving security by taking advantage of the narrow functionality. And this is where unikernels come in. Researchers at the Computer Laboratory started restructuring VMs into flexible modular components in 2009, as part of the RCUK-funded MirageOS project. These specialised modules – or unikernels - are in effect the opposite of generic VMs. Each one is designed to undertake a single task; they are small, simple and quick, using just enough code to enable the relevant application or process to run (about 4% of a traditional operating system according to one estimate). The small size of unikernels also lends considerable security advantages, as they present a much smaller 'surface' to malicious attack, and also enable companies to separate out different data processing tasks in order to limit the effects of any security breach that does occur. Given that resource use within the cloud is metered and charged, they also provide considerable cost savings to end users. By the end of last year, the unikernel technology arising from MirageOS was sufficiently advanced that the team, led by Dr. Anil Madhavapeddy, decided to found a start-up company. The company, Unikernel Systems, was recently acquired by San Francisco-based Docker Inc. to accelerate the development and broad adoption of the technology, now envisaged as a critical element in the future of the Internet of Things. "This brings together one of the most significant developments in operating systems technology of recent years, with one of the most dynamic startups that has already revolutionised the way we use cloud computing. This link-up will truly allow us all to "rethink cloud infrastructure", said Balraj Singh, co-founder and CEO of Unikernel Systems. "This acquisition shows that the Computer Laboratory continues to produce innovations that find their way into mainstream developments. It also shows the power of open source development to have impact and to be commercially successful", said Professor Andy Hopper, Head of the University of Cambridge Computer Laboratory. Explore further: Researchers discover new way to patch holes in the 'cloud'

Discover hidden collaborations