Palo Alto, CA, United States
Palo Alto, CA, United States

Time filter

Source Type

Grant
Agency: Cordis | Branch: FP7 | Program: NOE | Phase: ICT-2007.1.1 | Award Amount: 20.70M | Year: 2008

Future networks became a central topic with a large debate whether moving towards the new networked society will be evolutionary or disruptive. In the future networked society the physical and the digital worlds will merge based on the massive usage of wireless sensor networks. Objects will be able to identify and locate themselves and to communicate through radio interfaces. Self-organized edge networks will become more and more common. Virtualization and programmability will allow for providing different networking environments over the same infrastructure. Autonomic networking will deal with the increasing complexity of IandC systems. End-users empowerment will increase with his capacity of providing services and content, as well as connectivity support.\nThis new environment forces the scientific community to develop new principles and methods to design/dimension/control/manage future multi-technology architectures. The new paradigms raise new challenging scientific and technological problems embedded in complex policy, governance, and worldwide standards issues. Dealing with the diversity of these scientific and socio-economic challenges requires the integration of a wide range of research capacities; a role that Euro-NF will fulfil.\nIndeed, Euro-NF extends, in scope and duration, the successful Euro-NGI/FGI NoE that has integrated the required critical mass on the networks of the future and is now a major worldwide player in this area. The consortium has evolved in order to have an optimal coverage of the new scope. Euro-NF will therefore cover the integration of a wide range of European research capacities, including researchers and research and dissemination activities. As such Euro-NF will continue to develop a prominent European center of excellence in Future networks design and engineering, acting as a Collective Intelligence Think Tank, representing a major support for the European Society leading towards a European leadership in this area.


Grant
Agency: Cordis | Branch: FP7 | Program: CP | Phase: FI.ICT-2011.1.8 | Award Amount: 20.32M | Year: 2013

The FI-CONTENT 2 project aims at establishing the foundation of a European infrastructure for promoting and testing novel uses of audio-visual content on connected devices. The partners will develop and deploy advanced platforms for Social Connected TV, Mobile Smart City services, and Gaming/ Virtual worlds. To assess the approach and improve these platforms, user communities in 6 European locations will be activated for living lab and field trials. The project is strongly supported by local stakeholders (regional authorities, associations, educational organizations, user groups) who will participate in the project via User Advisory Boards. The technical capabilities of the platforms will be validated and improved by integrating new - content usage driven - partners recruited via the open call planned early in the project.\nIn FI-CONTENT (FI-PPP Phase 1), we demonstrated that challenging and bold assertions around next generation Internet content and technology needs are best assessed with radical yet practical demonstrators, use cases, APIs and field research. FI-CONTENT 2 builds on our work in Phase 1, refining the findings where appropriate.\nThe project has good relationships with the other projects of the FI-PPP program. Contacts have been taken for coordination and potentially joint experiments with other FI-PPP projects. The proposal shows how to work with FI-WARE and existing EU infrastructure projects where suitable, and demonstrates how best to create and define new domain specific technologies, mostly cloud based.\nThe FI-CONTENT 2 partnership is a balanced group of large industrial, Content and Media companies, technology suppliers, Telecommunications/Internet access operators, Living labs and Academic institutions. FI-CONTENT-2 harnesses the power and excitement of content on the new Internet to drive European innovation, content creation and distribution to enrich the lives of all Europeans.


Grant
Agency: GTR | Branch: EPSRC | Program: | Phase: Training Grant | Award Amount: 3.35M | Year: 2014

Our 21st century lives will be increasingly connected to our digital identities, representations of ourselves that are defined from trails of personal data and that connect us to commercial and public services, employers, schools, families and friends. The future health of our Digital Economy rests on training a new generation of leaders who can harness the emerging technologies of digital identity for both economic and societal value, but in a fair and transparent manner that accommodates growing public concern over the use of personal data. We will therefore train a community of 80 PhD students with the interdisciplinary skills needed to address the profound challenges of digital identity in the 21st century. Our training programme will equip students with a unique blend of interdisciplinary skills and knowledge across three thematic aspects of digital identity - enabling technologies, global impacts and people and society - while also providing them with the wider research and professional skills to deliver a research project across the intersection of at least two of these. Our students will be situated within Horizon, a leading centre for Digital Economy research and a vibrant environment that draws together a national research Hub, CDT and a network of over 100 industry, academic and international partners. Horizon currently provides access to a large network of over 75 potential supervisors, ranging from from leading Professors to talented early career researchers. Each student will work with an industry, public, third sector or international partner to ensure that their research is grounded in real user needs, to maximise its impact, and also to enhance their employability. These external partners will be involved in co-sponsorship, supervision, providing resources and hosting internships. Our external partners have already committed to co-sponsor 30 students so far, and we expect this number to grow. Our centre also has a strong international perspective, working with international partners to explore the global marketplace for digital identity services as well as the cross-cultural issues that this raises. This will build on our success in exporting the CDT model to China where we have recently established a £17M International Doctoral Innovation Centre to train 50 international students in digital economy research with funding from Chinese partners. We run an integrated four-year training programme that features a bespoke core covering key topics in digital identity, optional advanced specialist modules, practice-led team and individual projects, training in research methods and professional skills, public and external engagement, and cohort building activities including an annual writing retreat and summer school. The first year features a nine month structured process of PhD co-creation in which students, supervisors and external partners iteratively refine an initial PhD topic into a focused research proposal. Building on our experience of running the current Horizon CDT over the past five years, our management structure responds to external, university and student input and manages students through seven key stages of an extended PhD process: recruitment, induction, taught programme, PhD co-creation, PhD research, thesis, and alumni. Students will be recruited onto and managed through three distinct pathways - industry, international and institutional - that reflect the funding, supervision and visiting constraints of working with varied external partners.


News Article | March 2, 2017
Site: www.csmonitor.com

—Computer engineers have created some amazingly small devices, capable of storing entire libraries of music and movies in the palm of your hand. But geneticists say Mother Nature can do even better. DNA, where all of biology's information is stored, is incredibly dense. The whole genome of an organism fits into a cell that is invisible to the naked eye. That's why computer scientists are turning to molecular biology to design the next best way to store humanity's ever-increasing collection of digital data. With every new app, selfie, blog post, or cat video, the hardware to store the world's vast archive of digital information is filling up. But, theoretically, DNA could store up to 455 exabytes per gram. In other words, you could have 44 billion copies of the extended versions of all three of The Lord of the Rings movies on the tip of your finger. (For reference, watching all those movies would take more than 164 million years.) George Church, a geneticist at Harvard University and the Massachusetts Institute of Technology, first used DNA as storage for digital information in 2012, which he reported in a paper published in the journal Science. At the time, he revealed his success during an interview on the Colbert Report by showing Stephen Colbert a tiny piece of paper on which there was a small spot that contained millions of copies of Dr. Church's book, "Regenesis," in the form of DNA. Church and his colleagues were focused on proving that digital information could indeed be encoded in DNA at the time. But since then, teams of engineers and biologists have expanded on this proof-of-concept and worked to squeeze more and more data into DNA, eyeing the vast storage Church had predicted possible. A team at the European Bioinformatics Institute (EBI) in Hinxton, Britain, reported that they had made the largest DNA archive ever in 2013, putting 739 kilobytes worth of computer files into DNA strands. (Church's book had required about 650 kilobytes.) In July 2016, a team of Microsoft and University of Washington researchers announced that they had reset that record, storing 200 megabytes of data in DNA. Now, researchers at the New York Genome Center and Columbia University have ramped up the density of data stored in DNA molecules. They were able to reach a density of 214 petabytes per gram of DNA, according to a paper published Thursday in the journal Science – which is over eight times as dense as previous work. "This is a huge leap forward," says Church, who was not involved in the new research. Although he had calculated that this high data density was possible in his own work, Church and his team hadn't actually made it work. "They've proven a hypothetical," he says in a phone interview with The Christian Science Monitor. From DVDs to DNA: How does it work? Digital data in its simplest form is just 0s and 1s, Yaniv Erlich, lead author of the new study, explains in a phone interview with the Monitor. Any file, be it a computer program or a movie, is made up of a series of 0s and 1s. Similarly, DNA has its own series of letters, A, C, G, and T. Those letters represent the nucleotides – adenine, cytosine, guanine, and thymine – that are the basic structural units of DNA. So to convert digital data to DNA, Dr. Erlich's team and others have essentially translated 0s and 1s into As, Cs, Gs, and Ts. Then, the resulting DNA sequence is sent to a company that prints synthetic DNA, in this case San Francisco-based Twist Bioscience. What they receive back is a vial about half the size of a thumb that looks like it just has a little liquid in it. But there's actually DNA in there. To access the data stored in it, the team sequences the DNA and translates it back into 0s and 1s. In this case, the researchers encoded and then retrieved a full computer operating system, an 1895 French film, "Arrival of a train at La Ciotat," a $50 Amazon gift card, a computer virus, a Pioneer plaque, and a 1948 study by information theorist Claude Shannon. As one of the tests of the data, Erlich used the computer operating system to play the game Minesweeper. The genetic material is not extracted from any animal or plant. "DNA is just a hardware here," Erlich writes in a follow-up email to the Monitor. "It is not related to anything that is living and is not even derived from anything that was alive before. The synthesis, copying, and sequencing process are purely chemical." Turning digital data into DNA may seem as simple as coming up with a code for 0s and 1s, and As, Cs, Gs, and Ts. But it's a bit more complicated than that. First of all, Erlich says, not all DNA sequences are robust. For example, a string of all the same nucleotides, say, AAAAAAAAAAAA, is particularly fragile and difficult to read correctly. But the same isn't true for computer code. In addition, not all DNA molecules will survive the sequencing and retrieval process. And the scientists can't risk losing key pieces of the code. To resolve these problems, Erlich used what is known in computing as a fountain code to act as sort of gatekeeper that provided clues to the code rather than the code itself. Because DNA Fountain, as he calls the algorithm, can provide an unlimited amount of clues, if a few get lost in the process they will still be able to decode the DNA sequence in the end. In addition to this method to make the translation more robust, Erlich wanted to see if the data-filled DNA could be replicated without error. The process of sequencing the DNA includes removing some molecules from the sample. So to preserve the data and be able to access it, scientists have to be able to make copies, Erlich explains. So he made 25 copies, and copies of the copies, and copies of the copies of the copies, and so on nine times. And even in the most copied copies, he says, "we were able to perfectly retrieve this information. It's very robust." Are we entering the age of DNA-computers? Despite these strides to move digital data from hard drives to DNA and back, don't expect your next computer or smartphone to contain DNA. "This is still the early stages of DNA storage. It's basic science," Erlich says. "It's not that tomorrow you're going to go to Best Buy and get your DNA hard drive. And we don't envision that this will be in some hard drive that people will buy." "I think the more immediate use is for archiving," Church says. The method lends itself to archiving vast amounts of data that doesn't need to be accessed regularly, like video surveillance, for example, he says. Besides density, one reason DNA data storage would be advantageous over, say, a massive warehouse full of hard drives, Erlich says, is that it doesn't need to be kept cool. Furthermore, DNA doesn't degrade like other data storage tools. Paleoanthropologists have sequenced DNA from Neanderthals and other ancient humans, so Erlich isn't concerned about the longevity of this sort of data storage. The Microsoft researchers see the applications of DNA data storage more broadly. "Any organization or individual who needs long-term archival storage of large amounts of data would benefit from a DNA storage option," write Karin Strauss of Microsoft and Luis Ceze of the University of Washington in an email to the Monitor. "For example, hospitals need to store clinical information for all their patients for a long time, research institutions have massive amounts of data from research projects that need to be preserved, and the emerging virtual reality industry needs high-capacity storage solutions for very large video files. In addition, consumers could benefit from DNA storage via the cloud, especially following the advent of highly portable video cameras and the demand to store personal video online." Currently, the cost and time required for this process is somewhat prohibitive for consumer applications. It cost $7,000 to synthesize the DNA Erlich developed and another $2,000 to read it. The synthesis process took two weeks and the sequencing took about a day. That's not to say that DNA data storage won't touch consumers' everyday life. Church's team has worked with Technicolor to use the new data storage method to preserve the company's many old films. During a media tour in 2016, Jean Bolot, vice-president for research and innovation at Technicolor, showed off a vial containing a million copies of the 1902 French silent film "A Trip to the Moon." He said, "This, we believe, is what the future of movie archiving will look like." [Editor's note: An earlier version of this article erroneously suggested that the Columbia University researchers broke the Microsoft/University of Washington 200-megabyte milestone. An earlier version of the headline of this mistakenly conflated molecular biology with microbiology.]


Le Scouarnec N.,Technicolor
IEEE International Symposium on Information Theory - Proceedings | Year: 2012

We study the exact and optimal repair of multiple failures in codes for distributed storage. More particularly, we examine the use of interference alignment to build exact scalar minimum storage coordinated regenerating codes (MSCR). We show that it is possible to build codes for the case of k = 2 and d ≥ k by aligning interferences independently but that this technique cannot be applied as soon as k ≥ 3 and d > k. Our results also apply to adaptive regenerating codes. © 2012 IEEE.


Grant
Agency: Cordis | Branch: FP7 | Program: CP | Phase: ICT-2013.1.6 | Award Amount: 4.95M | Year: 2013

The explosion of information available online and the ubiquity of connected media devices are rendering existing content recommendation and content delivery system inadequate. The recent data deluge has made finding relevant content a daunting task. Users are presented with seemingly infinite choices for consumption and because recommendation systems are typically service or application specific and based on little or narrow data, which results in too coarse-grained recommendations. Furthermore, existing content delivery systems focus their media adaptation to match device and network characteristics, instead of users context and profile that could help increase the relevance of content search and the viewing conditions.\n\nWe introduce User Centric Networking (UCN), a new communication paradigm that leverages user information at large to store, discover and deliver content in the most optimal conditions at any time, for a given user in a specific context. UCN relies on a distributed Personal Information Hub (PIH) that contains information such as the user context, her mood, historical data about her taste, expectation, social acquaintances, and her network/device resources. UCN will use these data to decide at any point in time where to search content and where to deliver it from, and how to configure the delivery for a user in her context. In addition, UCN creates opportunities for a new range of personalized services, based on geo-location or fusion of very different sensor data for example.\n\nUCN will deliver prototypes for a new generation of Internet-based applications and services in the digital media sector and beyond. These prototypes will deployed in Technicolors Home Networking product line, designed with real data and tested in real conditions at Portugal Telecom and a NICTA who both use Technicolors most recent gateway technology.


VFX studio MPC, a Technicolor company has received an Oscar nomination for Best Visual Effects for its work on Disney movie The Jungle Book.


News Article | February 15, 2017
Site: globenewswire.com

Technicolor fournit des technologies, des produits et des services aux sociétés leaders de l'industrie des médias et du divertissement et détient un portefeuille de propriété intellectuelle riche de plus de 30 000 brevets et applications, dans les domaines de la compression vidéo, du traitement de l'image, des télécommunications, de l'expérience utilisateur, de la sécurité et des écrans. Ce portefeuille provient d'années d'efforts internes de Recherche et Développement de Technicolor consacrées à faire avancer les technologies et les équipements destinés aux médias et au divertissement. Il s'agit du produit du travail cumulé de milliers de chercheurs et ingénieurs de Technicolor au cours des ans et d'investissements en Recherche et Développement s'élevant à plus de 100 millions d'euros par an. Après de longues négociations pour parvenir à un accord équilibré, le Groupe se doit maintenant d'agir afin de se protéger et de préserver la valeur de son portefeuille de propriété intellectuelle. Les actions en contrefaçon ont été engagées auprès des Cours Régionales de Düsseldorf et de Mannheim en Allemagne et du Tribunal de Grande Instance de Paris en France. Elles portent sur dix brevets, en matière de technologies de vidéo compression, de télécommunications et d'autres technologies associées. Technicolor est représenté par le cabinet international Bird & Bird dans le cadre des contentieux en Europe et est conseillé par le cabinet Irell & Manella LLP pour les sujets de droit américain relatifs à la propriété intellectuelle. Technicolor, leader technologique mondial dans le secteur du Media & Entertainment, est à la pointe de l'innovation numérique. Grâce à nos laboratoires de recherche et d'innovation de premier plan, nous occupons des positions- clés sur le marché au travers de la fourniture de services vidéo avancés pour les créateurs et les distributeurs de contenu. Nous bénéficions également d'un riche portefeuille de propriété intellectuelle, centré sur les technologies de l'image et du son. Notre engagement : soutenir le développement de nouvelles expériences passionnantes pour les consommateurs au cinéma, à la maison, ou en mobilité.


News Article | February 20, 2017
Site: www.PR.com

Tech startup cuts costs and simplifies global freight shipping for importers and exporters. Los Angeles, CA, February 20, 2017 --( “The ports of Los Angeles and Long Beach are the country’s busiest and serve as the gateways to the U.S. from Asia,” said Shippabo CEO Nina Luu. “Headquartered in Los Angeles, Shippabo is at the center of the action for shippers using these shipping lanes. It is the most promising company providing a total international supply chain management system.” As the co-founder of IGH, Luu was an importer herself, supplying major retailers like Costco, Saks Fifth Avenue and several others. She started Shippabo out of her own need for a user-friendly, automated supply chain management system that offered better visibility to avoid costly penalties for missing deadlines. “Small and midsize shippers are in most need of a system like ours. As their businesses grow beyond 100 containers, they struggle to keep track of shipment bookings and statuses,” Luu said. “Shippabo lets them book shipments as soon as they place their purchase orders. In solving IGH’s logistics and supply chain problems, Shippabo has helped many other businesses such as Loot Crate and Technicolor overcome the same obstacles.” About Shippabo Launched in 2015, Shippabo is a leading cloud-based, international supply chain platform for importers and exporters. We provide visibility and control over your purchase orders, freight shipping, customs release and final delivery. Our data-driven technology provides intelligence and optimization in your logistics and inventory planning. Los Angeles, CA, February 20, 2017 --( PR.com )-- Shippabo, a leading cloud-based international supply chain platform for importers, announced today it has secured $1.8 million in private funding from Wonder Ventures, TenOneTen Ventures, Double M Partners, Slow Ventures, angel investor Joanne Wilson and others. Wilson and Dustin Rosen of Wonder Ventures joined Shippabo’s board of directors.“The ports of Los Angeles and Long Beach are the country’s busiest and serve as the gateways to the U.S. from Asia,” said Shippabo CEO Nina Luu. “Headquartered in Los Angeles, Shippabo is at the center of the action for shippers using these shipping lanes. It is the most promising company providing a total international supply chain management system.”As the co-founder of IGH, Luu was an importer herself, supplying major retailers like Costco, Saks Fifth Avenue and several others. She started Shippabo out of her own need for a user-friendly, automated supply chain management system that offered better visibility to avoid costly penalties for missing deadlines.“Small and midsize shippers are in most need of a system like ours. As their businesses grow beyond 100 containers, they struggle to keep track of shipment bookings and statuses,” Luu said. “Shippabo lets them book shipments as soon as they place their purchase orders. In solving IGH’s logistics and supply chain problems, Shippabo has helped many other businesses such as Loot Crate and Technicolor overcome the same obstacles.”About ShippaboLaunched in 2015, Shippabo is a leading cloud-based, international supply chain platform for importers and exporters. We provide visibility and control over your purchase orders, freight shipping, customs release and final delivery. Our data-driven technology provides intelligence and optimization in your logistics and inventory planning. Click here to view the list of recent Press Releases from Shippabo


Grant
Agency: Cordis | Branch: FP7 | Program: CP | Phase: ICT-2009.1.1 | Award Amount: 8.07M | Year: 2010

The Internet has evolved from a technology-centric core network to a user- and content-centric network that must support millions of users creating and consuming content. It must accommodate new services with new requirements and cope with heterogeneous network technologies. The momentum is moving toward the end user who is now capable of creating, storing, and delivering content and services. FIGARO proposes a Future Internet architecture that is structured around residential networks. In this architecture, home gateways have a key role as integrator of different networks and services, and as coordinator of Internet-wide distributed content management. FIGARO will: i) design a novel content management architecture that enables distributed content backup, search and access. This architecture will also support mobile users and wireless ad-hoc content sharing; ii) develop a network optimization framework, leveraging community networks and heterogeneous networks; iii) deliver a network management architecture which includes new network monitoring and real-time troubleshooting techniques; iv) explore novel Internet-based communication and service solutions for emerging sectors, such as energy management and e-health care.\nWe will deliver the components of the FIGARO architecture through an experimental approach incorporating testbed prototyping of solutions. In summary, FIGARO is intended to evolve the current Internet to meet future demands of applications, services and end-users, while preserving its current robustness and increasing its scalability and efficiency. Furthermore, the integration of new sectors into the future Internet will spur trans-sector innovation and create new businesses. The project is expected to result in technologies that will strengthen Europes position and give competitive advantage to European industry in the areas of Future Internet technologies and services, residential gateways and home automation.

Loading Technicolor collaborators
Loading Technicolor collaborators