Saint-Germain-Laval, France
Saint-Germain-Laval, France

Time filter

Source Type

In den USA wurde für FUJIFILM Diosynth Biotechnologies Texas, LLC (FDBT) eine cGMP-Produktionsanlage im Wert von 10 Mrd. JPY (USD 93 Mio.) fertiggestellt. Die Anlage wurde zum Teil von der BARDA (Biomedical Advanced Research and Development Authority), einer Abteilung des U.S. Department of Health and Human Services, gefördert. Fujifilm plant Investitionen von weiteren 3 Mrd JPY (USD 28 Mrd.), um die Anlage mit Bioreaktoren für Säugetierzellkulturen auszustatten. Der Standort nimmt Anfang 2018 den Betrieb auf. FUJIFILM Diosynth Biotechnologies Texas, LLC (FDBT) war 2014 von FUJIFILM Diosynth Biotechnologies U.S.A., Inc. (FDBU) übernommen worden. Die Transaktion war von Anfang an auf die vollständige Übernahme ausgelegt, und im März dieses Jahres wurde FDBT ein hundertprozentiges Tochterunternehmen von FDBU. Eine weitere Investition in Höhe von 1 Mrd. JPY (USD 9 Mio) dient der Steigerung der Leistungsfähigkeit der Prozessentwicklungsanlagen der FUJIFILM Diosynth Biotechnologies UK Limited bei Billingham, UK. Die Planungen gehen von der Betriebsbereitschaft dieses Standorts zum Sommer 2017 aus. ]Der FDBT-Standort wird das neue Exzellenz-Zentrum für Fujifilms Saturn Monoclonal Antibody Platform; die Zellkulturkapazität liegt zunächst bei 6000 l (3 Bioreaktoren à 2000 l). Die Anlage ist so gestaltet, dass eine zukünftige Erweiterung auf bis zu 24.000 l an Upstream-Kapazität möglich ist, um den steigenden klinischen und gewerblichen Bedarf der Kunden zu decken. Im Vereinigten Königreich sehen die Investitionen die Einrichtung eines Exzellenz-Zentrums für Säugetierzellkulturen vor. Auf etwa 1000 qm sind diese Labore mit neuester Technologie mit höchster Durchsatzleistung ausgestattet, etwa vollautomatische Bioreaktoren und Chromatografiesysteme. So ist die Anlage vorbereitet auf die schnelle und effiziente Herstellung von monoklonalen Antikörpern. Damit verfügt Fujifilm über Fertigungsstätten, Technologie und Wissen, die in der Branche führend sind. Fujifilm ist so in der Lage, in Zukunft für unsere langjährigen Partner CIADM (Center for Innovation in Advanced Development and Manufacturing), BARDA und andere Drittparteien verstärkt an der Entwicklung von medizinische Gegenmaßnahmen und Wirkstoffen gegen die Vogelgrippe-Epidemie zu arbeiten. FUJIFILM Diosynth Biotechnologies ist ein marktführendes Unternehmen zur Auftragsentwicklung und -herstellung von biologischen Wirkstoffen (CDMO) mit Standorten in Billingham im UK, RTP in North Carolina und College Station in Texas. FUJIFILM Diosynth verfügt über 25 Jahre an Erfahrung bei der Entwicklung von rekombinanten Proteinen, Impfstoffen und monoklonalen Antikörpern sowie anderer großer Moleküle, viraler Produkte und medizinischer Gegenmaßnahmen, und verfügt über eine breite Palette an mikrobiellen, Säugetier- und Wirt-/Virus-Systemen. Das Unternehmen bietet umfassende Dienstleistungen, angefangen bei der Entwicklung von Zelllinien mithilfe der unternehmensintern entwickelten pAVEway™ mikrobiellen und Apollo™ Zellliniensysteme bis hin zur Prozessentwicklung, Analyseentwicklung und klinisch überwachten und FDA-genehmigten Produktion. FUJIFILM Diosynth Biotechnologies ist ein Partnerunternehmen der der FUJIFILM Corporation und der Mitsubishi Corporation. Besuchen Sie für weitere Informationen www.fujifilmdiosynth.com.


GAITHERSBURG, Md., Feb. 23, 2017 (GLOBE NEWSWIRE) -- Emergent BioSolutions Inc. (NYSE:EBS) reported financial results for the quarter and twelve months ended December 31, 2016. (1)  The presentation of Emergent’s financial performance using the “Combined Basis” method includes the impact of the operations associated with the Company’s former biosciences business which was spun-off into a separate publicly traded company, Aptevo Therapeutics Inc., on August 1, 2016. The presentation of Emergent’s financial performance using the “Continuing Operations Basis” method excludes the impact of the operations of Aptevo. (2) See “Reconciliation of Statement of Operations” for a reconciliation of the Company’s Statement of Operations for the Three and Twelve Months Ended December 31, 2016 on a continuing operations basis to that on a combined basis. (3) See “Reconciliation of Net Income to Adjusted Net Income and EBITDA” for a definition of terms and a reconciliation table. Note: The following discussion of Emergent’s year to date and quarter ended December 31, 2016 unaudited, financial performance is on a Continuing Operations Basis. Product Sales For Q4 2016, product sales were $87.5 million, a decrease of 30% as compared to 2015. The decrease is principally attributable to lower BioThrax deliveries under the Company’s new contract with the CDC, signed in December 2016. Contract Manufacturing For Q4 2016, revenue from the Company’s contract manufacturing operations was $16.7 million, an increase of 59% as compared to 2015. The increase primarily reflects an increase in fill/finish services at the Company’s Camden facility in Baltimore. Contracts and Grants For Q4 2016, contracts and grants revenue was $47.5 million, an increase of 91% as compared to 2015. The increase primarily reflects an increase in development funding for the Company’s Bayview facility in Baltimore designated as a Center for Innovation in Advanced Development and Manufacturing (CIADM) and plasma collection for the Company’s VIGIV® [Vaccinia Immune Globulin Intravenous (Human)] program. Cost of Product Sales and Contract Manufacturing For Q4 2016, cost of product sales and contract manufacturing was $38.3 million, an increase of 11% as compared to 2015. The increase reflects an increase in the BioThrax cost per dose sold associated with lower production yield in the period in which the doses sold were produced, along with increased costs associated with the increase in Other product sales volume, partially offset by a decrease in BioThrax sales to the CDC. Research and Development For Q4 2016, gross research and development (R&D) expenses were $27.1 million, an increase of 7% as compared to 2015. For Q4 2016, net R&D was fully funded, resulting in a net contribution from funded development programs of $20.4 million, as compared to a net expense of $0.5 million in 2015. Net R&D, which is more representative of the Company’s actual out-of-pocket investment in product development, is calculated as gross research and development expenses less contracts and grants revenue. Selling, General and Administrative For Q4 2016, selling, general and administrative expenses were $35.4 million, an increase of 1% as compared to 2015. Net Income For Q4 2016, net income was $32.3 million, or $0.67 per diluted share, versus $42.5 million, or $0.90 per diluted share, in 2015. For Q4 2016 and 2015, net income per diluted share is computed using the “if-converted” method. This method requires net income to be adjusted to add back interest expense and amortization of debt issuance cost, both net of tax, associated with the Company’s 2.875% Convertible Senior Notes due 2021. As a result, net income per diluted share for Q4 2016 is adjusted in the amount of $1.1 million, from $32.3 million to $33.4 million, and diluted shares outstanding were 49.6 million. Net income per diluted share for Q4 2015 is adjusted in the amount of $0.9 million, from $42.5 million to $43.4 million, and diluted shares outstanding were 48.1 million. Product Sales For the twelve months of 2016, product sales were $296.3 million, a decrease of 10% as compared to 2015. The decrease is principally attributable to a 19% reduction in BioThrax sales, including reduced deliveries in 4Q 2016 related to the timing of signing the Company’s follow-on contract with CDC in December 2016. Contract Manufacturing For the twelve months of 2016, revenue from contract manufacturing operations was $49.1 million, an increase of 14% as compared to 2015. The increase reflects an increase in fill/finish services from the Company’s Camden facility and an increase in bulk manufacturing services from the Company’s facility in Winnipeg, partially offset by a decrease in contract manufacturing revenue related to the production of an MVA Ebola vaccine candidate in 2015. Contracts and Grants For the twelve months of 2016, contracts and grants revenue was $143.4 million, an increase of 22% as compared to 2015. The increase reflects an increase in development funding for the Company’s CIADM program, VIGIV program related to plasma collection, and the NuThrax program related to preparations for a Phase III clinical trial. These increases were offset by lower development funding for the Company’s Anthrasil™ [Anthrax Immune Globulin Intravenous (Human)] program related to timing of plasma collection, PreviThrax™ (recombinant protective antigen anthrax vaccine, purified) candidate related to reduced interest by the U.S. government to fund such a program, and Building 55 related to FDA licensure of the facility in August 2016. Cost of Product Sales and Contract Manufacturing For the twelve months of 2016, cost of product sales and contract manufacturing was $131.3 million, an increase of 22% as compared to 2015. The increase primarily reflects an increase in the BioThrax cost per dose sold associated with lower production yield, along with increased costs associated with the increase in Other product sales volume, partially offset by a decrease in BioThrax sales to the SNS. Research and Development For the twelve months of 2016, gross R&D expenses were $108.3 million, a decrease of 9% as compared to 2015. The decrease primarily reflects lower contract service costs. For the twelve months of 2016, net R&D was fully funded, resulting in a net contribution from funded development programs of $35.1 million, as compared to a net expense of $1.8 million in 2015. Selling, General and Administrative For the twelve months of 2016, selling, general and administrative expenses were $143.7 million, an increase of 19% as compared to 2015. This increase includes costs associated with restructuring activities at the Company’s Lansing, Michigan site, along with increased professional services to support the Company’s strategic growth initiatives and increased information technology investments. Net Income For the twelve months of 2016, net income was $62.5 million, or $1.35 per diluted share, versus $107.6 million, or $2.36 per diluted share, in 2015. Pursuant to the “if-converted” method, net income per diluted share for the twelve months of 2016 is adjusted in the amount of $4.0 million, from $62.5 million to $66.5 million, and diluted shares outstanding were 49.1 million. Net income from continuing operations per diluted share for the twelve months of 2015 is adjusted in the amount of $3.9 million, from $107.6 million to $111.5 million, and diluted shares outstanding were 47.3 million. (3)  See “Reconciliation of Net Income to Adjusted Net Income and EBITDA” for a definition of terms and a reconciliation table. The Company is targeting the following 2020 financial and operational goals: RECONCILIATION OF NET INCOME TO ADJUSTED NET INCOME AND EBITDA This press release contains two financial measures (Adjusted Net Income and EBITDA (Earnings Before Interest, Taxes, Depreciation and Amortization)) that are considered “non-GAAP” financial measures under applicable Securities and Exchange Commission rules and regulations. These non-GAAP financial measures should be considered supplemental to and not a substitute for financial information prepared in accordance with generally accepted accounting principles. The Company’s definition of these non-GAAP measures may differ from similarly titled measures used by others. Adjusted Net Income adjusts for specified items that can be highly variable or difficult to predict, or reflect the non-cash impact of charges resulting from purchase accounting. EBITDA reflects net income excluding the impact of depreciation, amortization, interest expense and provision for income taxes. The Company views these non-GAAP financial measures as a means to facilitate management’s financial and operational decision-making, including evaluation of the Company’s historical operating results and comparison to competitors’ operating results. These non-GAAP financial measures reflect an additional way of viewing aspects of the Company’s operations that, when viewed with GAAP results and the reconciliations to the corresponding GAAP financial measure, may provide a more complete understanding of factors and trends affecting the Company’s business. The determination of the amounts that are excluded from these non-GAAP financial measures are a matter of management judgment and depend upon, among other factors, the nature of the underlying expense or income amounts. Because non-GAAP financial measures exclude the effect of items that will increase or decrease the Company’s reported results of operations, management strongly encourages investors to review the Company’s consolidated financial statements and publicly filed reports in their entirety. The following table provides a reconciliation of the Company’s Statement of Operations for the Twelve Months Ended December 31, 2016 on a continuing operations basis to that on a combined basis, which takes into account the impact of the Aptevo-related discontinued operations. Company management will host a conference call at 5:00 pm (Eastern Time) today, February 23, 2017, to discuss these financial results. This conference call can be accessed live by telephone or through Emergent’s website: A replay of the call can be accessed on Emergent’s website www.emergentbiosolutions.com under “Investors.” Emergent BioSolutions Inc. is a global life sciences company seeking to protect and enhance life by focusing on providing specialty products for civilian and military populations that address accidental, intentional and naturally emerging public health threats. Through our work, we envision protecting and enhancing 50 million lives with our products by 2025. Additional information about the company may be found at emergentbiosolutions.com. Follow us @emergentbiosolu. This press release includes forward-looking statements within the meaning of the Private Securities Litigation Reform Act of 1995. Any statements, other than statements of historical fact, including, without limitation, our financial guidance, and any other statements containing the words "believes," "expects," "anticipates," "intends," "plans," "targets," "forecasts," "estimates" and similar expressions in conjunction with, among other things, obtaining a BioThrax procurement contract from BARDA under the Sole Source Notification, discussions of the Company's outlook, financial performance or financial condition, growth strategy, product sales, government development or procurement contracts or awards, government appropriations, manufacturing capabilities, product development, Emergency Use Authorization or other regulatory approvals or expenditures and plans to increase our operational efficiencies and cost structure are forward-looking statements. These forward-looking statements are based on our current intentions, beliefs and expectations regarding future events. We cannot guarantee that any forward-looking statement will be accurate. Investors should realize that if underlying assumptions prove inaccurate or unknown risks or uncertainties materialize, actual results could differ materially from our expectations. Investors are, therefore, cautioned not to place undue reliance on any forward-looking statement. Any forward-looking statement speaks only as of the date of this press release, and, except as required by law, we do not undertake to update any forward-looking statement to reflect new information, events or circumstances. There are a number of important factors that could cause the Company's actual results to differ materially from those indicated by such forward-looking statements, including our ability to obtain a BioThrax procurement contract from BARDA under the Sole Source Notification; the availability of funding and the exercise of options under our BioThrax and NuThrax contracts; appropriations for the procurement of BioThrax and NuThrax; our ability to secure EUA pre-authorization approval and licensure of NuThrax from the U.S. Food and Drug Administration within the anticipated timeframe, if at all; our ability to achieve our planned operational efficiencies and targeted levels of cost savings; availability of funding for our U.S. government grants and contracts; whether the operational, marketing and strategic benefits of the spin-off of our biosciences business can be achieved and the timing of any such benefits; our ability to identify and acquire or in-license products or late-stage product candidates that satisfy our selection criteria; whether anticipated synergies and benefits from an acquisition or in-license are realized within expected time periods, if at all; our ability to utilize our manufacturing facilities and expand our capabilities; our ability and the ability of our contractors and suppliers to maintain compliance with current good manufacturing practices and other regulatory obligations; the results of regulatory inspections; the outcome of the class action lawsuit filed against us and possible other future material legal proceedings; our ability to meet operating and financial restrictions placed on us and our subsidiaries that are contained in our senior credit facility; the rate and degree of market acceptance and clinical utility of our products; the success of our ongoing and planned development programs; the timing of and our ability to obtain and maintain regulatory approvals for our product candidates; and our commercialization, marketing and manufacturing capabilities and strategy. The foregoing sets forth many, but not all, of the factors that could cause actual results to differ from our expectations in any forward-looking statement. Investors should consider this cautionary statement, as well as the risk factors identified in our periodic reports filed with the Securities and Exchange Commission, when evaluating our forward-looking statements. (1) See “Net Income from Continuing Operations” for explanation of adjustments to denominator for per diluted share calculation. (1) See “Net Income from Continuing Operations” for explanation of adjustments to denominator for per diluted share calculation.


COMPEX Pyrolysis System Was Named Best Innovation in Power Industry in Moscow, Russia COMPEX high-temperature pyrolysis system for power generation received the "Golden Lightning" prize in the International Award "Distributed Power Generation – Great Achievements." Moscow, Russia, December 22, 2016 --( The board of the Award highly appreciated the innovation of COMPEX pyrolysis system as well as its economical efficiency. It allows generating electrical and thermal energy through the recovery of synthesis fuel from virtually and hydrocarbon feedstock. The solution is a single complex for feedstock conditioning, pyrolysis gas generation, combustion in a gas turbine and electricity and heat generation. The system is based on a patented technology of fast high-temperature pyrolysis that ensures efficient processing of various feedstock including wood, turf, shale, coals, manure, agricultural wastes and other biomass. The generated fuel can be used in gas turbine units without any additional treatment. Besides pyrolysis gas COMPEX systems provide the following energy resources: synthetic crude oil and high-carbon material. Through the use of virtually waste material an end user can get electricity cheaper by 2-3 times compared with the utility rates simultaneously eliminating the problem of organic wastes utilization. In some cases it is possible to establish waste-free cycle. For instance, a timber processing facility in Serbia utilizes wood scrap that is used to produce pyrolysis gas, which is used by a microturbine unit. High-carbon material, a by-product of pyrolysis, is sold to other companies as a fuel, and synthetic crude oil is used as a fuel in technological process of the facility. The COMPEX solution aroused great interest among many Russian and foreign companies. Its implementation facilitates environmental sustainability and energy efficiency of industrial and agricultural enterprises as well as reduces their energy costs. Moscow, Russia, December 22, 2016 --( PR.com )-- The International Award "Distributed Power Generation – Great Achievements" devoted to advanced developments in the sphere of distributed generation took place in the beginning of December in Moscow. Leading experts of Russia's business community, representatives of federal and regional authorities and scientists assessed developments submitted for the competition. COMPEX high-temperature pyrolysis system for power generation was acknowledged as the best development in the category "Advanced Development in the Sphere of Power Industry." This was one of four categories. In total, more than 50 developments of companies from Russia, Kazakhstan, Lithuania, Belarus and Germany participated in the competition.The board of the Award highly appreciated the innovation of COMPEX pyrolysis system as well as its economical efficiency. It allows generating electrical and thermal energy through the recovery of synthesis fuel from virtually and hydrocarbon feedstock. The solution is a single complex for feedstock conditioning, pyrolysis gas generation, combustion in a gas turbine and electricity and heat generation. The system is based on a patented technology of fast high-temperature pyrolysis that ensures efficient processing of various feedstock including wood, turf, shale, coals, manure, agricultural wastes and other biomass. The generated fuel can be used in gas turbine units without any additional treatment. Besides pyrolysis gas COMPEX systems provide the following energy resources: synthetic crude oil and high-carbon material. Through the use of virtually waste material an end user can get electricity cheaper by 2-3 times compared with the utility rates simultaneously eliminating the problem of organic wastes utilization. In some cases it is possible to establish waste-free cycle. For instance, a timber processing facility in Serbia utilizes wood scrap that is used to produce pyrolysis gas, which is used by a microturbine unit. High-carbon material, a by-product of pyrolysis, is sold to other companies as a fuel, and synthetic crude oil is used as a fuel in technological process of the facility.The COMPEX solution aroused great interest among many Russian and foreign companies. Its implementation facilitates environmental sustainability and energy efficiency of industrial and agricultural enterprises as well as reduces their energy costs. Click here to view the list of recent Press Releases from COMPEX


News Article | December 6, 2016
Site: www.prweb.com

Uptime Institute today announced new career advancement opportunities specifically for data center professionals aligned to the globally recognized Tier Classification System. The new curricula led by senior faculty, results in professional designations respected throughout the industry. The additions include the Accredited Operations Specialist (AOS), an educational course curriculum designed to achieve a complete understanding of the concepts and criteria to develop a comprehensive world class Management and Operations program for a critical facility. The course expansion extends to the well-established Accredited Tier Specialist (ATS) and Accredited Tier Designer (ATD), adding the new Professional and Expert designations to current course offerings. “Working in critical facilities requires constant vigilance and attention to detail. Our customers have told us that they want to continue to learn and grow, they understand that this is a field of constant innovation and development, and we are excited to provide new paths to career advancement based on our Tier Classification System,” said Lee Kirby, President of Uptime Institute. “Adding the Accredited Operations Specialist (AOS) designation is our next step in the growth of our fast-growing Management & Operations Stamp of Approval program offering, and the further extension of the Operational Sustainability offering which teaches the processes and knowledge required to implement industry-recognized behaviors that drive critical facility excellence.” The Uptime Institute curriculum provides courses for a variety of job functions. The Accredited Tier Designer series of courses (ATD) is a technically focused curriculum for individuals responsible for data center design. The Accredited Tier Specialist (ATS) provides instruction for individuals responsible for data center uptime. Along with the new AOS course offerings, accreditation has extended beyond the specialist designations to include Professional and Expert to provide a vehicle for longer term career advancement opportunities. Information on how to attain the Professional and Expert designations may be found on the Uptime Institute website. New Advanced Development Track: Uptime Institute is also adding new Advanced Seminars to further enhance the abilities of data center professionals to achieve further credentials and explore new educational content. These Advanced Seminars can be applied to the Professional and Expert requirements. The following Advanced Seminars will be offered in early 2017: All Uptime Institute one-day, in-person seminars provide focused in-depth reviews on topics that are key to growth and development, taught by senior Uptime Institute staff. About Uptime Institute Uptime Institute is an unbiased advisory organization focused on improving the performance, efficiency, and reliability of business critical infrastructure through innovation, collaboration, and independent certifications. Uptime Institute serves all stakeholders responsible for IT service availability through industry leading standards, education, peer-to-peer networking, consulting, and award programs delivered to enterprise organizations and third-party operators, manufacturers, and providers. Uptime Institute is recognized globally for the creation and administration of the Tier Standards & Certifications for Data Center Design, Construction, and Operational Sustainability along with its Management & Operations reviews, FORCSS® methodology, and Efficient IT Stamp of Approval. Uptime Institute – The Global Data Center Authority®, a division of The 451 Group, has offices in the U.S., Mexico, Costa Rica, Brazil, U.K., Spain, U.A.E., Russia, Taiwan, Singapore, and Malaysia.


SAN JOSE, Calif.--(BUSINESS WIRE)--Spirent Communications Plc (LSE: SPT), an industry leader in test and measurement, today announced the world’s first 200G Ethernet Test System. This additional Ethernet speed is enabled by new 50G electrical interfaces, which allows the deployment of cost-effective 50G links from high performance servers to a new generation of switches and routers. Within these devices, 50G interfaces can easily be combined to build higher speed connections to other systems within the data center or out to the cloud. According to the IEEE, 200G will be the next area for explosive growth in the Ethernet ecosystem due to the insatiable demand for more bandwidth in mobile devices, high-speed data center servers, internet-enabled entertainment (especially video), cloud computing and social media. “The 200G Ethernet speed heavily leverages 100G 4x25G NRZ technology now shipping in high volume. Its 4x50G PAM4 architecture re-uses 100G optical packaging and lasers. This enables near-term cost-effective volume deployment,” stated Chris Cole, Vice President of Advanced Development at Finisar. “Traditionally, combining four electrical interface lanes has been the sweet spot, both technically and economically, for Ethernet technologies,” stated Neil Holmquist, VP, Marketing Cloud and IP at Spirent. “So just as the use of four 25G lanes has been embraced for 100GbE, it is our belief that the use of four 50G lanes will be widely adopted for 200GbE.” This new generation of switches and routers, based on 50G electrical interfaces, will present additional challenges for equipment vendors and those who deploy their equipment. It will start with ensuring the integrity and stability of the physical link as the traffic scales up to 200Gbps, then will require the validation of a new set of features, functions, performance and scale. “Spirent’s 200G test capability proves that Spirent is the partner for companies racing to bring their leading edge network technologies to market first,” said Abhitesh Kastuar, General Manager of Spirent’s Cloud and IP Business Unit. “We have a long history of helping our customers get to market faster with the development of generation after generation of Spirent Ethernet testing technologies and we continue to do so with this first-to-market 200G release, as we did with our award-winning 400G solution released back in 2015.” This new 200G release is compatible with Spirent’s existing chassis and its other Ethernet products. To learn more about Spirent’s 200G test system visit: https://www.spirent.com/Products/TestCenter/Boost/200G Spirent Communications plc. (LSE: SPT) is the leading provider of verification, assessment, analytics, and device intelligence solutions. We enable those who deliver networks, connected devices, and communication services to provide a superior user experience. From service provider networks and enterprise data centers to mobile communications and connected vehicles, Spirent works with leading innovators to help the world communicate and collaborate faster, better, and more securely. For more information visit: http://www.spirent.com/About-Us/News_Room


News Article | February 2, 2016
Site: phys.org

But the knowledge gained from the three months this vessel was underwater could help make future datacenters more sustainable, while at the same time speeding data transmission and cloud deployment. And yes, maybe even someday, datacenters could become commonplace in seas around the world. The technology to put sealed vessels underwater with computers inside isn't new. In fact, it was one Microsoft employee's experience serving on submarines that carry sophisticated equipment that got the ball rolling on this project. But Microsoft researchers do believe this is the first time a datacenter has been deployed below the ocean's surface. Going under water could solve several problems by introducing a new power source, greatly reducing cooling costs, closing the distance to connected populations and making it easier and faster to set up datacenters. A little background gives context for what led to the creation of the vessel. Datacenters are the backbone of cloud computing, and contain groups of networked computers that require a lot of power for all kinds of tasks: storing, processing and/or distributing massive amounts of information. The electricity that powers datacenters can be generated from renewable power sources such as wind and solar, or, in this case, perhaps wave or tidal power. When datacenters are closer to where people live and work, there is less "latency," which means that downloads, Web browsing and games are all faster. With more and more organizations relying on the cloud, the demand for datacenters is higher than ever – as is the cost to build and maintain them. All this combines to form the type of challenge that appeals to Microsoft Research teams who are experts at exploring out-of-the-box solutions. Ben Cutler, the project manager who led the team behind this experiment, dubbed Project Natick, is part of a group within Microsoft Research that focuses on special projects. "We take a big whack at big problems, on a short-term basis. We take a look at something from a new angle, a different perspective, with a willingness to challenge conventional wisdom. So when a paper about putting datacenters in the water landed in front of Norm Whitaker, who heads special projects for Microsoft Research NExT, it caught his eye. "We're a small group, and we look at moonshot projects," Whitaker says. The paper came out of ThinkWeek, an event that encourages employees to share ideas that could be transformative to the company. "As we started exploring the space, it started to make more and more sense. We had a mind-bending challenge, but also a chance to push boundaries." One of the paper's authors, Sean James, had served in the Navy for three years on submarines. "I had no idea how receptive people would be to the idea. It's blown me away," says James, who has worked on Microsoft datacenters for the past 15 years, from cabling and racking servers to his current role as senior research program manager for the Datacenter Advanced Development team within Microsoft Cloud Infrastructure & Operations. "What helped me bridge the gap between datacenters and underwater is that I'd seen how you can put sophisticated electronics under water, and keep it shielded from salt water. It goes through a very rigorous testing and design process. So I knew there was a way to do that." James recalled the century-old history of cables in oceans, evolving to today's fiber optics found all over the world. "When I see all of that, I see a real opportunity that this could work," James says. "In my experience the trick to innovating is not coming up with something brand new, but connecting things we've never connected before, pairing different technology together." Building on James's original idea, Whitaker and Cutler went about connecting the dots. Cutler's small team applied science and engineering to the concept. A big challenge involved people. People keep datacenters running. But people take up space. They need oxygen, a comfortable environment and light. They need to go home at the end of the day. When they're involved you have to think about things like landscaping and security. So the team moved to the idea of a "lights out" situation. A very simple place to house the datacenter, very compact and completely self-sustaining. And again, drawing from the submarine example, they chose a round container. "Nature attacks edges and sharp angles, and it's the best shape for resisting pressure," Cutler says. That set the team down the path of trying to figure out how to make a datacenter that didn't need constant, hands-on supervision. This initial test vessel wouldn't be too far off-shore, so they could hook into an existing electrical grid, but being in the water raised an entirely new possibility: using the hydrokinetic energy from waves or tides for computing power. This could make datacenters work independently of existing energy sources, located closer to coastal cities, powered by renewable ocean energy. That's one of the big advantages of the underwater datacenter scheme – reducing latency by closing the distance to populations and thereby speeding data transmission. Half of the world's population, Cutler says, lives within 120 miles of the sea, which makes it an appealing option. This project also shows it's possible to deploy datacenters faster, turning it from a construction project – which require permits and other time-consuming aspects – to a manufacturing one. Building the vessel that housed the experimental datacenter only took 90 days. While every datacenter on land is different and needs to be tailored to varying environments and terrains, these underwater containers could be mass produced for very similar conditions underwater, which is consistently colder the deeper it is. Cooling is an important aspect of datacenters, which normally run up substantial costs operating chiller plants and the like to keep the computers inside from overheating. The cold environment of the deep seas automatically makes datacenters less costly and more energy efficient. Once the vessel was submerged last August, the researchers monitored the container from their offices in Building 99 on Microsoft's Redmond campus. Using cameras and other sensors, they recorded data like temperature, humidity, the amount of power being used for the system, even the speed of the current. "The bottom line is that in one day this thing was deployed, hooked up and running. Then everyone is back here, controlling it remotely," Whitaker says. "A wild ocean adventure turned out to be a regular day at the office." A diver would go down once a month to check on the vessel, but otherwise the team was able to stay constantly connected to it remotely – even after they observed a small tsunami wave pass. The team is still analyzing data from the experiment, but so far, the results are promising. "This is speculative technology, in the sense that if it turns out to be a good idea, it will instantly change the economics of this business," says Whitaker. "There are lots of moving parts, lots of planning that goes into this. This is more a tool that we can make available to datacenter partners. In a difficult situation, they could turn to this and use it." Christian Belady, general manager for datacenter strategy, planning and development at Microsoft, shares the notion that this kind of project is valuable for the research gained during the experiment. It will yield results, even if underwater datacenters don't start rolling off assembly lines anytime soon. "While at first I was skeptical with a lot of questions. What were the cost? How do we power? How do we connect? However, at the end of the day, I enjoy seeing people push limits." Belady says. "The reality is that we always need to be pushing limits and try things out. The learnings we get from this are invaluable and will in some way manifest into future designs." Belady, who came to Microsoft from HP in 2007, is always focused on driving efficiency in datacenters – it's a deep passion for him. It takes a couple of years to develop a datacenter, but it's a business that changes hourly, he says, with demands that change daily. "You have to predict two years in advance what's going to happen in the business," he says. Belady's team has succeeded in making datacenters more efficient than they've ever been. He founded an industry metric, power usage effectiveness (PUE), and in that regard, Microsoft is leading the industry. Datacenters are also using next-generation fuel cells – something James helped develop – and wind power projects like Keechi in Texas to improve sustainability through alternative power sources. Datacenters have also evolved to save energy by using outside air instead of refrigeration systems to control temperatures inside. Water consumption has also gone down over the years. Belady, who says he "loved" this project, says he can see its potential as a solution for latency and quick deployments. "But what was really interesting to me, what really surprised me, was to see how animal life was starting to inhabit the system," Belady says. "No one really thought about that." Whitaker found it "really edifying" to see the sea life crawling on the vessel, and how quickly it became part of the environment. "You think it might disrupt the ecosystem, but really, it's just a tiny drop in an ocean of activity," he says. The team is currently planning the project's next phase, which could include a vessel four times the size of the current container with as much as 20 times the compute power. The team is also evaluating test sites for the vessel, which could be in the water for at least a year, deployed with a renewable ocean energy source. Meanwhile, the initial vessel is now back on land, sitting in the lot of one of Microsoft's buildings. But it's the gift that keeps giving. "We're learning how to reconfigure firmware and drivers for disk drives, to get longer life out of them. We're managing power, learning more about using less. These lessons will translate to better ways to operate our datacenters. Even if we never do this on a bigger scale, we're learning so many lessons," says Peter Lee, corporate vice president of Microsoft Research NExT. "One of the things that's so fun about a CEO like Satya Nadella is that he's hard-nosed business savvy, customer obsessed, but another half of this brain is a dreamer who loves moonshots. When I see something like Natick, you could say it's a moonshot, but not one completely divorced from Microsoft's core business. I'm really tickled by it. It really perfectly fits the left brain/right brain combination we have right now in the company."


News Article | February 1, 2016
Site: www.scientificcomputing.com

In 2015, starfish, octopus, crabs and other Pacific Ocean life stumbled upon a temporary addition to the seafloor, more than half a mile from the shoreline: a 38,000-pound container. But in the ocean, 10 feet by seven feet is quite small. The shrimp exploring the seafloor made more noise than the datacenter inside the container, which consumed computing power equivalent to 300 desktop PCs. But the knowledge gained from the three months this vessel was underwater could help make future datacenters more sustainable, while at the same time speeding data transmission and cloud deployment. And yes, maybe even someday, datacenters could become commonplace in seas around the world. The technology to put sealed vessels underwater with computers inside isn’t new. In fact, it was one Microsoft employee’s experience serving on submarines that carry sophisticated equipment that got the ball rolling on this project. But Microsoft researchers do believe this is the first time a datacenter has been deployed below the ocean’s surface. Going under water could solve several problems by introducing a new power source, greatly reducing cooling costs, closing the distance to connected populations and making it easier and faster to set up datacenters. A little background gives context for what led to the creation of the vessel. Datacenters are the backbone of cloud computing, and contain groups of networked computers that require a lot of power for all kinds of tasks: storing, processing and/or distributing massive amounts of information. The electricity that powers datacenters can be generated from renewable power sources such as wind and solar, or, in this case, perhaps wave or tidal power. When datacenters are closer to where people live and work, there is less “latency,” which means that downloads, Web browsing and games are all faster. With more and more organizations relying on the cloud, the demand for datacenters is higher than ever — as is the cost to build and maintain them. All this combines to form the type of challenge that appeals to Microsoft Research teams who are experts at exploring out-of-the-box solutions. Ben Cutler, the project manager who led the team behind this experiment, dubbed Project Natick, is part of a group within Microsoft Research that focuses on special projects. “We take a big whack at big problems, on a short-term basis. We take a look at something from a new angle, a different perspective, with a willingness to challenge conventional wisdom. So, when a paper about putting datacenters in the water landed in front of Norm Whitaker, who heads special projects for Microsoft Research NExT, it caught his eye. “We’re a small group, and we look at moonshot projects,” Whitaker says. The paper came out of ThinkWeek, an event that encourages employees to share ideas that could be transformative to the company. “As we started exploring the space, it started to make more and more sense. We had a mind-bending challenge, but also a chance to push boundaries.” One of the paper’s authors, Sean James, had served in the Navy for three years on submarines. “I had no idea how receptive people would be to the idea. It’s blown me away,” says James, who has worked on Microsoft datacenters for the past 15 years, from cabling and racking servers to his current role as senior research program manager for the Datacenter Advanced Development team within Microsoft Cloud Infrastructure & Operations. “What helped me bridge the gap between datacenters and underwater is that I’d seen how you can put sophisticated electronics under water, and keep it shielded from salt water. It goes through a very rigorous testing and design process. So, I knew there was a way to do that.” James recalled the century-old history of cables in oceans, evolving to today’s fiber optics found all over the world. “When I see all of that, I see a real opportunity that this could work,” James says. “In my experience, the trick to innovating is not coming up with something brand new, but connecting things we’ve never connected before, pairing different technology together.” Building on James’s original idea, Whitaker and Cutler went about connecting the dots. Cutler’s small team applied science and engineering to the concept. A big challenge involved people. People keep datacenters running. But people take up space. They need oxygen, a comfortable environment and light. They need to go home at the end of the day. When they’re involved you have to think about things like landscaping and security. So, the team moved to the idea of a “lights out” situation. A very simple place to house the datacenter, very compact and completely self-sustaining. And again, drawing from the submarine example, they chose a round container. “Nature attacks edges and sharp angles, and it’s the best shape for resisting pressure,” Cutler says. That set the team down the path of trying to figure out how to make a datacenter that didn’t need constant, hands-on supervision. This initial test vessel wouldn’t be too far off-shore, so they could hook into an existing electrical grid, but being in the water raised an entirely new possibility: using the hydrokinetic energy from waves or tides for computing power. This could make datacenters work independently of existing energy sources, located closer to coastal cities, powered by renewable ocean energy. That’s one of the big advantages of the underwater datacenter scheme — reducing latency by closing the distance to populations and thereby speeding data transmission. Half of the world’s population, Cutler says, lives within 120 miles of the sea, which makes it an appealing option. This project also shows it’s possible to deploy datacenters faster, turning it from a construction project — which require permits and other time-consuming aspects — to a manufacturing one. Building the vessel that housed the experimental datacenter only took 90 days. While every datacenter on land is different and needs to be tailored to varying environments and terrains, these underwater containers could be mass produced for very similar conditions underwater, which is consistently colder the deeper it is. Cooling is an important aspect of datacenters, which normally run up substantial costs operating chiller plants and the like to keep the computers inside from overheating. The cold environment of the deep seas automatically makes datacenters less costly and more energy efficient. Once the vessel was submerged last August, the researchers monitored the container from their offices in Building 99 on Microsoft’s Redmond campus. Using cameras and other sensors, they recorded data like temperature, humidity, the amount of power being used for the system, even the speed of the current. “The bottom line is that in one day this thing was deployed, hooked up and running. Then everyone is back here, controlling it remotely,” Whitaker says. “A wild ocean adventure turned out to be a regular day at the office.” A diver would go down once a month to check on the vessel, but otherwise the team was able to stay constantly connected to it remotely — even after they observed a small tsunami wave pass. The team is still analyzing data from the experiment, but so far, the results are promising. “This is speculative technology, in the sense that if it turns out to be a good idea, it will instantly change the economics of this business,” says Whitaker. “There are lots of moving parts, lots of planning that goes into this. This is more a tool that we can make available to datacenter partners. In a difficult situation, they could turn to this and use it.” Christian Belady, general manager for datacenter strategy, planning and development at Microsoft, shares the notion that this kind of project is valuable for the research gained during the experiment. It will yield results, even if underwater datacenters don’t start rolling off assembly lines anytime soon. “While at first I was skeptical with a lot of questions. What were the cost? How do we power? How do we connect? However, at the end of the day, I enjoy seeing people push limits.” Belady says. “The reality is that we always need to be pushing limits and try things out. The learnings we get from this are invaluable and will in some way manifest into future designs.” Belady, who came to Microsoft from HP in 2007, is always focused on driving efficiency in datacenters — it’s a deep passion for him. It takes a couple of years to develop a datacenter, but it’s a business that changes hourly, he says, with demands that change daily. “You have to predict two years in advance what’s going to happen in the business,” he says. Belady’s team has succeeded in making datacenters more efficient than they’ve ever been. He founded an industry metric, power usage effectiveness (PUE), and in that regard, Microsoft is leading the industry. Datacenters are also using next-generation fuel cells — something James helped develop — and wind power projects like Keechi in Texas to improve sustainability through alternative power sources. Datacenters have also evolved to save energy by using outside air instead of refrigeration systems to control temperatures inside. Water consumption has also gone down over the years. Belady, who says he “loved” this project, says he can see its potential as a solution for latency and quick deployments. “But what was really interesting to me, what really surprised me, was to see how animal life was starting to inhabit the system,” Belady says. “No one really thought about that.” Whitaker found it “really edifying” to see the sea life crawling on the vessel, and how quickly it became part of the environment. “You think it might disrupt the ecosystem, but really, it’s just a tiny drop in an ocean of activity,” he says. The team is currently planning the project’s next phase, which could include a vessel four times the size of the current container with as much as 20 times the compute power. The team is also evaluating test sites for the vessel, which could be in the water for at least a year, deployed with a renewable ocean energy source. Meanwhile, the initial vessel is now back on land, sitting in the lot of one of Microsoft’s buildings. But it’s the gift that keeps giving. “We’re learning how to reconfigure firmware and drivers for disk drives, to get longer life out of them. We’re managing power, learning more about using less. These lessons will translate to better ways to operate our datacenters. Even if we never do this on a bigger scale, we’re learning so many lessons,” says Peter Lee, corporate vice president of Microsoft Research NExT. “One of the things that’s so fun about a CEO like Satya Nadella is that he’s hard-nosed business savvy, customer obsessed, but another half of this brain is a dreamer who loves moonshots. When I see something like Natick, you could say it’s a moonshot, but not one completely divorced from Microsoft’s core business. I’m really tickled by it. It really perfectly fits the left brain/right brain combination we have right now in the company.”


News Article | November 10, 2016
Site: www.prweb.com

One in a thousand children suffers deafness or hearing loss, and hearing is the most common sense to be affected by congenital disease. Deafness at birth is often caused by mutations in a specific gene known as Gap Junction Beta 2 (GJB2), which codes for the protein connexin 26. In some populations mutations of this gene are responsible for as many as half the instances of congenital hearing loss. Now, Kazusaku Kamiya and the co-authors of his recent report demonstrate a means of producing supplies of these cells on demand for use in therapeutic studies. “Human cochlear cells are not readily accessible for biopsy or direct drug administration because of anatomical limitations,” state the researchers in their report. “Therefore, ES/iPS [embryo stem/induced pluripotent stem] cells are an important tool for studying the molecular mechanisms underlying inner-ear pathology as well as for generating cells for replacement therapies.” To culture the cells the researchers followed standard protocol for the first seven days at which point specific proteins were added to increase mRNA expression of connexins. On day 7-11 the cells were transferred to a flat 2D culture with inner-ear cells that are especially resistant to enzymes that break down proteins. They successfully cultured induced pluripotent stem cells that differentiated into gap junction plaque cells expressing connexin 26. The researchers were also able to demonstrate that their stem-cell-derived gap junction cells were functionally and structurally characteristic of developing cochlear cells. Importantly the cells differentiated from mice that were deficient in connexin 26 reproduced cellular characteristics of congenital hearing loss. The researchers conclude, “It is expected, then, that these iPS derived cells, which can be obtained from patients, will be particularly useful for drug screening and inner-ear cell therapies targeting GJB2-related hearing loss.” Stem cells are a type of cell that can change into another type of more specialised cell through a process described as differentiation. They occur in embryos (embryonic stem cells), and adults as repair cells. Embryonic stem cells can differentiate into a several different types of specialised cells to form the range of cells needed in the human body. The ability to differentiate into several different types of cell is described as pluripotency and can be induced in adult cells as well by reprogramming non-reproductive system cells (somatic cells) to produce “induced pluripotent stem cells”. The ear comprises three main parts: outer, middle and inner. The ear canal in outer ear channels sound vibrations to the ear drum in the middle ear. The middle ear contains three bones or ossicles that transfer the vibrations of the ear drum to the cochlea, a fluid filled spiral cavity in the inner ear. The movement of the fluid in the cochlea in response to these vibrations is detected by thousands of hair cells in the cochlea that convert this motion into electrical signals that are then communicated by nerve cells to the brain, which senses them as sound. Connexins 26 and 30 form gap junctions that facilitate the movement of ions needed to maintain a balance in conditions - homeostasis – as well as developmental organization in the cochlea. The researchers were able to demonstrate that their stem-cell-derived gap junction cells were functional for forming gap junction intercellular communication networks typical of the developing cochlea. The cells differentiated from mice that were deficient in connexin 26 demonstrated a disruption in the formation of gap junction plaques. Reference Ichiro Fukunaga1, 2, Ayumi Fujimoto1, Kaori Hatakeyama1, Toru Aoki1, Atena Nishikawa1, Tetsuo Noda3, 4, Osamu Minowa3, 4, Nagomi Kurebayashi5, Katsuhisa Ikeda1, Kazusaku Kamiya1, In vitro models of GJB2-related hearing loss recapitulate Ca2+ transients via a gap junction characteristic of developing cochlea, Stem Cell Reports, Published online 11 Nov. 2016. 1.    Department of Otorhinolaryngology, Juntendo University Faculty of Medicine, Hongo 2-1-1, Bunkyo-ku, Tokyo 113-8421, Japan 2.    Research Institute for Diseases of Old Age, Juntendo University Graduate School of Medicine, Hongo 2-1-1, Bunkyo-ku, Tokyo 113-8421, Japan 3.    Department of Cell Biology, Japanese Foundation for Cancer Research, Cancer Institute, Tokyo 135-8550, Japan 4.    Team for Advanced Development and Evaluation of Human Disease Models, RIKEN BioResource Center, Tsukuba 305-0074, Japan 5.     Department of Cellular and Molecular Pharmacology, Juntendo University Graduate School of Medicine, Hongo 2-1-1, Bunkyo-ku, Tokyo 113-8421, Japan The mission of Juntendo University is to strive for advances in society through education, research, and healthcare, guided by the motto “Jin – I exist as you exist” and the principle of “Fudan Zenshin - Continuously Moving Forward”. The spirit of “Jin”, which is the ideal of all those who gather at Juntendo University, entails being kind and considerate of others. The principle of “Fudan Zenshin” conveys the belief of the founders that education and research activities will only flourish in an environment of free competition. Our academic environment enables us to educate outstanding students to become healthcare professionals patients can believe in, scientists capable of innovative discoveries and inventions, and global citizens ready to serve society. About Juntendo Juntendo was originally founded in 1838 as a Dutch School of Medicine at a time when Western medical education was not yet embedded as a normal part of Japanese society. With the creation of Juntendo, the founders hoped to create a place where people could come together with the shared goal of helping society through the powers of medical education and practices. Their aspirations led to the establishment of Juntendo Hospital, the first private hospital in Japan. Through the years the institution’s experience and perspective as an institution of higher education and a place of clinical practice has enabled Juntendo University to play an integral role in the shaping of Japanese medical education and practices. Along the way the focus of the institution has also expanded, now consisting of four undergraduate programs and three graduate programs, the university specializes in the fields of health and sports science and nursing health care and sciences, as well as medicine. Today, Juntendo University continues to pursue innovative approaches to international level education and research with the goal of applying the results to society.


GAITHERSBURG, Md., Feb. 13, 2017 (GLOBE NEWSWIRE) -- Emergent BioSolutions Inc. (NYSE:EBS) today announced that it has received a task order from the Biomedical Advanced Research and Development Authority (BARDA) valued at up to $30.5 million to develop monoclonal antibody therapeutics for viral hemorrhagic fever. This task order will utilize the company’s Center for Innovation in Advanced Development and Manufacturing (CIADM) facility located in Baltimore, Maryland. Using monoclonal antibodies from Mapp Biopharmaceutical Inc., the company will conduct technology transfer of process materials and information, perform process and analytical method development, execute small-scale production runs, and perform cGMP cell banking leading to cGMP manufacture of bulk drug substance. The task order consists of a 36-month period of performance with a base task order valued at $7.4 million and options that, if executed, will bring the total task order value over three years to up to $30.5 million. “Protecting and enhancing life is at the core of Emergent’s mission, and one of the ways by which we are able to fulfill this mission is in partnership with BARDA as it addresses emerging public health threats,” said Adam Havey, executive vice president and president, biodefense division of Emergent BioSolutions. “Emergent’s CIADM is a great example of a public-private partnership that works towards a shared goal. We look forward to successfully executing this task order in collaboration with all parties involved.” This is the fourth BARDA Task Order awarded to Emergent under the CIADM program. Since its inception in 2012, Emergent’s CIADM facility has been utilized to respond to public health emergencies including Ebola and Zika. This Task Order Number HHSO10033004T under contract HHSO100201200004I is funded by BARDA, within the Office of the Assistant Secretary for Preparedness and Response in HHS. About Emergent BioSolutions Emergent BioSolutions Inc. is a global life sciences company seeking to protect and enhance life by focusing on providing specialty products for civilian and military populations that address accidental, intentional, and naturally emerging public health threats. Through our work, we envision protecting and enhancing 50 million lives with our products by 2025. Additional information about the company may be found at emergentbiosolutions.com. Follow us @emergentbiosolu. Safe Harbor Statement This press release includes forward-looking statements within the meaning of the Private Securities Litigation Reform Act of 1995. Any statements, other than statements of historical fact, including statements regarding the exercise of options under the task order, are forward-looking statements. These forward-looking statements are based on our current intentions, beliefs and expectations regarding future events. We cannot guarantee that any forward-looking statement will be accurate. Investors should realize that if underlying assumptions prove inaccurate or unknown risks or uncertainties materialize, actual results could differ materially from our expectations. Investors are, therefore, cautioned not to place undue reliance on any forward-looking statement. Any forward-looking statement speaks only as of the date of this press release, and, except as required by law, we do not undertake to update any forward-looking statement to reflect new information, events or circumstances. There are a number of important factors that could cause the company's actual results to differ materially from those indicated by such forward-looking statements, including the availability of funding and BARDA’s exercise of options under the task order; the success of the planned development programs; the timing of and ability to obtain and maintain regulatory approvals for the product candidates; and commercialization, marketing and manufacturing capabilities. The foregoing sets forth many, but not all, of the factors that could cause actual results to differ from our expectations in any forward-looking statement. Investors should consider this cautionary statement, as well as the risk factors identified in our periodic reports filed with the SEC, when evaluating our forward-looking statements.


News Article | November 30, 2016
Site: www.prweb.com

Screenplay Unlimited has published Screenwriting Unchained, Emmanuel Oberg’s disruptive book on screenplay development and story structure. This practical, no-nonsense guide leaves behind one-size-fits-all story theories and adapts the development process to each individual project, making it a precious resource for anyone involved creatively in the Film and TV industry: writers, directors, producers, development execs, showrunners and, more generally, storytellers keen to reach a wide audience without compromising their creative integrity. Having identified three main story-types – plot-led, character-led, theme-led – Oberg reveals in Screenwriting Unchained how each of these impacts on the structure of any screenplay, and how we can use a single set of tools to develop any movie, from an independent crossover to a studio blockbuster. This unique approach leads to a powerful yet flexible way to handle the script development process: the Story-Type Method®. A new framework that doesn’t tell filmmakers what to write and when, but focuses instead on why some storytelling tools and principles have stood the test of time, and how to use them in the 21st century. Emmanuel Oberg says: “I’m passionate about story structure and have spent much of the last twenty years exploring it as a script consultant, development exec and screenwriter, working for European producers as well as Hollywood studios. Prescriptive, dogmatic theories often limit creative freedom and lead to predictable stories. Why not offer more flexible and empowering tools for all involved? Screenwriting Unchained is an attempt to start again and go back to the roots of story structure, in a resolutely modern way”. According to early reviewers, Oberg’s new approach is a game changer: “Without question the most useful book in the marketplace on the writing and development process. […] This is a highly refreshing approach to the subject and one highly recommended to everyone from the aspiring screenwriter to the overburdened development exec.” - Dan MacRae, Head of UK Development, StudioCanal “I have worked through more than a hundred screenwriting books over the last six years, rarely completing most as they tend to repeat the same content absorbed in previous books. Screenwriting Unchained is the exception. Brilliant content, great structure and a refreshing perspective […] A must read for new and experienced screenwriters." - Piet Marais, filmmaker About the Book: Screenwriting Unchained is published by Screenplay Unlimited Publishing. It’s available for sale at Amazon and other online retailers, as well as to order at most book stores. The hardcover (ISBN: 978-0-9954981-2-9) retails for £29.95. The paperback (ISBN: 978-0-9954981-1-2) retails for £17.95. The eBook (ISBN: 978-0-9954981-0-5) retails for £9.99. Book Website: http://www.screenwritingunchained.com Publisher website: http://www.screenplayunlimited.com Where to buy: http://viewbook.at/ScreenwritingUnchained About the Author: Emmanuel Oberg is a screenwriter, author and script consultant with more than twenty years of experience in the Film and TV industry. After selling a first project to Warner Bros as a co-writer, he went on to be commissioned by StudioCanal and Gold Circle before writing solo for Working Title / Universal and Film4. He has also designed an internationally acclaimed 3-day Advanced Development Workshop – based on the Story-Type Method – which he delivers with passion to filmmakers all over the world. Emmanuel lives in the UK with his wife and his two daughters. His film agent is Rachel Holroyd at Casarotto in London. For more information, please contact Naomi or download the press kit directly at http://www.screenplayunlimited.com/su-press-kit

Loading Advanced Development collaborators
Loading Advanced Development collaborators