Entity

Time filter

Source Type

SGI
Fremont, CA, United States

News Article | October 29, 2015
Site: www.marketwired.com

New Hybrid System of Shared-Memory and Cluster Servers Promotes Leading Research and Development in Bioinformatics MILPITAS, CA and TOKYO, JAPAN--(Marketwired - Oct 29, 2015) - SGI Japan, Ltd. part of SGI ( : SGI), a global leader in high-performance solutions for compute, data analytics and data management, announced the acceptance of an order from the Institute for Chemical Research (ICR) at Kyoto University in Kyoto, Japan for its next supercomputer system, a SGI® UV™ 2000. The new system will be used for ICR's advanced and interdisciplinary research in chemistry and biology. It will also be used by researchers throughout Japan as a shared computational platform for national universities and research centers. In addition, this system will be used as the platform for GenomeNet, one of the world's largest biological information services developed by ICR's Bioinformatics Center and used by about 30,000 researchers per day from Japan and other countries. The supercomputer will promote new life sciences research based on genome information and its application to drug discovery, medical care and environment conservation. The large-scale hybrid supercomputer system will combine two of the SGI UV 2000, 16 terabyte (TB) shared-memory servers with Intel® Xeon® Processor E5-4600 v2 Product Family and SGI Rackable Standard Depth cluster servers with total 3,000 cores of Intel® Xeon® Processor E5-2600 v3 Product Family, providing compatibility with the existing system, and allowing for scalability, versatility and cost efficiency. Nearly 9.4 petabytes (PB) of storage will be combined with a fast Lustre file system and a large-capacity Network File System (NFS). SGI Japan will install the products, build the system, and perform operations and maintenance. The system will start operation in January 2016. The system will provide ICR with substantial improvement in performance and capacity, allowing it to support prevalent next-generation sequencers, the increase in genome sequence data due to innovative sequence technology, and the increase in users of the GenomeNet database. ICR also plans to promote shared use and its activities as a joint research center. "As experimental technologies progress dramatically, supercomputers that provide rapid computation and high accuracy analysis of large data are a requirement in the fields of chemistry, physics, and biology," said Hiroyuki Ogata, professor at the Institute for Chemical Research, Kyoto University. "We expect that the ICR supercomputer system will meet such demands to serve researchers inside and outside the university more effectively. Our laboratory specifically is looking forward to using this supercomputer system to find the correlation between the earth's environment and the life system based on large-scale environmental genome data." The system's added physical capacity will provide 10 or more times the capacity of the current 840 Terabyte (TB) system. The vast memory space, increased computation capability and large-scale storage system are essential for calculating gene and genome information and storing analysis data, which are increasing at an accelerated rate. The new system will enable the comprehensive analysis of many parameter spaces and larger and more reliable simulations for fields requiring high computation capabilities, like chemical calculation, including quantum chemical calculation and biomacromolecular modeling. "SGI's production supercomputers with large shared memory capability provide a unique tool for our customers in the life sciences," said Gabriel Broner, vice president and general manager of high-performance computing, SGI. "We are happy to be partnering with Kyoto University Institute for Chemical Research and supporting them in pushing the boundaries of knowledge in their field." This system will consist of five server groups: GenomeNet calculation servers, chemical calculation servers, GenomeNet servers, chemistry database servers and file servers. These server groups will be combined with a fast Lustre file system and a large-capacity NFS. The center of the system will contain the GenomeNet servers and chemical calculation servers. About SGI SGI is a global leader in high-performance solutions for compute, data analytics and data management that enable customers to accelerate time to discovery, innovation, and profitability. Visit sgi.com (sgi.com/) for more information. Connect with SGI on Twitter (@sgi_corp), YouTube (youtube.com/sgicorp), Facebook (facebook.com/sgiglobal) and LinkedIn (linkedin.com/company/sgi). © 2015 Silicon Graphics International Corp. All rights reserved. SGI, the SGI logo, and UV, are trademarks or registered trademarks of Silicon Graphics International Corp. or its subsidiaries in the United States and/or other countries. Intel and Xeon are trademarks or registered trademarks of Intel Corporation. All other product and service names mentioned are the trademarks of their respective companies.


News Article | August 15, 2016
Site: http://www.theenergycollective.com/rss/all

Summary: Exxon’s scientists conducted extensive research in the 1970’s and 1980’s showing that use of fossil fuels produces greenhouse gases that warm the planet.  These results were accepted by the company’s management.  By the 1990’s, however, the fossil fuel industry changed its message.  It began a campaign to plant doubts in the minds of the public about the reality of manmade global warming and its effects. Although many large American companies have prospered because they embodied an optimistic culture, a “can-do” spirit, by the end of the 20th century the fossil fuel companies adopted a “won’t change” attitude.  They resisted the scientific reality of manmade global warming, seeking to continue their business model of producing fossil fuels to supply the world’s energy. Yet objective climate science shows that continued atmospheric accumulation of greenhouse gases such as carbon dioxide will lead to large increases in the global average temperature.  In face of this threat the world needs to move rapidly toward a carbon-free energy economy.  The large fossil fuel companies can play a major role in this shift, were they to adopt a “can-do” spirit.  The technologies already exist, the labor market is waiting, and they have the resources to carry out this transformation while remaining profitable. Exxon’s Own Research on Global Warming.  As the issue of manmade global warming gained prominence in recent decades, scientists at Exxon carried out research on the phenomenon in its own laboratories.   This program was extensive enough that it led to about 50 publications in peer-reviewed scientific journals.  An Exxon scientist involved in this work told his corporate superiors that their company should embark on measures that would reduce emissions of greenhouse gases (GHGs). By the end of the 1980s, however, the corporate view on warming reversed sides.  The company sought to raise doubts about the reality of global warming.  As the United Nations-sponsored negotiations leading to the Kyoto Protocol, limiting emissions from developed countries, came to fruition, Exxon and other fossil fuel companies opposed the treaty, undertaking extensive lobbying efforts.  When the time came to consider the Protocol in the U. S. Senate in 1997, the vote to take up the treaty for debate failed unanimously. Promoting Doubt about Global Warming.  From that period to the present, fossil fuel companies have engaged in a campaign that seeks to raise questions about global warming in the minds of citizens, leading them to doubt its reality and especially of its manmade cause.  This tactic had previously been used advantageously by the tobacco industry in its struggle to dissociate smoking from its harmful effects on smokers’ health (Naomi Oreskes and Erik M. Conway, “Merchants of Doubt”, Bloomsbury Press, 2010).  Indeed, some of the same personalities, having seemingly authoritative backgrounds in science, appeared in both battles.  Such doubt results in reduced pressure for change such as abandoning the extraction of fossil fuels that emit carbon dioxide, an important GHG. “Can-Do” Optimism.  This writer pointed out in the preceding blog post that the optimistic, “can-do” spirit that drove the growth of the U. S. during the nineteenth and early twentieth centuries has faded somewhat.  Now, many powerful industries, including the fossil fuel industry, when faced with indisputable scientific evidence indicating that they should change their operations, resist stubbornly.  They do this by promoting doubts about the scientific findings in question, and by adopting “won’t change” attitudes. Simplified historical sketches are presented below of three large American corporations.  The first two are considered “can-do” companies here. They have confronted technological changes in their industries and adopted new business plans to adapt to them.   The third one is ExxonMobil. International Business Machines (IBM) was founded in 1911 when three companies combined to form the Computing-Tabulating-Recording Company, IBM’s precursor.  The company wrote “[f]rom the beginning, IBM defines itself not by strategies or products—which range from commercial scales to punch card tabulators—but by forward-thinking culture and management practices…”.  This culture informed the company’s development of new apparatuses that, for example, contributed logistics enabling the Social Security Act of 1935, a major piece of anti-Great Depression policy, to function.  In the post-war years IBM expanded by developing electronic computers, which permitted it to become an important international company. In the following decade IBM responded to new challenges by expanding its mainframe computer systems, leading to five-fold growth in income.  From 1971 to 1992 it overhauled its business model to adapt to the growing trend toward personal computing, but even so confronted difficulties created largely by its own successes.  As use of the internet has expanded, IBM has moved away from hardware to provide software and computing services. General Electric (GE) was founded in 1889 by Thomas Edison based on his electric lamps, electric generators and motors, and power distribution.  Over the years, it developed or acquired businesses such as radio broadcasting, railroad and aircraft locomotion, electronic computing, finance, and medical diagnostic technology.  The list of its acquisitions and divestments is extensive, and was responsive to the changing business environments the company’s operations encountered.  Finance, for example, was intended to provide funds for purchasing its products, but portions have been sold off in recent years.  Additionally, as an outgrowth of its generation technology, GE manufactures industrial-scale wind turbines. ExxonMobil began operations shortly after oil was discovered in Pennsylvania in 1859.  John D. Rockefeller formed Standard Oil in 1870 to harvest the oil; at that time the principal refined product was kerosene.  The company grew rapidly, operating in many states and abroad, but was forced to break up its units by two antitrust decisions of the U. S. Supreme Court, in 1892 and again in 1911.  Nevertheless, the separate companies prospered and many recombined again to form a smaller number of Standard companies.  In this period, as the number of automobiles powered by internal combustion engines grew gasoline production surpassed that of kerosene. Over the following decades, the various Standard Oil companies focused on extracting petroleum, refining it to yield gasoline, and distributing it at retail.  They also produced lubricating oils, petrochemicals and other products as years passed.  A principal Standard Oil company adopted the brand “Esso” (pronouncing the letters “S” and “O”) in 1926, which was changed to Exxon in 1972.  Exxon merged with Mobil, itself another Standard Oil company, in 1999 to become ExxonMobil.  ExxonMobil is now the largest non-state producer of petroleum and its products in the world. Exxon set up its Solar Power Corporation in 1973 to make solar photovoltaic cells.  After determining that solar would not become profitable until 2012 Exxon sold Solar Power off in 1984.  (Mobil also had a solar venture from 1974–1994.)  From 1970 to 1986 Exxon ran a nuclear fuel preparation company.  Wikipedia (as of August 2016) lists no other non-fossil fuel ventures for Exxon. In 2002 ExxonMobil, General Electric and others formed the Global Climate and Energy Project at Stanford University to develop new energy technologies emitting greatly reduced greenhouse gas emissions.  It intended to invest more than US$200 million over 10 years.  As of 2015, 46 research institutions worldwide are participating.  In 2009 ExxonMobil and Synthetic Genomics Inc (SGI). began research in their algae biofuels program.  Unfortunately, the project did not provide positive results and ExxonMobil cut back its support severely in 2013.   The company also established an energy research program with the University of Texas (Austin) Energy Institute. Two “Can-Do” companies.  IBM and GE are categorized here as “can-do” companies.  Their histories embody a spirit of optimism, confronting the challenges of changing times and overcoming them to continue as successful ventures with new products.  IBM especially had troubles, and is currently dealing with yet more rapid change in information technology. Energy demand is high and increasing, especially in countries of the developing world.  As their economies grow, the need for energy supplies grows in pace with their economies.  Their populations are becoming more affluent, entering the middle classes.  The most populous developing countries are China and India, and their demand for energy has been growing dramatically in recent decades.  In addition, whereas China’s population is relatively stable, that of India is growing rapidly.  In many developing countries including India, energy demand is fed both by growth in economic activity and by increasing populations.  The world’s population, now more than 7 billion people, is expected to grow to 9 billion by about 2040. The fossil fuel industry has not had to overcome industry-wide challenges in order to grow.  The long-term worldwide demand for fossil fuels has never been in doubt (barring a few short-term hiccups and the current decline in demand for coal in the U.S. and Europe).  Commercial pressures in this industry have come primarily from within to improve its core technologies, rather than from a need to reinvent the companies by formulating new business models.  The large size of this industry and the confidence that demand for its products would stretch indefinitely into the future has led to its seeming complacency. ExxonMobil is actively resisting change.  In order to continue defending the role of the industry in the global economy, ExxonMobil and other fossil fuel companies are digging in their heels, resisting pressures for change coming from the need to minimize further warming of our planet.  As outlined above their attitude is one of resisting change, and of mobilizing their considerable financial resources and political influence to create doubt and oppose the need for change.  This may be changing, however.  Six non-U.S. oil companies and ExxonMobil have recently endorsed a price on carbon. Total accumulated GHGs dictate how much warming (on average) Earth will undergo.  As manmade carbon dioxide and other GHGs continue accumulating in the atmosphere, the projected further increase in global average temperature rises accordingly.  This is shown in a chart from the Fifth Assessment Report (2013) of the Intergovernmental Panel on Climate Change, below: Dependence of the projected change in global average temperature from about 1870 to 2100 (vertical axis) on the total amount of manmade carbon dioxide (as the main GHG; horizontal axis) foreseen from four emission “scenarios”, starting in 1890.  Circles represent each decade.  BLACK, historical data to 2010.  DARK BLUE, LIGHT BLUE, ORANGE, and RED represent projections for successively less stringent limits on emissions, from almost complete of emissions by 2050 to no meaningful limitation at all. Source: Intergovernmental Panel on Climate Change, Fifth Assessment Report http://www.climatechange2013.org/images/report/WG1AR5_SPM_FINAL.pdf The graphic makes clear that the increase in the total accumulated manmade carbon dioxide concentration in the air governs the change in global average temperature.  Only by minimizing future emissions using the most stringent constraints possible (DARK BLUE line and dots in the chart) can humanity keep further increases in global average temperature as small as possible. The fossil fuel industry needs rapidly to change its business model. In the 1970s-1980s Exxon understood the threat from global warming.  But by the end of the 20th century it and rest of the fossil fuel industry were sowing doubts about manmade global warming and resisting the need to change.  Even so, the reality shown in the image above is that continued extraction and burning of fossil fuels moves the world further along the trajectory of higher atmospheric carbon dioxide and higher global average temperature, to the detriment of all humanity. There is no avoiding the imperative to abandon further extraction, and to begin at the present time to decarbonize the world’s energy economy.  Many needed  technologies already exist, the labor force awaits the coming job opportunities, and the large international fossil fuel companies have the resources to undertake these changes.  What’s missing is the will to change their business models.  The world looks to them to adopt the “can-do” spirit that drives capitalism, leading to new sources of revenue and profit, by creating a decarbonized economy.


News Article | April 16, 2016
Site: http://www.sciencedaily.com/news/

In a small, phase I clinical trial, researchers say they show for the first time that the experimental drug guadecitabine (SGI-110) is safe in combination with the chemotherapy drug irinotecan and may overcome resistance to irinotecan in patients with metastatic colorectal cancer.


News Article | April 26, 2013
Site: yourstory.com

It’s been five years since IIT Bombay incubated Seclore Technology, a security software company was founded. Seclore helps corporate secure their internal data from being compromised. They provide security solutions in areas of information usage control, information rights management (IRM) and secure outsourcing. When we last spoke to co-founder Vishal Gupta in 2009, it was to understand  Seclore, the business model, the challenges ahead and the road ahead. Now four years later with multiple customers and a fresh round of Series A funding worth $6 million from Helion Venture Partners and Ventureast Proactive Fund, Vishal is raring to take Seclore to the next frontier. YourStory caught up with Vishal and Rahul Chandra, MD, Helion, about the journey so far and the road for the future. Excerpts. YS: How has Seclore grown over the last four years? Vishal: Seclore has evolved very rapidly in that time, from a campus startup doing R&D, we are now a young and growing company with more than 3 million users and customers in 15 countries. In terms of revenue, the company has grown nearly 20 times in the last four years. Due to this rapid growth, there have been constant process re-engineering exercises within the company across processes and business functions. YS: What have the key lessons been over these years? Vishal: There have been many lessons. We’ve seen the challenges of acquiring and retaining customers spread across multiple time zones and cultures. Besides that, creating a product mindset in people who have largely been exposed to services industry has been the other key lesson. These two put together, almost feels like doing two MBAs simultaneously! YS: What keeps you going? And how has the initial motivation to start fared in comparison? Vishal: The motivation to start was simple – control misuse of information. There are a lot of scenarios where we share information with other people but don’t want them to “misuse” it. This fine line between use and misuse is impossible to define and control with traditional security and control systems. These systems always focus on preventing “unauthorized users” from gaining access to information but don’t worry about unauthorized usage of the information by authorized users. A new paradigm for establishing information “ownership” needed to be defined. There were a lot of unanswered questions in the above context, like, “what happens to all the information shared with people/business associates when relationships change”, or “how can one monitor what people are doing with the information which was shared in good faith” and we needed to provide those answers. What kept us going beyond that idea, was the constant validation from customers, partners and investors about our technology, the business and of course the people. YS: You’ve just got in a fresh round of investment. How was the process? Vishal: Compared to the seed stage, this round was a very smooth ride and we were able to manage the whole process without taking our eyes off the business. There were multiple factors that helped us. The technology has gone through significant customer validation and we have more than 200 existing enterprises using it. Consequently, the business is profitable and cash flows are positive. Also, the investors had been monitoring our progress much before we formally started discussions. Our investment bank also managed the whole process really well. Rahul Chandra, MD of Helion took over for these couple of questions. YS: What kind of growth has the company seen? Rahul: Seclore has grown about 20 times in revenue in the last 3 years and has been constantly named in the “Deloitte Technology Fast 50” as one of the fastest growing technology companies.They have grown from 7 people to 80 in that period and have setup sales locations in three cities in India and four countries outside India. YS: Any last words for entrepreneurs starting out and hoping to make it big? Vishal: Nothing new… Just stay hungry and stay foolish. Vishal has managed to build Seclore and take it the heights that it is today, by following the said advice and we wish Seclore continued success in years to come.


« SLAC, U Toronto team develops new highly efficient ternary OER catalyst for water-splitting using earth-abundant metals; >3x TOF prior record-holder | Main | Six automated truck platoons to compete in European Truck Platooning Challenge » Researchers from the J. Craig Venter Institute (JCVI) and Synthetic Genomics, Inc. (SGI) have designed and constructed of the first minimal synthetic bacterial cell, JCVI-syn3.0. Using the first synthetic cell, Mycoplasma mycoides JCVI-syn1.0 (created by this same team in 2010, earlier post), JCVI-syn3.0 was developed through a design, build, and test process using genes from JCVI-syn1.0. The new minimal synthetic cell contains 531,560 base pairs and just 473 genes, making it the smallest genome of any organism that can be grown in laboratory media. Of these genes 149 are of unknown biological function. By comparison the first synthetic cell, M. mycoides JCVI-syn1.0 has 1.08 million base pairs and 901 genes. A paper describing this research is being published in the journal Science by lead authors Clyde A. Hutchison, III, Ph.D. and Ray-Yuan Chuang, Ph.D., senior author J. Craig Venter, Ph.D., and senior team of Hamilton O. Smith, MD, Daniel G. Gibson, Ph.D., and John I. Glass, Ph.D. Our attempt to design and create a new species, while ultimately successful, revealed that 32% of the genes essential for life in this cell are of unknown function, and showed that many are highly conserved in numerous species. All the bioinformatics studies over the past 20 years have underestimated the number of essential genes by focusing only on the known world. This is an important observation that we are carrying forward into the study of the human genome. The research to construct the first minimal synthetic cell at JCVI was the culmination of 20 years of research that began in 1995 after the genome sequencing of the first free-living organism, Haemophilus influenza, followed by the sequencing of Mycoplasma genitalium. A comparison of these two genomes revealed a common set of 256 genes which the team thought could be a minimal set of genes needed for viability. In 1999 Dr. Hutchison led a team who published a paper describing the use of global transposon mutagenesis techniques to identify the nonessential genes in M. genitalium. Over the last 50 years more than 2,000 publications have contemplated minimal cells and their use in elucidating first principals of biology. From the start, the goal of the JCVI team was similar—build a minimal operating system of a cell to understand biology but to also have a desirable chassis for use in industrial applications. The creation of the first synthetic cell in 2010 did not inform new genome design principles since the M. mycoides genome was mostly recapitulated as in nature. Rather, it established a work flow for building and testing whole genome designs, including a minimal cell, from the bottom up starting from a genome sequence. To create JCVI-syn3.0, the team used an approach of whole genome design and chemical synthesis followed by genome transplantation to test if the cell was viable. Their first attempt to minimize the genome began with a simple approach using information in the biochemical literature and some limited transposon mutagenesis work, but this did not result in a viable genome. After improving transposon methods, they discovered a set of quasi-essential genes that are necessary for robust growth which explained the failure of their first attempt. To facilitate debugging of non-functional reduced genome segments, the team built the genome in eight segments at a time so that each could be tested separately before combining them to generate a minimal genome. The team also explored gene order and how that affects cell growth and viability, noting that gene content was more critical to cell viability than gene order. They went through three cycles of designing, building, and testing ensuring that the quasi-essential genes remained, which in the end resulted in a viable, self-replicating minimal synthetic cell that contained just 473 genes, 35 of which are RNA-coding. In addition, the cell contains a unique 16S gene sequence. The team was able to assign biological function to the majority of the genes with 41% of them responsible for genome expression information, 18% related to cell membrane structure and function, 17% related to cytosolic metabolism, and 7% preservation of genome information. However, a surprising 149 genes could not be assigned a specific biological function despite intensive study. This remains an area of continued work for the researchers. The team concludes that a major outcome of this minimal cell program are new tools and semi-automated processes for whole genome synthesis. Many of these synthetic biology tools and services are commercially available through SGI and SGI-DNA including a synthetic DNA construction service specializing in building large and complex DNA fragments including combinatorial gene libraries, Archetype genomics software, Gibson Assembly kits, and the BioXp, which is a benchtop instrument for producing accurate synthetic DNA fragments. Other authors on the paper are: Thomas J. Deerinck and Mark H. Ellisman, Ph.D., University of California, San Diego National Center for Microscopy and Imaging Research; James F. Pelletier, Center for Bits and Atoms and Department of Physics, Massachusetts Institute of Technology; Elizabeth A. Strychalski, National Institute of Standards and Technology. This work was funded by SGI, the JCVI endowment and the Defense Advanced Research Projects Agency’s Living Foundries program, HR0011-12-C-0063.

Discover hidden collaborations