Clifton Park, NY, United States
Clifton Park, NY, United States

Kitware, Inc. is a technology company headquartered in Clifton Park, New York. The company specializes in the research and development of open-source software in the fields of computer vision, medical imaging, visualization, 3D data publishing and technical software development. In addition to software development, the company offers other products and services such as books, technical support, consulting and customized training courses. Wikipedia.


Time filter

Source Type

Grant
Agency: GTR | Branch: EPSRC | Program: | Phase: Training Grant | Award Amount: 3.94M | Year: 2014

The achievements of modern research and their rapid progress from theory to application are increasingly underpinned by computation. Computational approaches are often hailed as a new third pillar of science - in addition to empirical and theoretical work. While its breadth makes computation almost as ubiquitous as mathematics as a key tool in science and engineering, it is a much younger discipline and stands to benefit enormously from building increased capacity and increased efforts towards integration, standardization, and professionalism. The development of new ideas and techniques in computing is extremely rapid, the progress enabled by these breakthroughs is enormous, and their impact on society is substantial: modern technologies ranging from the Airbus 380, MRI scans and smartphone CPUs could not have been developed without computer simulation; progress on major scientific questions from climate change to astronomy are driven by the results from computational models; major investment decisions are underwritten by computational modelling. Furthermore, simulation modelling is emerging as a key tool within domains experiencing a data revolution such as biomedicine and finance. This progress has been enabled through the rapid increase of computational power, and was based in the past on an increased rate at which computing instructions in the processor can be carried out. However, this clock rate cannot be increased much further and in recent computational architectures (such as GPU, Intel Phi) additional computational power is now provided through having (of the order of) hundreds of computational cores in the same unit. This opens up potential for new order of magnitude performance improvements but requires additional specialist training in parallel programming and computational methods to be able to tap into and exploit this opportunity. Computational advances are enabled by new hardware, and innovations in algorithms, numerical methods and simulation techniques, and application of best practice in scientific computational modelling. The most effective progress and highest impact can be obtained by combining, linking and simultaneously exploiting step changes in hardware, software, methods and skills. However, good computational science training is scarce, especially at post-graduate level. The Centre for Doctoral Training in Next Generation Computational Modelling will develop 55+ graduate students to address this skills gap. Trained as future leaders in Computational Modelling, they will form the core of a community of computational modellers crossing disciplinary boundaries, constantly working to transfer the latest computational advances to related fields. By tackling cutting-edge research from fields such as Computational Engineering, Advanced Materials, Autonomous Systems and Health, whilst communicating their advances and working together with a world-leading group of academic and industrial computational modellers, the students will be perfectly equipped to drive advanced computing over the coming decades.


Grant
Agency: Department of Energy | Branch: | Program: SBIR | Phase: Phase II | Award Amount: 1000.00K | Year: 2012

Observations unequivocally show that the global climate is changing and the results are dramatic. With the accelerating pace of climate change, the impact of these changes will resonate through a broad range of fields including public health, infrastructure, water resources, and many others. These stakeholders need access to and resources for using climate data that are designed with the non-researcher in mind. Unfortunately, this is impeded by factors such as large data size, lack of adequate metadata, poor documentation, and lack of sufficient computational and visualization resources. The aim of this proposal is to develop ClimatePipes, a platform that will provide mechanisms to make this valuable data available to non-researchers including policy makers, health officials, agriculturalists, and industry leaders. The ClimatePipes cyberinfrastructure will provide state-of-the-art, user-friendly access, analysis, and visualization of data generated from high-resolution, long-term, climate change projections performed as part of the U.S. Global Change Research Program, the DOE-funded ESGF, NSF-funded DataONE, and NASA-funded EOSDIS. ClimatePipes is not a replacement for high-end tools for scientists, but instead provides simple, intuitive, and effective workflows that can be used by non-researchers and non-programmers. It builds on top of existing high-end tools, and is designed to easily leverage functionality added to those tools, thus ensuring state-of-the-art functionality in the years to come. ClimatePipes was successfully developed as a web-based tool that provides workflow and form-based interfaces for accessing, querying, and visualizing interesting datasets from one or more sources. This was integrated with ESGF and tools from UV-CDAT for the purpose of data manipulation and visualization. ClimatePipes demonstrated production of relevant data and visualizations in response to natural language queries. Will focus on implementation of mechanisms for supporting more elaborate and relevant queries, and improvements to the usability, robustness and scalability of the system. In order to support these types of queries, the team will develop a semantic search tool using natural language processing techniques. The team will also add the capability to perform appropriate transformations to bring relevant data into a common reference system. Upon completion of Phase II, ClimatePipes will provide an interface for running computations in the cloud or on a user-provided cluster, to accommodate the compute power required. Further, the team will develop custom visualizations for climate applications, an API, and an appropriate user interface for uploading data to the server for analyses and integration.


Grant
Agency: Department of Energy | Branch: | Program: SBIR | Phase: Phase I | Award Amount: 149.81K | Year: 2012

The integration of Computer Aided Design (CAD) and Computer Aided Engineering (CAE) into the product design and manufacturing process has shown to be a major benefit in terms of reducing both time and cost as well as increasing reliability. However, based on current practices the full potential of these concepts have yet to be achieved due to two major obstacles. The first is that the cost of commercial end- to-end systems such as Fluent, Unigraphics, and ANSYS tends to be quite high. The second is that many of these systems tend to be general purpose and non-intuitive to use by non-simulation experts. In addition, due to the increasing complexity and scale of the current problem domains being addressed, the simulations that need to be run must be coupled with high performance computational techniques. In response to this need there are several open-source products developed under various DOE projects, such as SciDAC, that address a part of the simulation lifecycle, but none offer a complete solution. Moreover, most of these solutions come in the form of toolkits, which lack intuitive user interfaces and have a relatively high learning curve thereby limiting their adoption into commercial product design and manufacturing. We propose to develop a suite of open source applications that will address the complete simulation lifecycle (from geometric modeling to visualization of simulation results). The suite will be based on a flexible framework built upon existing open source HPC Toolkits included CGM, OpenCASCADE, MOAB, Meshkit and ParaView. The applications will be customizable and can therefore be targeted towards specific vertical markets. The initial problems we have chosen are computational fluid dynamics, which will use the Nek5000 solver developed by Argonne National Labs, and MCNP, a general-purpose Monte Carlo N-Particle code for neutron, photon, electron, or coupled neutron, photon, and electron transport. The system will be designed so that different toolkits can be interchanged in order to address the needs of a particular application area. The suite will use a consistent graphical user interface between the different applications in the suite. The suite will include a Geometric Model Builder for defining and modifying geometric representation of the problem domain, a Simulation Builder for adding the additional information required to define an analysis as well as an interface to the mesher and solver, and a Results Visualizer for exporting the results from the simulation. All the applications will support a client/server model so that a user can make use of HPC facilities from his or her computer. Finally, the suite will provide the ability for analysis experts to predefine conceptual simulation models that a non-expert will be able to apply to a specific geometric models they have defined. The technology developed here will also benefit the larger scientific computing community since the code will be released under open source licenses (non- reciprocal Apache or BSD) within open source communities. Thus researchers, educators, and commercial enterprises will be able tune the suite to their particular workflow, easily research specialized areas of interest, and leverage international communities to develop leading edge technology.


Grant
Agency: Department of Energy | Branch: | Program: SBIR | Phase: Phase I | Award Amount: 149.93K | Year: 2012

Cosmological simulations play a very important role in the DOE High Energy Physics Cosmic Frontier program. They are used to connect fundamental physics with observation, and are especially critical in the effort to understand Dark Energy. Current and upcoming cosmological observations that survey large areas of the sky produce very large datasets. For example, the Large Synoptic Survey Telescope (LSST) will produce up to 30 terabytes of data per night. Simulations carried out to interpret results from surveys of this type already produce terabytes of data per simulation and this will rise to many petabytes within a decade. To have any hope of realizing, encapsulating, and interpreting the enormous wealth of information contained in such datasets, it is necessary to find very efficient ways to explore and analyze them, with the goal of eventually automating many such tasks. Critical challenges facing such simulations include workflow I/O and lack of domain-specific data analysis algorithms. Overcoming these challenges requires a revolutionary shift in the way cosmological predictions are obtained. Instead of the traditional workflow where simulation codes are run for days and the analysis is conducted as a post-processing step, on-line methods that enable scientists to analyze the data in tandem with the evolving simulation are required. To address this, the software infrastructure that we propose to create will provide the following functionality: 1. Data-reduction to minimize I/O 2. Robust and efficient halo-extraction methods 3. On-line/forward tracking of halos to capture halo formation dynamics 4. In-situ and co-visualization capabilities The development of such an infrastructure will pave a way forward toward the analysis of exascale datasets that are expected within the coming decade. The basic strategy to achieve this consists of (i) extending and optimizing existing halo-extraction techniques to improve robustness; (ii) developing an on-line halo-tracking method to be used in tandem with the simulation code at the time-step resolution, enabling insights unavailable using present approaches; (iii) minimizing the I/O bottleneck by reducing output to only halos, halo-formation history and a sub-sample of the remaining particles; and (iv) leveraging ParaViews in-situ and co-visualization capabilities. In addition to cosmological applications, the framework to be developed has a broader applicability to industries that use particle-based simulation techniques, including astrophysics, ballistics and defense, volcanology, oceanology, solid mechanics modeling, and various maritime applications. By addressing the wider issues associated with large-data simulations, this work will drive innovation in the computational sciences and be adaptable to many industries, facilitate the transition from terascale work to peta- and eventually exascale computing.


Grant
Agency: Department of Energy | Branch: | Program: SBIR | Phase: Phase I | Award Amount: 1.72M | Year: 2013

In 2011, the US electricity generation was 4344 billion kWh gross with 821 TWh (19%) from nuclear power reactors. The US has 104 nuclear power reactors, 69 pressurized water reactors and 35 boiling water reactors, in 31 states, operated by 30 different power companies. Almost all of the US nuclear generating capacity comes from reactors built between 1967 and 1990. There have been no new construction starts since 1977, largely because for a number of years gas generation was considered more economically attractive and because construction schedules were frequently extended by opposition, compounded by heightened safety fears following the Three Mile Island accident in 1979. A future pressurized water reactor Watts Bar 2 is expected to start up in 2013 following Tennessee Valley Authoritys decision in 2007 to complete the construction of the unit. Despite a near halt in new construction of more than 30 years, US reliance on nuclear power has continued to grow due to remarkable gains in power plant utilization through improved refueling, maintenance, and safety systems at existing plants. Advanced modeling and simulation of nuclear power reactors is critical to the design of future systems and the continued operation of the existing US plants. The proposal describes an approach to simplify and democratize advanced modeling and sim- ulation in the nuclear energy industry by developing an open-source integrated design-analysis en- vironment (IDAE) to work on a range of nuclear engineering applications. It will leverage millions of investment dollars from the Department of Energys Office of Nuclear Energy for modeling and simulation of light water reactors and the Office of Nuclear Energys research and development. The proposed design-analysis environment will leverage existing open-source toolkits, creating a graph- ical end-to-end umbrella guiding end-users and developers through the nuclear energy advanced modeling and simulation lifecycle. The proposed framework will deliver strategic advancements in meshing, data exchange, and visualization for ensembles, uncertainty quantification and analysis. The proposed environment will address all three of the identified challenges to the nuclear energy industrys use of advanced modeling and simulation. Through the use of this unique design-analysis environment, we will enable an ecosystem of synergistic activities will develop, fueling a nuclear energy renaissance. It also lays the groundwork for commercialization, through customization, tech- nology integration, and advanced research and development services. the services will be tailored to nuclear energy companies, large-scale manufacturing and engineering firms, software companies and high-performance computing infrastructure providers. By advancing the state-of-the-art in ad- vanced modeling and simulation, nuclear energy companies innovation will be accelerated and ever more realistic advanced modeling and simulation will virtually design, analyze and test tomorrows nuclear power systems.


Grant
Agency: Department of Energy | Branch: | Program: SBIR | Phase: Phase II | Award Amount: 1.00M | Year: 2013

The manufacturing and engineering industry needs more cost effective and easy-to-use high- performance computing (HPC) modeling and simulation tools to design, test, and analyze innovative components and products. For 92% of the small- to medium-sized manufacturers with less than 100 employees, an HPC system is most likely not part of their development environment. This class of manufacturers typically relies on two-dimensional computer-aided design software and a limited amount of desktop modeling and simulation software. This under-utilization of HPC tools by this class of manufactures is of chief concern to the Department of Energy. The proposed simulation framework and suite of tools will be developed with one eye on eliminating the barriers to adoption of HPC modeling and simulation tools for manufacturing and engineering, and the other eye on increasing the efficiency and effectiveness of manufacturing and engineering practices in solving real-world problems. By using these tools, manufacturers will be able to improve their design cycles and reduce their need of developing expensive physical prototypes. In Phase I, a framework referred to as the Simulation Model Based Architecture was developed, capable of supporting the entire lifecycle of a simulation. The approach used was to leverage existing open-source toolkits wherever possible so that the resulting system could also be distributed as open-source. The system provided several mechanisms by which HPC systems can be leveraged. In Phase II, the major focus will be improving the ease of use of the system and extending its core functionality. To address ease of use, the customization capabilities will be enhanced and workflow wizards will be developed to guide the user through their specific simulation workflow. In terms of core functionality, the modeling and meshing capabilities will be enhanced and the mechanisms for data exchange with various simulation environments will be improved. In addition, more complex workflows such as design optimization will be supported. Modular frameworks such as the one proposed here lend themselves to a service business model; vendors with the skills to provide technology integration services can partner with their customers and collaborators to build valuable, competitive products and services. Such an open, modular framework provides significant business opportunities. Beyond the manufacturing community, the technology developed here will also benefit the larger scientific computing community since the code will be released under open source licenses (non-reciprocal Apache or BSD) within open source communities. Thus researchers, educators, and commercial enterprises will be able tune the suite to their particular workflow, easily research specialized areas of interest, and leverage international communities to develop leading edge technology.


Grant
Agency: Department of Energy | Branch: | Program: SBIR | Phase: Phase II | Award Amount: 1000.00K | Year: 2013

Numerical simulations play a crucial role in the DOE High Energy Physics (HEP) Cosmic Frontier program. They provide a theoretical framework with which to compare observational data from surveys. Through such comparisons, significant breakthroughs are made. These breakthroughs can guide scientists in calibrating the underlying cosmological model, to produce simulations that match the observed sky, which enhances our overall understanding of the universe. Science requirements for such surveys demand simulations of extreme scales. The ability to perform these simulations,however, is limited by the scale at which we can store, manage, and process the resulting data. Chief among the pressing challenges are: (i) the explosive growth of data, (ii) the lack of domain-specific analysis algorithms, and (iii) the disconnect between simulation (theory) and observation (experiment). Large-scale simulations are also used in manufacturing to reduce design costs. Therefore, these challenges are reflected across multiple scientific domains and not just in cosmology. The goal of the work proposed herein is the development of an open-source, in situ analysis framework targeting large-scale simulations. The library will address the explosive growth of data by employing in situ analysis. Science-aware algorithms will be implemented to address the lack of domain-specific solutions. Lastly, to bridge the gap between simulation and experiments, the library will also be integrated with a data management system that leverages web-based mining and visualization technologies to simplify scientific workflows. In Phase I, CosmologyTools was successfully developed as a lightweight in situ analysis framework and it was coupled with HACC (Hybrid/Hardware Accelerated Cosmology Code), one of the core DOE gravity-only N-Body codes. The Phase I effort has demonstrated the feasibility of our technical approach. By employing in situ analysis, the data is reduced to the features of interest, I/O overheads and associated costs are minimized, and the scientific workflow is simplified. Lastly, the Phase I effort clearly demonstrated the need for data management to enable direct comparisons of simulation and observations. The Phase II project will focus on the development of a full-featured, production-level, in situ analysis library with a number of advanced features including novel feature identification, extraction, and tracking. Further, the library will be integrated with data management to enable direct comparison of simulation results and data from experiments. The team will also develop web-based technologies for data mining and visualization to simplify scientific workflows. Commercial Applications and Other Benefits: Upon completion of Phase II, the resulting infrastructure will have a number of key benefits and broader impacts in science and manufacturing in general. Specifically, it will (i) enable analysis of massive scientific datasets, (ii) enable scientists to glean insights not available before by embedding analysis within the simulation, (iii) enable direct comparison of simulation data and experiments, and (iv) simplify data management and scientific workflows.


Grant
Agency: Department of Energy | Branch: | Program: SBIR | Phase: Phase II | Award Amount: 1000.00K | Year: 2015

Three-dimensional characterization of materials at the nano- and meso-scale has become possible with transmission and scanning transmission electron microscopes. Its importance has extended to a wide class of nanomaterials|such as hydrogen fuel cells, solar cells, industrial catalysts, new battery materials and semiconductor devices, as well as spanning high-tech industry, universities, and national labs. While capable instrumentation is abundant, this rapidly expanding demand for high-resolution tomography is bottlenecked by software that is instead tailored for lower-dose, biological applications and not optimized for higher-resolution materials applications. To address this problem, this proposal will deliver a fully functional, freely-distributable, open- source scanning transmission electron microscopy tomography package with a modern user inter- face that enables automated acquisition, alignment, and reconstruction of raw tomography data, and provides advanced segmentation, three-dimensional visualization and analysis optimized for materials applications. It will establish an extendable framework capable of full automation for high-throughput electron tomography of materials from data acquisition to visualization. Phase I combined empirically-tested tomography strategies with professional software interfaces to develop a clean and integrated application for 3D visualization and reconstruction. The cross- platform application can read transmission electron microscope image data, provides graphical tools for user-assisted alignment of data, performs basic tomographic reconstruction, and visualizes the resulting 3D volume. Within the application, an interface enables integration of advanced reconstruction and data processing routines into the work ow|providing the foundation for Phase II capabilities. Phase II will implement advanced automated alignment and reconstruction routines, along with enhanced graphical tools to edit, align, segment, visualize and analyze the reconstructed data. Support for automated data acquisition, with manual override capability, will be added to o er a turn-key fully automated work ow. Support for saving the full application work ow will o er reproducible, auditable, advanced data collection, reconstruction and analysis capabilities, supporting the call for reproducible science with data publication. With around 600 transmission electron microscopes worldwide and approximately 50 coming online each year, the demand and impact of an open-source tomography tool is large. Significant opportunities exist in high-tech industry, universities, and national labs to enable or enhance three-dimensional imaging at the nanoscale and bring automated high-throughput approaches that will accelerate progress in materials characterization and metrology. The project will support a service based business model by enabling lab-speci c acquisition and processing customization and integration|support and development that will be provided into Phase III.


Grant
Agency: Department of Energy | Branch: | Program: SBIR | Phase: Phase II | Award Amount: 1000.00K | Year: 2015

According to the World Nuclear Association, the U.S. is the largest producer of nuclear power worldwide. In fact, the U.S. provides over 30 percent of the worlds nuclear generation of electricity. Although few new reactors have been built in the past 30 years, the association anticipates that four to six reactors may be built by 2020. The first of the reactors will be built in response to 16 license applications, which have been completed since 2007, to build 24 additional nuclear reactors. Currently, both government and industry are working to accelerate approval for the construction and design of new nuclear energy plants. The U.S. nuclear industry has already achieved significant developments in the implementation of nuclear power plants due to advancements in refueling, maintenance, and safety systems at current power plants. Meanwhile, changes in government policy, which have occurred since the late 1990s, have provided for noteworthy expansion in nuclear capacity. For example, the Energy Policy Act of 2005 evoked investment in electricity infrastructure such as nuclear power. Today, the importance of nuclear power in the U.S. is a geopolitical matter as much as it is an economic one. This is due to the notion that increasing the use of nuclear power reduces the U.S.s reliance on imported oil and gas. In order to design future systems and continue operational improvements at existing U.S. plants, advanced modeling, and simulation of nuclear power reactors is crucial. In this proposal, we will develop a full-featured open-source Cloud/Web-based simulation environment for nuclear energy advanced modeling and simulation. Our approach simplifies the workflow, elevates the need for in-house computational science and engineering experts, and lowers the capital investments required for advanced modeling and simulation. The Cloud/Web-based simulation environment simply guides end-users and developers through the advanced modeling and simulation lifecycle. In addition to providing significantly improved, intuitive software, the environment offers reproducible workflows where the full path of data from input to final analyzed results can be saved, shared, and even published as supporting information. The work proposed here addresses current deficiencies for utilizing advanced modeling and simulation by building a flexible, open-source Cloud/Web-based simulation environment or framework that stresses interoperability and ease-of-use. In particular, we are targeting nuclear energy, engineering, and manufacturing firms that can clearly benefit from this technology, given that we will make these tools easier to use, reduce the need for in house expertise, and reduce the overall capital costs of using advanced modeling and simulation.


Grant
Agency: Department of Energy | Branch: | Program: SBIR | Phase: Phase II | Award Amount: 1000.00K | Year: 2016

A researcher dealing with big data today is met with a maze of languages, programming environments, data storage and query systems, and compute engines. Pursuing a new path in this space may take years and millions of dollars of investment, only to discover that a new and more applicable big data paradigm has emerged. Costs include learning programming languages, storage systems, and computing paradigms, as well as significant hardware and administrative costs of setting up and maintaining the needed environments for data storage, transfer, and computation. How this problem is being addressed GoBig unifies and simplifies big data tools in two important areas: unified user interface to big data software and hardware stacks, and streamlined deployment and modularity to various types of cloud and HPC systems. Data is managed through the extensible Girder data framework, an open-source project started at Kitware which provides a unified interface to many distributed storage systems along with access control and extensible plugins. Romanesco manages analyses and workflows that span programming language boundaries. The results are then persisted in Girder to be made available for further analysis or visualization. Instead of managing and supporting multiple user endpoints to various big data toolchains, user management and authorization for multiple systems may be managed by GoBig’s account credentials. What is to be done in Phase I To demonstrate the feasibility of the GoBig system in Phase I, we will show system modularity by extending computation support in GoBig to Hadoop, HPC clusters running MPI, a queueing system, and a distributed data system. We will also add Julia, Java, and Scala to the analytic programming languages supported in GoBig, and demonstrate the applicability of GoBig to a computational science domain. Our Phase I work will also demonstrate ease of deployment including provisioning of arbitrary systems and easy installation on cloud services such as OpenStack and Amazon Web Services (AWS). This will all be performed utilizing Kitware’s proven practices for agile, durable, and sustainable software. Commercial applications and other benefits Because GoBig is open-source and extensible, the community that will grow around the aforementioned tools will foster agility and innovation while reducing maintenance cost over time. The development model used for open-source projects has also been proven to scale to thousands of developers while maintaining a high standard for quality. We will encourage the participation of developers who can add abstractions for more data storage and processing systems. GoBig’s flexibility and ease of use will ultimately impact a broad range of data analysts who require a low barrier of entry to distributed compute services, including government, academia, and the business community.

Loading Kitware Inc. collaborators
Loading Kitware Inc. collaborators