Kitware, Inc. is a technology company headquartered in Clifton Park, New York. The company specializes in the research and development of open-source software in the fields of computer vision, medical imaging, visualization, 3D data publishing and technical software development. In addition to software development, the company offers other products and services such as books, technical support, consulting and customized training courses. Wikipedia.
News Article | May 10, 2017
Kitware continued its transitions in team management and organizational structure with four promotions. “This year, our offices in New York, North Carolina, New Mexico and France have undergone significant growth, particularly in data and analytics,” said Lisa Avila, the president and CEO of Kitware. “We are happy to recognize the leadership of several team members as well as the contributions of our entire company.” Kitware recognized the leadership of Jeffrey Baumes, who the company promoted to director of data and analytics. Baumes joined Kitware in 2006, after he completed a doctorate in computer science. He has steered efforts such as XDATA and the Resonant software platform to fit industries that include defense, healthcare and energy. As director, Baumes will expand the software platforms and the technical strategy of the data and analytics team. Stephen Aylward also started a new role as senior director of strategic initiatives. Aylward was senior director of medical research and senior director of operations in North Carolina. In 2006, he coordinated the startup of the Kitware office in this location. He has helped it to grow to over 40 team members and has guided several medical research efforts. In his new role, Aylward will plan and promote the trajectory of Kitware, fostering nascent technical developments and enriching synergies among Kitware software platforms and teams. To further technical developments and synergies, Kitware named Andinet Enquobahrie director of medical computing. Enquobahrie has a doctorate in electrical and computer engineering as well as an MBA with a focus in technology evaluation and innovation. Since he joined Kitware in 2005, he has built and maintained relationships with collaborators, explored funding opportunities and led a team of research and development engineers to execute projects in image-guided intervention that influence fields from optometry to orthodontics. As director, Enquobahrie will guide the medical computing team as they continue to create algorithms and design software for academic researchers and commercial customers with the Insight Segmentation and Registration Toolkit (ITK) and 3D Slicer. Kitware also made Matt Turek a director. Turek graduated with his doctorate in computer science and began at Kitware in 2007. He has worked with Anthony Hoogs, senior director of computer vision, to manage the computer vision team; increase its membership to more than 30; and maintain relationships with technical institutes, government agencies and leaders in satellite imagery. As a result of his ability to grow important customer bases, Kitware named Turek assistant director of computer vision in 2013. He currently serves as a corporate relations chair for the 2017 conference on Computer Vision and Pattern Recognition (CVPR). As director of computer vision, he will assume broader responsibility of the operation of the computer vision team. Kitware will publish additional company news on its blog. For inquiries, please contact kitware(at)kitware(dot)com. About Kitware Kitware is an advanced technology, research and open-source solutions provider for research facilities, government institutions and corporations worldwide. Founded in 1998, Kitware specializes in research and development in the areas of HPC and visualization, medical imaging, computer vision, data and analytics and quality software process. Among its services, Kitware offers consulting and support for high-quality software solutions. Kitware is headquartered in Clifton Park, NY, with offices in Carrboro, NC; Santa Fe, NM; and Lyon, France. More information can be found on kitware.com.
Agency: GTR | Branch: EPSRC | Program: | Phase: Training Grant | Award Amount: 3.94M | Year: 2014
The achievements of modern research and their rapid progress from theory to application are increasingly underpinned by computation. Computational approaches are often hailed as a new third pillar of science - in addition to empirical and theoretical work. While its breadth makes computation almost as ubiquitous as mathematics as a key tool in science and engineering, it is a much younger discipline and stands to benefit enormously from building increased capacity and increased efforts towards integration, standardization, and professionalism. The development of new ideas and techniques in computing is extremely rapid, the progress enabled by these breakthroughs is enormous, and their impact on society is substantial: modern technologies ranging from the Airbus 380, MRI scans and smartphone CPUs could not have been developed without computer simulation; progress on major scientific questions from climate change to astronomy are driven by the results from computational models; major investment decisions are underwritten by computational modelling. Furthermore, simulation modelling is emerging as a key tool within domains experiencing a data revolution such as biomedicine and finance. This progress has been enabled through the rapid increase of computational power, and was based in the past on an increased rate at which computing instructions in the processor can be carried out. However, this clock rate cannot be increased much further and in recent computational architectures (such as GPU, Intel Phi) additional computational power is now provided through having (of the order of) hundreds of computational cores in the same unit. This opens up potential for new order of magnitude performance improvements but requires additional specialist training in parallel programming and computational methods to be able to tap into and exploit this opportunity. Computational advances are enabled by new hardware, and innovations in algorithms, numerical methods and simulation techniques, and application of best practice in scientific computational modelling. The most effective progress and highest impact can be obtained by combining, linking and simultaneously exploiting step changes in hardware, software, methods and skills. However, good computational science training is scarce, especially at post-graduate level. The Centre for Doctoral Training in Next Generation Computational Modelling will develop 55+ graduate students to address this skills gap. Trained as future leaders in Computational Modelling, they will form the core of a community of computational modellers crossing disciplinary boundaries, constantly working to transfer the latest computational advances to related fields. By tackling cutting-edge research from fields such as Computational Engineering, Advanced Materials, Autonomous Systems and Health, whilst communicating their advances and working together with a world-leading group of academic and industrial computational modellers, the students will be perfectly equipped to drive advanced computing over the coming decades.
Agency: Department of Energy | Branch: | Program: SBIR | Phase: Phase II | Award Amount: 1000.00K | Year: 2012
Observations unequivocally show that the global climate is changing and the results are dramatic. With the accelerating pace of climate change, the impact of these changes will resonate through a broad range of fields including public health, infrastructure, water resources, and many others. These stakeholders need access to and resources for using climate data that are designed with the non-researcher in mind. Unfortunately, this is impeded by factors such as large data size, lack of adequate metadata, poor documentation, and lack of sufficient computational and visualization resources. The aim of this proposal is to develop ClimatePipes, a platform that will provide mechanisms to make this valuable data available to non-researchers including policy makers, health officials, agriculturalists, and industry leaders. The ClimatePipes cyberinfrastructure will provide state-of-the-art, user-friendly access, analysis, and visualization of data generated from high-resolution, long-term, climate change projections performed as part of the U.S. Global Change Research Program, the DOE-funded ESGF, NSF-funded DataONE, and NASA-funded EOSDIS. ClimatePipes is not a replacement for high-end tools for scientists, but instead provides simple, intuitive, and effective workflows that can be used by non-researchers and non-programmers. It builds on top of existing high-end tools, and is designed to easily leverage functionality added to those tools, thus ensuring state-of-the-art functionality in the years to come. ClimatePipes was successfully developed as a web-based tool that provides workflow and form-based interfaces for accessing, querying, and visualizing interesting datasets from one or more sources. This was integrated with ESGF and tools from UV-CDAT for the purpose of data manipulation and visualization. ClimatePipes demonstrated production of relevant data and visualizations in response to natural language queries. Will focus on implementation of mechanisms for supporting more elaborate and relevant queries, and improvements to the usability, robustness and scalability of the system. In order to support these types of queries, the team will develop a semantic search tool using natural language processing techniques. The team will also add the capability to perform appropriate transformations to bring relevant data into a common reference system. Upon completion of Phase II, ClimatePipes will provide an interface for running computations in the cloud or on a user-provided cluster, to accommodate the compute power required. Further, the team will develop custom visualizations for climate applications, an API, and an appropriate user interface for uploading data to the server for analyses and integration.
Agency: Department of Energy | Branch: | Program: SBIR | Phase: Phase I | Award Amount: 149.93K | Year: 2012
Cosmological simulations play a very important role in the DOE High Energy Physics Cosmic Frontier program. They are used to connect fundamental physics with observation, and are especially critical in the effort to understand Dark Energy. Current and upcoming cosmological observations that survey large areas of the sky produce very large datasets. For example, the Large Synoptic Survey Telescope (LSST) will produce up to 30 terabytes of data per night. Simulations carried out to interpret results from surveys of this type already produce terabytes of data per simulation and this will rise to many petabytes within a decade. To have any hope of realizing, encapsulating, and interpreting the enormous wealth of information contained in such datasets, it is necessary to find very efficient ways to explore and analyze them, with the goal of eventually automating many such tasks. Critical challenges facing such simulations include workflow I/O and lack of domain-specific data analysis algorithms. Overcoming these challenges requires a revolutionary shift in the way cosmological predictions are obtained. Instead of the traditional workflow where simulation codes are run for days and the analysis is conducted as a post-processing step, on-line methods that enable scientists to analyze the data in tandem with the evolving simulation are required. To address this, the software infrastructure that we propose to create will provide the following functionality: 1. Data-reduction to minimize I/O 2. Robust and efficient halo-extraction methods 3. On-line/forward tracking of halos to capture halo formation dynamics 4. In-situ and co-visualization capabilities The development of such an infrastructure will pave a way forward toward the analysis of exascale datasets that are expected within the coming decade. The basic strategy to achieve this consists of (i) extending and optimizing existing halo-extraction techniques to improve robustness; (ii) developing an on-line halo-tracking method to be used in tandem with the simulation code at the time-step resolution, enabling insights unavailable using present approaches; (iii) minimizing the I/O bottleneck by reducing output to only halos, halo-formation history and a sub-sample of the remaining particles; and (iv) leveraging ParaViews in-situ and co-visualization capabilities. In addition to cosmological applications, the framework to be developed has a broader applicability to industries that use particle-based simulation techniques, including astrophysics, ballistics and defense, volcanology, oceanology, solid mechanics modeling, and various maritime applications. By addressing the wider issues associated with large-data simulations, this work will drive innovation in the computational sciences and be adaptable to many industries, facilitate the transition from terascale work to peta- and eventually exascale computing.
Agency: Department of Energy | Branch: | Program: SBIR | Phase: Phase I | Award Amount: 1.72M | Year: 2013
In 2011, the US electricity generation was 4344 billion kWh gross with 821 TWh (19%) from nuclear power reactors. The US has 104 nuclear power reactors, 69 pressurized water reactors and 35 boiling water reactors, in 31 states, operated by 30 different power companies. Almost all of the US nuclear generating capacity comes from reactors built between 1967 and 1990. There have been no new construction starts since 1977, largely because for a number of years gas generation was considered more economically attractive and because construction schedules were frequently extended by opposition, compounded by heightened safety fears following the Three Mile Island accident in 1979. A future pressurized water reactor Watts Bar 2 is expected to start up in 2013 following Tennessee Valley Authoritys decision in 2007 to complete the construction of the unit. Despite a near halt in new construction of more than 30 years, US reliance on nuclear power has continued to grow due to remarkable gains in power plant utilization through improved refueling, maintenance, and safety systems at existing plants. Advanced modeling and simulation of nuclear power reactors is critical to the design of future systems and the continued operation of the existing US plants. The proposal describes an approach to simplify and democratize advanced modeling and sim- ulation in the nuclear energy industry by developing an open-source integrated design-analysis en- vironment (IDAE) to work on a range of nuclear engineering applications. It will leverage millions of investment dollars from the Department of Energys Office of Nuclear Energy for modeling and simulation of light water reactors and the Office of Nuclear Energys research and development. The proposed design-analysis environment will leverage existing open-source toolkits, creating a graph- ical end-to-end umbrella guiding end-users and developers through the nuclear energy advanced modeling and simulation lifecycle. The proposed framework will deliver strategic advancements in meshing, data exchange, and visualization for ensembles, uncertainty quantification and analysis. The proposed environment will address all three of the identified challenges to the nuclear energy industrys use of advanced modeling and simulation. Through the use of this unique design-analysis environment, we will enable an ecosystem of synergistic activities will develop, fueling a nuclear energy renaissance. It also lays the groundwork for commercialization, through customization, tech- nology integration, and advanced research and development services. the services will be tailored to nuclear energy companies, large-scale manufacturing and engineering firms, software companies and high-performance computing infrastructure providers. By advancing the state-of-the-art in ad- vanced modeling and simulation, nuclear energy companies innovation will be accelerated and ever more realistic advanced modeling and simulation will virtually design, analyze and test tomorrows nuclear power systems.
Agency: Department of Energy | Branch: | Program: SBIR | Phase: Phase II | Award Amount: 1.00M | Year: 2013
The manufacturing and engineering industry needs more cost effective and easy-to-use high- performance computing (HPC) modeling and simulation tools to design, test, and analyze innovative components and products. For 92% of the small- to medium-sized manufacturers with less than 100 employees, an HPC system is most likely not part of their development environment. This class of manufacturers typically relies on two-dimensional computer-aided design software and a limited amount of desktop modeling and simulation software. This under-utilization of HPC tools by this class of manufactures is of chief concern to the Department of Energy. The proposed simulation framework and suite of tools will be developed with one eye on eliminating the barriers to adoption of HPC modeling and simulation tools for manufacturing and engineering, and the other eye on increasing the efficiency and effectiveness of manufacturing and engineering practices in solving real-world problems. By using these tools, manufacturers will be able to improve their design cycles and reduce their need of developing expensive physical prototypes. In Phase I, a framework referred to as the Simulation Model Based Architecture was developed, capable of supporting the entire lifecycle of a simulation. The approach used was to leverage existing open-source toolkits wherever possible so that the resulting system could also be distributed as open-source. The system provided several mechanisms by which HPC systems can be leveraged. In Phase II, the major focus will be improving the ease of use of the system and extending its core functionality. To address ease of use, the customization capabilities will be enhanced and workflow wizards will be developed to guide the user through their specific simulation workflow. In terms of core functionality, the modeling and meshing capabilities will be enhanced and the mechanisms for data exchange with various simulation environments will be improved. In addition, more complex workflows such as design optimization will be supported. Modular frameworks such as the one proposed here lend themselves to a service business model; vendors with the skills to provide technology integration services can partner with their customers and collaborators to build valuable, competitive products and services. Such an open, modular framework provides significant business opportunities. Beyond the manufacturing community, the technology developed here will also benefit the larger scientific computing community since the code will be released under open source licenses (non-reciprocal Apache or BSD) within open source communities. Thus researchers, educators, and commercial enterprises will be able tune the suite to their particular workflow, easily research specialized areas of interest, and leverage international communities to develop leading edge technology.
Agency: Department of Energy | Branch: | Program: SBIR | Phase: Phase II | Award Amount: 1000.00K | Year: 2013
Numerical simulations play a crucial role in the DOE High Energy Physics (HEP) Cosmic Frontier program. They provide a theoretical framework with which to compare observational data from surveys. Through such comparisons, significant breakthroughs are made. These breakthroughs can guide scientists in calibrating the underlying cosmological model, to produce simulations that match the observed sky, which enhances our overall understanding of the universe. Science requirements for such surveys demand simulations of extreme scales. The ability to perform these simulations,however, is limited by the scale at which we can store, manage, and process the resulting data. Chief among the pressing challenges are: (i) the explosive growth of data, (ii) the lack of domain-specific analysis algorithms, and (iii) the disconnect between simulation (theory) and observation (experiment). Large-scale simulations are also used in manufacturing to reduce design costs. Therefore, these challenges are reflected across multiple scientific domains and not just in cosmology. The goal of the work proposed herein is the development of an open-source, in situ analysis framework targeting large-scale simulations. The library will address the explosive growth of data by employing in situ analysis. Science-aware algorithms will be implemented to address the lack of domain-specific solutions. Lastly, to bridge the gap between simulation and experiments, the library will also be integrated with a data management system that leverages web-based mining and visualization technologies to simplify scientific workflows. In Phase I, CosmologyTools was successfully developed as a lightweight in situ analysis framework and it was coupled with HACC (Hybrid/Hardware Accelerated Cosmology Code), one of the core DOE gravity-only N-Body codes. The Phase I effort has demonstrated the feasibility of our technical approach. By employing in situ analysis, the data is reduced to the features of interest, I/O overheads and associated costs are minimized, and the scientific workflow is simplified. Lastly, the Phase I effort clearly demonstrated the need for data management to enable direct comparisons of simulation and observations. The Phase II project will focus on the development of a full-featured, production-level, in situ analysis library with a number of advanced features including novel feature identification, extraction, and tracking. Further, the library will be integrated with data management to enable direct comparison of simulation results and data from experiments. The team will also develop web-based technologies for data mining and visualization to simplify scientific workflows. Commercial Applications and Other Benefits: Upon completion of Phase II, the resulting infrastructure will have a number of key benefits and broader impacts in science and manufacturing in general. Specifically, it will (i) enable analysis of massive scientific datasets, (ii) enable scientists to glean insights not available before by embedding analysis within the simulation, (iii) enable direct comparison of simulation data and experiments, and (iv) simplify data management and scientific workflows.
Agency: Department of Energy | Branch: | Program: SBIR | Phase: Phase II | Award Amount: 1000.00K | Year: 2015
Three-dimensional characterization of materials at the nano- and meso-scale has become possible with transmission and scanning transmission electron microscopes. Its importance has extended to a wide class of nanomaterials|such as hydrogen fuel cells, solar cells, industrial catalysts, new battery materials and semiconductor devices, as well as spanning high-tech industry, universities, and national labs. While capable instrumentation is abundant, this rapidly expanding demand for high-resolution tomography is bottlenecked by software that is instead tailored for lower-dose, biological applications and not optimized for higher-resolution materials applications. To address this problem, this proposal will deliver a fully functional, freely-distributable, open- source scanning transmission electron microscopy tomography package with a modern user inter- face that enables automated acquisition, alignment, and reconstruction of raw tomography data, and provides advanced segmentation, three-dimensional visualization and analysis optimized for materials applications. It will establish an extendable framework capable of full automation for high-throughput electron tomography of materials from data acquisition to visualization. Phase I combined empirically-tested tomography strategies with professional software interfaces to develop a clean and integrated application for 3D visualization and reconstruction. The cross- platform application can read transmission electron microscope image data, provides graphical tools for user-assisted alignment of data, performs basic tomographic reconstruction, and visualizes the resulting 3D volume. Within the application, an interface enables integration of advanced reconstruction and data processing routines into the work ow|providing the foundation for Phase II capabilities. Phase II will implement advanced automated alignment and reconstruction routines, along with enhanced graphical tools to edit, align, segment, visualize and analyze the reconstructed data. Support for automated data acquisition, with manual override capability, will be added to o er a turn-key fully automated work ow. Support for saving the full application work ow will o er reproducible, auditable, advanced data collection, reconstruction and analysis capabilities, supporting the call for reproducible science with data publication. With around 600 transmission electron microscopes worldwide and approximately 50 coming online each year, the demand and impact of an open-source tomography tool is large. Significant opportunities exist in high-tech industry, universities, and national labs to enable or enhance three-dimensional imaging at the nanoscale and bring automated high-throughput approaches that will accelerate progress in materials characterization and metrology. The project will support a service based business model by enabling lab-speci c acquisition and processing customization and integration|support and development that will be provided into Phase III.
Agency: Department of Energy | Branch: | Program: SBIR | Phase: Phase II | Award Amount: 1000.00K | Year: 2015
According to the World Nuclear Association, the U.S. is the largest producer of nuclear power worldwide. In fact, the U.S. provides over 30 percent of the worlds nuclear generation of electricity. Although few new reactors have been built in the past 30 years, the association anticipates that four to six reactors may be built by 2020. The first of the reactors will be built in response to 16 license applications, which have been completed since 2007, to build 24 additional nuclear reactors. Currently, both government and industry are working to accelerate approval for the construction and design of new nuclear energy plants. The U.S. nuclear industry has already achieved significant developments in the implementation of nuclear power plants due to advancements in refueling, maintenance, and safety systems at current power plants. Meanwhile, changes in government policy, which have occurred since the late 1990s, have provided for noteworthy expansion in nuclear capacity. For example, the Energy Policy Act of 2005 evoked investment in electricity infrastructure such as nuclear power. Today, the importance of nuclear power in the U.S. is a geopolitical matter as much as it is an economic one. This is due to the notion that increasing the use of nuclear power reduces the U.S.s reliance on imported oil and gas. In order to design future systems and continue operational improvements at existing U.S. plants, advanced modeling, and simulation of nuclear power reactors is crucial. In this proposal, we will develop a full-featured open-source Cloud/Web-based simulation environment for nuclear energy advanced modeling and simulation. Our approach simplifies the workflow, elevates the need for in-house computational science and engineering experts, and lowers the capital investments required for advanced modeling and simulation. The Cloud/Web-based simulation environment simply guides end-users and developers through the advanced modeling and simulation lifecycle. In addition to providing significantly improved, intuitive software, the environment offers reproducible workflows where the full path of data from input to final analyzed results can be saved, shared, and even published as supporting information. The work proposed here addresses current deficiencies for utilizing advanced modeling and simulation by building a flexible, open-source Cloud/Web-based simulation environment or framework that stresses interoperability and ease-of-use. In particular, we are targeting nuclear energy, engineering, and manufacturing firms that can clearly benefit from this technology, given that we will make these tools easier to use, reduce the need for in house expertise, and reduce the overall capital costs of using advanced modeling and simulation.
Agency: Department of Energy | Branch: | Program: SBIR | Phase: Phase II | Award Amount: 1000.00K | Year: 2016
A researcher dealing with big data today is met with a maze of languages, programming environments, data storage and query systems, and compute engines. Pursuing a new path in this space may take years and millions of dollars of investment, only to discover that a new and more applicable big data paradigm has emerged. Costs include learning programming languages, storage systems, and computing paradigms, as well as significant hardware and administrative costs of setting up and maintaining the needed environments for data storage, transfer, and computation. How this problem is being addressed GoBig unifies and simplifies big data tools in two important areas: unified user interface to big data software and hardware stacks, and streamlined deployment and modularity to various types of cloud and HPC systems. Data is managed through the extensible Girder data framework, an open-source project started at Kitware which provides a unified interface to many distributed storage systems along with access control and extensible plugins. Romanesco manages analyses and workflows that span programming language boundaries. The results are then persisted in Girder to be made available for further analysis or visualization. Instead of managing and supporting multiple user endpoints to various big data toolchains, user management and authorization for multiple systems may be managed by GoBig’s account credentials. What is to be done in Phase I To demonstrate the feasibility of the GoBig system in Phase I, we will show system modularity by extending computation support in GoBig to Hadoop, HPC clusters running MPI, a queueing system, and a distributed data system. We will also add Julia, Java, and Scala to the analytic programming languages supported in GoBig, and demonstrate the applicability of GoBig to a computational science domain. Our Phase I work will also demonstrate ease of deployment including provisioning of arbitrary systems and easy installation on cloud services such as OpenStack and Amazon Web Services (AWS). This will all be performed utilizing Kitware’s proven practices for agile, durable, and sustainable software. Commercial applications and other benefits Because GoBig is open-source and extensible, the community that will grow around the aforementioned tools will foster agility and innovation while reducing maintenance cost over time. The development model used for open-source projects has also been proven to scale to thousands of developers while maintaining a high standard for quality. We will encourage the participation of developers who can add abstractions for more data storage and processing systems. GoBig’s flexibility and ease of use will ultimately impact a broad range of data analysts who require a low barrier of entry to distributed compute services, including government, academia, and the business community.