Time filter

Source Type

Clifton Park, NY, United States

Kitware, Inc. is a technology company headquartered in Clifton Park, New York. The company specializes in the research and development of open-source software in the fields of computer vision, medical imaging, visualization, 3D data publishing and technical software development. In addition to software development, the company offers other products and services such as books, technical support, consulting and customized training courses. Wikipedia.

Agency: GTR | Branch: EPSRC | Program: | Phase: Training Grant | Award Amount: 3.94M | Year: 2014

The achievements of modern research and their rapid progress from theory to application are increasingly underpinned by computation. Computational approaches are often hailed as a new third pillar of science - in addition to empirical and theoretical work. While its breadth makes computation almost as ubiquitous as mathematics as a key tool in science and engineering, it is a much younger discipline and stands to benefit enormously from building increased capacity and increased efforts towards integration, standardization, and professionalism. The development of new ideas and techniques in computing is extremely rapid, the progress enabled by these breakthroughs is enormous, and their impact on society is substantial: modern technologies ranging from the Airbus 380, MRI scans and smartphone CPUs could not have been developed without computer simulation; progress on major scientific questions from climate change to astronomy are driven by the results from computational models; major investment decisions are underwritten by computational modelling. Furthermore, simulation modelling is emerging as a key tool within domains experiencing a data revolution such as biomedicine and finance. This progress has been enabled through the rapid increase of computational power, and was based in the past on an increased rate at which computing instructions in the processor can be carried out. However, this clock rate cannot be increased much further and in recent computational architectures (such as GPU, Intel Phi) additional computational power is now provided through having (of the order of) hundreds of computational cores in the same unit. This opens up potential for new order of magnitude performance improvements but requires additional specialist training in parallel programming and computational methods to be able to tap into and exploit this opportunity. Computational advances are enabled by new hardware, and innovations in algorithms, numerical methods and simulation techniques, and application of best practice in scientific computational modelling. The most effective progress and highest impact can be obtained by combining, linking and simultaneously exploiting step changes in hardware, software, methods and skills. However, good computational science training is scarce, especially at post-graduate level. The Centre for Doctoral Training in Next Generation Computational Modelling will develop 55+ graduate students to address this skills gap. Trained as future leaders in Computational Modelling, they will form the core of a community of computational modellers crossing disciplinary boundaries, constantly working to transfer the latest computational advances to related fields. By tackling cutting-edge research from fields such as Computational Engineering, Advanced Materials, Autonomous Systems and Health, whilst communicating their advances and working together with a world-leading group of academic and industrial computational modellers, the students will be perfectly equipped to drive advanced computing over the coming decades.

Krishnan K.,Kitware Inc.
Optics express | Year: 2010

An open source lesion sizing toolkit has been developed with a general architecture for implementing lesion segmentation algorithms and a reference algorithm for segmenting solid and part-solid lesions from lung CT scans. The CT lung lesion segmentation algorithm detects four three-dimensional features corresponding to the lung wall, vasculature, lesion boundary edges, and low density background lung parenchyma. These features form boundaries and propagation zones that guide the evolution of a subsequent level set algorithm. User input is used to determine an initial seed point for the level set and users may also define a region of interest around the lesion. The methods are validated against 18 nodules using CT scans of an anthropomorphic thorax phantom simulating lung anatomy. The scans were acquired under differing scanner parameters to characterize algorithm behavior under varying acquisition protocols. We also validated repeatability using six clinical cases in which the patient was rescanned on the same day (zero volume change). The source code, data sets, and a running application are all provided under an unrestrictive license to encourage reproducibility and foster scientific exchange.

Sun Z.,General Electric | Hoogs A.,Kitware Inc.
International Journal of Computer Vision | Year: 2010

In this paper, we study (normalized) disjoint information as a metric for image comparison and its applications to perceptual image quality assessment, image registration, and video tracking. Disjoint information is the joint entropy of random variables excluding the mutual information. This measure of statistical dependence and information redundancy satisfies more rigorous metric conditions than mutual information, including self-similarity, minimality, symmetry and triangle inequality. It is applicable to two or more random variables, and can be computed by vector histogramming, vector Parzen window density approximation, and upper bound approximation involving fewer variables. We show such a theoretic advantage does have implications in practice. In the domain of digital image and video, multiple visual features are extracted and (normalized) compound disjoint information is derived from a set of marginal densities of the image distributions, thus enriching the vocabulary of content representation. The proposed metric matching functions are applied to several domain applications to demonstrate their efficacy. © 2010 Springer Science+Business Media, LLC.

Agency: Department of Energy | Branch: | Program: SBIR | Phase: Phase I | Award Amount: 150.00K | Year: 2014

Three-dimensional characterization of materials at the nano- and meso-scale has become possible with transmission and scanning transmission electron microscopes. Its importance has extended to a wide class of nanomaterialssuch as hydrogen fuel cells, solar cells, industrial catalysts, new battery materials and semiconductor devices, as well as spanning high-tech industry, universities, and national labs. While capable instrumentation is abundant, this rapidly expanding demand for high-resolution tomography is bottlenecked by software that is tailored for lower-dose, biological applications and not optimized for higher-resolution materials applications. To address this problem, Phase I of this proposal will deliver a fully functional, freely-distributable, open-source scanning transmission electron microscopy tomography package with a modern user in- terface that enables alignment and reconstruction of raw tomography data, and provides advanced three-dimensional visualization and analysis optimized for materials applications. It will establish an extendable framework which will position the Phase II project to focus on full automation for high-throughput electron tomography of materials from data acquisition to visualization. The Phase I approach will combine empirically-tested tomography strategies with professional software interfaces to develop a clean and fully integrated application for 3D visualization and reconstruction. This user-friendly, cross-platform, graphical desktop application will read trans- mission electron microscope image data, provide graphical tools to perform user-assisted alignment of images collected on the instrument, and sophisticated analysis of the resulting volumetric re- constructed data. Within the application, an interface will allow user-developed reconstruction algorithms to be fully integrated into the workflow. In addition to providing significantly en- hanced, intuitive software, the application will enable reproducible workflows where the full path of data from collection to final analyzed results can be saved and shared. With around 600 transmission electron microscopes worldwide and a growth of approximately 50 coming online each year, the demand and impact of an open-source tomography tool is large. Significant opportunities exist in high-tech industry, universities, and national labs to enable or en- hance three-dimensional imaging at the nanoscale and bring automated high-throughput approaches that will accelerate progress in materials characterization and metrology. The project will support a service based business model by enabling services-based consulting, technology integration and customizationsupport and development that will be provided into Phase III.

Agency: Department of Energy | Branch: | Program: SBIR | Phase: Phase II | Award Amount: 1000.00K | Year: 2015

According to the World Nuclear Association, the U.S. is the largest producer of nuclear power worldwide. In fact, the U.S. provides over 30 percent of the worlds nuclear generation of electricity. Although few new reactors have been built in the past 30 years, the association anticipates that four to six reactors may be built by 2020. The first of the reactors will be built in response to 16 license applications, which have been completed since 2007, to build 24 additional nuclear reactors. Currently, both government and industry are working to accelerate approval for the construction and design of new nuclear energy plants. The U.S. nuclear industry has already achieved significant developments in the implementation of nuclear power plants due to advancements in refueling, maintenance, and safety systems at current power plants. Meanwhile, changes in government policy, which have occurred since the late 1990s, have provided for noteworthy expansion in nuclear capacity. For example, the Energy Policy Act of 2005 evoked investment in electricity infrastructure such as nuclear power. Today, the importance of nuclear power in the U.S. is a geopolitical matter as much as it is an economic one. This is due to the notion that increasing the use of nuclear power reduces the U.S.s reliance on imported oil and gas. In order to design future systems and continue operational improvements at existing U.S. plants, advanced modeling, and simulation of nuclear power reactors is crucial. In this proposal, we will develop a full-featured open-source Cloud/Web-based simulation environment for nuclear energy advanced modeling and simulation. Our approach simplifies the workflow, elevates the need for in-house computational science and engineering experts, and lowers the capital investments required for advanced modeling and simulation. The Cloud/Web-based simulation environment simply guides end-users and developers through the advanced modeling and simulation lifecycle. In addition to providing significantly improved, intuitive software, the environment offers reproducible workflows where the full path of data from input to final analyzed results can be saved, shared, and even published as supporting information. The work proposed here addresses current deficiencies for utilizing advanced modeling and simulation by building a flexible, open-source Cloud/Web-based simulation environment or framework that stresses interoperability and ease-of-use. In particular, we are targeting nuclear energy, engineering, and manufacturing firms that can clearly benefit from this technology, given that we will make these tools easier to use, reduce the need for in house expertise, and reduce the overall capital costs of using advanced modeling and simulation.

Discover hidden collaborations