Fort Meade, MD, United States

Time filter

Source Type

News Article | April 19, 2017
Site: www.fastcompany.com

It seems like everything is tracked and measured these days. Order a pizza, and you get a notification when it’s put in the oven. Curious about how many times an NBA player (any player) dribbles per possession? That information is neatly presented on the league’s official site. Even dreams are being produced, collected, and analyzed as part of our “quantified self” data. One startup is working on tracking something more ambitious: the planet. Instead of measuring basic heart rate or blood pressure, Descartes Labs is applying machine learning to both public and private satellite imagery to determine rates of deforestation, forecast food supplies, identify where new wind farms are being constructed, and more. The company, which spun out of Los Alamos National Lab, has access to a massive archive of satellite imagery sourced from NASA, the European Space Agency (ESA), and other “commercial constellations.” The archive goes back decades and grows larger every day–currently, it houses five petabytes of data (that’s 5 million gigabytes). “It’s really an inherently good data set. It’s hard to imagine that the data we’re generating, like deforestation data, can be used for some sort of nefarious purpose, right? At some point, everybody ought to know this data, because it’s just data about the world we live in,” CEO and cofounder Mark Johnson tells Fast Company. Every day, Descartes Labs’ AI reads and processes nearly five terabytes of new data, including weather readings and the latest imagery from satellites orbiting the planet. Analyzing quadrillions of pixels at a time, and comparing it to past data, its fully automated algorithms can determine, for example, whether a field is growing corn, or soy, or something else like turnips, as well as how much of it has already sprouted. Infrared readings allow the AI to determine the health of a given crop, too. Johnson says this allows his team to accurately peer into the future of the planet. For example, Descartes Labs AI says it can predict the yield of America’s 3 million square kilometers of cornfields with 99% accuracy. “Investors always ask, ‘What’s the secret sauce on your corn model?” says Johnson. “And I always tell them, ‘You won’t like this, but there’s no secret algorithm.’ It’s really that we’ve taken more data than anyone else, cleaned it up better than anyone else, and ran more iterations on it than anyone else.” That tool has obvious applications outside U.S. borders, too, for both governments and private companies. With the aid of a just-awarded $1.5 million grant from the U.S. Defense Advanced Research Projects Agency (DARPA), Descartes Labs is now using its technology to anticipate food shortages and predict hot zones of sociopolitical conflict in the Middle East and North Africa. “This is the kind of work we wanted to do as a company as we founded it,” says Johnson. “We have 40-plus years of imagery on the planet. We can start to see, even without weather effects, how the climate has changed based on what’s growing there and what’s not. Drought and famine precede and oftentimes are big drivers in political instability. Better understanding those patterns is key.” Better understanding those patterns right now is critical. As many as 20 million people around world are already on the brink of famine, and we’ll have to feed as many as 2 billion more people over the next three decades. “It’s much cheaper to send in humanitarian resources than troops. And nipping [causes of conflict] in the bud is not only good for the people on the ground, because they’re happier and healthier, and child mortality rates goes down, but also you avoid future problems.” At the same time, Descartes Labs is trying to democratize its data, putting its tools in the hands of both humanitarian organizations that can intervene early enough to save lives, as well as leaders at every level of government so that they can make better decisions about how to allocate resources. But Johnson made it clear that he wants this information to be accessible to everyone, not just people with PhDs in machine learning or elected representatives, so his 30-person team is investing in artificial intelligence that can better classify and categorize new satellite imagery as it comes in, and make it easier to read. “What we’re focusing on is making it easy for people from the ‘outside world’ to use the infrastructure we’ve built,” he says. “ESA and NASA are both putting up lots of really interesting Earth observation satellites–tons of data is being generated. And that’s not to mention all the potential sensor data we’ll be getting from combines, tractors, cars, boats, barges, trains, ships, grain silo. Everything is going to have sensors on it, so making sense of all that data is the sort of challenge we’re aiming toward.” Descartes Labs isn’t trying to tackle this challenge alone. As part of their push to “open up the platform,” the team was part of a hackathon with the National Geospatial-Intelligence Agency (NGA), where they made the platform available to participants to explore how geospatial analysis can be used to study food security. There are 150 million square kilometers of dry land on earth; more than twice that area is covered by water. Greater awareness of how both land and sea are managed, Johnson hopes, will foster a more symbiotic relationship, even intimacy, between human and planet. In practice, the work of Descartes Labs–and that of other companies like marine data analytics company Windward–might encourage businesses to restructure their value chains and even guide our global village toward new approaches to climate action. “Why the hell don’t we know exactly how many trees have been cut down over the past 40 years? This is this is something where we have the data to answer that question,” says Johnson. “To me, this is critical for our future on the planet right. Decisions we make now could have massive repercussions for generations to come. And I want to be armed with a massive amount of data. I want to know where we should marshal our resources to be most effective in protecting the resiliency of humanity.”


Pavlis N.K.,National Geospatial-Intelligence Agency (NGA) | Holmes S.A.,SGT Inc. | Kenyon S.C.,National Geospatial-Intelligence Agency (NGA) | Factor J.K.,National Geospatial-Intelligence Agency (NGA)
Journal of Geophysical Research: Solid Earth | Year: 2012

EGM2008 is a spherical harmonic model of the Earth's gravitational potential, developed by a least squares combination of the ITG-GRACE03S gravitational model and its associated error covariance matrix, with the gravitational information obtained from a global set of area-mean free-air gravity anomalies defined on a 5 arc-minute equiangular grid. This grid was formed by merging terrestrial, altimetry-derived, and airborne gravity data. Over areas where only lower resolution gravity data were available, their spectral content was supplemented with gravitational information implied by the topography. EGM2008 is complete to degree and order 2159, and contains additional coefficients up to degree 2190 and order 2159. Over areas covered with high quality gravity data, the discrepancies between EGM2008 geoid undulations and independent GPS/Leveling values are on the order of 5 to 10 cm. EGM2008 vertical deflections over USA and Australia are within 1.1 to 1.3 arc-seconds of independent astrogeodetic values. These results indicate that EGM2008 performs comparably with contemporary detailed regional geoid models. EGM2008 performs equally well with other GRACE-based gravitational models in orbit computations. Over EGM96, EGM2008 represents improvement by a factor of six in resolution, and by factors of three to six in accuracy, depending on gravitational quantity and geographic area. EGM2008 represents a milestone and a new paradigm in global gravity field modeling, by demonstrating for the first time ever, that given accurate and detailed gravimetric data, a single global model may satisfy the requirements of a very wide range of applications. Copyright 2012 by the American Geophysical Union.


News Article | October 27, 2016
Site: www.businesswire.com

ARLINGTON, Va.--(BUSINESS WIRE)--CACI International Inc (NYSE: CACI) announced today that it has been awarded a prime position on the Full-Motion Video (FMV) functional area of the Multi-Intelligence Analytical and Collection Support Services (MACSS) multiple-award, indefinite delivery/indefinite quantity (IDIQ) contract vehicle to support the National Geospatial-Intelligence Agency (NGA). The company has also been awarded a five-year, $29 million task order, the first to be awarded under the M


Receive press releases from Geodata IT, LLC: By Email Geodata IT Named Minority Owned Small Business of the Year for Eastern Missouri Saint Louis, MO, February 22, 2017 --( An innovative and advanced technology company located in downtown St. Louis, Geodata IT is an 8(a) Certified, Minority Business Enterprise (MBE) and Veteran-Owned Small Business (VOSB). Founded in 2012, Geodata IT provides exceptional niche solutions to national security and geospatial mission requirements in the areas of advanced technology and innovative solutions development for the Department of Defense (DoD) and Intelligence Community (IC). A description of services is at www.geodatait.com. Dedicated to hiring veterans, the vision of Geodata IT, LLC is to foster a culture that is agile in its ability to create an innovative environment that inspires exceptional solutions, integrity and teamwork. Geodata IT’s President and CEO Justin Bennett is a 2016 graduate of the Emerging Leaders Program by the SBA. “Receiving this award is a fantastic opportunity to demonstrate to minority students of technology the possibilities of a career in geospatial intelligence (GEOINT),” said Bennett. Bennett has 18 years of experience in the DoD and Intelligence Community information technology industry. Recognized for achievement in enterprise management, service delivery and enterprise operations by the National Geospatial-Intelligence Agency (NGA), Bennett is an innovator in intelligent solutions architecture. As a specialist in designing niche IT strategies, Bennett has helped enterprise companies like NJVC, Raytheon, SAIC, Pragmatics and CSC streamline complex systems. Bennett holds a Master of Business Administration with an Emphasis in Management Information Systems from Southern Illinois University, an AACSB accredited institution that ensures the highest level of business competency. Additionally, he earned his Bachelor of Science in Management Information Systems from Park University and an Associate of Applied Science from the Community College of the Air Force (CCAF). Saint Louis, MO, February 22, 2017 --( PR.com )-- Geodata IT, LLC was named the Minority Owned Small Business of the Year for Eastern Missouri for 2017 by the U.S. Small Business Administration (SBA) – St. Louis District Office.An innovative and advanced technology company located in downtown St. Louis, Geodata IT is an 8(a) Certified, Minority Business Enterprise (MBE) and Veteran-Owned Small Business (VOSB). Founded in 2012, Geodata IT provides exceptional niche solutions to national security and geospatial mission requirements in the areas of advanced technology and innovative solutions development for the Department of Defense (DoD) and Intelligence Community (IC). A description of services is at www.geodatait.com.Dedicated to hiring veterans, the vision of Geodata IT, LLC is to foster a culture that is agile in its ability to create an innovative environment that inspires exceptional solutions, integrity and teamwork. Geodata IT’s President and CEO Justin Bennett is a 2016 graduate of the Emerging Leaders Program by the SBA.“Receiving this award is a fantastic opportunity to demonstrate to minority students of technology the possibilities of a career in geospatial intelligence (GEOINT),” said Bennett.Bennett has 18 years of experience in the DoD and Intelligence Community information technology industry. Recognized for achievement in enterprise management, service delivery and enterprise operations by the National Geospatial-Intelligence Agency (NGA), Bennett is an innovator in intelligent solutions architecture. As a specialist in designing niche IT strategies, Bennett has helped enterprise companies like NJVC, Raytheon, SAIC, Pragmatics and CSC streamline complex systems.Bennett holds a Master of Business Administration with an Emphasis in Management Information Systems from Southern Illinois University, an AACSB accredited institution that ensures the highest level of business competency. Additionally, he earned his Bachelor of Science in Management Information Systems from Park University and an Associate of Applied Science from the Community College of the Air Force (CCAF). Click here to view the list of recent Press Releases from Geodata IT, LLC


News Article | November 19, 2015
Site: phys.org

High-performance computing (or HPC) enables discoveries in practically every field of science—not just those typically associated with supercomputers like chemistry and physics, but also in the social sciences, life sciences and humanities. By combining superfast and secure networks, cutting-edge parallel computing and analytics software, advanced scientific instruments and critical datasets across the U.S., NSF's cyber-ecosystem lets researchers investigate questions that can't otherwise be explored. NSF has supported advanced computing since its beginning and is constantly expanding access to these resources. This access helps tens of thousands of researchers each year—from high-school students to Nobel Prize winners—expand the frontiers of science and engineering, regardless of whether their institutions are large or small, or where they are located geographically. Below are 10 examples of research enabled by NSF-supported advanced computing resources from across all of science. Pineapples don't just taste good—they have a juicy evolutionary history. Recent analyses using computing resources that are part of the iPlant Collaborative revealed an important relationship between pineapples and crops like sorghum and rice, allowing scientists to home in on the genes and genetic pathways that allow plants to thrive in water-limited environments. Led by the University of Arizona, Texas Advanced Computing Center, Cold Spring Harbor Laboratory and University of North Carolina at Wilmington, iPlant was established in 2008 with NSF funding to develop cyberinfrastructure for life sciences research, provide powerful platforms for data storage and bioinformatics and democratize access to U.S. supercomputing capabilities. This week, iPlant announced it will host a new platform, Digital Imaging of Root Traits (DIRT), that lets scientists in the field measure up to 76 root traits merely by uploading a photograph of a plant's roots. Software that simulates the effect of an electric charge passing through a transistor—only a few atoms wide—is helping researchers to explore alternative materials that may replace silicon in future nanodevices. The software simulations designed by Purdue researcher Gerhard Klimeck and his group, available on the nanoHUB portal, provide new information about the limits of current semiconductor technologies and are helping design future generations of nanoelectronic devices. NanoHUB, supported by NSF, is the first broadly successful, scientific end-to-end cloud computing environment. It provides a library of 3,000 learning resources to 195,000 users worldwide. Its 232 simulation tools are used in the cloud by over 10,800 researchers and students annually. Earthquakes originate through complex interactions deep below the surface of the Earth, making them notoriously difficult to predict. The Southern California Earthquake Center (SCEC) and its lead scientist Thomas Jordan use massive computing power to simulate the dynamics of earthquakes. In doing so, SCEC helps to provide long-term earthquake forecasts and more accurate hazard assessments. In 2014, the SCEC team investigated the earthquake potential of the Los Angeles Basin, where the Pacific and North American Plates run into each other at the San Andreas Fault. Their simulations showed that the basin essentially acts like a big bowl of jelly that shakes during earthquakes, producing more high-shaking ground motions than the team expected. Using the NSF-funded Blue Waters supercomputer at the National Center for Supercomputing Applications and the Department of Energy-funded Titan supercomputer at the Oak Ridge Leadership Computing Facility, the researchers turned their simulations into seismic hazard models. These models describe the probability of an earthquake occurring in a given geographic area, within a given window of time and with ground motion intensity exceeding a given threshold. Nearly 33,000 people die in the U.S. each year due to motor vehicle crashes, according to the National Highway Traffic Safety Administration. Modern restraint systems save lives, but some deaths and injuries remain—and restraints themselves can cause injuries. Researchers from the Center for Injury Biomechanics at Wake Forest University used the Blacklight supercomputer at the Pittsburgh Supercomputing Center to simulate the impacts of car crashes with much greater fidelity than crash-test dummies can. By studying a variety of potential occupant positions, they're uncovering important factors that lead to more severe injuries, as well as ways to potentially mitigate these injuries, using advanced safety systems. Since Albert Einstein, scientists have believed that when major galactic events like black hole mergers occur, they leave a trace in the form of gravitational waves—ripples in the curvature of space-time that travel outward from the source. Advanced LIGO is a project designed to capture signs of these events. Since gravitational waves are expected to travel at the speed of light, detecting them requires two gravitational wave observatories, located 1,865 miles apart and working in unison, that can triangulate the gravitational wave signals and determine the source of the wave in the sky. In addition to being an astronomical challenge, Advanced LIGO is also a "big data" problem. The observatories take in huge volumes of data that must be analyzed to determine their meaning. Researchers estimate that Advanced LIGO will generate more than 1 petabyte of data a year, the equivalent of 13.3 years' worth of high-definition video. To achieve accurate and rapid gravity wave detection, researchers use Extreme Science and Engineering Discovery Environment (XSEDE)—a powerful collection of advanced digital resources and services—to develop and test new methods for transmitting and analyzing these massive quantities of astronomical data. Advanced LIGO came online in September and advanced computing will play an integral part in its future discoveries. What happens when a supercomputer reaches retirement age? In many cases, it continues to make an impact in the world. The NSF-funded Ranger supercomputer is one such example. In 2013, after five years as one of NSF's flagship computer systems, the Texas Advanced Computing Center (TACC) disassembled Ranger and shipped it from Austin, Texas to South Africa, Tanzania and Botswana to give root to a young and growing supercomputing community. With funding from NSF, TACC experts led training sessions in South Africa in December 2014. In November 2015, 19 delegates from Africa came to the U.S. to attend a two-day workshop at TACC as well as the Supercomputing 2015 International Conference for High Performance Computing. The effort is intended, in part, to help provide the technical expertise needed to successfully staff and operate the Square Kilometer Array, a new radio telescope being built in Australia and Africa which will offer the highest resolution images in all of astronomy. In September 2015, President Obama announced plans to improve maps and elevation models of the Arctic, including Alaska. To that end, NSF and the National Geospatial-Intelligence Agency (NGA) are supporting the development of high-resolution Digital Elevation Models in order to provide consistent coverage of the globally significant region. The models will allow researchers to see in detail how warming in the region affects the landscape in remote areas, and allow them to compare changes over time. The project relies, in part, on the computing and data analysis powers of Blue Waters, which will let researchers store, access and analyze large numbers of images and models. To solve some of society's most pressing long-term problems, the U.S. needs to educate and train the next generation of scientists and engineers to use advanced computing effectively. This pipeline of training begins as early as high school and continues throughout the careers of scientists. Last summer, TACC hosted 50 rising high school juniors and seniors to participate in an innovative new STEM program, CODE@TACC. The program introduced students to high-performance computing, life sciences and robotics. On the continuing education front, XSEDE offers hundreds of training classes each year to help researchers update their skills and learn new ones. High-performance computing has another use in education: to assess how students learn and ultimately to provide personalized educational paths. A recent report from the Computing Research Association, "Data-Intensive Research in Education: Current Work and Next Steps," highlights insights from two workshops on data-intensive education initiatives. The LearnSphere project at Carnegie Mellon University, an NSF Data Infrastructure Building Blocks project, is putting these ideas into practice. Experimenting with cloud computing on new platforms In 2014, NSF invested $20 million to create two cloud computing testbeds that let the academic research community develop and experiment with cloud architectures and pursue new, architecturally-enabled applications of cloud computing. CloudLab (with sites in Utah, Wisconsin and South Carolina) came online in May 2015 and provides researchers with the ability to create custom clouds and test adjustments at all levels of the infrastructure, from the bare metal on up. Chameleon, a large-scale, reconfigurable experimental environment for cloud research, co-located at the University of Chicago and The University of Texas at Austin, went into production in July 2015. Both serve hundreds of researchers at universities across the U.S. and let computer scientists experiment with unique cloud architectures in ways that weren't available before. The NSF-supported "Comet" system at the San Diego Supercomputer Center (SDSC) was dedicated in October and is already aiding scientists in a number of fields, including domains relatively new for supercomputer integration, such as neuroscience. SDSC recently received a major grant to expand the Neuroscience Gateway, which provides easy access to advanced cyberinfrastructure tools and resources through a web-based portal, and can significantly improve the productivity of researchers. The gateway will contribute to the national BRAIN Initiative and deepen our understanding of the human brain. Explore further: Innovative new supercomputers increase nation's computational capacity and capability


News Article | November 18, 2015
Site: www.scientificcomputing.com

When researchers need to compare complex new genomes, or map new regions of the Arctic in high-resolution detail, or detect signs of dark matter, or make sense of massive amounts of functional MRI data, they turn to the high-performance computing and data analysis systems supported by the National Science Foundation (NSF). High-performance computing (or HPC) enables discoveries in practically every field of science — not just those typically associated with supercomputers like chemistry and physics, but also in the social sciences, life sciences and humanities. By combining superfast and secure networks, cutting-edge parallel computing and analytics software, advanced scientific instruments and critical datasets across the U.S., NSF's cyber-ecosystem lets researchers investigate questions that can't otherwise be explored. NSF has supported advanced computing since its beginning and is constantly expanding access to these resources. This access helps tens of thousands of researchers each year — from high-school students to Nobel Prize winners — expand the frontiers of science and engineering, regardless of whether their institutions are large or small, or where they are located geographically. Below are 10 examples of research enabled by NSF-supported advanced computing resources from across all of science. Pineapples don't just taste good — they have a juicy evolutionary history. Recent analyses using computing resources that are part of the iPlant Collaborative revealed an important relationship between pineapples and crops like sorghum and rice, allowing scientists to home in on the genes and genetic pathways that allow plants to thrive in water-limited environments. Led by the University of Arizona, Texas Advanced Computing Center, Cold Spring Harbor Laboratory and University of North Carolina at Wilmington, iPlant was established in 2008 with NSF funding to develop cyberinfrastructure for life sciences research, provide powerful platforms for data storage and bioinformatics and democratize access to U.S. supercomputing capabilities. This week, iPlant announced it will host a new platform, Digital Imaging of Root Traits (DIRT), that lets scientists in the field measure up to 76 root traits merely by uploading a photograph of a plant's roots. 2. Designing new nanodevices Software that simulates the effect of an electric charge passing through a transistor — only a few atoms wide — is helping researchers to explore alternative materials that may replace silicon in future nanodevices. The software simulations designed by Purdue researcher Gerhard Klimeck and his group, available on the nanoHUB portal, provide new information about the limits of current semiconductor technologies and are helping design future generations of nanoelectronic devices. NanoHUB, supported by NSF, is the first broadly successful, scientific end-to-end cloud computing environment. It provides a library of 3,000 learning resources to 195,000 users worldwide. Its 232 simulation tools are used in the cloud by over 10,800 researchers and students annually. Earthquakes originate through complex interactions deep below the surface of the Earth, making them notoriously difficult to predict. The Southern California Earthquake Center (SCEC) and its lead scientist Thomas Jordan use massive computing power to simulate the dynamics of earthquakes. In doing so, SCEC helps to provide long-term earthquake forecasts and more accurate hazard assessments. In 2014, the SCEC team investigated the earthquake potential of the Los Angeles Basin, where the Pacific and North American Plates run into each other at the San Andreas Fault. Their simulations showed that the basin essentially acts like a big bowl of jelly that shakes during earthquakes, producing more high-shaking ground motions than the team expected. Using the NSF-funded Blue Waters supercomputer at the National Center for Supercomputing Applications and the Department of Energy-funded Titan supercomputer at the Oak Ridge Leadership Computing Facility, the researchers turned their simulations into seismic hazard models. These models describe the probability of an earthquake occurring in a given geographic area, within a given window of time and with ground motion intensity exceeding a given threshold. Nearly 33,000 people die in the U.S. each year due to motor vehicle crashes, according to the National Highway Traffic Safety Administration. Modern restraint systems save lives, but some deaths and injuries remain — and restraints themselves can cause injuries. Researchers from the Center for Injury Biomechanics at Wake Forest University used the Blacklight supercomputer at the Pittsburgh Supercomputing Center to simulate the impacts of car crashes with much greater fidelity than crash-test dummies can. By studying a variety of potential occupant positions, they're uncovering important factors that lead to more severe injuries, as well as ways to potentially mitigate these injuries, using advanced safety systems. Since Albert Einstein, scientists have believed that when major galactic events like black hole mergers occur, they leave a trace in the form of gravitational waves — ripples in the curvature of space-time that travel outward from the source. Advanced LIGO is a project designed to capture signs of these events. Since gravitational waves are expected to travel at the speed of light, detecting them requires two gravitational wave observatories, located 1,865 miles apart and working in unison, that can triangulate the gravitational wave signals and determine the source of the wave in the sky. In addition to being an astronomical challenge, Advanced LIGO is also a "big data" problem. The observatories take in huge volumes of data that must be analyzed to determine their meaning. Researchers estimate that Advanced LIGO will generate more than one petabyte of data a year, the equivalent of 13.3 years' worth of high-definition video. To achieve accurate and rapid gravity wave detection, researchers use Extreme Science and Engineering Discovery Environment (XSEDE) — a powerful collection of advanced digital resources and services — to develop and test new methods for transmitting and analyzing these massive quantities of astronomical data. Advanced LIGO came online in September, and advanced computing will play an integral part in its future discoveries. What happens when a supercomputer reaches retirement age? In many cases, it continues to make an impact in the world. The NSF-funded Ranger supercomputer is one such example. In 2013, after five years as one of NSF's flagship computer systems, the Texas Advanced Computing Center (TACC) disassembled Ranger and shipped it from Austin, TX, to South Africa, Tanzania and Botswana to give root to a young and growing supercomputing community. With funding from NSF, TACC experts led training sessions in South Africa in December 2014. In November 2015, 19 delegates from Africa came to the U.S. to attend a two-day workshop at TACC as well as the Supercomputing 2015 International Conference for High Performance Computing. The effort is intended, in part, to help provide the technical expertise needed to successfully staff and operate the Square Kilometer Array, a new radio telescope being built in Australia and Africa which will offer the highest resolution images in all of astronomy. In September 2015, President Obama announced plans to improve maps and elevation models of the Arctic, including Alaska. To that end, NSF and the National Geospatial-Intelligence Agency (NGA) are supporting the development of high-resolution Digital Elevation Models in order to provide consistent coverage of the globally significant region. The models will allow researchers to see in detail how warming in the region affects the landscape in remote areas, and allow them to compare changes over time. The project relies, in part, on the computing and data analysis powers of Blue Waters, which will let researchers store, access and analyze large numbers of images and models. To solve some of society's most pressing long-term problems, the U.S. needs to educate and train the next generation of scientists and engineers to use advanced computing effectively. This pipeline of training begins as early as high school and continues throughout the careers of scientists. Last summer, TACC hosted 50 rising high school juniors and seniors to participate in an innovative new STEM program, CODE@TACC. The program introduced students to high-performance computing, life sciences and robotics. On the continuing education front, XSEDE offers hundreds of training classes each year to help researchers update their skills and learn new ones. High-performance computing has another use in education: to assess how students learn and ultimately to provide personalized educational paths. A recent report from the Computing Research Association, "Data-Intensive Research in Education: Current Work and Next Steps," highlights insights from two workshops on data-intensive education initiatives. The LearnSphere project at Carnegie Mellon University, an NSF Data Infrastructure Building Blocks project, is putting these ideas into practice. 9. Experimenting with cloud computing on new platforms In 2014, NSF invested $20 million to create two cloud computing testbeds that let the academic research community develop and experiment with cloud architectures and pursue new, architecturally-enabled applications of cloud computing. CloudLab (with sites in Utah, Wisconsin and South Carolina) came online in May 2015 and provides researchers with the ability to create custom clouds and test adjustments at all levels of the infrastructure, from the bare metal on up. Chameleon, a large-scale, reconfigurable experimental environment for cloud research, co-located at the University of Chicago and The University of Texas at Austin, went into production in July 2015. Both serve hundreds of researchers at universities across the U.S. and let computer scientists experiment with unique cloud architectures in ways that weren't available before. The NSF-supported "Comet" system at the San Diego Supercomputer Center (SDSC) was dedicated in October and is already aiding scientists in a number of fields, including domains relatively new for supercomputer integration, such as neuroscience. SDSC recently received a major grant to expand the Neuroscience Gateway, which provides easy access to advanced cyberinfrastructure tools and resources through a web-based portal, and can significantly improve the productivity of researchers. The gateway will contribute to the national BRAIN Initiative and deepen our understanding of the human brain.


Greer J.B.,National Geospatial-Intelligence Agency (NGA)
Proceedings of SPIE - The International Society for Optical Engineering | Year: 2010

The linear mixture model for hyperspectral images assumes that all the image spectra lie on a high-dimensional simplex with corners called endmembers. Given the set of endmembers, one typically calculates fractional abundances for each pixel using constrained least squares. This method likely reconstructs the spectra as combinations of most, if not all, the endmembers. We instead assume that pixels are combinations of only a few of the end-members, yielding sparse abundance vectors. We introduce a new method, similar to Matching Pursuit (MP) from the signal processing literature, to calculate these sparse abundances. We combine this sparse demixing algorithm with dictionary learning methods to automatically calculate endmembers for a provided set of spectra. We apply our method to an AVIRIS image of Cuprite, NV, for which we compare our endmembers with spectral signatures from the USGS spectral library. © 2010 SPIE.


Crispell D.,National Geospatial-Intelligence Agency (NGA) | Mundy J.,Brown University | Taubin G.,Brown University
IEEE Transactions on Geoscience and Remote Sensing | Year: 2012

Given a set of high-resolution images of a scene, it is often desirable to predict the scene's appearance from viewpoints not present in the original data for purposes of change detection. When significant 3-D relief is present, a model of the scene geometry is necessary for accurate prediction to determine surface visibility relationships. In the absence of an a priori high-resolution model (such as those provided by LIDAR), scene geometry can be estimated from the imagery itself. These estimates, however, cannot, in general, be exact due to uncertainties and ambiguities present in image data. For this reason, probabilistic scene models and reconstruction algorithms are ideal due to their inherent ability to predict scene appearance while taking into account such uncertainties and ambiguities. Unfortunately, existing data structures used for probabilistic reconstruction do not scale well to large and complex scenes, primarily due to their dependence on large 3-D voxel arrays. The work presented in this paper generalizes previous probabilistic 3-D models in such a way that multiple orders of magnitude savings in storage are possible, making high-resolution change detection of large-scale scenes from high-resolution aerial and satellite imagery possible. Specifically, the inherent dependence on a discrete array of uniformly sized voxels is removed through the derivation of a probabilistic model which represents uncertain geometry as a density field, allowing implementations to efficiently sample the volume in a nonuniform fashion. © 2006 IEEE.


Greer J.B.,National Geospatial-Intelligence Agency (NGA)
IEEE Transactions on Image Processing | Year: 2012

In the LMM for hyperspectral images, all the image spectra lie on a high-dimensional simplex with corners called endmembers. Given a set of endmembers, the standard calculation of fractional abundances with constrained least squares typically identifies the spectra as combinations of most, if not all, endmembers. We assume instead that pixels are combinations of only a few endmembers, yielding abundance vectors that are sparse. We introduce sparse demixing (SD), which is a method that is similar to orthogonal matching pursuit, for calculating these sparse abundances. We demonstrate that SD outperforms an existing L 1 demixing algorithm, which we prove to depend adversely on the angles between endmembers. We combine SD with dictionary learning methods to calculate automatically endmembers for a provided set of spectra. Applying it to an airborne visible/infrared imaging spectrometer image of Cuprite, NV, yields endmembers that compare favorably with signatures from the USGS spectral library. © 2011 IEEE.


News Article | February 16, 2017
Site: www.businesswire.com

REDLANDS, Calif.--(BUSINESS WIRE)--Esri, the global leader in spatial analytics, together with the ArcticDEM project—a public-private initiative to produce high-resolution, high-quality digital elevation models (DEM) of the Arctic—has released the largest addition of new elevation models to the project thus far. The ArcticDEM project is an ongoing collaboration between National Geospatial-Intelligence Agency (NGA), the Polar Geospatial Center, and Esri to produce high-resolution elevation models to support scientific and national security implications in the Arctic. Coupled with the accessibility of Esri's online platform, ArcticDEM can meet the need for high-quality elevation data in remote locations and provide accurate measurement of topographic changes. "The latest release of this elevation data is the largest collection to date for this project," said Don Kerr, Chief of News and Information, National Geospatial-Intelligence Agency. “The release includes a much broader area of the Arctic geography than ever before, comprising more populated areas that will see a substantive benefit from access to this kind of detailed location data.” New elevation models on Esri's public online portal show stunning surface detail from mainland Canada and Russia. In many locations, the models are created from images collected on multiple dates, allowing anyone to see how the landscape changes over time—for example, showing the rate at which glaciers are receding. Since the Arctic region is uniquely challenged by the effects of climate change, including melting ice, this elevation data provides a great resource for enabling better planning and adaptation. For instance, elevation models can help local communities monitor coastal erosion in order to identify important structures that are at high risk of storm damage. "The growing collection of detailed data and imagery from this collaborative project is a useful tool for federal agencies that depend on location data," said Peter Becker, ArcGIS product manager, Esri. "For instance, the ArcticDEM information is a valuable source of data for maritime navigational charts that must have accurate and up-to-date coastal points of reference." The National Geospatial-Intelligence Agency presented the new ArcticDEM elevation data at the twentieth annual Esri Federal GIS (FedGIS) Conference, held February 13 and 14 at the Walter E. Washington Convention Center in Washington, DC. Explore visualized data from the ArcticDEM project at the ArcticDEM Explorer website. Esri, the global market leader in geographic information system software, offers the most powerful mapping and spatial analytics technology available. Since 1969, Esri has helped customers unlock the full potential of data to improve operational and business results. Today, Esri software is deployed in more than 350,000 organizations including the world's largest cities, most national governments, 75 percent of Fortune 500 companies, and more than 7,000 colleges and universities. Esri engineers the most advanced solutions for digital transformation, the Internet of Things (IoT), and location analytics to create the maps that run the world. Visit us at esri.com/news. Copyright © 2017 Esri. All rights reserved. Esri, the Esri globe logo, GIS by Esri, ArcGIS, esri.com, and @esri.com are trademarks, service marks, or registered marks of Esri in the United States, the European Community, or certain other jurisdictions. Other companies and products or services mentioned herein may be trademarks, service marks, or registered marks of their respective mark owners.

Loading National Geospatial-Intelligence Agency (NGA) collaborators
Loading National Geospatial-Intelligence Agency (NGA) collaborators