News Article | September 11, 2017
Ash is raining down on the city of Portland. Thousand-degree fireballs are forcing evacuations near Salt Lake City, Seattle, and parts of Northern California. In Missoula, Montana, the sun burns a bloody red even even at high noon. These days, the American West looks less like an Ansel Adams postcard and more like the kingdom of Mordor. Across nine states, nearly 1.5 million acres are on fire. And according to President Trump’s natural resources czar, you can blame it all on the hippies. Allow me to explain: Last month, Interior Secretary Ryan Zinke traveled to his home state of Montana for an on-site briefing on the Lolo Peak fire, burning just south of Missoula and turning the valley’s air into toxic nubulae. Joined by Agricultural Secretary Sonny Perdue and members of Montana’s Republican delegation, Zinke pointed the finger not at drought or climate change, but on mismanagement resulting from lawsuits by “environmental extremists.” It’s not a new argument from the pro-industry camp, but Zinke’s comments are reigniting some old debates about the best way for people to manage forests that have been turned into tinderboxes by decades of overgrazing, fire suppression, and extended droughts. Climate change only makes the cycle more vicious: Less water leads to more fuel leads to more fire means more money to fight fires means less money to manage forests, and now we’re back to too much fuel. Since 1985, global warming has nearly doubled the annual number of acres burning in the western US. So what is a scrappy, resource-strapped agency like the Forest Service to do? That’s where science comes in. An emerging consensus suggests that officials should spend less time thinning out forests where a fire might hit, and more time figuring out what the specific conditions are when a fire actually does. But to do that, they’ll need some help from outer space. Lincoln, Montana, located 75 miles east of Missoula on Highway 200, is surrounded on all sides by the thick stands of spruce, fir, and pine that make up the western sections of the Helena National Forest. More than a decade ago, the area was dealt a double whammy—a series of summer droughts and a plague of pine bark beetles—that left the forest littered with big, dead, beetle-hole-ridden trees that were barely still standing. Called snags, this brittle timber is far more likely to fall in a fire, making for much more dangerous wildfire fighting conditions. And overlapping downed trees can make it virtually impossible to cut fire lines. So a few years ago the Forest Service began developing a plan to thin trees, selectively log, and do controlled burns on about 4,800 acres four miles north of Lincoln. The idea was to build a more resilient forest to mitigate the intensity of any fire that might come across the landscape. The project was to start this spring. But in February, two conservation groups filed a lawsuit against the Forest Service, saying the agency failed to comply with federal laws requiring proper environmental impact assessments. And they alleged the project would disrupt essential habitat for endangered species like grizzly bears and lynx that live in the area. In May, a US district judge issued an injunction to stop all management actions from proceeding until the case could be settled. Then came summer, with its dry heat and powerful thunderstorms. In July, lightning sparked a fire on a steep, isolated slope and a month later another storm started a second blaze. For firefighting purposes, they are now considered to be a single incident, which to date has burned 17,722 acres acres, mostly within the project area. Here’s the straightforward logic of Zinke’s scapegoating: Environmentalists block the Forest Service from lowering the fuel load on the land, land catches on fire, and now it’s harder to put out. Thanks, tree-huggers. But fire scientists say it’s more complicated than that. Many question the ecological (and economic) value of thinning forests out, for three big reasons. One, the evidence for its efficacy is both scant and at times contradictory. Two, probabilistic risk assessments show that the thinning doesn’t really help much because the likelihood of a fire starting close enough to interact with thinned areas is negligibly small. And three, in the worst weather conditions—dry, hot, and most importantly, windy—no amount of thinning or selective logging is going to make much difference. A case in point: that Park Creek fire burning outside of Lincoln. It started on a remote slope that wasn’t slated for any prescribed burns or dead tree removals. But such treatments wouldn’t have made much difference anyway, according to Carl Seielstad, a fire ecologist at the National Center for Landscape Fire Analysis at the University of Montana, because the closest road is more than mile away, at the bottom of a slope. If you know anything about fire behavior, you know it moves much faster uphill. And in this case there wasn’t much in that direction, except more trees. “Without any roads in this area there was nothing for firefighters to anchor to,” says Seielstad, pointing at a 3D rendering of the fire’s path he’s pulled up on his computer. “It’s fair to say that regardless of treatment, this area would probably have been impossible to contain.” A firefighter sprays water as flames from a wildfire engulf a residence near Oroville, Calif., on Sunday, July 9, 2017. Evening winds drove the fire through several neighborhoods leveling homes in its path. (AP Photo/Noah Berger) The West Is on Fire. Blame the Housing Crisis You can blame climate change for all those wildfires, but don't forget another factor: We love to build on the edge of wildlands. The Tricked-Out Research Planes That Fly Through Wildfires The best way to test the gases created by wildfires is to fly a plane directly above the conflagration. A freshly scorched landscape is seen in the early morning hours of June 18, 2016 at the Sherpa Fire near Santa Barbara, California. A Cash-Strapped Forest Service Faces Its Most Epic Fire Season Ever The Forest Service has to fork over most of its budget to battle increasingly intense blazes, leaving little left for prevention and restoration. Seielstad has been fighting, researching, and teaching about wildfires for 17 seasons. He says on the first day of his classes every semester he tells his forestry department freshman the founding principle of fire management: weather, topography, and fuels determine a fire’s behavior. But fuels are the only one you can do anything about. Even so, he says, the impact of thinning forests is mostly speculation. Its “effectiveness is hard to study because you can’t control any other variables out in the wild,” Seielstad says. “Sometimes it reduces the speed and intensity of a fire, sometimes it does the opposite.” Further complicating matters is the fact that current models used to predict fire behavior aren’t particularly useful for forests that have been attacked by bark beetles, because scientists don’t yet have a lot of good data for how bug-butchered timber actually behaves. And, as Steven Running, a climate scientist who studies forest carbon (and shares a Nobel Peace Prize with former Vice President Al Gore for his work on the IPCC’s first global warming report) points out, the vast majority of forests around the world don’t come with detailed plot records. Meaning that scientists are always making assumptions about how old and how dense a forest is, what kinds of species make up a given hillside, and how fuel loads are distributed on the landscape. “When it comes to fire danger we don’t have much of an idea about how much dead material is lying around on the ground,” he says. “So when lightning strikes we don’t really know what’s out there, not in any detailed way.” Until about a month ago, Running directed the intimidatingly named Numerical Terradynamic Simulation Group at the University of Montana. (He was one of a number of faculty who took a buyout package offered by the University to staunch recent budget woes.) He used massive, high-powered NASA satellites to measure the Earth’s daily rate of photosynthesis. He thinks forest managers should be using something similar (though much, much smaller) to better track, model, and plan for the West’s future fires. For more than a decade, satellites have beamed down thermal data captured from high above the earth’s atmosphere to inform fire management systems of the location and severity of hot spots. But it’s only in the last few years—thanks to miniaturization and the democratization of remote sensing technologies—that such tools have started to become affordable. “When I started, satellites had the resolving power of 1 square kilometer,” Running says. “Now they’re good enough to get down to the level of a single tree. And they’re cheap.” He envisions a world where flocks of nanosatellites constantly circle the planet, snapping pictures, waiting for a job to come in from a forest manager who has spotted a lightning strike or a rogue campfire ember. With the coordinates uploaded, a small sat could fly over to the area of interest, collect a bunch of images before the smoke gets too dense, and then send it to a computer for processing and analysis. With the right algorithms, in just a few hours forest service officials on the ground could have a real-time map of where the fuels were and thus where the fire was most likely to go. Combine that with already well-used weather and water stress models and you could have a much better plan much earlier, Redding says. “That capability is pretty much ready to go right now,” he says. “It’s just a matter of using it.” You know who is using it? Canada. This year, Natural Resources Canada—the ministry responsible for managing the country’s minerals, metals, and forests—enlisted the help of 178 tiny satellites, each one weighing barely 11 pounds, to give managers a head start on any fires burning in the nation’s 857 million acres of forest. The satellites, nicknamed Doves, are owned by Planet, a San Francisco-based company started by former NASA engineers. Together with 5 RapidEye sats, the company is able to snap photos of 70 million square miles every day—the entirety of Earth’s landmass, and then some. And every day they upload about 7 terabytes of data to a fire detection company in Vancouver called Tanka. Using machine learning to parse smoke and flame from fire-free forests across Canada’s vast wilderness areas, Tanka then sends NRC the coordinates of any fires that have started in the last 24 hours, along with information about how big they are, how fast they’re growing, and what kind of fuel is in the area. Tanka’s CEO, Nikola Obrknezev, says once the company has a full fire season under its belt he plans to begin talks with government agencies in the US, Australia, and Chile about offering a similar image-based surveillance service. The US, however, already has plans for its own early warning networking devoted entirely to wildfires. First conceptualized in 2011 by engineers at NASA’s Jet Propulsion Laboratory, FireSat aspires to be a constellation of 200 orbiting thermal sensors, capable of detecting fires as small as 35 feet across, within 15 minutes of the time they begin. The project is now being developed by San Francisco-based startup Quadra PI R2E, which plans to launch its first batch of 20 sensors sometime next year. The goals of FireSat and Tonka are the same—detect fires as they happen, so that forest managers can deploy resources to contain them while they’re small, saving money, timber, property, and yes, clean air to breath. These systems will also help the Forest Service and the NRC make decisions about whether or not to contain a fire in the first place. In a natural cycle, the lodgepole and ponderosa pine forests of the Northern Rockies would burn every 8 - 30 years. And while all-out fire suppression has been the dominant policy in the Forest Service for the last few decades, that’s starting to shift. Looking at Seielstad’s computer screen, which now shows all the fires burning in Montana, a pattern starts to jump out. What at first look like random splotches and cutouts on the landscape all have something in common. The places they’ve stopped growing all butt up against the borders of old fire scars—land that burned in the recent past, either from wildfires or intentional burns. “In my opinion, the best fuel treatment is fire itself,” Seielstad says. The fires burning right now are a form of future insurance for nearby communities. “Right now if you went out in the streets of Missoula and said these fires and this smoke were actually a good thing, you’d get tarred and feathered,” he says. “But five years from now, I think people will be really grateful.” No one in Montana is excited about rebranding as the Big Smoke State. But fires, like floods and hurricanes, are inevitable. And attempting to fireproof the nation’s forests when no one knows where the next lightning strike will come is a Sisyphean task, with or without environmentalists nipping at your heels. It’s all going to burn at some point. The question is, will forest managers be able to make informed decisions about how and when and whether to let them go when that time comes? Having a few hundred heat-seeking satellites at their disposal certainly won’t hurt.
News Article | November 23, 2016
Commercial Satellite Imaging Market For Geospatial Technology, Defense & Intelligence, Construction & Development, Energy, Natural Resources Management and others Application And By End-Users Military, Forest, Government, Commercial Enterprises, Agriculture, Energy and others : Global Industry Perspective, Comprehensive Analysis, Size, Share, Growth, Segment, Trends and Forecast, 2014 – 2020 The report covers forecast and analysis for the commercial satellite imaging market on a global and regional level. The study provides historic data of 2014 along with a forecast from 2015 to 2020 based revenue (USD Million). The study includes drivers and restraints for the commercial satellite imaging market along with the impact they have on the demand over the forecast period. Additionally, the report includes study of opportunities available in the commercial satellite imaging market on a global level. In order to give the users of this report a comprehensive view on the commercial satellite imaging market, we have included a detailed competitive scenario, and product portfolio of key vendors. To understand the competitive landscape in the market, an analysis of Porter’s five forces model for the commercial satellite imaging market has also been included. The study encompasses a market attractiveness analysis, wherein application segments are benchmarked based on their market size, growth rate and general attractiveness. The study provides a decisive view on the commercial satellite imaging market by segmenting the market based on applications and end-users. All the application segments have been analyzed based on present and future trends and the market is estimated from 2014 to 2020. Key application markets covered under this study includes geospatial technology, defense and intelligence, construction and development, energy, natural resources management and others. Military, forest, government, commercial enterprises, agriculture, energy, and others are the end-user segment of this market. The regional segmentation includes the current and forecast demand for North America, Europe, Asia Pacific, Latin America and Middle East and Africa with its further bifurcation into major countries including U.S. Germany, France, UK, China, Japan, India and Brazil. The report covers detailed competitive outlook including company profiles of the key participants operating in the global market. Key players profiled in the report include GeoEye Inc., BlackBridge (RapidEye), Planet Labs, Inc., Spaceknow, Inc., Skybox Imaging, Inc., Trimble Navigation Limited, Digital Globe, Inc., Image Sat International N.V., Astrium Geo, SkyLab Analytics, Telespazio, Google Inc., and Galileo Group. The report segments the global commercial satellite imaging market into:
News Article | November 20, 2017
There's a battle going on in outer space, for control of the Earth imaging market. Yesterday morning, we learned that DigitalGlobe (NYSE:DGI) has sold itself to Canadian space-tech specialist MacDonald, Dettwiler and Associates for a purchase price of $2.4 billion -- and not a moment too soon. DigitalGlobe, if you recall, is the American satellite-imaging company that itself bought one-time Motley Fool recommendation GeoEye a few years ago. Just before announcing its own sale, DigitalGlobe reported a loss for its fiscal fourth quarter. And if you ask me, this suggests the time is ripe for DigitalGlobe to cash out -- the more so because its primary area of business is about to get even tougher to compete in. DigitalGlobe is one of the biggest names in satellite-based, highly detailed Earth imaging. The company operates a constellation of five satellites capable of snapping photographs from orbit at resolutions as minute as 30 centimeters per pixel. DigitalGlobe calls its constellation "the best in the world" and "the largest constellation in the industry," and that's at least half right. Problem is, it's also half wrong. DigitalGlobe may dominate the market for highly detailed Earth imaging. But over at upstart Planet Labs (also known simply as "Planet"), one of the leaders of the "new space" industry is rapidly overtaking DigitalGlobe in breadth of coverage. As we discussed last month, Planet had for some time been discussing acquiring Alphabet's (NASDAQ:GOOG) (NASDAQ:GOOGL) satellite imaging business, Terra Bella. The purchase price remains a mystery -- but the deal itself is apparently now a "go." On Feb. 3, Planet confirmed that it will indeed acquire Terra Bella from Alphabet, along with its fleet of seven high-resolution (capable of sub-meter resolutions) satellites. Then, last week, Planet announced the launch of 88 new "Dove" medium-resolution (three to five meters per pixel) satellites, more than doubling the size of its medium-res constellation to 144 birds in orbit. Planet made history here in a couple of ways. First, its deployment was part of an Indian government PSLV (Polar Satellite Launch Vehicle) launch that sent 16 other satellites, in addition to Planet's, into space. At 104 satellites total, the mission that carried Planet's Doves was the largest-ever satellite deployment made from a single rocket. More important, Planet says that its 144 Doves in orbit, plus five RapidEye satellites acquired from BlackBridge in 2015, give it a constellation of 149 satellites in orbit today. Feel free to mentally add seven more to that number once the Terra Bella acquisition is finalized, if you like. But already, Planet's mixed bag of 149 satellites gives it "the largest private satellite constellation in history." This is significant not only because it gives Planet bragging rights. By more than doubling its satellite constellation, Planet says it now has the ability to "image all of Earth's landmass every day." Alone among satellite operators, Planet can now take a snapshot of every square foot of Earth's solid surface once per day, every day of the year. With every passing day, Planet Labs is getting bigger, treading more and more on DigitalGlobe's astro-turf, and making its rival's products nonunique. Indeed, even if the photos taken by Planet's Doves don't offer quite the same resolution as DigitalGlobe offers, Planet's ability to take so many snapshots every day gives the company a more all-encompassing view of what's happening around the globe. It's difficult to guess what advances will flow from access to such data. Better weather forecasting, certainly. A clearer view of the progression of global warming, its effects and its causes, very likely. But we can also anticipate Planet generating better data on traffic patterns on highways, at ports, and along shipping routes; on the spread of residential development that could guide brick-and-mortar retailers on where to place their next big-box stores; and in countless other ways. The future is looking awfully bright for Planet Labs. And it's getting brighter every day, one Dove at a time.
News Article | February 15, 2017
A rocket loaded down with a record number of satellites just launched on its way to orbit. The Indian Space Research Organization's (ISRO) Polar Satellite Launch Vehicle (PSLV) blasted off on Tuesday at 10:58 p.m. ET to bring 104 satellites to space, the largest clutch of spacecraft ever launched by one rocket. SEE ALSO: 88 satellites will launch on Valentine's Day to image the entire Earth every day This launch beats out the previously record set by a Russian rocket that brought 37 satellites to orbit in 2014. The PSLV's main payload is an Earth-mapping satellite for India, but its largest haul is the 88 small Dove satellites for the Earth-observing company Planet. Those satellites, once functioning in orbit, will allow Planet to image the entire Earth every day, when combined with data beamed back to engineers from 12 other Doves and RapidEye satellites operated by the U.S. company. Imaging the whole Earth every day has been the company's goal (nicknamed "Mission One") since it was founded in 2010. "We've had a lot of launches under our belts but this is the one that we feel really defines Mission One," Mike Safyan, Planet's director of launch and regulatory affairs, said in an interview before launch. "It's a pretty special feeling to think back [to] all those years ago when we were a scrappy team inside a garage dreaming about this day, and now this day has finally come." Being able to photograph the entire Earth every day will allow customers using Planet's data to keep close track of a number of things. One possible use of the data is in tracking deforestation, Safyan said. Instead of just seeing one area every couple of months, tracking changes to a specific part of the world on a daily basis will allow people on the ground to actually do something about any illegal deforestation occurring. "If everyday you're getting an alert where trees are going down where they aren't allowed to be harvested or cut down, then you can actually go and send someone and do something about it," Safyan added. This marks the 15th Dove launch for Planet and will give the company a total of 100 of these satellites in orbit.
Agency: European Commission | Branch: FP7 | Program: CP | Phase: SPA.2010.1.1-01 | Award Amount: 3.39M | Year: 2011
It is the overall aim of EUFODOS to develop specific Forest Downstream Services (FDS) that are urgently required by regional European users in an economically viable manner and utilise the GMES Land Forest Core products as a basis for the development of these services. The specific FDS which will be examined and developed to a pre-operational level are related to assessment of forest damage and mapping forest functional parameters. As the GMES Land Core products are not yet operational the FDS will thus also provide a pre-operational validation of the GMES services and products. The FDS programme is based on 3 foundations: technical/methodological developments which will be based on an approach that combines Earth Observation (EO) and in-situ data as well as the GMES Forest Core products; the formation of a functional Service Network (SN) which includes effective representation/involvement of the user community, the service providers and the research community; and the assessment of the economic feasibility of developing the service cases such that they are sustainable. Thus these foundations form the following programme objectives: 1. Development of a FDS Service Network comprised of service providers, users and the research community for effective involvement of all stakeholders and service delivery. 2. The investigation and implementation of methodologies for forest degradation assessment and forest functions parameter mapping for the provision of pre-operational systems. 3. The users commitments to participate in the validation of the Core products, as well as the utility assessment of the downstream services within their own work practices. 4. The assessment of the economic cases for these different regional downstream services to ensure sustainability. All the objectives will be fulfilled within the lifespan of the EUFODOS programme.
Eitel J.U.H.,University of Idaho |
Vierling L.A.,University of Idaho |
Litvak M.E.,University of New Mexico |
Long D.S.,Columbia Plateau Conservation Research Center |
And 4 more authors.
Remote Sensing of Environment | Year: 2011
Multiple plant stresses can affect the health, esthetic condition, and timber harvest value of conifer forests. To monitor spatial and temporal dynamic forest stress conditions, timely, accurate, and cost-effective information is needed that could be provided by remote sensing. Recently, satellite imagery has become available via the RapidEye satellite constellation to provide spectral information in five broad bands, including the red-edge region (690-730. nm) of the electromagnetic spectrum. We tested the hypothesis that broadband, red-edge satellite information improves early detection of stress (as manifest by shifts in foliar chlorophyll a. +. b) in a woodland ecosystem relative to other more commonly utilized band combinations of red, green, blue, and near infrared band reflectance spectra. We analyzed a temporally dense time series of 22 RapidEye scenes of a piñon-juniper woodland in central New Mexico acquired before and after stress was induced by girdling. We found that the Normalized Difference Red-Edge index (NDRE) allowed stress to be detected 13. days after girdling - between and 16. days earlier than broadband spectral indices such as the Normalized Difference Vegetation Index (NDVI) and Green NDVI traditionally used for satellite based forest health monitoring. We conclude that red-edge information has the potential to considerably improve forest stress monitoring from satellites and warrants further investigation in other forested ecosystems. © 2011 Elsevier Inc.
Naughton D.,RapidEye AG |
Brunn A.,RapidEye AG |
Czapla-Myers J.,University of Arizona |
Douglass S.,RapidEye AG |
And 3 more authors.
Journal of Applied Remote Sensing | Year: 2011
RapidEye AG is a commercial provider of geospatial information products and customized solutions derived from Earth observation image data. The source of the data is the RapidEye constellation consisting of five low-earth-orbit imaging satellites. We describe the rationale, methods, and results of a reflectance-based vicarious calibration campaign that was conducted between April 2009 and May 2010 at Railroad Valley Playa and Ivanpah Playa to determine the on-orbit radiometric accuracy of the RapidEye sensor. In situ surface spectral reflectance measurements of known ground targets and an assessment of the atmospheric conditions above the sites were taken during spacecraft overpasses. The ground data are used as input to a radiative transfer code to compute a band-specific top-of-atmosphere spectral radiance. A comparison of these predicted values based on absolute physical data to the measured at-sensor spectral radiance provide the absolute calibration of the sensor. Initial assessments show that the RapidEye sensor response is within 8% of the predicted values. Outcomes from this campaign are then used to update the calibration parameters in the ground segment processing system. Subsequent verification events confirmed that the measured RapidEye response improved to within 4% of the predictions based on the vicarious calibration method. © 2011 Society of Photo-Optical Instrumentation Engineers (SPIE).
Thiele M.,RapidEye AG |
Anderson C.,RapidEye AG |
Brunn A.,RapidEye AG
International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences - ISPRS Archives | Year: 2012
Radiometric calibration of the RapidEye Multispectral Imager (MSI) as with all other remote sensing instruments is an essential task in the quantitative assessment of sensor image quality and the production of accurate data products for a wide range of geo-spatial applications. Spatially and temporally pseudo-invariant terrestrial targets have long been used to quantify and provide a consistent record of the radiometric performance of Earth observation systems. The RapidEye cross-calibration approach combines temporal and relative calibration to ensure temporal stability in spectral response between it's 5 identical MSI over time by using a large number of repetitive collects of many pseudo-invariant calibration sites. The approach is characterized by its known reliability which is based on the purely statistical analysis of many ground collects with ground infrastructure or measurement systems not being necessary. The results show that the in-band percent difference in the measured response among all RapidEye sensors is less than two percent. Although the results show some offsets between the different sensors, the response of the RapidEye constellation over a three-year period is very stable.
Zillmann E.,RapidEye AG |
Weichelt H.,RapidEye AG
MultiTemp 2013 - 7th International Workshop on the Analysis of Multi-Temporal Remote Sensing Images: "Our Dynamic Environment", Proceedings | Year: 2013
Grasslands cover large areas of the earth's surface and have been extensively converted to other uses such as cultivation and urbanization. The monitoring of grasslands is needed for any land use planning and environmental management. Remote Sensing techniques are suitable to provide detailed spatial information on grassland to support this process. The RapidEye satellite constellation represents a unique potential of multi-temporal acquisition of high resolution image data, therefore, offering a reliable data source for detailed multi-temporal analysis. In the presented study a semi-automatic land-cover classification approach with emphasis on the identification of grassland was developed. The methodology is based on the analysis of multi-temporal RapidEye images using the supervised decision tree (DT) classifier C5 in combination with prepended image segmentation. The results presented correspond to an area of 2500 km2 in the State of Brandenburg / Germany. The classification accuracy was assessed by using randomly distributed independent reference points and the confusion matrix to derive users' and producers' accuracies. The grassland classification of the test area reached an overall accuracy of about 90%. © 2013 IEEE.
Marx A.,RapidEye AG
Photogrammetrie, Fernerkundung, Geoinformation | Year: 2010
The article at hand reveals the methodology and results of a research and development project in the field of applied remote sensing in forest protection and bark beetle monitoring. It was found that using multi-temporal RapidEye imagery, the ground truth data of bark beetle infestation and the application of datamining techniques allow for the recognition and separation of different infestation stages. The analysis suggests a weak trend for the identification of infested groups of trees, which are still widely green. In contrast, the classification of reddish-coloured deteriorating or dead tree groups shows a high accuracy (97% user's, 82 % producer's, kappa: 0.89). © 2010 E. Schweizerbart'sche Verlagsbuchhandlung, Stuttgart, Germany.