Time filter

Source Type

Liao L.,Wuhan University | Li P.,Wuhan University | Yang J.,Wuhan University | Chang H.,Yunnan Provincial Geomatics Center
Wuhan Daxue Xuebao (Xinxi Kexue Ban)/Geomatics and Information Science of Wuhan University

At present SAR polarimetric calibration algorithm are assumed to satisfy reciprocity from distributed targets. But not all distributed targets satisfy the principle of reciprocity. Aiming to solve this problem, this paper presents a method to calibration based on reciprocity judgment of distributed targets. According to the angle between the polarization scattering matrix and the reciprocal matrix of each pixel, we select pixels with small angle as the initial reciprocity calibration samples. Then, on the basis of the coherence of targets, some pixels are selected as the final calibration samples. CECT-38 X-band fully polarimetric data were completed by combining a typical calibration algorithm and selected distributed targets using the proposed method. Experiments show that this method, using distributed targets, can be well applied to SAR polarimetric calibration. Source

Tan X.,Wuhan University | Di L.,George Mason University | Deng M.,George Mason University | Fu J.,China Electrical Power Research Institution | And 6 more authors.
Sustainability (Switzerland)

Since the Open Geospatial Consortium (OGC) proposed the geospatial Web Processing Service (WPS), standard OGC Web Service (OWS)-based geospatial processing has become the major type of distributed geospatial application. However, improving the performance and sustainability of the distributed geospatial applications has become the dominant challenge for OWSs. This paper presents the construction of an elastic parallel OGC WPS service on a cloud-based cluster and the designs of a high-performance, cloud-based WPS service architecture, the scalability scheme of the cloud, and the algorithm of the elastic parallel geoprocessing. Experiments of the remote sensing data processing service demonstrate that our proposed method can provide a higher-performance WPS service that uses less computing resources. Our proposed method can also help institutions reduce hardware costs, raise the rate of hardware usage, and conserve energy, which is important in building green and sustainable geospatial services or applications. © 2015 by the authors. Source

Xiong J.,Kunming University of Science and Technology | Fang Y.,Kunming University of Science and Technology | Jin B.,Yunnan Provincial Geomatics Center | Zhao Z.,Kunming University of Science and Technology
Proceedings of the 2012 4th International Conference on Intelligent Human-Machine Systems and Cybernetics, IHMSC 2012

This paper proposes a new method for the automatic generation of a digital terrain model (DTM) from airborne laser scanning technique. First, generate regular grids, and then filter the points on each regular grid. Finally, change the size of grid and filter these points on the new grids repeatedly. The data measured by LiteMapper-5600 in a city is used to evaluate the method. The results show that this method can remove most of non-ground points effectively and generate DTM with the high-accuracy in urban areas. Copyright © 2012 by The Institute of Electrical and Electronics Engineers, Inc. Source

Li Z.,George Mason University | Yang C.,George Mason University | Jin B.,George Mason University | Jin B.,Yunnan Provincial Geomatics Center | And 5 more authors.

Geoscience observations and model simulations are generating vast amounts of multidimensional data. Effectively analyzing these data are essential for geoscience studies. However, the tasks are challenging for geoscientists because processing the massive amount of data is both computing and data intensive in that data analytics requires complex procedures and multiple tools. To tackle these challenges, a scientific workflow framework is proposed for big geoscience data analytics. In this framework techniques are proposed by leveraging cloud computing, MapReduce, and Service Oriented Architecture (SOA). Specifically, HBase is adopted for storing and managing big geoscience data across distributed computers. MapReduce-based algorithm framework is developed to support parallel processing of geoscience data. And service-oriented workflow architecture is built for supporting on-demand complex data analytics in the cloud environment. A proof-of-concept prototype tests the performance of the framework. Results show that this innovative framework significantly improves the efficiency of big geoscience data analytics by reducing the data processing time as well as simplifying data analytical procedures for geoscientists. © 2015 Li et al. Source

Gui Z.,Hubei Engineering University | Gui Z.,George Mason University | Yu M.,George Mason University | Yang C.,George Mason University | And 9 more authors.

Dust storm has serious disastrous impacts on environment, human health, and assets. The developments and applications of dust storm models have contributed significantly to better understand and predict the distribution, intensity and structure of dust storms. However, dust storm simulation is a data and computing intensive process. To improve the computing performance, high performance computing has been widely adopted by dividing the entire study area into multiple subdomains and allocating each subdomain on different computing nodes in a parallel fashion. Inappropriate allocation may introduce imbalanced task loads and unnecessary communications among computing nodes. Therefore, allocation is a key factor that may impact the efficiency of parallel process. An allocation algorithm is expected to consider the computing cost and communication cost for each computing node to minimize total execution time and reduce overall communication cost for the entire simulation. This research introduces three algorithms to optimize the allocation by considering the spatial and communicational constraints: 1) an Integer Linear Programming (ILP) based algorithm from combinational optimization perspective; 2) a K-Means and Kernighan-Lin combined heuristic algorithm (K&K) integrating geometric and coordinate-free methods by merging local and global partitioning; 3) an automatic seeded region growing based geometric and local partitioning algorithm (ASRG). The performance and effectiveness of the three algorithms are compared based on different factors. Further, we adopt the K&K algorithm as the demonstrated algorithm for the experiment of dust model simulation with the non-hydrostatic mesoscale model (NMMdust) and compared the performance with the MPI default sequential allocation. The results demonstrate that K&K method significantly improves the simulation performance with better subdomain allocation. This method can also be adopted for other relevant atmospheric and numerical modeling. © 2016 Gui et al. This is an open access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited. Source

Discover hidden collaborations