Xiong J.,Kunming University of Science and Technology |
Fang Y.,Kunming University of Science and Technology |
Jin B.,Yunnan Provincial Geomatics Center |
Zhao Z.,Kunming University of Science and Technology
Proceedings of the 2012 4th International Conference on Intelligent Human-Machine Systems and Cybernetics, IHMSC 2012 | Year: 2012
This paper proposes a new method for the automatic generation of a digital terrain model (DTM) from airborne laser scanning technique. First, generate regular grids, and then filter the points on each regular grid. Finally, change the size of grid and filter these points on the new grids repeatedly. The data measured by LiteMapper-5600 in a city is used to evaluate the method. The results show that this method can remove most of non-ground points effectively and generate DTM with the high-accuracy in urban areas. Copyright © 2012 by The Institute of Electrical and Electronics Engineers, Inc.
Jin B.-X.,Yunnan Provincial Geomatics Center |
Fang Y.-M.,Kunming University of Science and Technology |
Song W.-W.,Kunming University of Science and Technology
Transactions of Nonferrous Metals Society of China (English Edition) | Year: 2011
Digital mine is the inevitable outcome of the information processing, and is also a complicated system engineering. Firstly, for the 3D visualization application of the digital mine, the ground and underground integrative visualization framework model was proposed based on the mine entity database. So, the visualization problem was availably resolved, as well as the professional analytical ability was improved. Secondly, aiming at the irregularities, non-uniformity, dynamics of mine entities, mix modeling method based on the entity character was put forward, in which 3D expression of mine entities was realized. Lastly, the 3D visualization project for a copper mine was experimentally studied. Satisfactory results were acquired, and the rationality of visualization model and feasibility of 3D modeling were validated. © 2011 The Nonferrous Metals Society of China.
Tan X.,Wuhan University |
Di L.,George Mason University |
Deng M.,George Mason University |
Fu J.,China Electrical Power Research Institution |
And 6 more authors.
Sustainability (Switzerland) | Year: 2015
Since the Open Geospatial Consortium (OGC) proposed the geospatial Web Processing Service (WPS), standard OGC Web Service (OWS)-based geospatial processing has become the major type of distributed geospatial application. However, improving the performance and sustainability of the distributed geospatial applications has become the dominant challenge for OWSs. This paper presents the construction of an elastic parallel OGC WPS service on a cloud-based cluster and the designs of a high-performance, cloud-based WPS service architecture, the scalability scheme of the cloud, and the algorithm of the elastic parallel geoprocessing. Experiments of the remote sensing data processing service demonstrate that our proposed method can provide a higher-performance WPS service that uses less computing resources. Our proposed method can also help institutions reduce hardware costs, raise the rate of hardware usage, and conserve energy, which is important in building green and sustainable geospatial services or applications. © 2015 by the authors.
Li Z.,George Mason University |
Yang C.,George Mason University |
Jin B.,George Mason University |
Jin B.,Yunnan Provincial Geomatics Center |
And 5 more authors.
PLoS ONE | Year: 2015
Geoscience observations and model simulations are generating vast amounts of multidimensional data. Effectively analyzing these data are essential for geoscience studies. However, the tasks are challenging for geoscientists because processing the massive amount of data is both computing and data intensive in that data analytics requires complex procedures and multiple tools. To tackle these challenges, a scientific workflow framework is proposed for big geoscience data analytics. In this framework techniques are proposed by leveraging cloud computing, MapReduce, and Service Oriented Architecture (SOA). Specifically, HBase is adopted for storing and managing big geoscience data across distributed computers. MapReduce-based algorithm framework is developed to support parallel processing of geoscience data. And service-oriented workflow architecture is built for supporting on-demand complex data analytics in the cloud environment. A proof-of-concept prototype tests the performance of the framework. Results show that this innovative framework significantly improves the efficiency of big geoscience data analytics by reducing the data processing time as well as simplifying data analytical procedures for geoscientists. © 2015 Li et al.
Gui Z.,Hubei Engineering University |
Gui Z.,George Mason University |
Yu M.,George Mason University |
Yang C.,George Mason University |
And 9 more authors.
PLoS ONE | Year: 2016
Dust storm has serious disastrous impacts on environment, human health, and assets. The developments and applications of dust storm models have contributed significantly to better understand and predict the distribution, intensity and structure of dust storms. However, dust storm simulation is a data and computing intensive process. To improve the computing performance, high performance computing has been widely adopted by dividing the entire study area into multiple subdomains and allocating each subdomain on different computing nodes in a parallel fashion. Inappropriate allocation may introduce imbalanced task loads and unnecessary communications among computing nodes. Therefore, allocation is a key factor that may impact the efficiency of parallel process. An allocation algorithm is expected to consider the computing cost and communication cost for each computing node to minimize total execution time and reduce overall communication cost for the entire simulation. This research introduces three algorithms to optimize the allocation by considering the spatial and communicational constraints: 1) an Integer Linear Programming (ILP) based algorithm from combinational optimization perspective; 2) a K-Means and Kernighan-Lin combined heuristic algorithm (K&K) integrating geometric and coordinate-free methods by merging local and global partitioning; 3) an automatic seeded region growing based geometric and local partitioning algorithm (ASRG). The performance and effectiveness of the three algorithms are compared based on different factors. Further, we adopt the K&K algorithm as the demonstrated algorithm for the experiment of dust model simulation with the non-hydrostatic mesoscale model (NMMdust) and compared the performance with the MPI default sequential allocation. The results demonstrate that K&K method significantly improves the simulation performance with better subdomain allocation. This method can also be adopted for other relevant atmospheric and numerical modeling. © 2016 Gui et al. This is an open access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.
Liao L.,Wuhan University |
Li P.,Wuhan University |
Yang J.,Wuhan University |
Chang H.,Yunnan Provincial Geomatics Center
Wuhan Daxue Xuebao (Xinxi Kexue Ban)/Geomatics and Information Science of Wuhan University | Year: 2015
At present SAR polarimetric calibration algorithm are assumed to satisfy reciprocity from distributed targets. But not all distributed targets satisfy the principle of reciprocity. Aiming to solve this problem, this paper presents a method to calibration based on reciprocity judgment of distributed targets. According to the angle between the polarization scattering matrix and the reciprocal matrix of each pixel, we select pixels with small angle as the initial reciprocity calibration samples. Then, on the basis of the coherence of targets, some pixels are selected as the final calibration samples. CECT-38 X-band fully polarimetric data were completed by combining a typical calibration algorithm and selected distributed targets using the proposed method. Experiments show that this method, using distributed targets, can be well applied to SAR polarimetric calibration.
Shi-Hua L.,Yunnan Normal University |
Shi-Hua L.,Yunnan Provincial Geomatics Center |
Bao-Xuan J.,Yunnan Provincial Geomatics Center |
Jun-Song Z.,Yunnan Provincial Geomatics Center |
And 4 more authors.
International Journal of Earth Sciences and Engineering | Year: 2016
The change of Lakes area can reflect the change of regional environment and climate, and it is of great significance to study the regional climate and sustainable development. Based on RS and GIS technology, using Normalized Difference Water Index to extract boundary of Fuxian lake by using multi-temporal remote sensing images such as MSS, TM and ETM+, OLI from year 1974 to 2014, and combining available ancillary geographical data, field survey data and other related research literature to verify the analysis results, the 14 times data of area and volume of Fuxian lake are calculated, and the spatiotemporal change trend of Fuxian lake is analyzed. Meanwhile, combining with the existing the relevant meteorological data from year 1974 to 2014 in the Fuxian lake watershed, using multiple linear regression model, the main driving factors of the spatiotemporal change trend of Fuxian lake are preliminary discussed. Results show that: (1) since 1974, the change of area and volume of Fuxian lake presents obviously downward trend in the overall. (2) The climate change affect significantly to the area of and volume change of Fuxian lake from year 1974 to 2014, and the main driving factors are annual precipitation, annual mean ground temperature, annual ground evaporation volume and the surface temperature in the meteorological factors. © 2016 CAFET-INNOVA TECHNICAL SOCIETY. All rights reserved.