Tanerguclu T.,Turkish Academy Of Sciences |
Maras H.,General Command of Mapping |
Gencer C.,Gazi University |
Aygunes H.,Cankaya University
Information Systems Frontiers | Year: 2012
In this study, a decision support system (DSS) based on the interactive use of location models and geographical information systems (GIS) was developed to determine the optimal positions for air defence weapons and radars. In the location model, the fire units are considered as the facilities to be located and the possible approach routes of air vehicles are treated as demand points. Considering the probability that fire by the units will miss the targets, the objective of the problem is to determine the positions that provide coverage of the approach routes of the maximum number of weapons while considering the military principles regarding the tactical use and deployment of units. In comparison with the conventional method, the proposed methodology presents a more reliable, faster, and more efficient solution. On the other hand, owing to the DSS, a battery commander who is responsible for air defence becomes capable of determining the optimal weapon and radar positions, among the alternative ones he has identified, that cover the possible approach routes maximally. Additionally, he attains the capability of making such decisions in a very short time without going to the field over which he will perform the defence and hence without being subject to enemy threats. In the decision support system, the digital elevation model is analysed using Map Objects 2.0, the mathematical model is solved using LINGO 4.0 optimization software, and the user interface and data transfer are supported by Visual Basic 6.0. © Springer Science+Business Media, LLC 2012.
Aktug B.,General Command of Mapping |
Kaypak B.,Ankara University |
Celik R.N.,Technical University of Istanbul
Journal of Seismology | Year: 2010
The 03 February 2002 Çay Earthquake (Mw ~6.7) occurred on the fault segment between Eber and Akşehir Lakes followed by a large aftershock (Mw ~ 5.6) near the western end of the fault and two sequential aftershocks. We computed the coseismic surface displacements from static GPS measurements to determine the fault geometry parameters and uniform slip components. The coseismic displacements were obtained through combining the regional pre-earthquake and post-earthquake GPS data. Fault geometry and slips were acquired through the inversion of GPS data modeling the events as elastic dislocations in a half-space and assuming all four events took place on the same fault plane. Results suggest that one-segment fault of ~33km length and dipping ~43° northward suffices to model the dislocation, assuming uniform slip distribution with 0.51m dip slip, 0.26m left-lateral slip extending to a depth down to ~11.5km which is consistent with seismological evidence. The results also verify the normal faulting in the eastern flank of Isparta Angle which has long been assumed as a thrusting structure. While the available data cannot identify the four individual events on the same day, an attempted distributed slip model differentiates dip slip and left-lateral slips near the hypocenter with maximum values of ~1 and 0.6m, respectively. © 2009 Springer Science+Business Media B.V.
Sahin H.,General Command of Mapping |
Kulur S.,Technical University of Istanbul
International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences - ISPRS Archives | Year: 2012
Thanks to the nature of the graphics processing, the newly released products offer highly parallel processing units with high-memory bandwidth and computational power of more than teraflops per second. The modern GPUs are not only powerful graphic engines but also they are high level parallel programmable processors with very fast computing capabilities and high-memory bandwidth speed compared to central processing units (CPU). Data-parallel computations can be shortly described as mapping data elements to parallel processing threads. The rapid development of GPUs programmability and capabilities attracted the attentions of researchers dealing with complex problems which need high level calculations. This interest has revealed the concepts of "General Purpose Computation on Graphics Processing Units (GPGPU)" and "stream processing". The graphic processors are powerful hardware which is really cheap and affordable. So the graphic processors became an alternative to computer processors. The graphic chips which were standard application hardware have been transformed into modern, powerful and programmable processors to meet the overall needs. Especially in recent years, the phenomenon of the usage of graphics processing units in general purpose computation has led the researchers and developers to this point. The biggest problem is that the graphics processing units use different programming models unlike current programming methods. Therefore, an efficient GPU programming requires re-coding of the current program algorithm by considering the limitations and the structure of the graphics hardware. Currently, multi-core processors can not be programmed by using traditional programming methods. Event procedure programming method can not be used for programming the multi-core processors. GPUs are especially effective in finding solution for repetition of the computing steps for many data elements when high accuracy is needed. Thus, it provides the computing process more quickly and accurately. Compared to the GPUs, CPUs which perform just one computing in a time according to the flow control are slower in performance. This structure can be evaluated for various applications of computer technology. In this study covers how general purpose parallel programming and computational power of the GPUs can be used in photogrammetric applications especially direct georeferencing. The direct georeferencing algorithm is coded by using GPGPU method and CUDA (Compute Unified Device Architecture) programming language. Results provided by this method were compared with the traditional CPU programming. In the other application the projective rectification is coded by using GPGPU method and CUDA programming language. Sample images of various sizes, as compared to the results of the program were evaluated. GPGPU method can be used especially in repetition of same computations on highly dense data, thus finding the solution quickly. © 2012 ISPRS.
Sahin I.,General Command of Mapping |
International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences - ISPRS Archives | Year: 2013
Increase in the number of satellites and the utilization of digital cameras in the aerial photography has spread the use of satellite image and oriented aerial photograph as real or near-real time resolution, accessible, cost effective spatial data. Co-registered images or aerial photos corrected for the height variations and orthogonality (scale) have become an essential input for geographical information systems and spatial decision making due to their integration with the other spatial data. Beyond that, images and photographs compose infrastructure for the other information in usage of spatial data with the help of the access and query facility web providing. Although the issue of the aerial photo ortho-rectification has been solved long ago, the problems related with the storage of huge amount of photos and images, their management, processes, and user accesses have been raised. These subjects concern the multitudinous private and governmental institutes. Some governmental organizations and private companies have gained the technical ability to perform these works in recent times. This situation has lead to significant increase in the amount of aerial photograph taking and processing in one year for whole country. General Command of Mapping has been using digital aerial camera since 2008 for the photograph taking. The total area covered by the satellite images, purchased for different purposes, and the aerial photographs, taken for some revision purposes or demands of governmental and private institutes, has reached up to 200.000 km2. It is considered that, colored and high resolution orthophotos of the whole country can be achieved within four years; provided that the annual production would continue similarly without any increase in amount. From the numbers given above, it is clear and inevitable that the orthophoto production procedure must be improved in order to produce orthophotos in the same year just after the photograph takings. Necessary studies about the storage, management and presentation of the huge amounts of orthophoto images to the users must be started immediately. In this study; metadata components of the produced orthophotos compatible with the international standards have been defined, a relational database has been created to keep complete and accurate metadata, and a user interface has been developed to insert the metadata into the database. Through the developed software, some extra time has been saved while creating and querying the metadata.
Kayi A.,General Command of Mapping |
Yilmaz A.,General Command of Mapping
International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences - ISPRS Archives | Year: 2014
An earthquake occurred at Van City on 23 October 2011 at 13:41 local time. The magnitude, moment magnitude and depth of earthquake were respectively MI:6.7, Mw:7.0 and 19.07 km. Van city centre and its surrounding villages were affected from this destructive earthquake. Many buildings were ruined and approximately 600 people died. Acquisition and use of geospatial data is very important and crucial for the management of such kind of natural disasters. In this paper, the role of national and international geospatial data in the management of Van earthquake is investigated.. With an international collaboration with Charter, pre and post-earthquake satellite images were acquired in 24 hours following the Earthquake. Also General Command of Mapping (GCM), the national mapping agency of Turkey, produced the high resolution multispectral orthophotos of the region. Charter presented the orthophotos through 26-28 October 2012. Just after the earthquake with a quick reaction, GCM made the flight planning of the 1296 km2 disaster area to acquire aerial photos. The aerial photos were acquired on 24 October 2012 (one day after the earthquake) by UltraCamX large format digital aerial camera. 152 images were taken with 30 cm ground sample distance (GSD) by %30 sidelap and %60 overlap. In the evening of same flight day, orthophotos were produced without ground control points by direct georeferencing and GCM supplied the orthophotos to the disaster management authorities. Also 45 cm GSD archive orthophotos, acquired in 2010, were used as a reference in order to find out the effects of the disaster.
Torun A.,General Command of Mapping |
Boyaci D.,General Command of Mapping
International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences - ISPRS Archives | Year: 2012
Immense use of topographical data in spatial data visualization, business GIS (Geographic Information Systems) solutions and applications, mobile and location-based services forced the topo-data providers to create standard, up-to-date and complete data sets in a sustainable frame. Data quality has been studied and researched for more than two decades. There have been un-countable numbers of references on its semantics, its conceptual logical and representations and many applications on spatial databases and GIS. However, there is a gap between research and practice in the sense of spatial data quality which increases the costs and decreases the efficiency of data production. Spatial data quality is well-known by academia and industry but usually in different context. The research on spatial data quality stated several issues having practical use such as descriptive information, metadata, fulfillment of spatial relationships among data, integrity measures, geometric constraints etc. The industry and data producers realize them in three stages; pre-, co- and post data capturing. The pre-data capturing stage covers semantic modelling, data definition, cataloguing, modelling, data dictionary and schema creation processes. The co-data capturing stage covers general rules of spatial relationships, data and model specific rules such as topologic and model building relationships, geometric threshold, data extraction guidelines, object-object, object-belonging class, object-non-belonging class, class-class relationships to be taken into account during data capturing. And post-data capturing stage covers specified QC (quality check) benchmarks and checking compliance to general and specific rules. The vector data quality criteria are different from the views of producers and users. But these criteria are generally driven by the needs, expectations and feedbacks of the users. This paper presents a practical method which closes the gap between theory and practice. Development of spatial data quality concepts into developments and application requires existence of conceptual, logical and most importantly physical existence of data model, rules and knowledge of realization in a form of geo-spatial data. The applicable metrics and thresholds are determined on this concrete base. This study discusses application of geo-spatial data quality issues and QA (quality assurance) and QC procedures in the topographic data production. Firstly we introduce MGCP (Multinational Geospatial Co-production Program) data profile of NATO (North Atlantic Treaty Organization) DFDD (DGIWG Feature Data Dictionary), the requirements of data owner, the view of data producers for both data capturing and QC and finally QA to fulfil user needs. Then, our practical and new approach which divides the quality into three phases is introduced. Finally, implementation of our approach to accomplish metrics, measures and thresholds of quality definitions is discussed. In this paper, especially geometry and semantics quality and quality control procedures that can be performed by the producers are discussed. Some applicable best-practices that we experienced on techniques of quality control, defining regulations that define the objectives and data production procedures are given in the final remarks. These quality control procedures should include the visual checks over the source data, captured vector data and printouts, some automatic checks that can be performed by software and some semi-automatic checks by the interaction with quality control personnel. Finally, these quality control procedures should ensure the geometric, semantic, attribution and metadata quality of vector data. © 2012 ISPRS.
Yildiz H.,General Command of Mapping
Studia Geophysica et Geodaetica | Year: 2012
Gravity field and steady-state Ocean Circulation Explorer (GOCE) is the first satellite mission that observes gravity gradients from the space, to be primarily used for the determination of high precision global gravity field models. However, the GOCE gradients, having a dense data distribution, may potentially provide better predictions of the regional gravity field than those obtained using a spherical harmonic Earth Geopotential Model (EGM). This is investigated in Auvergne test area using Least Squares Collocation (LSC) with GOCE vertical gravity gradient anomalies (T zz), removing the long wavelength part from EGM2008 and the short wavelength part by residual terrain modelling (RTM). The results show that terrain effects on the vertical gravity gradient are significant at satellite altitude, reaching a level of 0. 11 Eötvös unit (E. U.) in the mountainous areas. Removing the RTM effects from GOCE T zz leads to significant improvements on the LSC predictions of surface gravity anomalies and quasigeoid heights. Comparison with ground truth data shows that using LSC surface free air gravity anomalies and quasi-geoid heights are recovered from GOCE T zz with standard deviations of 11 mGal and 18 cm, which is better than those obtained by using GOCE EGMs, demonstrating that information beyond the maximal degree of the GOCE EGMs is present. Investigation of using covariance functions created separately from GOCE T zz and terrestrial free air gravity anomalies, suggests that both covariance functions give almost identical predictions. However, using covariance function obtained from GOCE T zz has the effect that the predicted formal average error estimates are considerably larger than the standard deviations of predicted minus observed gravity anomalies. Therefore, GOCE T zz should be used with caution to determine the covariance functions in areas where surface gravity anomalies are not available, if error estimates are needed. © 2012 Institute of Geophysics of the ASCR, v.v.i.
Lenk O.,General Command of Mapping
Journal of Geodynamics | Year: 2013
In recent years, the Gravity Recovery and Climate Experiment (GRACE) has provided a new tool to study terrestrial water storage variations (TWS) at medium and large spatial scales, providing quantitative measures of TWS change. Linear trends in TWS variations in Turkey were estimated using GRACE observations for the period March 2003 to March 2009. GRACE showed a significant decrease in TWS in the southern part of the central Anatolian region up to a rate of 4 cm/year. The Global Land Data Assimilation System (GLDAS) model also captured this TWS decrease event but with underestimated trend values. The GLDAS model represents only a part of the total TWS variations, the sum of soil moisture (2 m column depth) and snow water equivalent, ignoring groundwater variations. Therefore,GLDAS model derived TWS variations were subtracted from GRACE derived TWS variations to estimate groundwater storage variations. Results revealed that decreasing trends of TWS observed by GRACE in the southern part of central Anatolia were largely explained by the decreasing trends of groundwater variations which were confirmed by the limited available well groundwater level data in the region. © 2012 Elsevier Ltd.
Aktug B.,Bogazici University |
Parmaksiz E.,General Command of Mapping |
Kurt M.,General Command of Mapping |
Lenk O.,General Command of Mapping |
And 3 more authors.
Journal of Geodynamics | Year: 2013
Central Anatolia plays a key role to connect the theories about the subduction of African Plate along Hellenic and Cyprian Arcs and the collision of Arabia indenter along Bitlis-Zagros Thrust Zone. Taking place between the North Anatolian and East Anatolian megashear zones, the neotectonics of seismically less active Central Anatolia is often regarded as tectonic escape or extrusion tectonics. Although, available GPS studies dating backto early 1990s reported coherent rotation, they were mostly focused on the seismically more active and more populated Western Anatolia and lack sufficient spatial resolution in quantifying second-order structures such as Tuz Gölü Fault Zone, Central Anatolia Fault Zone which comprises Ecemiş Fault and Erciyes Fault, Ezinepazari Fault and their related basins and associated processes. Besides, the new dense GPS velocity field of Central Anatolia exhibits systematic local patterns of internal deformation which is inconsistent with either coherent rotation or translation. The velocity gradients computed along the rotation profiles of Central Anatolia show nearly westward and smooth increments which cannotbe explained through a simple rotation/transport of Central Anatolia Basin. Moreover, estimating and removing an Euler rigid-body rotation rate which is computed from the sites lying in the middle part of Central Anatolia absorbs the velocity discrepancies between the Eastern and Western part of Central Anatolia down to a few millimetres and leaves out systematic residuals. Upon completion of Turkish National Fundamental GPS Network (TNFGN) in 1999, early revision surveys were carried out in Marmara region because of the 1999 Marmara earthquakes. Additional observations were carried out in Central Anatolia, resulting in a velocity field of unprecedented spatial density with average inter-station distance of 30-50 km.We computed the horizontal velocity field with respect to a not-net rotation frame, to Eurasia, and to a computed Anatolia Euler Pole. Two distinct models of Anatolia neotectonics, microplate and continuum deformation were tested through the rigid-body Euler rotations, block modelling and strain analysis. The results show that the decomposition of the Eurasia-fixed velocity field into the rigid rotations and the residuals reveals systematic residuals up to 5 mm/yr with respect to a computed best-fit Euler Pole located at 31.6820N ± 0.05, 31.6130E ± 0.02 and with a rotation rate of 1.3800/Myr ± 0.01. The relative velocities computed along rotation paths exhibit westwardincreasing linear gradients of 0.7-1.3 mm per 100 km depending on the latitude which is mechanically inconsistent with the assumptions of a coherent transport or a rigid rotationdue to an extrusion in the east. Moreover, the strain analysis results show E-W extensionrates up to 100 nanostrain/yr along approximately N-S striking faults within the region from the west of Karliova to Isparta Angle, which is another indication of the partitionedextensional strain across the Central Anatolia. On the other hand, the compressional strains were also obtained near the eastern branch of Isparta Angle, Tuz Gölü and southern Anatolia. In this study, we provide new quantitative results about the fact that the deformation in Central Anatolia is not uniform and possibly driven by the extension through slab pull and/or suction in west-southwest and the compression in the south rather than a simple coherent rotation and/or translation/transport of Anatolia driven by an extrusion process in the east. We also propose that the tectonics of Central Anatolia comprises a dominant tensional driving force along Hellenic Arc in the southwest and a restraining belt along Cyprian Arc in the south. © 2012 Elsevier Ltd.
Yildiz H.,General Command of Mapping |
Andersen O.B.,Technical University of Denmark |
Simav M.,General Command of Mapping |
Aktug B.,Bogazici University |
Ozdemir S.,General Command of Mapping
Advances in Space Research | Year: 2013
The differences between coastal altimetry and sea level time series of tide gauges in between March 1993 and December 2009 are used to estimate the rates of vertical land motion at three tide gauge locations along the southwestern coasts of Turkey. The CTOH/LEGOS along-track coastal altimetry retrieves altimetric sea level anomalies closer to the coast than the standard along-track altimetry products. However, the use of altimetry very close to the coast is not found to improve the results. On the contrary, the gridded and interpolated AVISO merged product exhibits the best agreement with tide gauge data as it provides the smoothest variability both in space and time compared with along track altimetry data. The Antalya gauge to the south (in the Mediterranean Sea) and the Mentes/Izmir gauge to the west (in the Aegean Sea) both show subsidence while the Bodrum tide gauge to the south (in the Aegean Sea) shows no significant vertical land motion. The results are compared and assessed with three independent geophysical vertical land motion estimates like from GPS. The GIA effect in the region is negligible. The VLM estimates from altimetry and tide gauge data are in good agreement both with GPS derived vertical velocity estimates and those inferred from geological and archaeological investigations. © 2012 COSPAR. Published by Elsevier Ltd. All rights reserved.