Darmstadt, Germany
Darmstadt, Germany

Time filter

Source Type

Walther M.,AGT Group RandD GmbH
CINTI 2012 - 13th IEEE International Symposium on Computational Intelligence and Informatics, Proceedings | Year: 2012

Product information search has become one of the most important application areas of the Web. Especially considering pricey technical products, consumers tend to carry out intensive research activities previous to an actual acquisition. However, the vast amount of available data about such products and its various representations may easily overstrain potential customers. In this paper, we develop a comprehensive technique for extracting product specifications about arbitrary technical products from web pages in a widely unsupervised manner. The technique is based on a clustering approach that uses structural and visual features of web page elements. The resulting detailed information sets allow a potential consumer to effectively compare products while saving the manual extraction work. © 2012 IEEE.

Ros S.P.,University of Murcia | Lischka M.,AGT Group R and D GmbH | Marmol F.G.,NEC Europe Ltd.
Proceedings of ACM Symposium on Access Control Models and Technologies, SACMAT | Year: 2012

The amount of private information in the Internet is constantly increasing with the explosive growth of cloud computing and social networks. XACML is one of the most important standards for specifying access control policies for web services. The number of XACML policies grows really fast and evaluation processing time becomes longer. The XEngine approach proposes to rearrange the matching tree according to the attributes used in the target sections, but for speed reasons they only support equality of attribute values. For a fast termination the combining algorithms are transformed into a first applicable policy, which does not support obligations correctly. In our approach all comparison functions defined in XACML as well as obligations are supported. In this paper we propose an optimization for XACML policies evaluation based on two tree structures. The first one, called Matching Tree, is created for a fast searching of applicable rules. The second one, called Combining Tree, is used for the evaluation of the applicable rules. Finally, we propose an exploring method for the Matching Tree based on the binary search algorithm. The experimental results show that our approach is orders of magnitude better than Sun PDP. Copyright 2012 ACM.

Hahn J.,TU Darmstadt | Debes C.,AGT Group RandD GmbH | Leigsnering M.,TU Darmstadt | Zoubir A.M.,TU Darmstadt
Digital Signal Processing: A Review Journal | Year: 2014

Hyperspectral imaging (HSI) is an emerging technique, which provides the continuous acquisition of electro-magnetic waves, usually covering the visible as well as the infrared light range. Many materials can be easily discriminated by means of their spectra rendering HSI an interesting method for the reliable classification of contents in a scene. Due to the high amount of data generated by HSI, effective compression algorithms are required. The computational complexity as well as the potentially high number of sensors render HSI an expensive technology. It is thus of practical interest to reduce the number of required sensor elements as well as computational complexity - either for cost or for energy reasons. In this paper, we present two different systems that acquire hyperspectral images with less samples than the actual number of pixels, i.e. in a low dimensional representation. First, a design based on compressive sensing (CS) is explained. Second, adaptive direct sampling (ADS) is utilized to obtain coefficients of hyperspectral images in the 3D (Haar) wavelet domain, simplifying the reconstruction process significantly. Both approaches are compared with conventionally captured images with respect to image quality and classification accuracy. Our results based on real data show that in most cases only 40% of the samples suffice to obtain high quality images. Using ADS, the rate can be reduced even to a greater extent. Further results confirm that, although the number of acquired samples is dramatically reduced, we can still obtain high classification rates. © 2013 Elsevier Inc.

Fandos R.,TU Darmstadt | Debes C.,AGT Group RandD GmbH | Zoubir A.M.,TU Darmstadt
Signal Processing | Year: 2013

We address two fundamental design issues of a classification system: the choice of the classifier and the dimensionality of the optimal feature subset. Resampling techniques are applied to estimate both the probability distribution of the misclassification rate (or any other figure of merit of a classifier) subject to the size of the feature set, and the probability distribution of the optimal dimensionality given a classification system and a misclassification rate. The latter allows for the estimation of confidence intervals for the optimal feature set size. Based on the former, a quality assessment for the classifier performance is proposed. Traditionally, the comparison of classification systems is accomplished for a fixed feature set. However, a different set may provide different results. The proposed method compares the classifiers independently of any pre-selected feature set. The algorithms are tested on 80 sets of synthetic examples and six standard databases of real data. The simulated data results are verified by an exhaustive search of the optimum and by two feature selection algorithms for the real data sets. © 2013 Elsevier B.V. All rights reserved.

Mostafa A.A.,TU Darmstadt | Debes C.,AGT Group RandD GmbH | Zoubir A.M.,TU Darmstadt
IEEE Transactions on Geoscience and Remote Sensing | Year: 2012

A scheme for target detection using segmentation by classification is proposed. The scheme is applied to through-the-wall microwave images obtained using frequency-domain back-projection in a wideband radar. We consider stationary targets where Doppler and change-detection-based techniques are inapplicable. The proposed scheme uses features from polarimetric images to segment and classify the image observations into target, clutter, and noise segments. We map target polarization signatures from copolarized and cross-polarized target returns to a pixel-by-pixel feature space, then oversegment the image to homogeneous regions called superpixels depending on this feature space. The features of each superpixel are used subsequently to group homogeneous superpixels into clusters. The clusters are then classified using decision trees. Real data collected using an indoor radar imaging scanner are used for performance validation. © 2012 IEEE.

Braun J.,TU Darmstadt | Horsch M.,TU Darmstadt | Wiesmaier A.,AGT Group R and D GmbH
Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) | Year: 2012

Recent attacks on the German identity card show that a compromised client computer allows for PIN compromise and man-in-the-middle attacks on eID cards. We present a selection of new solutions to that problem which do not require changes in the card specification. All presented solutions protect against PIN compromise attacks, some of them additionally against man-in-the-middle attacks. © 2012 Springer-Verlag.

Debes C.,AGT Group R and D GmbH | Zoubir A.M.,TU Darmstadt | Amin M.G.,Villanova University
IEEE Transactions on Geoscience and Remote Sensing | Year: 2012

We consider the problem of through-the-wall radar imaging (TWRI), in which polarimetric imaging is used for automatic target detection. Two generalized statistical detectors are proposed which perform joint detection and fusion of a set of multipolarization radar images. The first detector is an extension of a previously proposed iterative target detector for multiview TWRI. This extension allows the detector to automatically adapt to statistics that may vary, depending on target locations and electromagnetic-wave polarizations. The second detector is based on Bayes' test and is of interest when target pixel occupancies are known from, e.g., secondary data. Properties of the proposed detectors are delineated and demonstrated by real data measurements using wideband sum-and-delay beamforming, acquired in a semicontrolled lab environment. We examine the performance of the proposed detectors when imaging both metal objects and humans. © 2012 IEEE.

Deisenroth M.P.,TU Darmstadt | Deisenroth M.P.,University of Washington | Turner R.D.,Winton Capital | Turner R.D.,University of Cambridge | And 3 more authors.
IEEE Transactions on Automatic Control | Year: 2012

We propose a principled algorithm for robust Bayesian filtering and smoothing in nonlinear stochastic dynamic systems when both the transition function and the measurement function are described by non-parametric Gaussian process (GP) models. GPs are gaining increasing importance in signal processing, machine learning, robotics, and control for representing unknown system functions by posterior probability distributions. This modern way of system identification is more robust than finding point estimates of a parametric function representation. Our principled filtering/smoothing approach for GP dynamic systems is based on analytic moment matching in the context of the forward-backward algorithm. Our numerical evaluations demonstrate the robustness of the proposed approach in situations where other state-of-the-art Gaussian filters and smoothers can fail. © 2011 IEEE.

Huber M.F.,AGT Group RandD GmbH
IEEE Transactions on Automatic Control | Year: 2012

In the considered linear Gaussian sensor scheduling problem, only one sensor out of a set of sensors performs a measurement. To minimize the estimation error over multiple time steps in a computationally tractable fashion, the so-called information-based pruning algorithm is proposed. It utilizes the information matrices of the sensors and the monotonicity of the Riccati equation. This allows ordering sensors according to their information contribution and excluding many of them from scheduling. Additionally, a tight lower is calculated for branch-and-bound search, which further improves the pruning performance. © 2012 IEEE.

Yin F.,TU Darmstadt | Debes C.,AGT Group R and D GmbH | Zoubir A.M.,TU Darmstadt
IEEE Transactions on Signal Processing | Year: 2012

We propose a parametric waveform design approach for improved detection of extended targets embedded in uncorrelated signal-dependent clutter and noise, whose spectral densities are assumed to be known. Unlike canonical waveform design approaches, the transmit waveform is represented as a weighted linear combination of discrete prolate spheroidal sequences. In the optimization problem, the probability of detection is maximized with respect to the weighting factors of the associated discrete prolate spheroidal sequences under the transmit energy constraint. The weighting factors, which are resolved using a numerical method, lead directly to the desired transmit waveform in the time domain. In comparison to the canonical waveform design approaches, the extra step for time sequence synthesis is avoided and the loss in probability of detection produced therein is remedied. Simulation results demonstrate the improvement in the probability of detection for the proposed approach. However, the improvement comes at the cost of higher computational complexity. © 1991-2012 IEEE.

Loading AGT Group RandD GmbH collaborators
Loading AGT Group RandD GmbH collaborators