Time filter

Source Type

Lausanne, Switzerland

Rohac J.,Czech Technical University | Rerabek M.,Czech Technical University | Rerabek M.,Multimedia Signal Processing Group | Hudec R.,Czech Technical University
Acta Polytechnica

This paper focuses on the idea of a multi-functional wide-field star tracker (WFST) and provides a description of the current state-of-the-art in this field. The idea comes from a proposal handed in to ESA at the beginning of 2011. Star trackers (STs) usually have more than one object-lens with a small Field-of-View. They provide very precise information about the attitude in space according to consecutive evaluation of star positions. Our idea of WFST will combine the functions of several instruments, e.g. ST, a horizon sensor, and an all-sky photometry camera. WFST will use a fish-eye lens. There is no comparable product on the present-day market. Nowadays, spacecraft have to carry several instruments for these applications. This increases the weight of the instrumentation and reduces the weight available for the payload. Source

Ewert S.,University of Bonn | Ewert S.,Multimedia Signal Processing Group | Muller M.,Saarland University | Dannenberg R.B.,Carnegie Mellon University
Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)

The general goal of music synchronization is to align multiple information sources related to a given piece of music. This becomes a hard problem when the various representations to be aligned reveal significant differences not only in tempo, instrumentation, or dynamics but also in structure or polyphony. Because of the complexity and diversity of music data, one can not expect to find a universal synchronization algorithm that yields reasonable solutions in all situations. In this paper, we present a novel method that allows for automatically identifying the reliable parts of alignment results. Instead of relying on one single strategy, our idea is to combine several types of conceptually different synchronization strategies within an extensible framework, thus accounting for various musical aspects. Looking for consistencies and inconsistencies across the synchronization results, our method automatically classifies the alignments locally as reliable or critical. Considering only the reliable parts yields a high-precision partial alignment. Moreover, the identification of critical parts is also useful, as they often reveal musically interesting deviations between the versions to be aligned. © 2011 Springer-Verlag Berlin Heidelberg. Source

Kroupi E.,Multimedia Signal Processing Group | Hanhart P.,Multimedia Signal Processing Group | Lee J.-S.,Yonsei University | Rerabek M.,Multimedia Signal Processing Group | Ebrahimi T.,Multimedia Signal Processing Group
Multimedia Tools and Applications

As immersive technologies target to provide higher quality of multimedia experiences, it is important to understand the quality of experience (QoE) perceived by users from various multimedia rendering schemes, in order to design and optimize human-centric immersive multimedia systems. In this study, various QoE-related aspects, such as depth perception, sensation of reality, content preference, and perceived quality are investigated and compared for presentation of 2D and 3D contents. Since the advantages of implicit over explicit QoE assessment have become essential, the way these QoE-related aspects influence brain and periphery is also investigated. In particular, two classification schemes using electroencephalography (EEG) and peripheral signals (electrocardiography and respiration) are carried out, to explore if it is possible to automatically recognize the QoE-related aspects under investigation. In addition, a decision-fusion scheme is applied to EEG and peripheral features, to explore the advantage of integrating information from the two modalities. The results reveal that the highest monomodal average informedness is achieved in the high beta EEG band (0.14 % ± 0.09, p < 0.01), when recognizing sensation of reality. The highest and significantly non-random multimodal average informedness is achieved when high beta EEG band is fused with peripheral features (0.17 % ± 0.1, p < 0.01), for the case of sensation of reality. Finally, a temporal analysis is conducted to explore how the EEG correlates for the case of sensation of reality change over time. The results reveal that the right cortex is more involved when sensation of reality is low, and the left when sensation of reality is high, indicating that approach and withdrawal-related processes occur during sensation of reality. © 2015 Springer Science+Business Media New York Source

Discover hidden collaborations