Mignone V.,Research and Technical Innovation Center |
Vazquez-Castro M.A.,Autonomous University of Barcelona |
Stockhammer T.,Nomor Research GmbH
Proceedings of the IEEE | Year: 2011
The first two generations of TV broadcasting are almost history, at least from the standardization perspective. Analog and digital satellite distributions both have addressed mass markets. The next-generation TV systems are close to primetime deployments: HDTV, 3DTV, interactive services, hybrid services, and additional innovations will dominate the next-generation TV. With the efficiency and maturity of DVB-S2, satellite distribution is a cost-efficient and well-established broadcast technology with significant potential for extensions. In this paper, the future perspectives of digital TV and HDTV broadcasting will be first explored, considering ongoing and future standardization activities that will be carried out. Particular attention to innovative solutions based on adaptive modulation and coding, source and channel coding, and error resilience techniques for satellite TV transmission is paid. In addition to broadcast TV, also the perspectives of hybrid and IPTV will be considered in a satellite scenario with their pros and cons, trying to understand if satellite IPTV will be in competition with conventional broadcast satellite TV services (like it already happens in terrestrial scenarios). © 2011 IEEE.
Viering I.,Nomor Research GmbH |
Lobinger A.,Nokia Inc. |
Stefanski S.,Nokia Inc.
Eurasip Journal on Wireless Communications and Networking | Year: 2010
A novel theoretical framework for uplink simulations is proposed. It allows investigations which have to cover a very long (real-) time and which at the same time require a certain level of accuracy in terms of radio resource management, quality of service, and mobility. This is of particular importance for simulations of self-organizing networks. For this purpose, conventional system level simulators are not suitable due to slow simulation speeds far beyond real-time. Simpler, snapshot-based tools are lacking the aforementioned accuracy. The runtime improvements are achieved by deriving abstract theoretical models for the MAC layer behavior. The focus in this work is long term evolution, and the most important uplink effects such as fluctuating interference, power control, power limitation, adaptive transmission bandwidth, and control channel limitations are considered. Limitations of the abstract models will be discussed as well. Exemplary results are given at the end to demonstrate the capability of the derived framework. Copyright © 2010 Ingo Viering et al.
Fehske A.J.,TU Dresden |
Viering I.,Nomor Research GmbH |
Viering I.,TU Munich |
Voigt J.,Actix GmbH |
And 3 more authors.
Proceedings of the IEEE | Year: 2014
Increasing the spatial reuse of frequency spectrum by deploying more access points has historically been the most effective means to improve the capacity of any cellular communication network. Today's mobile networks face a proliferation of data services and overall demand for data traffic that has been strongly increasing over several years. As a result, increasing network capacity through the deployment of small lower power nodes is of key importance for mobile network operators. Although such small access points are conceptually equivalent to conventional cellular base stations in many ways, the expected large number of small cells as well as their much more dynamic unplanned deployment raise a variety of challenges in the area of network management. This paper discusses such challenges and reviews state-of-the-art modeling as well as selected network management techniques. © 2014 IEEE.
Awada A.,Nokia Inc. |
Wegmann B.,Nokia Inc. |
Viering I.,Nomor Research GmbH |
Klein A.,TU Darmstadt
IEEE International Symposium on Personal, Indoor and Mobile Radio Communications, PIMRC | Year: 2010
Game theory provides an adequate methodology for analyzing topics in communication systems that include trade-offs such as the subject of load balancing. As a means of balancing the load in the network, users are handed over from highly loaded cells to lower loaded neighbors increasing the capacity usage and the Quality of Service (QoS). The algorithm that calculates the amount of the load that each cell should decide either to accept or to offload might differ if the base stations are from distinct vendors, which in-turn may have an impact on the performance of the network. In this paper, we study the load balancing problem using a game-theoretic approach where, in the worst case, each cell decides independently on the amount of load that maximizes its payoff in an uncoordinated way and investigate whether the resulting Nash equilibrium would exhaust the gains achieved. Moreover, we alter the behavior of the players using the linear pricing technique to have a more desirable equilibrium. The simulation results for the Long Term Evolution (LTE) network have shown that the Nash equilibrium point can still provide a remarkable increase in the capacity when compared to a system without load balancing and has a slight degradation in performance with respect to the equilibrium achieved by linear pricing. ©2010 IEEE.
Awada A.,TU Darmstadt |
Wegmann B.,Nokia Inc. |
Viering I.,Nomor Research GmbH |
Klein A.,TU Darmstadt
IEEE Transactions on Vehicular Technology | Year: 2013
First, the deployment of the Long-Term Evolution (LTE) system will be concentrated on areas with high user traffic overlaying with the legacy second-generation (2G) or third-generation (3G) mobile system. Consequently, the limited LTE coverage will result in many inter-radio access technology (RAT) handovers from LTE to 3G systems and vice versa. Trouble-free operation of inter-RAT handovers requires the optimization of the handover parameters of each cell in both RATs. The current network planning and optimization methods provide a fixed network-wide setting for all the handover parameters of the cells. Cells that later show considerable mobility problems in operation mode are manually optimized with the aid of drive tests and expert knowledge. This manual optimization of the handover parameters requires permanent human intervention and increases the operational expenditure (OPEX) of the mobile operators. Moreover, the interoperability of several RATs increases further the parameter space of the handover parameters, which makes the manual optimization difficult and almost impracticable. To reduce OPEX and to achieve a better network performance, we propose in this paper a self-optimizing algorithm where each cell in a RAT updates its handover parameters in an autonomous and automated manner depending on its traffic and mobility conditions. The proposed algorithm uses a feedback controller to update the handover parameters as a means to providing a steady improvement in the network performance. In the context of control theory, the feedback controller consists of a proportional control block, which regulates the change in the magnitude of each handover parameter, and a gain scheduler, which modifies the parameters of the proportional control block depending on the mobility conditions in each cell. To benchmark the design of the proposed algorithm, we apply two general and nonself-optimization algorithms: Taguchi's method and simulated annealing to optimize the handover parameters. Simulation results show that the proposed self-optimizing algorithm reaches a stable optimized operation point with cell-specific handover parameter settings, which considerably reduce the number of mobility failure events in the network, compared with three fixed settings for the handover parameters. Moreover, it is presented that the proposed self-optimizing algorithm outperforms Taguchi's method and simulated annealing when applied to a mobility robustness optimization (MRO) problem. © 1967-2012 IEEE.
Liebl G.,Nomor Research GmbH |
De Moraes T.M.,Nomor Research GmbH |
Soysal A.,Nomor Research GmbH |
Seidel E.,Nomor Research GmbH
IEEE Wireless Communications and Networking Conference, WCNC | Year: 2012
In this work, we revisit the issue of fair resource allocation in relay-enhanced wireless networks. Our focus this time is on Type-1a relays as proposed for LTE-Advanced. The latter operate as out-band relays, i.e., the backhaul and relay access link use separate carrier frequencies. If carrier aggregation is applied at the macro base station, the backhaul carrier may also contain part of the macro access link. Assuming full buffer traffic on the downstream, we demonstrate how similar resource partitioning strategies at the base station as proposed for in-band relays can be also applied in the out-band case. Furthermore, we propose that for out-band relays, the backhaul link should be considered directly in the regular frequency-selective scheduling process for best performance vs. complexity trade-off. The presented results include the resource consumption and achievable throughput for a hot-spot scenario with 2 out-band relay nodes, as well as a comparison to the in-band case assuming same overall resource budget. © 2012 IEEE.
Agency: European Commission | Branch: FP7 | Program: CP | Phase: ICT-2007.1.5 | Award Amount: 3.26M | Year: 2008
In the forthcoming age, where everyone may be content producer, mediator and consumer, SEA aims to offer a new experience of personalised, seamless content delivery, maintaining the integrity and wherever applicable, enriching the perceived QoS (PQoS) of the media across the whole distribution chain. SEA is a project focused on seamless, personalised, trusted and PQoS-optimised multimedia content delivery, across broadband networks, varying from broadcasting to P2P topologies. \nSEA motivation is to implement a context-aware networking delivery platform, by focusing on four key principles:\n- Multi-layered/-viewed content coding, considering the evolving H.264 SVC/MVC and their emerging successors, as the major foreseen delivery technologies over heterogeneous networks/terminals and large audiences. \n- Multi-source/-network content streaming offering on-the fly content adaptation, increased scalability and enriched PQoS by dynamically combining content layers or representations of the same resource, transmitted from multiple sources and/or received over multiple networks. \n- Cross-network/-layer optimisation. The network/terminal heterogeneity, also engaging P2P overlays and serving different quality and views will require cross-layer optimization, traffic adaptation and optimal use of the available network/terminal resources.\n- Content Protection. A hybrid solution for personalised content protection by means of a combination of streaming encryption, content protection and rights management for new media, covering not only the legacy content creation chain, but also the private multimedia content. \nSEA will test/validate the developed technologies over three interconnected tedbeds: a) a real-time emulated lab, b) a world-wide extended P2P testbed (PlanetLab) and c) a real 2G\/3G/4G/WLAN mobile trial. \nSEA will eventually provide citizens with the means to offer personalized A/V user-centric services, improving their quality of life, entertainment and safety.
Nomor Research GmbH | Date: 2011-06-08
An apparatus (100) for providing a control signal for a transceiver (200) of a mobile communication network. The apparatus (100) comprises a means (110) for simulating the mobile communication network in accordance with a predetermined simulation setting, the mobile communication network having a plurality of virtual transceivers. The apparatus (100) further comprises a means (120) for mapping the transceiver (200) to one of the plurality of virtual transceivers and a means (130) for determining a signal of the mapped virtual transceiver based on the simulation describing the behavior of the virtual transceiver in the mobile communication network based on the predetermined simulation setting. The apparatus (100) further comprises a means (140) for processing the signal to obtain the control signal and a means (150) for outputting the control signal to the transceiver (200).
Agency: European Commission | Branch: H2020 | Program: RIA | Phase: ICT-14-2014 | Award Amount: 7.92M | Year: 2015
The key objective of 5G NORMA is to develop a conceptually novel, adaptive and future-proof 5G mobile network architecture. The architecture is enabling unprecedented levels of network customisability, ensuring stringent performance, security, cost and energy requirements to be met; as well as providing an API-driven architectural openness, fuelling economic growth through over-the-top innovation. With 5G NORMA, leading players in the mobile ecosystem aim to underpin Europes leadership position in 5G. Relevant to strands Radio network architecture and technologies and Convergence beyond last mile, the 5G NORMA architecture will provide the necessary adaptability able to efficiently handle the diverse requirements and traffic demand fluctuations resulting from heterogeneous and changing service portfolios. Not following the one system fits all services paradigm of current architectures, 5G NORMA will allow for adapting the mechanisms executed for a given service to the specific service requirements, resulting in a novel service- and context-dependent adaptation of network functions paradigm. The technical approach is based on the innovative concept of adaptive (de)composition and allocation of mobile network functions, which flexibly decomposes the mobile network functions and places the resulting functions in the most appropriate location. By doing so, access and core functions no longer (necessarily) reside in different locations, which is exploited to jointly optimize their operation when possible. The adaptability of the architecture is further strengthened by the innovative software-defined mobile network control and mobile multi-tenancy concepts, and underpinned by corroborating demonstrations. 5G NORMA will ensure economic sustainability of network operation and open opportunities for new players, while leveraging the efficiency of the architecture to do so in a cost- and energy- effective way.
Agency: European Commission | Branch: FP7 | Program: CP | Phase: ICT-2009.1.1 | Award Amount: 14.40M | Year: 2010
The first 3GPP Long Term Evolution standard version is complete and ready to be deployed. Although it increases peak data rate and spectral efficiency compared to legacy techniques, cell-edge and average user throughputs are still significantly lower than the peak rates. In the LTE-Advanced Study Item, ways to extend LTE are being explored. However, some of the considered techniques are complex and significant research efforts are needed to bring these techniques to reality.\n\nThe main ARTIST4G project objective is to improve the ubiquitous user experience of cellular mobile radio communications systems by satisfying the following requirements:\n\tHigh spectral efficiency and user data rate across the whole coverage area\n\tFairness between users\n\tLow cost per information bit\n\tLow latency\n\nThis objective will be achieved by developing innovative concepts out of promising ideas from the research ecosystem, and benchmarking them with the state-of-the-art. The technologies identified to fulfil the above requirements are:\n\tInterference avoidance\n\tInterference exploitation\n\tAdvanced relay techniques\n\nARTIST4G will build upon projects such as EASY-C, where first steps towards integration of these technologies in cellular systems have been made, but also address new aspects like:\n\tAdvanced multi-cell scheduling for adaptive and efficient usage of interference management and relaying techniques in appropriate scenarios\n\tImpact of the innovative concepts on the network architecture\n\nARTIST4G will not only use theoretical analysis and simulations to develop and validate innovative concepts based on these technologies, but also enable proof-of-concept via hardware prototypes and field trials in a representative testbed.\n\nIt is expected that ARTIST4G will create a major impact on standardization and provide the partners with a technological head-start that will strengthen the European position in cellular communications.