iMinds is a Flemish non-profit organization, founded by the Flemish Government. It was founded as a research institute, with a focus on information & communication technology in general, and applications of broadband technology in particular. iMinds offers companies and organizations active support in research and development. It brings together companies, authorities, and non-profit organizations to join forces on research projects. Wikipedia.
Agency: Cordis | Branch: FP7 | Program: CP | Phase: ICT-2007.1.1 | Award Amount: 4.84M | Year: 2008
The DICONET proposal is targeting a novel approach to optical networking providing a disruptive solution for the development of the core network of the future. It is the vision and goal of our consortium to provide ultra high speed end-to-end connectivity with quality of service and high reliability through the use of optimised protocols and routing algorithms that will complement a flexible control and management plane offering flexibility for the future network infrastructure. We plan to investigate, design, implement and test new routing and wavelength assignment algorithms considering as constraints physical impairments that arise in transparent core networks. These algorithms will be incorporated into a novel dynamic network planning tool that would consider dynamic traffic characteristics, varying physical impairment and component characteristics and a reconfigurable optical layer. The use of this novel planning tool in conjunction with proper extensions to the control plane of core optical networks that will be designed, implemented and tested by our consortium will make possible to realize the vision of transparency, while offering efficient resource utilization and strict quality of service guarantees based on certain service level agreements. The combinations of the tools, algorithms and protocols that will developed by the uniquely qualified DICONET consortium together with new technologies and architectures that will be considered as enablers for the network of the future will assist in overcoming the expected long term limitations of current core network capabilities. The DICONET scope and objectives, address dynamic cross-layer network planning and optimization while considering the development of a future transport network infrastructure which ensures fail-safe network configuration and operation. Our approach will greatly contribute as a basic element in achieving resilience and transparency of the Future Internet.
Agency: Cordis | Branch: FP7 | Program: NOE | Phase: ICT-2007.1.1 | Award Amount: 4.75M | Year: 2008
The BONE-proposal builds on the foundations laid out by the ePhoton/ONe projects in the previous Framework Programme. This Network of Excellence has brought together over several years the research activities within Europe in the field of Optical Networks and the BONE-project intends to validate this effort by stimulating a more intensified collaboration, exchange of researchers and building on Virtual Centres of Excellence that can serve to European industry with education and training, research tools and testlabs and pave the way to new technologies and architectures.\nThe Network of the Future, which is the central theme of this Call, will have to cope with a wide variety of applications running on a wide variety of terminals and with an increasing number of connected devices and increasing speed and data-loads. The BONE-proposal does not look into issues as convergence between mobile and fixed networks, nor does it consider issues regarding the optimised broadband access in the last mile using a wide variety of technologies such as DSL, cable, WiMAX, WiFi, PLC,... The BONE-proposal looks further into the future and takes as the final Network of the Future:\n- a high capacity, flexible, reconfigurable and self-healing optical Core and Metro network which supports the transport of massive amounts of data\n- a FTTx solution in which the x is as close as possible to the home, at the home, or even in the home. From this point the user is connected using terminal-specific technologies (wireless to handheld devices, fiber to home cinema, wireless to laptop, fixed connection to desktop,...)\nBONE clearly identifies the existence of the current technologies and also recognizes the fact that users also require the mobility of wireless access, but this mobile connection ends at a gateway or access points and from there a fixed connection is required and this fixed connection will finally be an optical link.
Vermeeren G.,Interdisciplinary Institute for BroadBand Technology IBBT |
Joseph W.,Interdisciplinary Institute for BroadBand Technology IBBT |
Martens L.,Interdisciplinary Institute for BroadBand Technology IBBT
Bioelectromagnetics | Year: 2013
Assessing the whole-body absorption in a human in a realistic environment requires a statistical approach covering all possible exposure situations. This article describes the development of a statistical multi-path exposure method for heterogeneous realistic human body models. The method is applied for the 6-year-old Virtual Family boy (VFB) exposed to the GSM downlink at 950MHz. It is shown that the whole-body SAR does not differ significantly over the different environments at an operating frequency of 950MHz. Furthermore, the whole-body SAR in the VFB for multi-path exposure exceeds the whole-body SAR for worst-case single-incident plane wave exposure by 3.6%. Moreover, the ICNIRP reference levels are not conservative with the basic restrictions in 0.3% of the exposure samples for the VFB at the GSM downlink of 950MHz. The homogeneous spheroid with the dielectric properties of the head suggested by the IEC underestimates the absorption compared to realistic human body models. Moreover, the variation in the whole-body SAR for realistic human body models is larger than for homogeneous spheroid models. This is mainly due to the heterogeneity of the tissues and the irregular shape of the realistic human body model compared to homogeneous spheroid human body models. © 2012 Wiley Periodicals, Inc.
De Florio V.,University of Antwerp |
De Florio V.,Interdisciplinary Institute for BroadBand Technology IBBT
Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) | Year: 2010
At our behest or otherwise, while our software is being executed, a huge variety of design assumptions is continuously matched with the truth of the current condition. While standards and tools exist to express and verify some of these assumptions, in practice most of them end up being either sifted off or hidden between the lines of our codes. Across the system layers, a complex and at times obscure web of assumptions determines the quality of the match of our software with its deployment platforms and run-time environments. Our position is that it becomes increasingly important being able to design software systems with architectural and structuring techniques that allow software to be decomposed to reduce its complexity, but without hiding in the process vital hypotheses and assumptions. In this paper we discuss this problem, introduce three potentially dangerous consequences of its denial, and propose three strategies to facilitate their treatment. Finally we propose our vision towards a new holistic approach to software development to overcome the shortcomings offered by fragmented views to the problem of assumption failures. © 2010 Springer-Verlag Berlin Heidelberg.
Agency: Cordis | Branch: FP7 | Program: CP | Phase: ICT-2007.1.1 | Award Amount: 4.98M | Year: 2008
The SOCRATES project investigates the application of self-organisation methods, which includes mechanisms for self-optimisation, self-configuration and self-healing, as a promising opportunity to automate wireless access network planning and optimisation, thus reducing substantially the Operational Expenditure (OPEX) and improving network coverage, resource utilisation and service quality. Fundamental drivers for the deployment of self-organisation methods are the complexity of the contemporary heterogeneous access network technologies, the growing diversity in offered services and the need for enhanced competitiveness.\n\nSOCRATES technological focus is on the self-configuration and self-optimisation of site and radio resource management parameters of 3GPP Long Term Evolution (LTE). Directed by a set of use cases where the application of self-organisation methods are anticipated to have a significant potential, novel methods for efficient and effective self-organisation are developed, with due attention given to the retrieval and processing of the required measurements.\n\nSelf-organisation in wireless access networks is a challenging topic: besides the intrinsically difficult issues concerning measurement and control, the project faces highly complex systems with a multitude of tuneable parameters and intricate interdependencies.\n\nAs part of the project a validation and demonstration of the developed methods for self-organisation is carried out through extensive simulation experiments, assessing the achievable cost reductions and performance enhancements. The implementation and operational impact of the developed concepts and methods is investigated by analysing the residual radio network planning process, the operations, administration and maintenance architecture and the protocol interfaces.
Agency: Cordis | Branch: FP7 | Program: CP | Phase: ICT-2007.1.1 | Award Amount: 5.05M | Year: 2008
Thin client solutions (i.e. processing delegated primarily to a remote server) have been extremely successful in wired LAN settings, because of cost reductions, inherent data security and privacy, more efficient use of resources, and ubiquitous data and service access.\n\nDespite the successes in the wired scenario, solutions able to also perform well in a wireless wide area network, still do not exist because of the differences of characteristics in the network and devices. The ambition of MobiThin STREP project is to focus on extending existing wired thin client solutions to wireless mobile devices.\n\nIntelligent distribution of demanding services and all existing legacy applications to mobile devices over state-of-the-art telecommunications network, is a important rationale behind several research initiatives. Major blockers for this efficient high quality service delivery are concerned with the inherent characteristics of the wireless medium, as well with the resource constraints typical for wireless terminals (energy consumption and input-output mechanisms). MobiThin takes this challenge, pursuing thin client based solutions, optimized for wireless wide area networks.\n\nMobiThin driven by a strong consortium focused on thin client computing - will develop an end-to-end solution, and address all important blockers for the wide adoption of wireless thin client computing paradigm. These include architecture and technology issues (wireless medium optimization, dedicated video codec and user pattern research, software/middleware, performance and energy saving oriented solutions), as well as economic ones (business roles and models). In addition to making scientific and technological progress, the project will demonstrate an integrated prototype for the wireless thin client.\n\nCombined with the existing wired technologies the MobiThin impact is to:\no run your applications EVERYWHERE\no CONNECT mobile and wireless workers\no use every DEVICE on every network
Heylen R.,Interdisciplinary Institute for BroadBand Technology IBBT |
Burazerovic D.,Interdisciplinary Institute for BroadBand Technology IBBT |
Scheunders P.,Interdisciplinary Institute for BroadBand Technology IBBT
IEEE Transactions on Geoscience and Remote Sensing | Year: 2011
We present a new algorithm for linear spectral mixture analysis, which is capable of supervised unmixing of hyperspectral data while respecting the constraints on the abundance coefficients. This simplex-projection unmixing algorithm is based upon the equivalence of the fully constrained least squares problem and the problem of projecting a point onto a simplex. We introduce several geometrical properties of high-dimensional simplices and combine them to yield a recursive algorithm for solving the simplex-projection problem. A concrete implementation of the algorithm for large data sets is provided, and the algorithm is benchmarked against well-known fully constrained least squares unmixing (FCLSU) techniques, on both artificial data sets and real hyperspectral data collected over the Cuprite mining region. Unlike previous algorithms for FCLSU, the presented algorithm possesses no optimization steps and is completely analytical, severely reducing the required processing power. © 2006 IEEE.
Heylen R.,Interdisciplinary Institute for BroadBand Technology IBBT |
Scheunders P.,Interdisciplinary Institute for BroadBand Technology IBBT
IEEE Geoscience and Remote Sensing Letters | Year: 2012
Recently, several nonlinear techniques have been proposed in hyperspectral image processing for classification and unmixing applications. A popular data-driven approach for treating nonlinear problems employs the geodesic distances on the data manifold as property of interest. These geodesic distances are approximated by the shortest path distances in a nearest neighbor graph constructed in the data cloud. Although this approach often works well in practical applications, the graph-based approximation of these geodesic distances often fails to capture correctly the true nonlinear structure of the manifold, causing deviations in the subsequent algorithms. On the other hand, several model-based nonlinear techniques have been introduced as well and have the advantage that one can, in theory, calculate the geodesic distances analytically. In this letter, we demonstrate how one can calculate the true geodesics, and their lengths, on any manifold induced by a nonlinear hyperspectral mixing model. We introduce the required techniques from differential geometry, show how the constraints on the abundances can be integrated in these techniques, and present a numerical method for finding a solution of the geodesic equations. We demonstrate this technique on the recently developed generalized bilinear model, which is a flexible model for the nonlinearities introduced by secondary reflections. As an application of the technique, we demonstrate that multidimensional scaling applied to these geodesic distances can be used as a preprocessing step to linear unmixing, yielding better unmixing results on nonlinear data when compared to principal component analysis and outperforming ISOMAP. © 2012 IEEE.
Verdegem P.,Interdisciplinary Institute for BroadBand Technology IBBT |
Verdegem P.,Uppsala University |
De Marez L.,Interdisciplinary Institute for BroadBand Technology IBBT
Technovation | Year: 2011
In the contemporary ICT environment, we are confronted with a growing number of failing innovations. New technological innovations often fail because too much attention is still given to (technical) product-related features without taking into account the most important parameters of user acceptance. In addition, suppliers of ICT products often lack accurate insight into the distinguished profiles of their (potential) target audience. In this article theoretical considerations and empirical results on this matter are highlighted. First of all, an approach is proposed in which more traditional and often scattered vision(s) on adoption determinants are broadened into an integrated framework. The approach provides a stronger base for better targeting of (new) users of technologies. Secondly, the authors elaborate on this by rethinking these determinants with regard to later adopters. Later adopters (or even non-adopters/users) are often ignored in technology acceptance research. However, especially for policy purposes, the understanding of why people do not adopt or do not use ICT is strongly relevant in the light of the development of an inclusive information society. Both approaches are illustrated by case studies starting from a common list of nineteen ICT appropriation determinants. This framework enables to better profile both earlier and later adopters as well as it allows to formulate recommendations how to bring innovations in the market. Summarizing, this contribution offers an integrated approach on technology acceptance research by bridging the gap between a market and a policy-oriented point of view. © 2011 Elsevier Ltd. All rights reserved.
Agency: Cordis | Branch: FP7 | Program: CP | Phase: ICT-2007.1.1 | Award Amount: 16.47M | Year: 2008
The ALPHA project addresses the challenges of building the future access and all types of in-building networks for home and office environments. The proposal supports the evolution towards a cognitive network by dynamically utilising the resources of an optical network infrastructure to support a heterogeneous environment of wired and wireless technologies.\nThe project investigates innovative architectural and transmission solutions based on the manifold of optical fibres (single-, multi-mode and plastic) as well as wireless technology to support both wired and wireless services in a converged network infrastructure. The focus is on using the newest physical layer achievements and adequate management and control algorithms to reach a yet unprecedented end-to-end provisioned capacity for access and in-building networks at a fraction of the price of todays technologies and to simultaneously include the transport of existing 2G/3G and Beyond 3G (B3G) signals whether they are Internet Protocol (IP) or non-IP-based.\n\nThe project starts with analysing the potential future bandwidth and quality-of-service (QoS) requirements which can be posed by future services in the scope of access and in-building networks such as Ultra HD Video, Local Storage Area Network, remote medical applications and others, and mapping those requirements into network specifications. The questions on the best applicable media, necessity for optical layer dynamics, compatibility of network types at the physical layer, foundations for better QoS provisioning and embedding of 2G/3G and B3G signals into the networks are then addressed within the project.\n\nThe project pursues experimental validations of close-to-maturity technologies in laboratory tests and field trials by intensively exploiting the three project testbeds. The project also includes long-term research activities targeting to improve the existing technologies, and follows an intensive dissemination and standardisation strategy.