The Infocomm Development Authority of Singapore is a statutory board of the Singapore government, under the Ministry of Communications and Information . It was formed in 1999 when the government merged the National Computer Board and Telecommunication Authority of Singapore , as a result of a growing convergence of information technology and telephony. The Infocomm Development Authority is responsible for the development and growth of the infocomm sector in Singapore. IDA functions as the country's infocomm industry champion, the national infocomm master-planner and developer, and the Government Chief Information Officer . Wikipedia.
Agency: European Commission | Branch: FP7 | Program: CSA | Phase: INFRA-2007-3.3;INFRA-2007-3.0-03 | Award Amount: 1.45M | Year: 2008
The EUAsiaGrid proposal contributes to the aims of the EU Research Infrastructures FP7 Programme by promoting international interoperation between similar infrastructures with the aim of reinforcing the global relevance and impact of European e-Infrastructures. The projects main goal will be to pave the way towards an Asian e-Science Grid Infrastructure, in synergy with the other European Grid initiatives in Asia, namely EGEE-III via its Asia Federation, and both the EUChinaGRID and EU-IndiaGRID projects and their eventual follow on efforts. Taking advantage of the existing global Grid technologies, with the specific emphasis on the European experience with the gLite middleware and applications running on top of it, the project plans to encourage federating approaches across scientific disciplines and communities. EUAsiaGrid will act as a support action, aiming to define and implement a policy to promote the gLite middleware developed within the EU EGEE project across Asian countries. Its main actions will be to spread dissemination, provide training, support scientific applications and monitor the results.
Avetisyan A.I.,Russian Academy of Sciences |
Campbell R.,University of Illinois at Urbana - Champaign |
Gupta I.,University of Illinois at Urbana - Champaign |
Heath M.T.,University of Illinois at Urbana - Champaign |
And 14 more authors.
Computer | Year: 2010
Open Cirrus is a cloud computing testbed that, unlike existing alternatives, federates distributed data centers. It aims to spur innovation in systems and applications research and catalyze development of an open source service stack for the cloud. © 2006 IEEE.
Li Z.,Institute of High Performance Computing of Singapore |
Cai W.,Nanyang Technological University |
Turner S.J.,Nanyang Technological University |
Li X.,Infocomm Development Authority of Singapore (iDA) |
And 2 more authors.
ACM Transactions on Modeling and Computer Simulation | Year: 2015
Parallel and distributed simulations (or High-Level Architecture (HLA)-based simulations) employing optimistic synchronization allow federates to advance simulation time freely at the risk of overoptimistic executions and execution rollbacks. As a result, the simulation performance may degrade significantly due to the simulation workload imbalance among federates. In this article, we investigate the execution of parallel and distributed simulations on Cloud and data centers with Virtual Execution Environments (VEEs). In order to speed up simulation execution, an Adaptive Resource Provisioning Mechanism in Virtual Execution Environments (ArmVee) is proposed. It is composed of a performance monitor and a resource manager. The former measures federate performance transparently to the simulation application. The latter distributes available resources among federates based on the measured federate performance. Federates with different simulation workloads are thus able to advance their simulation times with comparable speeds, thus are able to avoid wasting time and resources on overoptimistic executions and execution rollbacks. ArmVee is evaluated using a real-world simulation model with various simulation workload inputs and different parameter settings. The experimental results show that ArmVee is able to speed up the simulation execution significantly. In addition, it also greatly reduces memory usage and is scalable. © 2015 ACM.
Bock G.-W.,Sungkyunkwan University |
Lee J.,Kyung Hee University |
Kuan H.-H.,Infocomm Development Authority of Singapore (iDA) |
Kim J.-H.,Sungkyunkwan University
Decision Support Systems | Year: 2012
This research attempts to evaluate the effects of antecedents of online trust in the context of multi-channel retailers at different phases, taking into consideration the moderating effects of product types. The results reveal that multi-channel retailers' trust is transferred from the offline channel to the online channel. Secondly, the customers' initial interaction with the retailers ameliorates the effects of non-direct experience, such as Internet-based structural assurance and word-of-mouth. With increases in product uncertainty, the effects of word-of-mouth, offline trust, and efficacy of sanctions on online trust are greater for experience products than for search products. © 2012 Elsevier B.V. All rights reserved.
Chia Y.-K.,IBM |
Chong H.-F.,Infocomm Development Authority of Singapore (iDA)
IEEE Transactions on Information Theory | Year: 2015
We consider the Wyner-Ziv and two way source coding problems with the erasure distortion measure. We characterize the rate-distortion regions for these settings, when the source and side information satisfy a positivity condition. Using our results, we show that, contrary to recent conjectures in the literature, feedback from the decoder to the encoder does not reduce the overall rate required in the Wyner-Ziv setting with erasure distortion, when the positivity condition is satisfied. Finally, we extend our techniques to characterize the rate-distortion regions of two multiterminal source coding settings, the Heegard-Berger setting, and the cascade source coding setting, when the positivity condition is satisfied. © 1963-2012 IEEE.
Duan R.,Institute of High Performance Computing of Singapore |
Goh R.S.M.,Institute of High Performance Computing of Singapore |
Rachmawati L.,Rolls-Royce |
Wang L.,Institute of High Performance Computing of Singapore |
And 6 more authors.
Proceedings of the International Conference on Cloud Computing Technology and Science, CloudCom | Year: 2015
As scientific applications like Computational Fluid Dynamics (CFD) simulations generate more and more data, co-processing becomes the most cost effective way to process the vast amount of data generated by these simulation. In a co-processing environment, analysis and/or visualization of intermediate results occur concurrently to the simulation itself. Improved efficiency and early insight into the simulation process and results are potential advantages in comparison to postprocessing, where analysis and/or visualization are performed after the completion of the simulation. To enable co-processing, however, intermediate data needs to be shared between simulation and data analysis, and some degree of coordination may be required to maintain the correctness of both simulation and data analysis. The overhead incurred to facilitate data sharing and coordination may well offset benefits gained, particularly where distributed, large-scale systems are involved as workload sharing, processor affinity and data locality introduce significant effects to the overall performance. In this paper, we propose a co-processing framework to address these issues. The empirical benchmarking results suggest that co-processing overhead tasks scale well with the system size, the overall gain of about 20% in turnaround time compared to post-processing and that the coprocessing framework allows simulation and data analysis task to scale up to their individual limits. © 2014 IEEE.