Greene P.R.,BGKT Consulting Engineers |
Vigneau E.S.,Biomedical science |
Greene J.,Computer science
Clinical and Experimental Optometry | Year: 2015
Background: This project relates prevalence-time data, incidence rate data, age of onset, system plateau level and system time constant, using exponential equations, as they apply to progressive myopia, useful over several decades. Methods: Cross-sectional refractive data is analysed for nine studies with a total number of subjects at 444.6K (345, 981, 7.6K, 39, 421K, 383, 2K, 12K, 255), with ages ranging from five to 39 years. Basic exponential equations allow calculation of the prevalence versus time function Pr(t) as a percentage and the incidence rate function In(t) (percentage per year), system time constant t0 (years), onset age t1 (years) and saturation plateau level
(percentage). Results: The prevalence of myopia as a function of time Pr(t) (years) and incidence of myopia as a function of time In(t) (percentage per year) are continuously generated and compared with prevalence/incidence data from various reports investigating student populations. For a general medical condition, typical values for time constant t0 may range from one week to five years, depending on the health condition. Typical plateau levels for myopia may range from 35 to 95 per cent. Herein, data from nine demographic studies of myopia are analysed for prevalence Pr(t) with an accuracy within 14 per cent and incidence In(t) within 2.6 per cent per year, onset t1 = 1.5 years, time constant t0 = 4.5 year. By comparison, linear regression can predict the prevalence of myopia Pr(t) within 11 per cent and estimates a constant incidence rate for myopia In(t) of 4.7 per cent per year (95 per cent CI: 2.1 to 7.3 per cent per year]. Conclusions: The initial incidence rate at onset age In(t1) and system time constant t0 are inversely related. For myopia, onset age, time constant and saturation plateau level are fundamental system parameters derived from age-specific prevalence and incidence data. © 2015 Optometry Australia.
Olsson C.,Lund University |
Ulen J.,Lund University |
Boykov Y.,Computer Science |
Kolmogorov V.,Institute of Science and Technology
Proceedings of the IEEE International Conference on Computer Vision | Year: 2013
Energies with high-order non-sub modular interactions have been shown to be very useful in vision due to their high modeling power. Optimization of such energies, however, is generally NP-hard. A naive approach that works for small problem instances is exhaustive search, that is, enumeration of all possible labelings of the underlying graph. We propose a general minimization approach for large graphs based on enumeration of labelings of certain small patches. This partial enumeration technique reduces complex high-order energy formulations to pair wise Constraint Satisfaction Problems with unary costs (uCSP), which can be efficiently solved using standard methods like TRW-S. Our approach outperforms a number of existing state-of-the-art algorithms on well known difficult problems (e.g. curvature regularization, stereo, deconvolution), it gives near global minimum and better speed. Our main application of interest is curvature regularization. In the context of segmentation, our partial enumeration technique allows to evaluate curvature directly on small patches using a novel integral geometry approach. © 2013 IEEE.
Fleischer L.,Computer Science
Workshop on Analytic Algorithmics and Combinatorics 2010, ANALCO 2010 | Year: 2010
Inspired by problems in data center scheduling, we study the submodularity of certain scheduling problems as a function of the set of machine capacities and the corresponding implications. In particular, we • give a short proof that, as a function of the excess vector, maximum generalized flow is submodular and minimum cost generalized flow is supermodular; • extend Wolsey's approximation guarantees for submodular covering problems to a new class of problems we call supermodular packing problems; • use these results to get tighter approximation guarantees for several data center scheduling problems. © Copyright (2010) by SIAM: Society for Industrial and Applied Mathematics. All rights reserved.
Zahran M.,Computer Science
IEEE Pulse | Year: 2016
In the computing community, people look at the brain as the ultimate computer. Brain-inspired machines are believed to be more efficient than the traditional Von Neumann computing paradigm, which has been the dominant computing model since the dawn of computing. More recently, however, there have been many claims made regarding attempts to build brain-inspired machines. But one question, in particular, needs to be thoroughly considered before we embark on creating these so-called brain-inspired machines: Inspired by what, exactly? Do we want to build a full replica of the human brain, assuming we have the required technology? © 2016 IEEE.
Friedman R.,Computer Science |
Kogan A.,Computer Science
Proceedings of the IEEE Symposium on Reliable Distributed Systems | Year: 2012
This paper investigates a novel efficient approach to utilize multiple radio interfaces for enhancing the performance of reliable multicasts from a single sender to a group of receivers. In the proposed scheme, one radio channel (and interface) is dedicated only for recovery information transmissions. We apply this concept to both ARQ and hybrid ARQ+FEC protocols, formally analyzing the number of packets each receiver needs to process in both our approach and in the common single channel approach. We also present a corresponding efficient protocol, and study its performance by simulation. Both the formal analysis and the simulations demonstrate the benefits of our scheme. © 2012 IEEE.
Sala A.,Computer Science |
Cao L.,Computer Science |
Wilson C.,Computer Science |
Zablit R.,Computer Science |
And 2 more authors.
Proceedings of the 19th International Conference on World Wide Web, WWW '10 | Year: 2010
Access to realistic, complex graph datasets is critical to research on social networking systems and applications. Simulations on graph data provide critical evaluation of new systems and applications ranging from community detection to spam filtering and social web search. Due to the high time and resource costs of gathering real graph datasets through direct measurements, researchers are anonymizing and sharing a small number of valuable datasets with the community. However, performing experiments using shared real datasets faces three key disadvantages: concerns that graphs can be de-anonymized to reveal private information, increasing costs of distributing large datasets, and that a small number of available social graphs limits the statistical confidence in the results. The use of measurement-calibrated graph models is an attractive alternative to sharing datasets. Researchers can "fit" a graph model to a real social graph, extract a set of model parameters, and use them to generate multiple synthetic graphs statistically similar to the original graph. While numerous graph models have been proposed, it is unclear if they can produce synthetic graphs that accurately match the properties of the original graphs. In this paper, we explore the feasibility of measurement-calibrated synthetic graphs using six popular graph models and a variety of real social graphs gathered from the Facebook social network ranging from 30,000 to 3 million edges. We find that two models consistently produce synthetic graphs with common graph metric values similar to those of the original graphs. However, only one produces high fidelity results in our application-level benchmarks. While this shows that graph models can produce realistic synthetic graphs, it also highlights the fact that current graph metrics remain incomplete, and some applications expose graph properties that do not map to existing metrics. © 2010 International World Wide Web Conference Committee (IW3C2).
Wu Z.,Computer Science |
Wang J.,Computer Science
Proceedings - 2010 IEEE/ACM International Conference on Green Computing and Communications, GreenCom 2010, 2010 IEEE/ACM International Conference on Cyber, Physical and Social Computing, CPSCom 2010 | Year: 2010
Power management is becoming very important in data centers. Cloud computing is also one of the newer promising techniques, that are appealing to many big companies. To apply power management in cloud computing has been proposed and considered as green computing. Cloud computing, due to its dynamic structure and property in online services, differs from current data centers in terms of power management. To better manage the power consumption of web services in cloud computing with dynamic user locations and behaviors, we propose a power budgeting design based on the logical level, using a distribution tree. By setting multiple trees, we can differentiate and analyze the effect of workload types and Service Level Agreements (SLAs, e.g. response time) in terms of power characteristics. Base on these, we introduce classified power capping for different services as the control reference to maximize power saving when there are mixed workloads. © 2010 IEEE.
Sirisha A.,Computer Science |
Kumari G.G.,BITS Pilani Hyderabad Campus
Proceedings of the 2nd International Conference on Trendz in Information Sciences and Computing, TISC-2010 | Year: 2010
As cloud is an emerging paradigm of computing, it throws open various challenges and issues. The major issue hindering the growth of popularity of usage of cloud computing is Cloud security. There are numerous cloud security issues, of which this paper addresses the problem of insecure APIs. APIs act as the interface between cloud provider and the customer and the security of cloud computing depends largely on the security of these APIs. Hence a strong API access control mechanism is required. This paper proposes a two stage access control mechanism implemented at the API level using the Role Based Access Control Model (RBAC). ©2010 IEEE.
Ramos-Amezquita A.,Computer Science
Proceedings of Meetings on Acoustics | Year: 2013
The present work shows the results obtained in collaboration with the government of the state of Guanajuato in Mexico in a project that looked to include the acoustical analysis of Archaeological sites as a tool for gatering information regarding the historical social use of the areas in question. To that end, the acoustical characterization of 3 archaeological sites recently opened to the public in the state was in order: Cañada de la Virgen, Peralta and Plazuelas. Results include the 3D modeling of the areas of interest and the simulation of the acoustic response of them using the software EASE. Specific acoustic parameters were extracted from the simulations and then analyzed in comparison to archeological hypothesis of the use of such spaces as areas of public appearances, performance, ethno-musicological reports on the type and use of musical instruments and other archaeological findings in the area in order to support or disprove such hypothesis. © 2013 Acoustical Society of America.
Friedman R.,Computer Science |
Kogan A.,Computer Science |
Krivolapov Y.,Technion - Israel Institute of Technology
Proceedings - IEEE INFOCOM | Year: 2011
This paper describes a combined power and throughput performance study of WiFi and Bluetooth usage in smartphones. The study reveals several interesting phenomena and tradeoffs. The conclusions from this study suggest preferred usage patterns, as well as operative suggestions for researchers and smartphone developers. © 2011 IEEE.