Alazri A.S.,Applied Information Sciences
2016 11th International Conference for Internet Technology and Secured Transactions, ICITST 2016 | Year: 2016
This paper highlight the idea of submarine cables and its security trends. At the beginning , history of cables and its development have been introduced. The main structure of fiber optic have been discussed as well. Finally, threats and vulnerabilities of submarine cable introduced in details and supported by examples from the world such as natural disaster and habitats, commercial fishing, anchoring, oil and gas development. © 2016 Infonomics Society.
Kavitha B.,Bharathiar University |
Karthikeyan D.S.,Applied Information Sciences |
Sheeba Maybell P.,Karpagam University
Knowledge-Based Systems | Year: 2012
In the real world it is a routine that one must deal with uncertainty when security is concerned. Intrusion detection systems offer a new challenge in handling uncertainty due to imprecise knowledge in classifying the normal or abnormal behaviour patterns. In this paper we have introduced an emerging approach for intrusion detection system using Neutrosophic Logic Classifier which is an extension/combination of the fuzzy logic, intuitionistic logic, paraconsistent logic, and the three-valued logics that use an indeterminate value. It is capable of handling fuzzy, vague, incomplete and inconsistent information under one framework. Using this new approach there is an increase in detection rate and the significant decrease in false alarm rate. The proposed method tripartitions the dataset into normal, abnormal and indeterministic based on the degree of membership of truthness, degree of membership of indeterminacy and degree of membership of falsity. The proposed method was tested up on KDD Cup 99 dataset. The Neutrosophic Logic Classifier generates the Neutrosophic rules to determine the intrusion in progress. Improvised genetic algorithm is adopted in order to detect the potential rules for performing better classification. This paper exhibits the efficiency of handling uncertainty in Intrusion detection precisely using Neutrosophic Logic Classifier based Intrusion detection System. © 2011 Elsevier B.V. All rights reserved.
Orvatinia M.,Applied Information Sciences |
Heydarianasl M.,Bushehr Azad University
Sensors and Actuators, A: Physical | Year: 2012
A novel method for detection of continuous infrared (IR) radiation by pyroelectric detectors was presented. In this method, instead of conventional modulation of IR radiation by mechanical parts, which are complicated and unreliable, the temperature of the detector is modulated by thermoelectric cooler to activate it. With this new method, the main inherent limitation for application of these detectors is expected to be eliminated. An equivalent electrical circuit was proposed to simulate the thermal and electrical behavior of the detector. A prototype sensor was fabricated and its transient responses to different level of IR radiations were recorded. The model-based calculations were fitted to the measured data, and the fitting parameters were considered as model parameters. Good agreement between experimental data and analytical calculations confirmed the validity of the model. It was also demonstrated that, the application of this method increases the detection speed of the sensor. Improved speed is more than three orders of magnitude better than the other type of thermal detectors. © 2011 Elsevier B.V. All rights reserved.
Mohammad A.H.,Applied Information Sciences |
Zitar R.A.,New York Institute of Technology
Applied Soft Computing Journal | Year: 2011
Spam is a serious universal problem which causes problems for almost all computer users. This issue not only affects normal users of the internet, but also causes a big problem for companies and organizations since it costs a huge amount of money in lost productivity, wasting users' time and network bandwidth. There are many studies on spam indicates that spam costs organizations billions of dollars yearly. This work presents a lot of modification on a machine learning method inspired by the human immune system called artificial immune system (AIS) which is a new emerging method that still needs more investigations and demonstrations. Core modifications were applied on the standard AIS with the aid of the Genetic Algorithm (GA). Also Artificial Neural Network (ANN) for spam detection is applied in a new manner. SpamAssassin corpus is used in all our simulations. In standard AIS several user defined parameters are used such as culling of old lymphocytes. Genetic optimized AIS is used to present culling time instead of using user defined value. Also, a new idea to check antibodies in AIS is introduced. This would make the system able to accept types of messages that were previously considered as spam. The idea is accomplished by introducing a new issue which we call "rebuild time". Moreover, an adaptive weighting of lymphocytes is used to modify selection opportunities for different gene fragments. In this work also, core modifications on ANN neurons are applied; these modifications allow neurons to be changed over time replacing useless layers. This approach is called Continuous Learning Approach Artificial Neural Network, CLA-ANN. The final results are compared and analyzed. Results show that both systems, optimized spam detection using GA and spam detection using ANN, achieved promising scores comparable to standard AIS and other known methods. © 2011 Elsevier B.V.
Wenk B.,Applied Information Sciences
2010 IEEE Education Engineering Conference, EDUCON 2010 | Year: 2010
Open educational resources (OER) can significantly reduce the time required to prepare lectures. The prerequisites are that a desired resource can be found quickly and that its adequacy for the intended purpose can be estimated easily. Eventually, the resource should also be suitable for modification. In the first part we outline the requirements for the sourcing, storing, retrieval and exchange of open educational resources considering technical and legal aspects. In the second part we present a case study focusing on the user level perspective. We describe the searching for a particular OER (an online Moodle tutorial), the analysis of the resource found, its modification and the publishing of the modified resource on a repository. © 2010 IEEE.
Evjen S.,Applied Information Sciences
Library and Information Science Research | Year: 2015
In terms of political perceptions, library building projects appear to be similar across different contexts. Qualitative interviews with local politicians were employed to examine attitudes towards public libraries and library development in three cities building new central libraries: Aarhus, Denmark; Birmingham, UK; and Oslo, Norway. Applying an institutional perspective, the analysis focuses on norms, legitimization, and organizational change. Findings show shared views on the role and mission of the library. The informants primarily pointed to citizens' democratic rights and their country's democratic tradition when legitimizing public funding for libraries in general. However, argumentation for local library building projects was connected to city development and the desire to portray a city as oriented towards knowledge and culture. © 2015 Elsevier Inc.
Arike Y.,Applied Information Sciences
Letters in Mathematical Physics | Year: 2012
We show that the space of logarithmic intertwining operators among logarithmic modules for a vertex operator algebra is isomorphic to the space of 3-point conformal blocks over the projective line. This can be viewed as a generalization of Zhu's result for ordinary intertwining operators among ordinary modules. © 2012 Springer.
Tian T.,Applied Information Sciences |
Qi W.-F.,Applied Information Sciences
IEEE Transactions on Information Theory | Year: 2013
Let n be a positive integer. An NFSR of n stages is called irreducible if the family of output sequences of any NFSR of stages less than n is not included in that of the NFSR. In this paper, we prove that the density of the irreducible NFSRs of n stages is larger than 0.39. This implies that it is expected to find an irreducible NFSR of n stages among three randomly chosen NFSRs of n stages. © 1963-2012 IEEE.
News Article | April 19, 2012
This guest post is from Vishwas Lele, CTO at Applied Information Sciences, a provider of software and systems engineering services to government agencies. At AIS we work on IT initiatives across several U.S. Federal agencies including The Departments of Defense, Homeland Security and Justice. With our work in mind, I thought I would share some thoughts on the White House's recent Big Data announcement. While much has been said about the grand visions behind this initiative, my focus in this post is the usefulness of Big Data in medium-to-large tactical/operational IT projects. Big Data is not just for canonical use cases (such as genome data analysis, video and image analysis), but is equally important for Federal agencies in accomplishing their core missions. The majority of the applications we build are targeted towards implementing new (or optimizing existing) business processes, and even these applications that can generate a lot of data. Strategic value amid tactical challenges But there are challenges here. Given that the primary driver for these initiatives is to improve efficiency (and compliance), the data analysis part is often an afterthought. Even in cases where due importance is given to data analysis, the data collection strategy flows directly from the existing requirements. For instance, the grain (least count) of our data sets is governed by the level of drill-down that users have asked for today. Similarly, the amount of the historical data that is kept around is governed by the parameters used for capacity planning. These decisions are a result of limited resources (such as storage infrastructure) and the traditionally non-trivial cost of preparing data for analysis. These costs have arisen because traditional BI tools require data be organized in well-defined structures. Bringing analysis within Federal reach But now things may change. The advent of Big Data can bring the tools for arbitrarily large data collection and analysis within the reach of Federal agencies, even when resource-bound as discussed above. This is possible through adoption of open source frameworks such as Hadoop or Storm, commodity hardware and familiar SQL-like query constructs provided by such tools as Hive. Using an ODBC database driver for Hive, that imports results from a Hadoop query into Excel for further analysis, extends the life and usefulness of the data collected, and can be done affordably. The advent of “Hadoop-as-a-service” from public cloud providers such as Microsoft and Amazon can greatly lower costs as well. The existence of such cloud solutions means that agencies without a continuous need for Big Data can use Hadoop on an as-needed basis. And agencies that cannot move to a public cloud environment for security reasons can benefit from the community cloud-based Hadoop-as-a service offerings. A private sector example The CIO for travel services provider Orbitz decided to harness data that was going uncollected and unanalyzed. He initiated a big data strategy that allowed Orbitz to keep the logs of user activity indefinitely (prior to this initiative, logs were kept only for fixed number of days). This change caused the collection volume to grow from 7 TB to 750 TB. However, big data techniques made it possible for Orbitz not only to manage this data volume but turn it into key insights about their customers. A public sector commonality AIS believes that Federal agencies can apply similar Big Data techniques to increase insight, by harnessing currently-uncollected data. For example, financial agencies can use Big Data to improve fraud detection. Similarly, law enforcement agencies can improve open-source intelligence collection and analysis. Hopefully the spotlight on Big Data as a result of the recent White House announcement will encourage Federal agencies to take notice.