Tabuk, Saudi Arabia

Time filter

Source Type

Abbas A.,University of Balamand | Nasri A.,Fahad Bin Sultan University
Computer-Aided Design and Applications | Year: 2017

As originally conceived, T-splines generalize both NURBS and Subdivision surfaces. Central to T-splines is the knot refinement algorithm, which seems to successfully import the local characteristic of B-spline and NURBS curve knot insertion. However, the mathematical decisiveness manifested in curve knot insertion is nowhere to be seen in previously published versions of T-spline local refinement. In this respect, this paper gives a tutorial exposition of T-spline local refinement, interpreted in the spirit of a belief-revision metaphor. It also provides a detailed implementation of that, designed following the architecture of rule-based systems. Both of these are classical topics in traditional Artificial Intelligence Research. © 2017 CAD Solutions, LLC


Innab N.,University of New England of Australia | Kayed A.,Fahad Bin Sultan University | Sajeev A.S.M.,University of New England of Australia
Proceedings of 2012 IEEE International Conference on Information Science and Technology, ICIST 2012 | Year: 2012

Ontology provides means to describe concepts effectively. It has become an increasingly useful tool in understanding concepts in various fields of Information Systems and Technology. The aim of this paper is to build and evaluate an ontology that standardizes concepts and semantics of requirements modelling notations, in order to provide a common understanding of those concepts among software engineers. This ontology will lead to easy learning of modelling diagram concepts for new system developers. It will also allow software engineers to move from one modelling notation to another easily. © 2012 IEEE.


Haydar M.,Fahad Bin Sultan University | Haydar M.,University of Montréal | Sahraoui H.,University of Montréal
Information and Software Technology | Year: 2013

Context: In the past decade, the World Wide Web has been subject to rapid changes. Web sites have evolved from static information pages to dynamic and service-oriented applications that are used for a broad range of activities on a daily basis. For this reason, thorough analysis and verification of Web Applications help assure the deployment of high quality applications. Objectives: In this paper, an approach is presented to the formal verification and validation of existing web applications. The approach consists of using execution traces of a web application to automatically generate a communicating automata model. The obtained model is used to model checking the application against predefined properties, to perform regression testing, and for documentation. Methods: Traces used in the proposed approach are collected by monitoring a web application while it is explored by a user or a program. An automata-based model is derived from the collected traces by mapping the pages of the application under test into states and the links and forms used to browse the application into transitions between the states. Properties, meanwhile, express correctness and quality requirements on web applications and might concern all states of the model; in many cases, these properties concern only a proper subset of the states, in which case the model is refined to designate the subset of the global states of interest. A related problem of property specification in Linear Temporal Logic (LTL) over only a subset of states of a system is solved by means of specialized operators that facilitate specifying properties over propositional scopes in a concise and intuitive way. Each scope constitutes a subset of states that satisfy a propositional logic formula. Results: An implementation of the verification approach that uses the model checker Spin is presented where an integrated toolset is developed and empirical results are shown. Also, Linear Temporal Logic is extended with propositional scopes. Conclusion: a formal approach is developed to build a finite automata model tuned to features of web applications that have to be validated, while delegating the task of property verification to an existing model checker. Also, the problem of property specification in LTL over a subset of the states of a given system is addressed, and a generic and practical solution is proposed which does not require any changes in the system model by defining specialized operators in LTL using scopes. © 2013 Elsevier B.V. All rights reserved.


Al-Degs Y.S.,Hashemite University | Al-Ghouti M.,Fahad Bin Sultan University | Al-Ghouti M.,Industrial Chemistry Center | Salem N.,Industrial Chemistry Center
Food Analytical Methods | Year: 2011

The frying qualities of palm and soybean oils are determined using infrared spectroscopy and multivariate calibration. Compare to soybean oil, palm oil is more resistive to the chemical and physical changes and this is attributed to the high degree of unsaturation of soybean oil (61.9%) compare to palm oil (13.8%). After 48 h in service, the oil samples were effectively clustered into two groups using principal component analysis which indicated that both oils still maintain their chemical identities. Partial least squares regression (PLS1 and PLS2) a long with mid-FTIR data are used for predicting free fatty acid, viscosity, and total polar compounds of the used oils without running expensive standard procedures. PLS1 and PLS2 outperformed PCR and MLR for predicting the quality indicators of the frying oils. For palm oil and at the optimum calibration conditions, the obtained accuracies (SD) are 105.6% (0.05), 99.8% (1.10), and 103.9% (0.16) for free fatty acid, viscosity, and total polar compounds, respectively. The proposed method is simple, less-expensive, and has comparable accuracy/precision with standard procedures that often used for monitoring frying oils. © 2011 Springer Science+Business Media, LLC.


Evers F.,Heinrich Heine University Düsseldorf | Zunke C.,Heinrich Heine University Düsseldorf | Hanes R.D.L.,Heinrich Heine University Düsseldorf | Bewerunge J.,Heinrich Heine University Düsseldorf | And 4 more authors.
Physical Review E - Statistical, Nonlinear, and Soft Matter Physics | Year: 2013

The dynamics of individual colloidal particles in random potential energy landscapes was investigated experimentally and by Monte Carlo simulations. The value of the potential at each point in the two-dimensional energy landscape follows a Gaussian distribution. The width of the distribution, and hence the degree of roughness of the energy landscape, was varied and its effect on the particle dynamics studied. This situation represents an example of Brownian dynamics in the presence of disorder. In the experiments, the energy landscapes were generated optically using a holographic setup with a spatial light modulator, and the particle trajectories were followed by video microscopy. The dynamics is characterized using, e.g., the time-dependent diffusion coefficient, the mean squared displacement, the van Hove function, and the non-Gaussian parameter. In both experiments and simulations the dynamics is initially diffusive, showing an extended subdiffusive regime at intermediate times before diffusive motion is recovered at very long times. The dependence of the long-time diffusion coefficient on the width of the Gaussian distribution agrees with theoretical predictions. Compared to the dynamics in a one-dimensional potential energy landscape, the localization at intermediate times is weaker and the diffusive regime at long times reached earlier, which is due to the possibility to avoid local maxima in two-dimensional energy landscapes. © 2013 American Physical Society.


Al-Degs Y.S.,Hashemite University | Al-Ghouti M.,Fahad Bin Sultan University | Walker G.,Queen's University of Belfast
Journal of Thermal Analysis and Calorimetry | Year: 2012

Higher heating value (HHV) is probably the most important property of the fuels. Bomb calorimeter and derived empirical formulae are often used for accurate determination of HHV of fuels. A useful empirical equation was derived to estimate HHV of petro-diesels from their C and H contents: HHV (in MJ/kg) = 0.3482(C) + 1.1887(H), r2 = 0.9956. The derived correlation was validated against the most common formulae in the literature, Boie and Channiwala-Parikh correlations. Accordingly, accurate determination of C and H contents is essential for estimation of HHV and avoids using a bomb calorimeter. However, accurate estimation of C and H contents requires using expensive and laborious gas chromatographic techniques. In this work, chemometry offered a simple method for HHV determination of petro-diesels without using bomb calorimeter or even gas chromatography. PLS-1 calibration was used instead of gas chromatography to find C and H contents from the non-selective mid-infrared (MIR) spectra of petro-diesels, HHV was then estimated from the earlier empirical equation. The proposed method predicts HHV of petro-diesels with high accuracy and precision, with modest analysis costs. The present method may be extended to other fuels. © Akademiai Kiado, Budapest, Hungary 2011.


Kayed A.,Fahad Bin Sultan University | El-Qawasmeh E.,King Saud University | Qawaqneh Z.,Jordan University of Science and Technology
Information and Management | Year: 2010

Many web search engines retrieve enormous amounts of irrelevant information in answer to users' queries. The semantic web provides a promising approach to improve search operation. For specific domains, ontologies can capture concepts to help machines deal with data semantically. Our aim in writing this paper was to show how to measure the closeness (relevancy) of retrieved web sites to user query-concepts and re-rank them accordingly. We therefore proposed a new relevancy measure to re-rank retrieved documents. We termed the approach "ontology concepts" and it on the domain of electronic commerce. Results suggested that we could re-rank the retrieved documents (web sites) according to their relevancy to the search query. Our method depends on the frequency of the "ontology concepts" in the retrieved documents and uses this to compute their relevancy. © 2010 Elsevier B.V. All righs reserved.


Yasseen Z.,French Institute for Research in Computer Science and Automation | Verroust-Blondet A.,French Institute for Research in Computer Science and Automation | Nasri A.,Fahad Bin Sultan University
Pattern Recognition | Year: 2016

One of the main challenges in shape matching is overcoming intra-class variation where objects that are conceptually similar have significant geometric dissimilarity. The key to a solution around this problem is incorporating the structure of the object in the shape descriptor which can be described by a connectivity graph customarily extracted from its skeleton. In a slightly different perspective, the structure may also be viewed as the arrangement of protruding parts along its boundary. This arrangement does not only convey the protruding part's ordering along the anti clockwise direction, but also these parts on different levels of detail. In this paper, we propose a shape matching method that estimates the distance between two objects by conducting a part-to-part matching analysis between their visual protruding parts. We start by a skeleton-based segmentation of the shape inspired by the Chordal Axis Transform. Then, we extract the segments that represent the protruding parts in its silhouette on varied levels of detail. Each one of these parts is described by a feature vector. A shape is thus described by the feature vectors of its parts in addition to their angular and linear proximities to each other. Using dynamic programming, our algorithm finds a minimal cost correspondence between parts. Our experimental evaluations validate the proposition that part correspondence allows conceptual matching of precisely dissimilar shapes. © 2016 Elsevier Ltd.


Gourtsoyannis E.,Fahad Bin Sultan University
Advances in Space Research | Year: 2010

The ancient Greek astronomical calculator known as the Antikythera Mechanism has been analyzed using geometrical, calculus, trigonometric and complex variable methods. This analysis demonstrates that the Mechanism modeled the variations in the Moon's angular velocity as seen from the Earth, to better than 1 part in 200. A major implication of this analysis is that the Antikythera Mechanism of the 2nd century BCE modeled the anomalistic motion of the Moon more accurately than Ptolemy's account of Hipparchus's theory of the 2nd century CE. In the present work, mathematics, astronomy, history and methodology of the sciences combine in the study of a unique artifact, preserved for posterity in an ancient ship that sank in the Mediterranean 2100 years ago and recovered by Greek sponge divers at the dawn of the 20th century. © 2010 COSPAR. Published by Elsevier Ltd.


Bayoud H.A.,Fahad Bin Sultan University
Lecture Notes in Electrical Engineering | Year: 2013

The parameters of the two-parameter exponential distribution are estimated in this chapter from the Bayesian viewpoint based on complete, Type-I and Type-II censored samples. Bayes point estimates and credible intervals of the unknown parameters are proposed under the assumption of suitable priors on the unknown parameters and under the assumption of the squared error loss function. Illustrative example is provided to motivate the proposed Bayes point estimates and the credible intervals. Various Monte Carlo simulations are also performed to compare the performances of the classical and Bayes estimates. © 2013 Springer Science+Business Media Dordrecht.

Loading Fahad Bin Sultan University collaborators
Loading Fahad Bin Sultan University collaborators