Al-Madinah International University

www.mediu.edu.my
Shah Alam, Malaysia

MEDIU is an independent, non-profit educational institution, located in Malaysia. The university was established in 2006, founded on Islamic principles and values. MEDIU is licensed by the Malaysian Ministry of Higher Education and its programs are accredited by the Malaysian Qualification Agency . The university is managed by Rector Prof. Dr. Mohammad Khalifa Al-Tamimi. Wikipedia.

SEARCH FILTERS
Time filter
Source Type

Hanum H.M.,University Technology of MARA | Nasaruddin S.,University Technology of MARA | Bakar Z.A.,Al-Madinah International University
2016 3rd International Conference on Information Retrieval and Knowledge Management, CAMP 2016 - Conference Proceedings | Year: 2016

Prosodic phrasing is useful to segment lengthy spontaneous speech into smaller meaningful utterance without analysis of linguistic information. A simpler approach is presented to identify and classify the boundaries in prosodic phrasing using pitch and intensity patterns and pause duration on Malay speech sentences. We also propose a listening test that allows trained listener to classify the boundaries as minor or major breaks. This cheaper and faster approach is proven useful for under-resource language such as Malay which do not have comprehensive prosodic-annotated corpus. Word-related pitch, intensity and duration features are extracted from the targeted sentence and phrase breaks. A speech corpus is developed from targeted breaks of 100 speech sentences evaluated in the listening test. Instead of labeling the phrase break using linguistic and phonetic meaning, the proposed listening test allows labeling of phrase break as perceived by listener. In addition, the results can be used as preliminary information for evaluation of boundary saliency at the targeted boundary locations. © 2016 IEEE.


Sazali S.S.,University Technology of MARA | Rahman N.A.,University Technology of MARA | Bakar Z.A.,Al-Madinah International University
2016 3rd International Conference on Information Retrieval and Knowledge Management, CAMP 2016 - Conference Proceedings | Year: 2016

Natural Language Processing (NLP) is an important field of research in Computer Science. NLP is the process of analyzing texts based on a set of theories and technologies, and recent studies focused more on Information Extraction (IE). In Information Extraction, there are few steps or commonly known as task to be followed, which are named entity recognition, relation detection and classification, temporal and event processing, and template filling. Recent researches in Malay languages mainly focused on newspaper articles and since this research experiment is experimenting on classical documents, there is a need to identify the best way to extract noun from existing methods. This paper proposes to conduct a research about extracting nouns from Malay classical documents. The result shows that experiment using the Noun Extraction using Morphological Rules (Verb, Adjective and Noun Affixes) that has 77.61% chances of identifying a noun to contribute to the existing Malay noun list. As there is not any existing completed Malay noun list or dictionary that can be used as a guide, the results extracted still need to be judged by the language experts. © 2016 IEEE.


Azizan A.,University Technology of MARA | Bakar Z.A.,Al-Madinah International University | Noah S.A.,National University of Malaysia
2016 3rd International Conference on Information Retrieval and Knowledge Management, CAMP 2016 - Conference Proceedings | Year: 2016

Query reformulation techniques based on ontological approach have been studied as a method to improve retrieval effectiveness. However, the evaluation of this techniques has primarily focused on comparing the technique with ontology and without ontology. The aim of this paper is to present, evaluate and compare the proposed technique in four different possibilities of reformulation. In this study we propose the combination of ontology terms and keywords from the query to reformulate new queries. The experimental result shows that reformulation using ontology terms alone has increases recall and decreases precision. However, better results were obtained when the ontology terms being combined with the query's keywords. © 2016 IEEE.


Abdulghafoor O.B.,Al-Madinah International University | Shaat M.M.R.,Catalonia Technology Center of Telecomunications | Ismail M.,National University of Malaysia | Nordin R.,National University of Malaysia
2016 IEEE 3rd International Symposium on Telecommunication Technologies, ISTT 2016 | Year: 2017

Cognitive radio (CR) has been proposed to solve the problem of spectrum under utilisation by opportunistically accessing unutilised bands. In this research, the problem of resource allocation in OFDM-CR networks has been examined. The main objective of this study, is to provide better control management for the interference to the primary user (PU) while maintain low complexity to the proposed algorithm. This objective has been secured by adopting pricing scheme to develop better power allocation algorithm with respect to the interference management. The performance of the proposed power allocation algorithm is tested for OFDM-based CRNs and has been compared to a number of related algorithms in the literature. The simulation results show excellent performance for the proposed algorithm compared to that algorithms presented in the literature with lower computational complexity, O(NM)+O(Nlog(N)), compared to the optimal solution. © 2016 IEEE.


Al-geelani N.A.,Al-Madinah International University | Piah M.A.M.,University of Technology Malaysia | Abdul-Malek Z.,University of Technology Malaysia
Electrical Engineering | Year: 2017

Partial discharges (PDs) emit energy in several ways, producing electromagnetic emissions in the form of radio waves, light and heat, and acoustic emissions in the audible and ultra-sonic ranges. These emissions enable us to detect, locate, measure, and analyse PD activity in order to identify faults before the development of failures, because once present, the damage caused by PDs always increases, leading to asset losses, outages, protection-system failure, disaster, and huge energy losses. Therefore, it is of great importance to identify different types of PDs and to assess their severity. This paper investigates the acoustic emissions associated with corona discharge (CD) from different types of sources in the time domain, and based on these it is used to detect, identify, and characterize the acoustic signals due to CD activity. Which usually takes place on polluted glass insulators used in high-voltage transmission lines and hence to differentiate abnormal operating conditions from normal ones. A laboratory experiment was conducted by preparing prototypes of the discharge. This study suggests a feature extraction and classification algorithm for CD classification. A wavelet signal processing toolbox is used to recover the CD acoustic signals by eliminating the noisy portion and to reduce the dimensions of the feature input vector. The proposed model is proven to characterize the PD activity with a high degree of integrity, which is attributed to the effect of the wavelet technique. The test results show that the proposed approach is efficient and reliable. © 2017 Springer-Verlag Berlin Heidelberg


Al Moaiad Y.,Al-Madinah International University | Abu Bakar Z.,Al-Madinah International University | Al-Sammarraie N.A.,Al-Madinah International University
ICOS 2016 - 2016 IEEE Conference on Open Systems | Year: 2017

The problem arises when there is no tool that can help users to make best decision for choosing provider that meets user requirements satisfactorily. Available tools only compare the services of providers without considering user requirements. Thus the objective of this paper is to present a tool, called PTUR to meet the requirements and satisfaction of user from the selected providers that are ranked according to the users requirements. To fulfill the objectives, the information of user requirements and functional services of providers are gathered. The prominent information required from the providers are the speed of central processing unit (CPU), the size of random-access memory (RAM), the size solid-state drive (SSD), the bandwidth in bits per second (bit/s), the cost of service. Next, the construction of the PTUR that stores of provider functional services and the user requirement needs; performs weighting using linear equation; and ranks the providers by summing the weightage of services for each provider. The results display the ranking of the providers based on user requirements and the user can choose what PTUR recommended or make decision to select other provider. PTUR also saves time for the user and be used by providers to benchmark or improve their services. © 2016 IEEE.


Ismail R.,University Technology of MARA | Rahman N.A.,University Technology of MARA | Abu Bakar Z.,Al-Madinah International University
ICOS 2016 - 2016 IEEE Conference on Open Systems | Year: 2017

Ontology learning is a field of extracting ontological elements to form ontology. Identification of concepts is the main activities within ontology learning. Diverse methods can be used to find concepts. One of the methods is using collocation learning technique. The technique used statistical scores which to test the strength of the connection between terms. In English translated Quran, single term Allah has occurred more frequently. The highest occurrences make the term Allah as concept but ignore the multi terms that related terms to Allah. This paper proposed a method to extract concept. It is based on collocation of terms related to Allah. The collocation used Ngram method. The result shows that the collocation method is able to identify terms related to Allah to be as concepts. © 2016 IEEE.


Othman N.-I.,University Technology of MARA | Bakar Z.A.,Al-Madinah International University
AIP Conference Proceedings | Year: 2017

Many of the computer-aided mathematics assessment systems that are available today possess the capability to implement stepwise correctness checking of a working scheme for solving equations. The computational technique for assessing the correctness of each response in the scheme mainly involves checking the mathematical equivalence and providing qualitative feedback. This paper presents a technique, known as the Stepwise Correctness Checking and Scoring (SCCS) technique that checks the correctness of each equation in terms of structural equivalence and provides quantitative feedback. The technique, which is based on the Multiset framework, adapts certain techniques from textual information retrieval involving tokenization, document modelling and similarity evaluation. The performance of the SCCS technique was tested using worked solutions on solving linear algebraic equations in one variable. 350 working schemes comprising of 1385 responses were collected using a marking engine prototype, which has been developed based on the technique. The results show that both the automated analytical scores and the automated overall scores generated by the marking engine exhibit high percent agreement, high correlation and high degree of agreement with manual scores with small average absolute and mixed errors. © 2017 Author(s).


Liban A.,Al-Madinah International University | Hilles S.M.S.,Al-Madinah International University
Proceedings - 2014 5th IEEE Control and System Graduate Research Colloquium, ICSGRC 2014 | Year: 2014

Securing the database against frequent attacks is a big concern; attackers usually intend to snitch private information and damage databases. These days, web applications are widely used as a meddler between computer users. Web applications are also used mostly by e-commerce companies, and these types of applications need a secured database in order to keep sensitive and confidential information. Since Blind SQL injection attacks occurred as a new way of accessing database through the application rather than directly through the database itself, they have become popular among hackers and malicious users. Many detection tools are developed to handle this problem but they have limitations. This study enhances SQL-injection vulnerability scanning tool for automatic creation of SQL-injection attacks (MYSQL Injector) using time-based attack with Inference Binary Search Algorithm. It covers four types of blind SQL injection attacks, true/false, true error, time-based and order by attacks. This tool will automate the process of the blind SQL injection attacks to check the blind SQL injection vulnerability in the PHP-based websites that use MySQL databases. Forty four vulnerable websites and thirty non vulnerable websites were tested to ensure the accuracy of the tool. The result shows 93% accuracy for detecting the vulnerability while MySQL injector performs 84%. © 2014 IEEE.


Hilles S.M.S.,Al-Madinah International University | Maidanuk V.P.,Vinnytsia National Technical University
ARPN Journal of Engineering and Applied Sciences | Year: 2014

This paper present image coding which is gained many researchers attention in order to improve the quality of image after the compression process. Since this is expended most computing resources and research which is related not only to search for a mathematical transformation, but also to study the characteristics of visual perception of the image features and fail-safe transmission of images via communication channels. There are many methods of image coding with neural networks of 2D SOFM kohonen map have been suggested and investigated. The coding schemes are proposed methods vector quantization as the original image, and the spatial frequency image component derived from the adaptive to the contours of the two-dimensional analysis and synthesis. The calculation of the computational cost in compression based on Kohonen maps. The methods are characterized by a high level of adaptation due to the introduction of educational stage that provides for the increase of multiplication ratio and high quality of image restarting after coding. The modified method of image multiplexing based on characteristic feature of the given method is vector digitizing of image components. This paper considers the coding problem of photo realistic images, presented in a digital form. The characteristic feature of the method is the application of pair exchange, this increases processing speed and sorting of data arrays. However the result of proposed method is shown the image quality after compression processor. Using this approach the differences or lost pixels between the image after and before compression processer are considered. The propose method may useful for image representation and image coding researcher and such related field. © 2006-2014 Asian Research Publishing Network (ARPN).

Loading Al-Madinah International University collaborators
Loading Al-Madinah International University collaborators