Centre for Modelling, Simulation and Design
Permanent URI for this community
Browse
Browsing Centre for Modelling, Simulation and Design by Author "Almuttairi, Rafah M."
Results Per Page
Sort Options
-
ItemA two phased service oriented Broker for replica selection in data grids( 2013-01-01) Almuttairi, Rafah M. ; Wankar, Rajeev ; Negi, Atul ; Rao, C. R. ; Agarwal, Arun ; Buyya, RajkumarReplica selection is one of the fundamental problems in Data Grid's environment. This work's concern is designing a Two phasedService Oriented Broker (2SOB) for replica selection. It is focused on investigating, selecting, modifying, and experimenting with some non-conventional approaches to be applied on the relevant selection techniques. The motivation of this work is to introduce a novel Service OrientedBroker for Replica Selection in Data Grid. The main characteristics of this 2SOB are: Scalability, Reliability, Availability, Efficiency and Ease of deployment. 2SOB consists of two phases; the first is a Coarse-grainphase, basically used for sifting replica sites that have low latency (uncongested network links) and distinguishing them from other replicas having a high latency (congested network links). Procedurally, this has been done using the association rules concept of the Data Mining approach. The second is a Fine-grainphase,used for extracting the replicas admissible for user requirements through applying Modified Minimum Cost and Delay Policy (MMCD). Both phases have accordingly been designed, simulated, coded, and then validated using real data from EU Data Grid.The first phase has thereby been applied on the real network data of CERN (February 2011). Experimentations compared with some other contemporary selection methods of different Brokers showed appreciable results. Using this proposed Broker it is possible to achieve an enhancement in the speed of executing Data Grid jobs through reducing the transfer time. © 2012 Elsevier B.V. All rights reserved.
-
ItemEnhanced data replication broker( 2011-12-26) Almuttairi, Rafah M. ; Wankar, Rajeev ; Negi, Atul ; Raghavendra Rao, ChillarigeData Replication Broker is one of the most important components in data grid architecture as it reduces latencies related to file access and file transfers (replica). Thus it enhances performance since it avoids single site congestion by the numerous requesters. To facilitate access and transfer of the data sets, replicas of data are distributed across multiple sites. The effectiveness of a replica selection strategy in data replication broker depends on its ability to serve the requirement posed by the users' jobs or grid application. Most jobs are required to be executed at a specific execution time. To achieve the QoS perceived by the users, response time metrics should take into account a replica selection strategy. Total execution time needs to factor latencies due to network transfer rates and latencies due to search and location. Network resources affect the speed of moving the required data and searching methods can reduce scope for replica selection. In this paper we propose an approach that extends the data replication broker with policies that factor in user quality of service by reducing time costs when transferring data. The extended broker uses a replica selection strategy called Efficient Set Technique (EST) that adapts its criteria dynamically so as to best approximate application providers' and clients' requirements. A realistic model of the data grid was created to simulate and explore the performance of the proposed model. The policy displayed an effective means of improving the performance of the network traffic and is indicated by the improvement of speed and cost of transfers by brokers. © 2011 Springer-Verlag.
-
ItemNew replica selection technique for binding replica sites in data grids( 2010-12-01) Almuttairi, Rafah M. ; Wankar, Rajeev ; Negi, Atul ; Chillarige, Raghavendra Rao ; Almahna, Mahdi S.The objective in Data Grids is to reduce access and file (replica) transfer latencies, as well as to avoid single site congestion by the numerous requesters. To facilitate access and transfer of the data, the files of the Data Grid are distributed across the multiple sites. The effectiveness of a replica selection strategy in data grids depends on its ability to serve the requirement posed by the users' jobs. Most jobs are required to be executed at a specific execution time. To achieve the QoS perceived by the users, response time metrics should take into account a replica selection strategy. Total execution time needs to factor latencies due to network transfer rates and latencies due to search and location. Network resources affect the speed of moving the required data and searching methods can reduce scope for replica selection. This paper presents a replica selection strategy that adapts its criteria dynamically so as to best approximate application providers' and clients' requirements. We introduce a new selection technique (EST) that shows improved performance over the more common algorithms. © 2010 Universtiy of Basrah.
-
ItemReplica selection in data grids using preconditioning of decision attributes by K-means clustering (K-RSDG)( 2010-12-01) Almuttairi, Rafah M. ; Wankar, Rajeev ; Negi, Atul ; Rao, C. R.This paper extends the applicability of the Rough Set Replica Selection Strategy in Data Grids, RSDG, proposed previously for such situations where the history of replica sites is unavailable. Grey based Rough Set Theory is applied using replicas information only as an input data. The Decision attributes are derived applying Grey based K-means clustering algorithm upon the input data. Each cluster label of K clusters represents a class of Decision attribute in the Decision Table of the rough set. Comparing to the previous work the synthetic data experimentation shows the improvement in the overall performance. © 2010 IEEE.
-
ItemRough set based quality of service design for service provisioning in clouds( 2011-10-19) Ganghishetti, Praveen ; Wankar, Rajeev ; Almuttairi, Rafah M. ; Rao, C. RaghavendraQuality of Service (QoS) is a broad term used to describe the overall experience a user or application will receive over a network. A rough set based approach is used to design a modified Cloud-QoS Management Strategy (MC-QoSMS). MC-QoSMS is a component of cloud broker that is used to allocate resources based on Service Level Agreement between users and providers for Infrastructure as a Service (IaaS) provisioning of cloud. Concept of reduct from rough set theory is used to allocate the best service provider to the cloud's user with minimum searching time. The performance of the proposed system has been analyzed in terms of number of requests. It is reported that the system outperformed random algorithm by 25% and the round robin algorithm by 30% for 100 requests. © 2011 Springer-Verlag.
-
ItemRough set clustering approach to replica selection in data grids (RSCDG)( 2010-12-01) Almuttairi, Rafah M. ; Wankar, Rajeev ; Negi, Atul ; Chillarige, Raghavendra RaoIn data grids, the fast and proper replica selection decision leads to better resource utilization due to reduction in latencies to access the best replicas and speed up the execution of the data grid jobs. In this paper, we propose a new strategy that improves replica selection in data grids with the help of the reduct concept of the Rough Set Theory (RST). Using Quickreduct algorithm the unsupervised clustering is changed into supervised reducts. Then, Rule algorithm is used for obtaining optimum rules to derive usage patterns from the data grid information system. The experiments are carried out using Rough Set Exploration System (RSES) tool. © 2010 IEEE.
-
ItemSmart replica selection for data grids using rough set approximations (RSDG)( 2010-12-01) Almuttairi, Rafah M. ; Wankar, Rajeev ; Negi, Atul ; Rao, C. R.The best replica selection problem is one of the important aspects of data management strategy of data grid infrastructure. Recently, rough set theory has emerged as a powerful tool for problems that require making optimal choice amongst a large enumerated set of options. In this paper, we propose a new replica selection strategy using a grey-based rough set approach. Here first the rough set theory is used to nominate a number of replicas, (alternatives of ideal replicas) by lower approximation of rough set theory. Next, linguistic variables are used to represent the attributes values of the resources (files) in rough set decision table to get a precise selection cause, some attribute values like security and availability need to be decided by linguistic variables (grey numbers) since the replica mangers' judgments on attribute often cannot be estimated by the exact numerical values (integer values). The best replica site is decided by grey relational analysis based on a grey number. Our results show an improved performance, compared to the previous work in this area. © 2010 IEEE.