Accepted Papers

  • ECSM : Energy Efficient Clustering Scheme for Mobile M2M Communication Networks
    Mohammed Saeed Al-kahtani, Salman bin Abdulaziz University, Saudi Arabia
    ABSTRACT

    Scheduling the active and idle period of machine type communication devices (MTC) such as RFID tags, sensors and smart meters are significantly important to achieve energy efficiency in the emerging machine to machine (M2M) communication networks, which comprises thousands of resource constrained MTC devices (i.e., low data rate, energy and bandwidth). However, only a few studies exist in the literature on node scheduling schemes of M2M communication networks. Most of these schemes consider only the energy efficiency of MTC devices and do not support mobility. Thus, we introduce an energy efficient, node scheduling scheme for mobile M2M (ECSM) communication networks. The ECSM scheduling scheme selects a minimum number of active MTC devices in each cluster and increases the probability of network coverage. Simulation results show that the ECSM scheduling scheme outperforms the existing cluster-based and well-known mobility centric LEACH-M and LEACH-ME schemes in terms of network energy consumption and lifetime.

  • A Novel Security Architecture for Responsibility Aspect in Cloud Computing Applications
    Mohammad Reza Ahmadi1 and Amir Ahmadi2, 1Research Institute of ICT (ITRC) Tehran, Iran and 2Urmia University Urmia, Iran
    ABSTRACT

    Cloud computing is a model for enabling convenient, on demand network access to a shared pool of configurable computing resources that can be rapidly provisioned and released with minimal management effort [1]. Advances in virtualization, storage, connectivity and processing power are combined to create a new ecosystem for cloud computing. It is an internet based service delivery model which provides internet based services, computing and storage for users in all markets including information technology, financial health care, business and government. This new economic model for computing has found fertile ground and is attracting massive global investment. Although the benefits of cloud computing are clear, so is the need to develop proper security for cloud implementations. Cloud security is becoming a key differentiator and competitive edge between cloud providers. This paper discusses the main concept and proposes a security architecture that arises in a cloud computing frame work. It focuses on cloud architecture and relation with virtualization as well as the security issue and responsibility of the providers and users with the view of a secure cloud environment.

  • A Distributed Agent Media Access Framework
    Craig M. Gelowitz, Luigi Benedicenti and Raman Paranjape, University of Regina, Canada
    ABSTRACT

    This paper presents a distributed software mobile agent framework for media access. The access and delivery of media is enhanced by an agent architecture framework that uses mobile agent characteristics to provide access and delivery of media. Migrating agents to the media source and destination devices enables the framework to discover the properties of devices and media within networks dynamically. The mobile agents in the framework make decisions and work together to enable access and delivery of media through transcoding of media based on the data path and device constraints.

  • Mathematical Modelling And Analysis Of Network Service Failure In Data Centre
    Malik Usman Dilawar and Faiza Ayub Syed, University of Engineering and Technology, Pakistan
    ABSTRACT

    World has become a global village. With the advent of technology, the concept of Cloud Computing has evolved a lot. Cloud computing offers various benefits in terms of storage, computation, cost and flexibility. It focuses on delivering combination of technological components such as applications, platforms, infrastructure, security and web hosted services over the internet. One of the major elements of Cloud Computing infrastructure is Data centre. Companies host there applications and services online through Data centres, whose probability of downtime is expected to be very low. Since, data centre consists of number of servers; the rate of service failure is usually high. In this paper we have analysed service failure rate of a conventional data centre. The Fault Trend of network failure by assuming there occurrence as a Poisson Process was made. The accurate prediction of fault rate helps in managing the up gradation, replacement and other administrative issues of data centre components.

  • Unsupervised Topic-Sensitive Parser
    Dongchen Li and Xiantao Zhang, Peking University, China
    ABSTRACT

    Modern statistical parsers are trained on large annotated corpora (treebanks). These treebanks usually consist of sentences addressing different topics (e.g. finance, politics, sports), which implies that the statistics gathered by current statistical parsers are mixtures of topics of language use. Previous parser domain adaptation work focus on modeling the difference among different corpus collections (genres). However, source and target document collections for parsers involving heterogeneous topics, which may vary widely in their lexical choices and stylistic preferences. We study this problem as a new task l unsupervised multiple topic-sensitive grammar learning and parsing. We propose an approach that biases parsing systems toward relevant topics based on topic-specific contexts, where topics are induced in an unsupervised way using topic models; this can be thought of as inducing subcorpora for adaptation without any human annotation. We use these topic distributions to induce topic-sensitive grammar and directly incorporate them into our weighted sum model for amalgamating their predictions. Our experiments show that introducing topic sensitivity by automatically exploiting corpora can improve significantly over a tough, state-of-theart baseline.

  • Identification of the Best and the Worst Time to Tweet: An Experimental Study
    Basit Shahzad and Esam Alwagait, King Saud University, Saudi Arabia
    ABSTRACT

    Increase in Twitter’s popularity has been phenomenal over time and the Tweets now, are not only a mean of status update and one-on-one communication but widely used for the trend setting and marketing. The probability that a Tweet will be seen by the user when he was offline is very low. In order to increase the throughput of the responses it is important to determine the number of individuals online so that the Tweet is seen by maximum number of users. This research focuses on identifying the individual users from Saudi Arabia based on the parameters already set for the conduct of this study. The time stamped data for 1000 selected individuals is retrieved from Twitter and their data is analyzed accordingly. The number of online users by recording the 'last seen' status are observed. The retrieval of data is based on number of experiments that were run on same time in all days of the week to reduce the inconsistent patterns. The data is then analyzed to see the time slots where the online user percentage is higher as compared to other time slots. The results of the study are focused to identify and recommended the timings when the Tweets are better valued and the impact is considerable.

  • Multilevel Algorithms For The Clustering Problem
    Noureddine Bouhmala, Vestfold University College, Norway
    ABSTRACT

    Data Mining is concerned with the discovery of interesting patterns and knowledge in data repositories. Cluster Analysis which belongs to the core methods of data mining is the process of discovering homogeneous groups called clusters. Given a dataset and some measure of similarity between data objects, the goal in most clustering algorithms is maximizing both the homogeneity within each cluster and the heterogeneity between different clusters. In this work, a multilevel genetic algorithm for the clustering problem is introduced. The approach suggests looking at the clustering problem as a hierarchical optimization process going through different levels evolving from a coarse grain to fine grain strategy. The clustering problem is solved by first reducing the problem level by level to a coarser problem where an initial clustering is computed. The clustering of the coarser problem is mapped back level-by-level to obtain a better clustering of the original problem by refining the intermediate different clustering obtained at various levels. In this paper, a multilevel genetic algorithm and a multilevel K-means algorithm are introduced for solving the clustering problem. A benchmark using a number of data sets collected from a variety of domains is used to compare the effectiveness of the hierarchical approach against its single-level counterpart

  • Design of substrate integrated waveguide power divider, circulator and coupler in [10-15]GHz band
    Bouchra Rahali and Mohammed Feham, STIC Laboratory, Algeria
    ABSTRACT

    The Substrate Integrated Waveguide (SIW) technology is an attractive approach for the design of high performance microwave and millimeter wave components, as it combines the advantages of planar technology, such as low fabrication costs, with the low loss inherent to the waveguide solution. In this study, a substrate integrated waveguide power divider, circulator and coupler are conceived and optimized in [10-15] GHz band by Ansoft HFSS code. Thus, through this modeling, design considerations and results are discussed and presented. Attractive features including compact size and planar form make these devices structure easily integrated in planar circuits.

  • Study of Integration Working Method—Example Analysis of Data Exchange
    Luo ya-guo, Xi 'an college of arts and science, China
    ABSTRACT

    Integration Working Method is a software working method to enhance the Data Exchange working. Integration working method has five fundamental phases and one integration working controller to control and coordinate the overall development working. Integration working method uses 3B-Method to generalize the working of solving the problem in each phase. 3B-Method, which helps in implementing module based development approach and provides firmer control over the Data Exchange working. Because of the module driven approach, the risk associated with cost and time is limited to module only and ensure the overall quality of software system, reduce the Data Exchange cost and time by considering the changing requirements of customer, risk assessment, identification, evaluation and composition of relative concerns at each phase of development working. We have implemented integration working method to the design of a real world in formation system and evaluated this implementation with the initial project estimation.

  • Rule-Based Model Via Rough Sets To Mimo Discrete-Time Nonlinear Dynamical System
    Carlos A. M. Pinheiro and Rubiane H. Oliveira, Federal University of Itajuba (UNIFEI), Brazil
    ABSTRACT

    This paper suggests a method to develop rule-based models to systems with multiple inputs and outputs using concepts about rough sets. The rules provide relations among variables and give a mechanism to link granular descriptions of the models with the computational procedures used. An estimation procedure is applied to compute values from numeric representations encoded by rule sets. The method is useful to develop models of nonlinear systems. An example of multi-input multi-output (MIMO) discrete-time nonlinear dynamical system is presented.

  • Fault Tolerance Mobile Agent Execution System (FTMAS) Modeling and Performance Analysis
    Amal Al Dweik, Palestine Polytechnic University, Palestine
    ABSTRACT

    Mobile Agent paradigm is said to give better performance advantages over other paradigms due to its features especially in the network and the internet applications. To integrate mobile agents into such applications, fault tolerant ability of an agent is one of the most important issues. In this paper we analyzed the Fault Tolerant Mobile Agent execution System "FTMAS" model proposed in [1]and study the performance mathematically. Thepaper defines and analyzes the migration time, the Round trip time, the transfer time and the system throughput. They are tested against 3 performance metrics: the average connectivity of the network, the agent size and the probability of failure. The performance analysis proves that our proposed FTMAS is a promising model in the fault tolerant field of the mobile agent MA execution.

  • Multi-Dimensional Customization Modelling Based On Metagraph For Saas Multi-Tenant Applications
    Ashraf A. Shahin1,2, 1Al Imam Mohammad Ibn , 1Saud Islamic University (IMSIU), Kingdom of Saudi Arabia and 2Institute of Statistical Studies & Research, Egypt
    ABSTRACT

    Software as a Service (SaaS) is a new software delivery model in which pre-built applications are delivered to customers as a service. SaaS providers aim to attract a large number of tenants (users) with minimal system modifications to meet economics of scale. To achieve this aim, SaaS applications have to be customizable to meet requirements of each tenant. However, due to the rapid growing of the SaaS, SaaS applications could have thousands of tenants with a huge number of ways to customize applications. Modularizing such customizations still is a highly complex task. Additionally, due to the big variation of requirements for tenants, no single customization model is appropriate for all tenants. In this paper, we propose a multi-dimensional customization model based on metagraph. The proposed mode addresses the modelling variability among tenants, describes customizations and their relationships, and guarantees the correctness of SaaS customizations made by tenants.

  • Reduced Massive Data Sets using improved algorithms based Core-Sets
    LACHACHI Nour-Eddine, Oran University, Algeria
    ABSTRACT

    Minimal Enclosing Ball (MEB) is a spherically shaped boundary around a normal dataset, it used to separate this set from abnormal data. MEB has a limitation for dealing with a large dataset in which computational load drastically increases as training data size becomes large. To handle this problem in huge dataset used in different domains, we propose an improved two approaches using k-NN clustering method. It produce a reduced data optimally matched to the input demands of different background of systems as UBM architectures in language recognition and identification systems. In this paper, we improve a technique for reduced massive data sets using core-set. For this, the training data, learned by Support Vector Machines (SVMs), partitioned among several data sources. Computation of such SVMs achieved by finding a core-set for the image of the data. Experimentation used on speech information had eliminate all noise data.

  • Backtracking Based Integer Factorisation, Primality Testing and Square Root Calculation
    Mohammed Golam Kaosar, Charles Sturt University Australia
    ABSTRACT

    Breaking a big integer into two factors is a famous problem in the field of Mathematics and Cryptography for years. Many crypto-systems use such a big number as their key or part of a key with the assumption - it is too big that, the fastest factorisation algorithms running on the fastest computers would take impractically long period of time. Hence,many efforts have been provided to break those crypto-systems by finding two factors of an integer for decades. In this paper, a new factorisation technique is proposed which is based on the concept of backtracking. Binary bit by bit operations are performed to find two factors of a given integer. This proposed solution can be applied in computing square root,primality test, finding prime factors of integer numbers etc. If the proposed solution is proven to be efficient enough, it may break the security of many crypto-systems. Implementation and performance comparison of the technique is kept for future research.

  • Factors Affecting Bmi: A Comprehensive Study
    Mohsin Manshad Abbasi, University of Azad Jammu & Kashmir, Pakistan
    ABSTRACT

    Living beings are vulnerable to disease and humans are not exceptional. Most of the diseases today are because of unhealthy, non-hygienic food or nonphysical life style. In some situations both of these factors works together where people get use to eat oily, fatty products and their life style is very lazy. That can result in disease like heart dieses. According to some survey the highest percentage of death in the world is because of heart diseases. This paper has been written based on the relation between a very simple measure of health called Body Mass Index (BMI) and disease that may result due to abnormal values of BMI. It is a standard measure to calculation the height to weight ratio of an individual. A standardized range of value has been considered as normal values and the values other than them are considered as under or over normal values. For this study a questionnaire has been designed comprising of most common factors that affects BMI values and has circulated among a population to analyze which factors are more dominating in affecting the BMI of that population. After analyzing the more dominating factors, the advice of health physician has been observed to normalizing these dominating factors that are affecting BMI values of that population. Studies have been made in the past to analyze effect of BMI on health and how to normalize BMI values. However none of them considered as many BMI factors and in detailed manner as in this study.

  • Improving Processes in Small Businesses
    Shafiq Ur Rehman, Brunel, UK
    ABSTRACT

    Most organizations in Pakistan software industry have less than a hundred employees which we term as "small organizations" or "small business". Generally, small organizations have many constraints and limitations that include budget, resources and product time to market; which prevents them to initiate improvement programs. These programs are intended to improve different areas within an organization. As the organizations continue to grow, it becomes necessary for them to have well defined processes along with the programs to continuously improve these processes and inject quality in them. Having quality processes ensures the quality of the software and helps reducing extra cost and rework, increases productivity and shortens the time to market. The objective of this research is to analyze how small organizations can improve their processes instead of their limitations. We gather information from small organizations in Pakistan in order to understand the environment, culture, issues and the way they operate. Then we present a ? (RHO) process improvement model which is tailor made for small organizations. The goal of this is to make small organizations productive, competitive and agile by improving the quality of their processes and product.

  • An Empirical Research on the Effects of Mobile Payment Adoption
    Shafiq Ur Rehman, Brunel, UK
    ABSTRACT

    The purpose of this research is to understand what factors facilitate adoption of mobile payment in Pakistan context and the potential issues caused by the adoption of mobile payments. Usability is significant issue in m-payment system which effect acceptance of m-payment system. As prototyping procedures edit the life cycle of software and electronic products developments, diagnosing usability harms and as long as metrics for making relative decisions. Usability depends on affecting application and visual reliability of the design. The m-payment usability dimension evaluates the usability of mobile devices for the intention of assessment making among users, and identifying investigative information of usability factors to improve specific usability scope and associated interface design. The preceded contributions of this research will ease the adoption process by providing factors of user satisfaction and help in proactive management of upcoming issues.

  • Parameter Space and Comparative Analyses of Energy Aware Sensor Communication Protocols
    Ittipong Khemapech, University of the Thai Chamber of Commerce, Thailand
    ABSTRACT

    Energy conservation is one of the important issues in communication protocol development for Wireless Sensor Networks (WSNs). WSNs are a shared medium system, consequently a Medium Access Control (MAC) protocol is required to resolve contention. The feature of the MAC together with the application behavior determines the communication states which have different power requirements. The power level used for a transmission, will affect both the effective range of the transmission and the energy used. The Power & Reliability Aware Protocol (PoRAP) has been developed to provide efficient communication by means of energy conservation without sacrificing reliability. It is compared to Carrier Sense Multiple Access (CSMA), Sensor-MAC (S-MAC) and Berkeley-MAC (B-MAC) in terms of energy consumption. The analysis begins with a parameter space study to discover which attributes affect the energy conservation performance. Parameter settings in the Great Duck Island are used for comparative study. According to the results, PoRAP consumes the least amount of communication energy and it is applicable when the percentage of slot usage is high

  • Software Reliability Improvement using Empirical Bayesian Method
    D.Vivekananda Reddy and A.Ramamohan Reddy, Sri Venkateswara University College of Engineering
    ABSTRACT

    The main objective of any software testing is to improve software reliability. Many of previous testing methods did not pay much attention towards how to improve software testing strategy based on software reliability improvement. The reason to it as the relationship between software testing and software reliability is a very complex task and this is because due to the complexity of software products and development processes involved in it. However any testing strategy of software in order to improve reliability must need to possess the ability to predict that reliability. For this purpose an approach is used called Model predictive control, which provides a good framework to improve that predictive effect .There is an main issue in model predictive control is that how to estimate the concern parameter. In this case, Empirical Bayesian method is used to estimate the concern parameter: Reliability. This proposed Software reliability improvement using Empirical Bayesian method can optimize test allocation scheme on line. In this the case study shows that it is not definitely true for a software testing method that can find more defects than others can get higher reliability. And the case study also shows that the proposed approach can get better result in the sense of improving reliability than random testing.

  • A 10-bit 25MSPS Low Power Pipeline ADC for Mobile HDTV Receiver System
    Pratik Akre, Krushna Dongre and K. Sarangam, Yeshwantrao Chavan College of Engg Nagpur, India
    ABSTRACT

    This paper describes a 10-bit 25MSPS analog-to-digital converter (ADC) for Mobile HDTV Receiver System. The ADC is based on a 4-3-3 bits- per-stage pipeline architecture and The proposed pipelined ADC adopts a optimized stage resolution based on power consumption of sample and hold circuit and comparator. At the target sampling rate of 25MS/s, measured results show that the converter consumes 12.36mW from a 1.8V power supply and 56dB SNR and 60dB SFDR.

  • Conflict Resolution and Service Recommendation in Context Aware Environment
    U.P.Kuklarni, Thyagaraju.GS, SDMCET , India
    ABSTRACT

    The research work presented in this paper proposes the algorithms to resolve service conflict among users and recommend appropriate service in context aware environment. The proposed algorithms are designed by utilizing contextual parameters like personal, social, location, environmental, temporal, mood and activity. In addition to the contextual parameters the proposed algorithms utilizes the explicit or implicit user ratings and watching history to resolve the conflict if any while recommending the services. The proposed algorithms are formulated by exploiting AI techniques like fuzzy logic, Bayesian probability and rough sets based decision rules. In the presented research work, the usage of the algorithms is demonstrated in designing the context aware TV. The motto of the research work is to enable context aware applications to offer socialized and personalized services to single user or multiple users by resolving service conflicts among users.

  • Modeling User Context using CEAR Diagram
    Ravindra Dastikop , Thyagaraju.G.S and U.P.Kulkarnic, SDM College Of Engineering and Technology ,India
    ABSTRACT

    Even though the number of context aware applications is increasing day by day along with the users, till today there is no generic programming paradigm for context aware applications. This situation could be remedied by design and developing the appropriate context modeling and programming paradigm for context aware applications. In this paper we are proposing the static context model and metrics for validating the expressiveness and understandability of the model. The proposed context modeling is a way of describing a situation of user using context entities , attributes and relationships .The model which is an extended and hybrid version of ER model , Ontology model and Graphical model is specifically meant for expressing and understanding the user situation in context aware environment. The model is useful for understanding context aware problems, preparing documentation and designing programs and databases. The model makes use of context entity attributes relationship (CEAR) diagram for representation of association between the context entities and attributes. We have identified a new set of graphical notations for improving the expressiveness and understandability of context from the end user perspective .

  • A Machine Learning Model for Stock Market Price Prediction
    Omar S. Soliman and Mustafa Abdul Salam, Cairo University, Egypt
    ABSTRACT

    Stock market prediction is the act of trying to determine the future value of a company stock or other financial instrument traded on a financial exchange. The successful prediction of a stock's future price will maximize investor's gains. This paper proposes a machine learning model to predict stock market price. The proposed algorithm integrates Particle swarm optimization (PSO) and least square support vector machine (LS-SVM). The PSO algorithm is employed to optimize LS-SVM predict the daily stock prices is based on the study of stocks historical data, technical indicators. PSO algorithm selects best free parameters combination for LS-SVM to avoid over-fitting and local minima problems and improve prediction accuracy. The proposed model was applied and evaluated using twelve benchmark financials datasets and compared with artificial neural network with Levenberg-Marquardt (LM) algorithm. The obtained results showed that the proposed model has better prediction accuracy and the potential of PSO algorithm in optimizing LS-SVM.

  • A Method to Identify Potential Ambiguous Malay Words through Ambiguity Attributes Mapping: An Exploratory Study
    Hazlina Haron and Abdul Azim Abd. Ghani, Universiti Putra Malaysia,Malaysia
    ABSTRACT

    We describe here a methodology to identify a list of ambiguous Malay words that commonly used in Malay documentations such as Requirement Specification. We compiled several relevant and appropriate requirement quality attributes and sentence rules from previous literatures and adopt it to come out with a set of ambiguity attributes that most suit Malay words. The extracted Malay ambiguous words (potential) are then being mapped onto the constructed ambiguity attributes to confirm their vagueness. The list is then verified by Malay linguist experts. This paper aims to publish a list of potential Malay ambiguous words in an attempt to assist writers to avoid using the vague words while documenting Malay Requirement Specification as well as to any other related Malay documentation. The result of this study is a list of 120 potential ambiguous Malay words that could act as guidelines in writing Malay sentences.

  • An Explicit Trust Model Towards Better System Security
    Orhio Mark Creado, Balasubramaniam Srinivasan, Phu Dung le, Jefferson Tan, Monash University, Australia
    ABSTRACT

    Trust is a absolute necessity for digital communications; but is often viewed as an implicit singular entity. The use of the internet as the primary vehicle for information exchange has made accountability and veriability of system code almost obsolete. This paper proposes a novel approach towards enforcing system security by requiring the ex- plicit denition of trust for all operating code. By identifying the various classes and levels of trust required within a computing system; trust is defined as a combination of individual characteristics. Trust is then represented as a calculable metric obtained through the collective enforcement of each of these characteristics to varying degrees. System Security is achieved by facilitating trust to be a constantly evolving aspect for each operating code segment capable of getting stronger or weaker over time

  • User preferred color combination design using interactive Genetic Algorithm
    Tad Gonsalves, Sophia University, Japan
    ABSTRACT

    Selecting the right combination of colors in designing a product is often a difficult task. In this paper, the author proposes a decision support system for modeling the color combination design suited for personal preference using interactive Genetic Algorithm (iGA). iGA is different from the traditional GA in that it leaves the evaluation of the objective function to the personal preferences of the user. The iGA interactive system is capable of creating an unlimited number of color combination options taking into consideration the preferences of the user. The user chooses and indicates his/her preferences and directs the process of optimizing the color combination. The final outcome of this user-system interaction is a color-combination design which the user might not have even imagined before he/she began interacting with the system. An application of the proposed interactive system for selecting color combination of clothing items is also presented. Users find the system efficient, user-friendly and responding in real-time without causing any user fatigue.

  • Open Multiprocessing Aided Overlapped Motion Compensated Temporal Interpolation
    Madiha Sher1, Nasru Minallah1, Muhammad Asif Manzoor2, Munaza Sher1, 1University of Engineering & Technology, Pakistan, 2Umm Al-Qura University, Kingdom of Saudi Arabia
    ABSTRACT

    Many todays' multimedia applications demand low bit rate transmission of the video sequences due to the limited bandwidth of transmission channels. Video compression is particularly required for these applications for the reception of an acceptable video quality at receiver. An important part of many video compression techniques is motion compensation. Overlapped Motion Compensated Temporal Interpolation (OMCTI) is a block based search approach for the temporal interpolation of skipped frames. It generates interpolated frames with considerably improved video quality at the receiver. Motion compensation is computationally complex and data intensive operation. Multi-core processor has captured major portion of the market due to its enhanced computational capabilities. Increase in single-core microprocessors' performance is limited by semiconductor scaling, associated power and thermal challenges. Currently multi-core CPUs have turned out to be the mechanism for enhancement of processor's performance to overcome limitations. Parallel processing changes the whole way we live. In this work, we speed up the motion compensation by leveraging the multicore processors and an OpenMP based multithreaded approach is established to reduce the computational complexity of the OMCTI. The performance of the proposed multi-core processor technique is evaluated with reference to the bench marker as single-core processor in order to analyze the performance tradeoffs. The paper is concluded with a discussion about the generated experimental results. Multi-core processors achieve performance enhancement of 30% - 50% in different scenario while the single-core processors, the bench marker; performance is improved by 5% at the most.

  • Dynamic Abstraction Model Checking
    Wanwu Wang, Iowa State University,USA
    ABSTRACT

    This paper presents a new approach to check the model using two abstractions—Universal Abstraction and Existential Abstraction. These new techniques can check both the Existential fragment of Computational Tree Logic (ECTL) and the Universal fragment of Computational Tree Logic (ACTL) specifications. I developed a Model Checker, called LOTUS, building upon these new techniques with a traditional fixed point algorithm on Linux. Experimental results confirmed the feasibility and validity of this new dynamic model checking technique. In this paper, I aim to provide a complete picture of this dynamic modelchecking algorithm, ranging from design details to implementation-related issues and experiments of the Philosopher Dinning Problem.

  • On Demand-based Frequency Allocation to Mitigate Interference in Femto-Macro LTE Cellular Network
    Shahadate Rezvy, Middlesex University, UK
    ABSTRACT

    Long Term Evolution (LTE) has introduced Femtocells technology in cellular mobile communication system in order to enhance indoor coverage. Femtocell is low-power, very small and cost effective cellular base station used in indoor environment. However, the impact of Femtocells on the performance of the conventional Macrocell system leads interference problem between Femtocells and pre-existing Macrocells as they share the same licensed frequency spectrum. In this paper, we propose an efficient method to mitigate interference and improve system capacity in the existing Femto- Macro two tier networks. In our proposed scheme, we use a novel frequency planning for two tiers cellular networks using frequency reuse technique where Macro base stations allocate frequency sub-bands for Femtocells users on demand basis through Femtocells base stations. This novel frequency reuse technique aims to mitigate interference by improving system throughput.

  • Word Hypothesis and Information Retrieval for Syntactic analysis of Kannada Script
    Keshava Prasanna, S. C. Sharma and Thungamani.M 3, BMS Institute of Technology,India
    ABSTRACT

    The output of even the most effectively designed OCR (Optical Character Recognition) module is not 100% accurate and hence errors occur in the identification of letters in turn leading to erroneous words. This motivates the use of spell checkers for syntactic analysis of the words which are output by the OCR and the need to verify the grammatical correctness of the sentences formed using the optional words. The use of spell checkers can be used to eliminate typographic errors and spell checkers form the heart of modern day Natural language Processing. An input word is taken from the user and it is searched for in a static data dictionary. The data dictionary is implemented using Ternary Search Tree (TST) and Suffix tree as the primary data structure.

  • Mining Developer Communication Data Streams
    Andy M. Connor, Jacqui Finlay and Russel Pears, Auckland University of Technology, New Zealand
    ABSTRACT

    This paper explores the concepts of modelling a software development project as a process that results in the creation of a continuous stream of data. In terms of the Jazz repository used in this research, one aspect of that stream of data would be developer communication. Such data can be used to create an evolving social network characterized by a range of metrics. This paper presents the application of data stream mining techniques to identify the most useful metrics for predicting build outcomes. Results are presented from applying the Hoeffding Tree classification method used in conjunction with the Adaptive Sliding Window (ADWIN) method for detecting concept drift. The results indicate that only a small number of the available metrics considered have any significance for predicting the outcome of a build over time.

  • Securing Mobile Cloud Using Finger Print Authentication
    Iehab ALRassan and Hanan AlShaher, King Saud University, Saudi Arabia
    ABSTRACT

    Mobile cloud computing becomes part of mobile users daily life transactions. Mobile devices with Internet capabilities have increased the use of mobile clouding computing. Due to hardware limitations in mobile devices, these devices can't install and run applications require heavy CPU processing or extensive memory. Cloud computing allows mobile users to synchronize their data with remote storage and utilize applications require heavy CPU processing or extensive memory such as Microsoft Office or Adobe Photoshop, as they run in a desktop computer. The combination of cloud computing and mobile computing introduces mobile cloud computing, which also present new issues of security threats such as unauthorized access to resources exist in mobile cloud. Protecting mobile cloud computing from illegitimate access becomes an important concern to mobile users. This paper proposes and implements a new user authentication mechanism of mobile cloud computing using fingerprint recognition system. Fingerprint images of mobile users can be captured and processed using mobile phone camera to access mobile cloud computing . The implementation of the proposed solution in different mobile operating systems and devices show security enhancement in mobile cloud computing with accepted performance level.

  • Effect of colours in manual data typing
    Melih Kirlidog, Marmara University, Turkey
    ABSTRACT

    Although there is a large body of literature on research into colour in human-computer interaction, the overwhelming majority of the literature emphasises the cognition by computer users. However, colour is also important in this interaction when users manually type data into a computer. This paper investigates the effect of colour combinations on manual data typing. To this end, three experiments were conducted where the subjects were requested to read several texts with different colour combinations and re-type them in the same screen. Typing accuracy and speed is measured as the dependent variable across different colour combinations. Three experiments were conducted as such. In the first experiment, display and input windows were close to each other and in the second one they were located in the opposite ends of the screen. The third experiment was a subset of the first one with reversed foreground and background colours. It was found that different colour combinations had varying effects on data typing performance and proximity of the display and input windows was not a significant factor for typing accuracy in a 17-inch screen. The effect of reversing the foreground and background colours was inconclusive with the colour combinations used. Key Wo

  • National Intelligence Grid
    D.Haritha and M.Krishna Madhav, SRK Institute of Technology, India
    ABSTRACT

    National Intelligence Grid (NATGRID) aims to ensure a readily available and real time information sharing platform between Intelligence Agencies and Organizations of E-Governance of India. NATGRID mainly deals with integrating different Government, quasi-Government and Private Organizations of E-Governance in India. This paper aims to provide a framework for integrating the various departments and providing the information access to 11 Indian Investigation agencies like National Investigation Agency (NIA), Intelligence Bureau, Research and Analysis Wing etc.. The NATGRID uses Aadhaar ID, which is a 12-digit unique number issued by the Unique Identification Authority of India (UIDAI) for all residents in India. The intelligence agencies are given the privileges to access only the respective necessary databases based on their functionalities. The agencies can also interact with the other agencies and can share the critical information.This paper addresses the analysis and design of the NATGRID System. The design, linkage and interaction among the 21 databases are discussed. The necessary architectural design and preliminary implementation was addressed.

  • Task Pooler: A System for Proactive Desktop Assistance
    Hamid Turab Mirza1, Ling Chen2, Ibrar Hussain2 and Gencai Chen2, 1COMSATS Institute of Information Technology, Pakistan and 2Zhejiang University, China
    ABSTRACT

    There has been a growing interest in assisting the knowledge workers by reusing their own work experiences. However, due to the highly unstructured and multitasking nature of the knowledge work it is very challenging to capture the essence of users’ experiences and then automatically deliver the aggregated knowledge for lightweight assistance. In this paper we present a new technique that; firstly takes advantage of the temporal aspects of user behaviour to identify the activities of desktop users. Second, it transforms the extracted activities into human readable tasks and finally translates the outcomes of individual experiences into an enriched re-usable task repository. We then present “Task Pooler” an interactive task assistance system embodying these ideas, and a user study to test the validity of the approach and present its results.

  • Private Cloud for Thin Clients and Handheld Devices
    Amogh Antarkar, Harshad Karbhare, Pratik Ghorpade and Sanjay Thakare, RSCOE, University of Pune, India
    ABSTRACT

    Cloud Computing has became the most discussed jargons in the computing world with its immense ability of scalability, high processing capabilities and cost efficiency. Contemporary virtualization technologies are restricted to desktop environments and are not used for thin clients and handheld devices. This paper explores how to set up a virtual desktop environment for handheld devices and thin clients. In the process, the paper discusses conception of cloud technology, how it can impact the enterprises on a macro and micro level. The cloud computing technologies and frameworks have evolved over time. Through our research and findings we discuss how an open source cloud can be built for the organization. We discuss components like XEN, XEN ARM, EUCALYTUS, Open VSwitch that are needed to setup a private cloud. The paper overcomes the limitations of BYODs (Bring your own devices) using the power of virtualization

  • A Framework with Scalable Storage for Simulation of Highly Detailed Protocol Stacks in Manets
    M. Colizza1, F. Santucci1, V. Palma2, A. Neri2, C. Armani3, M. Faccio1 and L. De Nardis4, 1University of L'Aquila, Italy, 2University of ROMA TRE, Italy, 3Selex Elsag, Italy and 4Sapienza University of Rome, Italy
    ABSTRACT

    Simulation experiments are widely used in the domain of the Mobile Ad hoc Networks (MANETs) to assess performance of the design activities. These experiments must model the network topology, network traffic, routing and other network protocols, node mobility, physical layer issues, including the radio frequency channel, terrain and antenna properties, and, perhaps, energy and battery characteristics. Accurate models are needed in order to run high fidelity simulations; furthermore, different models need to be defined to emulate realistic scenarios. Consequently, each simulation run requires hardware and software resources that be appropriate to support the computational effort and management of the large amount of data that the simulations generate. An approach to the management of simulation data has been recently defined and named as MANET Simulation Data Storage Methodology (MSDSM) [3.]. In the present paper we propose a consistent framework that is based on MSDSM and specifically provide a multi-thread implementation of the System for Storage Management (SSM). We then provide an environment for development of a complete system for setting up test beds for validation of protocols stacks.

  • Image Shadow Detection and Classification Based on Rough Sets
    Tie-Min Chen, Jian-Fang Deng and Jun-Liang Pan, Hunan mobile communication company, China
    ABSTRACT

    The disadvantages of domestic and oversea shadow image edge detection algorithms are analyzed. A novel shadow detection algorithm based on edge growing and rough set theory and subsequent solution is proposed. We describe how to detect image edge using condition attribution of rough set in this paper. Also, the method of thinning and connection for shadow edge using edge growing from the edge nodes is proposed. As can be seen from the experimental analysis, the method we proposed has better performance in edge detection and image segmentation.

  • Towards A Semantic for UML Activity Diagram Based on Institution Theory for It’s Transformation to Event-B Model
    Aymen Achouri, University of Tunis, Tunisia
    ABSTRACT

    In this article, we define an approach for model transformation. We use the example of UML Activity Diagram (UML AD) and Event-B as a source and a target formalism. Before doing the transformation, a formal semantic is given to the source formalism. We use the institution theory to define the intended semantic. With this theory, we gain a algebraic specification for this formalism. Thus, the source formalism will be defined in its own natural semantic meaning without any intermediate semantic. Model transformation will be performed by a set of transformation schema which preserve the semantic expressed in the source model during the transformation process. The generated model expressed in Event-B language will be used for the formal verification of the source model. As a result, some model expressed in a precise formalism, the verification of this model can be seen as the verification of the Event-B model semantically equivalent to the source model. Then, in the present work we combine the institution theory, Event-Bmethod and graph grammar to develop an approach supporting the specification, the transformation and the verification of UML AD.

  • Energy Efficient load Balanced Routing Protocol for Wireless Sensor Networks
    Alghanmi Ali Omar and ChongGun Kim, Yeungnam University, South Korea
    ABSTRACT

    Due to the enormous applications of wireless sensors the research on wireless sensor network remains active throughout the past two decades. Because of miniaturization of sensor nodes and their limited batteries the energy efficiency and energy balancing is the demand in‐need to extend the life time of sensor networks. This study proposes an energyaware directional routing protocol for stationary wireless sensor network. The routing algorithm is non‐table driven, destination aware and packet forwarder nodes are selected on the basis of admissible heuristic logical distance, and packet forwarding direction is also determined in very simplistic method. The algorithm is designed for 1‐hop, 2‐hop and ‘2‐ hop & 1‐hop combine’ communication method. The energy balancing mechanism of this paper is based on two state thresholds and simulation result shows its superiority over the existing directional routing protocols of wireless sensor networks.

  • Performance Analysis of an Integrated Cellular and Ad Hoc Relay System
    Madhu Jain1 and Nidhi Agarwal2, 1IIT-Roorkee, India and 2I.B.S, India
    ABSTRACT

    In this paper, we propose a new wireless system architecture based on the integration of cellular and modern ad hoc relaying technologies. It can efficiently balance traffic loads and share channel resource between cells by using ad hoc relaying stations to relay traffic from one cell to another dynamically. However, the application demand and allocation could lead to congestion, if the network has to maintain such high resources for quality of service (QoS) requirements of the applications. In our system, handoff area and queue are taken into consideration and new and handoff calls are given priority, respectively. We analyze the system performance in terms of the call blocking probability and queueing delay for new call requests and call dropping probability for handoff requests. Numerical illustrations are provided with the help of Successive Overrelaxation Method (SOR). In order to improve the performance of base station, the trade off between number of services channel and QoS of base station must be considered.

  • An study of the performance parameter for recommendation algorithm evaluation
    Chhavi Rana and Sanjay Kumar Jain,
    ABSTRACT

    The enormous amount of Web data has challenged its usage in efficient manner in the past few years. As such, a range of techniques are applied to tackle this problem; prominent among them is personalization and recommender system. In fact, these are the tools that assist user in finding relevant information of web. Most of the e-commerce websites are applying such tools in one way or the other. In the past decade, a large number of recommendation algorithms have been proposed to tackle such problems. However, there have not been much research in the evaluation criteria for these algorithms. As such, the traditional accuracy and classification metrics are still used for the evaluation purpose that provides a static view. This paper studies how the evolution of user preference over a period of time can be mapped in a recommender system using a new evaluation methodology that explicitly using time dimension. We have also presented different types of experimental set up that are generally used for recommender system evaluation. Furthermore, an overview of major accuracy metrics and metrics that go beyond the scope of accuracy as researched in the past few years is also discussed in detail.

  • Process-Driven Software Development Methodology for Enterprise Information System
    Kwan Hee Han, Gyeongsang National University, Korea
    ABSTRACT

    In today’s process-centered business organization, it is imperative that enterprise information system must be converted from task-centered to process-centered system. Traditional software development methodology is function-oriented, in which each function manages its own data and it results in redundancy because data that belongs to one object are stored by several functions. Proposed in this paper is a process-driven software development methodology, in which business process is a major concern and workflow functionalities are identified and specified throughout the entire development life cycle. In the proposed methodology, the development process, modeling tools and deliverables are clarified explicitly. Proposed methodology can be a guideline to practitioners involved in enterprise software development, of which workflow is an essential part.

  • Developemnt and Evaluation of a Web Based Question Answering System for Arabic Languag
    Heba Kurdi, Sara Alkhaider and Nada Alfaifi, Al Imam Muhammad Ibn Saud Islamic University, SA
    ABSTRACT

    Question Answering (QA) systems are gaining great importance due to the increasing amount of web content and the high demand for digital information that regular information retrieval techniques cannot satisfy. A question answering system enables users to have a natural language dialog with the machine, which is required for virtually all emerging online service systems on the Internet. The need for such systems is higher in the context of the Arabic language. This is because of the scarcity of Arabic QA systems, which can be attributed to the great challenges they present to the research community, including the particularities of Arabic, such as short vowels, absence of capital letters, complex morphology, etc. In this paper, we report the design and implementation of an Arabic web-based question answering system, which we called “JAWEB”, the Arabic word for the verb “answer”. Unlike all Arabic question-answering systems, JAWEB is a web-based application, so it can be accessed at any time and from anywhere. Evaluating JAWEB showed that it gives the correct answer with 100% recall and 80% precision on average. When compared to ask.com, the well-established web-based QA system, JAWEB provided 15-20% higher recall. These promising results give clear evidence that JAWEB has great potential as a QA platform and is much needed by Arabic-speaking Internet users across the world.

  • Human Interaction in the Regulatory of Telecommunications Infrastructure Deployment in South Africa
    Sharol Sibongile Mkhomazi1 and Tiko Iyamu2, 1Tshwane University of Technology, South Africa and 2Polytechnic of Namibia, Namibia
    ABSTRACT

    Telecommunications is increasingly vital to the society at large, and has become essential to business, academic, as well as social activities. Due to the necessity to have access to telecommunications, the deployment requires regulations and policy. Otherwise, the deployment of the infrastructures would contribute to environment, and human complexities rather than ease of use. However, the formulation of telecommunication infrastructure deployment regulation and policy involve agents such as people and processes. The roles of the agents are critical, and are not as easy as it meant to belief. This could be attributed to different factors, as they produce and reproduce themselves overtime. This paper presents the result of a study which focused on the roles of agents in the formulation of telecommunication infrastructures deployment regulation and policy. In the study, the interactions that take place amongst human and non-human agents were investigated. The study employed the duality of structure, of Structuration Theory as lens to understand the effectiveness of interactions in the formulation of regulations, and how policy is used to facilitate the deployment of telecommunications infrastructure in the South African environment. Keywords: Regulatory Authority, Telecommunications, Infrastructure sharing, Structuration Theory, Human Interaction