Accepted Papers

  • Human Odour Based Biometric Authentication System
    S.Divakar1, S.Selvakumar2 and T.Vignesh Ramkumar1, 1Adhiparasakthi College of Engineering, India and 2IFET Engineering College, India

    Biometrics is the current buzzword in user authentication domain. Finger print and retinal scan, that are examples of biometric systems that are in use today,have the drawback that, they are not fool proof. Recent surveys have revealed the uniqueness of human odour. Human odour will join the Elite list in the near future. The advantage lies in the fact that it is impossible to replicate human odour. This paper deals with the feasiblity of creating a model system that authenticates people based on their body odour. The challenge is in designing a sensor that identifies every human by his scent. An abstract model of a system that implements this sensing and identification has been proposed here. This authentication system is very useful in safeguarding bank vaults and documents of International repercussions from potentially smart anti-social crooks.

  • A Survey on the Application of Soft Computing, Datamining and Swarm Intelligence for Stock Market Forecasting
    Partha Roy, Ramesh Kumar, Sanjay Sharma and M.K.Kowar, Bhilai Institute of Technology, India

    This paper surveys recent literature in the domain of applying Soft Computing, Data-mining and Swarm Intelligence for Stock Market Forecasting. This papers contribution is to explore the key areas where research is being undertaken, and attempt to identify the degree of successes associated with different research approaches.

  • Improving QoS For VANET In City Environment
    Anchala Ameena Mohammadi, G.S.Raj and R.Vimal Karthick, Vel Tech University, India

    Vehicular ad-hoc networks (VANETs) allow vehicles to form a self-organized network without the need for permanent infrastructure. As a prerequisite to communication, an efficient route between network nodes must be established, and it must adapt to the rapidly changing topology of vehicles in motion. Improving efficiency and fairness for City Environment becomes a hot topic own to its increasing challenge about unique features, such as limited transporting distance, high mobility, and poor link quality. As a result, providing quality of service in city environment has a great role in Intelligent Transportation System [11] (ITS). Since there are more obstacles in city environment, most of the existing protocol fails to provide greater efficiency. The main goal of this paper is to propose a new ad hoc routing protocol (GCTAR), which has improved performance in highly mobile environment of VANETs(city infrastructure). This GCTAR inherits the characters of geographic routing and also contains additional features such as maintain a cache of successful routes between various source and destination.

  • Image Segmentation by Modified MAP-ML Estimations
    MM.S.Karande1 and D. B. Kshirsagar2 , 1K.K. Wagh Polytechnic, India and 2S. R. E. S 's COE , India

    Though numerous algorithm exist to perform image segmentation there are several issues related to execution time of these algorithm. Image Segmentation is nothing but label relabeling problem under probability framework. To estimate the label conguration, an iterative optimization scheme is implemented to alternately carry out the maximum a posteriori (MAP) estimation and the maximum likelihood(ML) estimations. In this paper this technique is modified in such a way so that it performs segmentation within stipulated time period. The extensive experiments shows that the results obtained are comparable with existing algorithms. This algorithm performs faster execution than the existing algorithm to give automatic segmentation without any human intervention. Its result match image edges very closer to human perception.

  • Hybrid Double Layered Security against Data corruption and Node Compromises with Minimum Energy Consumption in WSN
    Geethu K Mohan and K R Ramesh Babu , Government Engineering College Painavu , India

    Wireless Sensor Network (WSN) is a collection of sensors that are of heterogeneous in nature. Data are sensed from the external environment, traversed through the network and finally reaches the sink. The main focused problem of Wireless Sensor Network is the data integrity throughout the network from data sensing till it reaches the sink. If the data is corrupted, considerable amount of energy is wasted at each time when the data is forwarded to the next node. The critical data corruption attack is done by compromised nodes. Various strategies have been introduced to identify the corrupted data and compromised node. This paper focuses a hybrid double layered security strategy for sensed data. The first step of security is applied by appending a Keyed Message Authentication Code(HMAC) to the sensed data by Secure Hash Algorithm (SHA-2/512) which is robust algorithm to ensure message security throughout the network. The second step of security is implemented by a variation of ConstrAined Random Perturbation based pairwise keY (CARPY) mechanism. In CARPY+ mechanism successful key exchange between sender and receiver proves the sender nodes identity. Any fail while comparing the key which is extracted from the received message identifies the sender node as a malicious node. The proposed methodology improves the network performance by avoiding data corruption at the network layer and same time identifies the compromised node.

  • Detection of Flood Attacks in DTN Using Rate Limiter Technique
    T.Abhishek Kumar, C.Balamurugan and M.Viswanathan, Vel Tech University, India

    Disruption tolerant network (DTN) is a network, developed in such a manner that intermittent communication problems have very low affect on the outcome of the result. However, due to the limited network resources in this network such as buffer space and bandwidth, it is liable to flood attacks. Flood attacks means a network becomes so weighed down with packets, caused by the attackers. It prevents packets being sent / received between the nodes in the network. Flood attacks caused by outsider (unauthorized) can be prevented by authentication techniques. However it is not possible to prevent for attacks caused by insiders (authorized). There are many methods adopted to prevent flood attacks in other networks, but none has been installed successfully for DTN's. In order to protect resources and defend against flood attacks, the rate limiting technique should be adopted. In which each node must be set up with a limit over the number of packets it can send to the network and number of replicas that can be created for each packets, such as rate limit L and rate limit R respectively. Also we inhibit technique for detection of application level flood attacks, and routing misconduct caused by malicious nodes.

  • Energy and Latency Aware Application Mapping Algorithm & Optimization for Homogeneous 3D Network on Chip
    Vaibhav Jha, Sunny Deol and G K Sharma, ABV-Indian Institute of Info. Tech. & Mgmt., India

    Energy efficiency is one of the most critical issue in design of System on Chip. In Network On Chip (NoC) based system, energy consumption is influenced dramatically by mapping of Intellectual Property (IP) which affect the performance of the system. In this paper we test the antecedently extant proposed algorithms and introduced a new energy proficient algorithm stand for 3D NoC architecture. In addition a hybrid method has also been implemented using bioinspired optimization(particle swarm optimization) technique.The proposed algorithm has been implemented and evaluated on randomly generated benchmark and real life application such as MMS, Telecom and VOPD. The algorithm has also been tested with the E3S benchmark and has been compared with the existing algorithm(spiral and crinkle) and has shown better reduction in the communication energy consumption and shows improvement in the performance of the system.Comparing our work with spiral and crinkle, experimental result shows that the average reduction in communication energy consumption is 19% with spiral and 17% with crinkle mapping algorithms, while reduction in communication cost is 24% and 21% whereas reduction in latency is of 24% and 22% with spiral and crinkle. Optimizing our work and the existing method using bio-inspired technique and having the comparison among them an average energy reduction is found to be of 18% and 24%.

  • An Efficient Feature Selection in Classification of Audio Files
    Jayita Mitra1 and Diganta Saha2, 1Camellia Institute of Technology, India and 2Jadavpur University, India

    In this paper we have focused on an efficient feature selection method in classification of audio files. The main objective is feature selection and extraction. We have selected a set of features for further analysis, which represents the elements in feature vector. By extraction method we can compute a numerical representation that can be used to characterize the audio using the existing toolbox. In this study Gain Ratio (GR) is used as a feature selection measure. GR is used to select splitting attribute which will separate the tuples into different classes. The pulse clarity is considered as a subjective measure and it is used to calculate the gain of features of audio files. The splitting criterion is employed in the application to identify the class or the music genre of a specific audio file from testing database. Experimental results indicate that by using GR the application can produce a satisfactory result for music genre classification. After dimensionality reduction best three features have been selected out of various features of audio file and in this technique we will get more than 90% successful classification result.

  • Enhanced Acknowledgement Based Intrusion Detection For Manets
    T.Archana and N.Rajkumar, Vel Tech University, India

    In this paper we have focused on an efficient feature selection method in classification of audio files. The main objective is feature selection and extraction. We have selected a set of features for further analysis, which represents the elements in feature vector. By extraction method we can compute a numerical representation that can be used to characterize the audio using the existing toolbox. In this study Gain Ratio (GR) is used as a feature selection measure. GR is used to select splitting attribute which will separate the tuples into different classes. The pulse clarity is considered as a subjective measure and it is used to calculate the gain of features of audio files. The splitting criterion is employed in the application to identify the class or the music genre of a specific audio file from testing database. Experimental results indicate that by using GR the application can produce a satisfactory result for music genre classification. After dimensionality reduction best three features have been selected out of various features of audio file and in this technique we will get more than 90% successful classification result.

  • An Effective way of Mining Knowledge from Heterogeneous data sources
    Nathiya M and S.Koteeswaran, Vel Tech University, India

    Big data related to large volume, multiple ways of growing data sets and autonomous sources. With the fast growing development of networking, data and information storage, and the data collection capacity, now the big data is expanding in all fields are science and engineering domains, including physical, research and biomedical sciences. This article demonstrates a HACE theorem that describes the features of big data revolution and it proposes the Big Data processing model as the data mining perspective. This model includes demand driven gathering of information sources, knowledge mining and data analysis, user interest modelling and security considerations. In this we also analyze the challenging issues which are present in data driven model and big data revolution.

  • Study on Performance Improvement of Oil Paint Image Filter Algorithm Using Parallel Pattern Library
    Siddhartha Mukherjee, Samsung R&D Institute, India

    This paper gives a detailed study on the performance of oil paint image filter algorithm with various parameters applied on an image of RGB model. Oil Paint image processing, being very performance hungry, current research tries to find improvement using parallel pattern library.

  • Detecting Pareto Type II Software Reliability Using SPRT
    Sitakumari Kotha1, Anusha Chalasani1 and Satyaprasad Ravi2, 1Velagapudi Ramakrishna Siddhartha Engineering College, India and 2Acharya Nagarjuna University, India

    As the volumes of data/software is getting increased in the internet day by day, there is a need for the people to have the tools/mechanism to assess the software for reliability as it takes more time to come to conclusion in classical hypothesis. By adopting Sequential Analysis of Statistical science it can be decided very quickly whether the software which is developed is reliable or unreliable. To implement this, Sequential Probability Ratio Test (SPRT) is applied for Pareto Type II model. In this paper, the performance of SPRT on time domain data is evaluated using Pareto Type II Model with ordered statistics. The results are analyzed for 4 different data sets and the parameters are estimated using Maximum Likelihood Estimation. The experimental results elucidated that the reliability of software can be assessed with less number of observations comparatively.

  • A Cloud based Personalized Health Care System
    P.Evangeline Jasmine and R. Kavitha, Vel Tech University, India

    The Health Care Organization is one of the largest service organizations in the world which mainly relies on Information Technology and Management Systems to provide better service and accuracy of information to their patients. However the existing system was mainly focus on information of patient health record and the electronic medical record within the organization. In order to develop public health care system we propose a Cloud based health care system which is based on cloud computing infrastructures in conjunctive with semantic web and machine learning algorithm. The cloud-based storage makes the information available to anyone who's taking care of the patients, allowing health care practitioners to view their patient history, diagnoses, treatment and also provide remote health consultation. Several key techniques for service composition are used which supports branch and parallel structures for the improvement of the proposed public health care platform.

  • Next Generation e-ticketing System
    Abdul Mateen Ansari and Aftab Alam, King Khalid University, Saudi Arabia

    The Indian Railways attempt to modernize its reservation systems. Since the railway has become the best modes of transportation available in the country for common people. It would be awkward to just increase the ticket fares to meet expenditure incurred due to maintenance, the huge workers and the expansion activities. The recent moves; Indian Railways have given some contracts to the IT companies to enhance their passenger reservation system, software- aided train scheduling and install Wi-Fi services in selected trains and e-ticketing via smart phones. The Railways have planned to inaugurate a new ticket reservation system that will sell 7200 tickets per minute while current system that can reserved 2000 tickets per minute and able to handle about 1.2 lakh concurrent connections on web servers simultaneously against 40,000. The railway is going to modernize itself to embrace cutting-edge technologies like cloud computing technology for better efficiency and minimize the cost. This cloud technology will be extensively employed in railway reservation. In this research we shall investigate an impact of cloud based railway reservation system on passengers as well as government. Here we explain Service Oriented Architecture (SOA) for cloud based e-ticketing railway reservation system, how IRCTC (Indian Railway Catering and Tourism Corporation) can make tatkal booking easier, features of proposed system architecture, benefits of the proposed system and issues and challenges.

  • Survey paper on event detection techniques in disaster recovery
    Apurva Saoji and Poonam Lambhate, JSPM, India

    Wireless Sensor network is having tremendous growth in current world due to low cost sensor and well planned techniques. Wireless sensor networks (WSNs) are large networks made of a large number of sensor nodes with power to sense the environment and communicate it with administrator. This technology can be used to detect the particular event which can be helpful to manage the disaster. Until now various techniques of event detection have come forward and effectively contributed to manage the disaster. In this survey paper, we present Wireless sensor network architecture, we study various techniques used for Event Detection in WSN like Support Vector Machine a, Feed forward neural network, and machine learning techniques. Also, we introduced the open issues and challenges faced in various techniques and how machine learning technique with the help of neural network effectively addresses to it. Finally, several open research questions of wireless sensor networks and event detection are suggested and put forward.

  • Optimized Algorithm for Improving QoS in Mobile Ad Hoc Network
    R.T.Thivya lakshmi, R.Srinivasan and G.S.Raj, Vel Tech University, India

    A Mobile Adhoc Network is a collection of independent mobile nodes that can communicate to each other via radio waves. The mobile nodes that are in radio range of each other can directly communicate, whereas others needs the aid of intermediate nodes to route their packets. Each of the node has a wireless interface to communicate with each other.These networks are fully distributed, and can work at any place without the help of any fixed infrastructure as access points or base stations, All networking functions such as routing and packet forwarding, are performed by nodes themselves in a self-organizing manner. For these reasons, securing a mobile ad -hoc network is very challenging.We propose Ant Colony Optimization Technique along with Swarm Intelligence Mechanism to ensure the Quality of Service parameters and also to enhance the MANET security.

  • A Fragile ROI-Based Medical Image Watermarking Technique with Tamper Detection and Recovery
    R. Eswaraiah1 and E. Sreenivasa Reddy2, 1Vasireddy Venkatadri Institute of Technology, India and 2Acharya Nagarjuna University, India

    In telemedicine it is a common practice to exchange medical images between hospitals at different locations through internet this may cause some changes in medical images. In this scenario authentication and integrity of medical images, transferred across unsecure channels like internet, using watermarking has become a very popular area of research. Many lossless watermarking techniques have been developed for tamper detection and recovery of medical images to overcome any misdiagnosis. In this paper we are proposing a Region of Interest (ROI) and Least Significant Bit (LSB) based lossless and fragile watermarking technique for tamper detection and tamper recovery of medical images. Run Length Encoding (RLE) is used to increase the embedding capacity. Experimental results reveal that any tampering to ROI of medical image is identified and recovered without any loss.

  • Enhancing Driving Direction using PkNN shortest path
    Pushkala.S, Keerthika.M, Herlin Shiny.A and Sundara Meena.V, Manakula Vinayagar Institute of Technology, India

    In Digital ecosystems which have been inspired by natural systems aim to addresses the complexities of digital world which includes categories like self-organize, is scalable, and is attainable. Spatial networks which consist of geospatial objects and paths that link the objects form a digital ecosystem in the truth of geo informatics. Recent development of mobile devices using inexpensive wireless networks, applications to access interest objects and their paths in the spatial world are getting more in demand. In this project, we introduced the concept of path-based k nearest neighbor (pkNN). Given a set of candidate interest objects, a query point, and the number of objects k, pkNN finds the shortest path that goes through all k interest objects with the minimum shortest distance among all possible paths. PkNN is useful when users would like to visit all k-interest objects one by one from the query point, in which pkNN will give the users the shortest path. We have addressed the complexities of the pkNN method, and covered various sections like looping paths, U-turn, and the possibilities to encounter local minima. Our performance evaluations show that pkNN performs well in respect to various object densities on the map due to our proposed pruning methods to reduce the search space.

  • Communicating Via Ant Colony Algorithm and Detecting Attackers through Tracking Program
    V.Ezhilarasi, E. Vinothini, K.Prabisha and D. Nagamani Abirami, Manakula Vinayagar Institute of Technology, India

    In the past few decades migration from wired to wireless network has been a global trend. The most important and widely used application in wireless network is Mobile Ad-hoc Networks(MANETs).MANETs are considered to be the future wireless network consisting entirely of mobile nodes that communicate on-the-move without any base stations. Because of the dynamic nature of MANETs, they are typically not very secure, so it is important to be cautious what data is sent over a MANET. Enhanced Adaptive ACKnowledgment (EAACK) specially designed for MANETs is used for providing secure intrusion-detection system to protect MANET from attacks. As there is no efficiency in the existing system to find the attacker, in our proposed model we provide efficient communication path using Ant colony Algorithm which leaves duplicate messages throughout its travelling nodes like ants leaves pheromones for further ants to sense the path. Thus it helps us to sense the attacker and from which node the attack has taken place. A tracking program is sent along with the source to destination which waits for acknowledgement from the destination till it gets the acknowledgement. If it does not get acknowledgement it waits till acknowledgement and it resends to the destination. It helps finds out the attacker soon. We implies Elliptic Curve Cryptography (ECC) algorithm for ensuring security for the message to be sent along with acknowledgement.

  • Scrutinizing Vulnerable Attacks in Cloud Infrastructure
    Shabana.M, I.Saranya, A.Nasreen and S.Jayamoorthy, Manakula Vinayagar Institute of Technology, India

    In recent years, all the network users have moved towards the cloud infrastructure for data maintenance as it maintains the copy of original data in their centralized storage and provides service to end users. Such services are called as Software-as-a-Service(SAAS),Infrastructure-as-a-Service (IAAS), Platform-as-a-service (PAAS). Example for this kind of service provider is EMC^2. These services are utilized by the user which provides security towards the cloud infrastructure. It also plays a vital role in recent research issues such as Distributed denial of service (DDoS) attack and low frequency vulnerability scanning involved in the network. But the detection of Zombie (early stage attack) is extremely difficult. To prevent such vulnerability in the environment, that is in cloud, we focus towards the multi-face distributed vulnerability detection, measurement and countermeasure selection mechanism called NICE, designing an open flow network providing API's to build a monitor and control plane over the distributed cloud data security. In addition, we introduce the Attack Graph Model (AGK) to prevent such above mentioned attacks such as DDoS, Zombies, etc. Here, we introduce the Counter Attack Authentication Metrics (CAAM) to prevent attacks in cloud services by comparing it with our existing method and outline the AGM to show that our proposed method has a little maximum security towards the cloud data service.

  • Ascertaining Security in Online Reputation Systems Using Rate Auditing Tool (RAT)
    Dhivyalakshmi.D , Kalaivani.K , Tamilarasi.V and Bhavani. P, Manakula Vinayagar Institute of Technology, India

    With the rapid development of reputation systems in varous online social networks,manipulations against such systems are evolving quickly. In this paper, we propose scheme TATA, the abbreviation of joint Temporal And Trust Analysis, which protects reputation systems from a new angle: the combination of time domain anomaly detection and Dempster-Shafer theory-based trust computation. Real user attack data collected from a cyber competition is used to construct the testing dataset. Compared with two representative reputation schemes and our previous scheme,TATA achieves a significantly better performance in terms of identifying items under attack, detecting malicious users who insert dishonest ratings, and recovering reputation scores. In our proposed model we implement Rate Auditing Tool (RAT) to monitor each and every rating manipulation. It checks whether the time logged in and logged out matches the stipulated time for viewing any videos or messages and giving appropriate rating. It monitors either the video provided is fully or atleast partly viewed thus accordingly their ratings given. It also checks whether the login in network which is providing rating is either from same location or its location varies each and every time from vast geographical location. It also checks whether from same ip address multiple logins giving ratings continuously. Thus by doing this the attacker is also detected and given counter measure to their attacks.

  • Automated Segmentation and Detection of Brain Tumor Tissue Using SVM in MR Images
    R.Preetha1 and G.R.Suresh2, 1Anna University, India and 2Easwari Engineering College, India

    Tumor detection is important in medical images for medical practices. Image Segmentation becomes essential and challenging to visualize the tissue of human for analyzing the MR images. In brain MR images, the boundary of tumor tissue is highly irregular.Deformable models and Region based methods are extensively used for medical image segmentation, to locate the boundary of the tumor. Problems associated with nonlinear distribution of real data, User interaction and poor convergence to the boundary region limited their usefulness. Clustering of brain tumor tissues, using Fuzzy C means is robust and effective for tumor localization.SVM is a more accurate classifier, which can act as an expert assistant to medical practitioners. The Fuzzy C means clustering with the extension of Feature extraction and classification using SVM is very promising in the field of brain tumor detection.

  • An Efficent Method for Retinal Blood Vessels Segmentation using Log-Gabor Filter
    S. Batmavady, L. Parvathavarthiny and V.Geetha, Pondicherry Engineering College, India

    Diagnosing retinal diseases is a recent technological advancement in eye care. It enables the optometrist to capture a digital image of the retina, blood vessels and optic nerve located at the back of eyes. Retinal blood vessel segmentation plays an important role in diagnosing the pathologies. The conventional methods use the Ensemble classifier of Bagged and Boosted decision tree of segmentation to detect the abnormalities of retina. In the latest literature, segmentation of retinal image is carried out using Gabor filter. In the proposed work, Log-Gabor filter is used which overcomes the disadvantages of Gabor filter.

  • High Level View of Cloud Security: Issues and Solutions
    Venkata Narasimha Inukollu1 , Sailaja Arsi1 and Srinivasa Rao Ravuri2, 1Texas Tech University, USA and 2Cognizant Technology Solutions, India

    In this paper, we discuss security issues for cloud computing, Map Reduce and Hadoop environment. We also discuss various issues and approaches for cloud computing security and Hadoop. Today, Cloud computing security is developing at a rapid pace which includes computer security, network security and information security. Cloud computing plays a very vital role in protecting data, applications and the related infrastructure with the help of policies, technologies and controls.

  • A Deadline Based Improved Maximum Signal -to-Interference Ratio Scheduler at Subscriber Station for real time services in WiMAX
    Divya Saxena1 and arun Kumar Kulshrestha2, 1GLA University, India and 2MITM, India

    The Worldwide Interoperability for Microwave Access (WiMAX) provides high data access at all time with mobility support and maintains the fairness among different service flows with Quality of Servic(QoS). Various scheduling algorithm exists in literature to deal with bursty network traffic in WiMAX. The goals of any scheduling are to attain the most favourable usage of resources, to ensure QoS, to increase throughput and to decrease end-to-end delay while maintaining less computational complexity. In this paper, Deadline Improved maximum Signal-to-Interference Ratio (DImSIR) scheduling algorithm for real time services in WiMAX is proposed. In the proposed DImSIR approach based on basic maximum Signal-to-Interference Ratio (mSIR) and Improved maximum Signal-to-Interference Ratio (ImSIR) approach, deadline is calculated for the real time Polling Service (rtPS) traffic at the Subscriber Stations (SSs) to guarantee the delay requirement and data packets are served in next frame accordingly. Through this approach, all SSs present in network are likely to have high chance to get bandwidth on the basis of deadline within their category i.e., high, medium and low. In addition, we present an extensive comparison among some of the best scheduling algorithms with the DImSIR.

  • A Unidirectional Data-flow Model for Cloud Data Security with User Involvement during Data Transit
    Bhargav J Bhatkalkar , St. Joseph Engineering College, India

    Traditional computational models are rapidly shifting from a centralized computing to a distributed computing paradigm. As a result of this shift, the buzz of cloud computing is heard everywhere these days. The main concern in cloud computing environment is providing security to the user data. Often user data is moved back and forth between Cloud Service Vendor (CSV) and Cloud Service User (CSU). The degree of trust of CSU in CSV varies when it comes to the sensitivity of data. A CSU may or may not trust the CSV. In the latter case, the CSU may be interested to use the security service provided by a Third-Party (TP) like a Certification Authority to whom both the CSU and CSV may trust. Once again here, the CSU may or may not even trust the TP based on the supremacy of the data. In order to provide a flexible and secure management of CSUs data, the proposed model explicitly considers the degree of trust possessed by the CSU in both CSV and TP. The movement of CSU data within the premises of CSV is also strictly controlled with the involvement of CSU so that the data is not moved arbitrarily without the consent of CSU. Majority of the flow of data among entities in the proposed model is kept unidirectional to block the reverse transmission of sensitive information and also to block the return path to shield the secure data source from hidden viruses, Trojans, malicious instructions or other intrusion attempts. The security mechanisms suggested for realization of the proposed model are widely accepted and practically proven. The proposed data security model ensures privacy and security of the data both at CSV side and CSU side.

  • SQLIXSS: Survey on Detecting SQL Injection And Cross Site Scripting In Multitier Web Applications
    R.Stalinbabu and P. Chellammal, J.J. College of Engineering and Technology, India

    Internet services and applications have become an inextricable part of daily life, enabling communication and the management of personal information from anywhere. To accommodate this increase in application and data complexity, web services have moved to a multitiered design wherein the web server runs the application front-end logic and data are outsourced to a database or file server. This present Double Guard, an IDS system that models the network behavior of user sessions across both the front-end web server and the backend database. By monitoring both web and subsequent database requests are able to ferret out attacks that independent IDSwould not be able to identify. Furthermore, quantify the limitations of any multitier IDS in terms of training sessions and functionality coverage. Implemented Double Guard using an Apache web server with MySQL.

  • Integration of Sensor Network With Cloud Computing for Patient Monitoring
    S.Janani Devi, Dr.G.M.Tamil Selvan and M.Suresh, Bannari Amman Institute of Technology, India

    In past years, Patient observation is done manually or by using wireless Body Sensor Network which is sensibly observed by medical organization agents. Mesh network is used for reading physiological framework and clear description of the patient using wireless sensors. Inventive Agents are proposed for alerting medical organization and data aggregation. Cloud is also proposed for supporting healthcare community and remote or mobile patient monitoring.

  • WarningBird MailAlert Based Malicious URLs Blocker System in twitter
    S.Saranya, Prist University, India

    Twitter is prone to malicious tweets containing URLs for spam, phishing, and malware distribution. Conventional Twitter spam detection schemes utilize account features such as the ratio of tweets containing URLs and the account creation date, or relation features in the Twitter graph.These detection schemes are ineffective against feature fabrications or consume much time and resources. Conventional suspicious URL detection schemes utilize several features including lexical features of URLs, URL redirection, HTML content, and dynamic behavior. However, evading techniques such as time-based evasion and crawler evasion exist. In this paper, we propose WARNINGBIRD, a suspicious URL detection system for Twitter. Our system investigates correlations of URL redirect chains extracted from several tweets. Because attackers have limited resources and usually reuse them, their URL redirect chains frequently share the same URLs. We develop methods to discover correlated URL redirect chains using the frequently shared URLs and to determine their suspiciousness.

    We collect numerous tweets from the Twitter public timeline and build a statistical classifier using them. Evaluation results show that our classifier accurately and efficiently detects suspicious URLs.WARNINGBIRD as a near real-time system for classifying suspicious URLs in the Twitter stream. In this project I proposed block the malicious URLs and provide mailalert for malicious URLs occur in the twitter stream.

  • Spread of Influence on Information Propagation: Target Viral Marketing through Influential Nodes in Social Network
    Windelle John G. Vega, Emiliana B. Lappay, Tefany L. Balawen, Norman Christian D. Sy and Luisa B. Aquino, Saint Louis University, Philippines

    Social Networking sites are already widely used nowadays in different aspects most especially on communication, information dissemination, marketing and others. In Social Network Analysis, one of the most studied problems is the information dissemination, and is often related to target viral marketing. The aim of this study is to provide a framework that consolidates the use of NodeXL as a tool for analyzing various data, determining the key influential nodes using the different metrics and to simulate the diffusion process using the two models (ICM and LTM). We have to first gather data from a Facebook network, do some analysis using NodeXL after which we will measure the metrics of each node to determine the key influential node over the network and lastly we will simulate the diffusion process using Linear Threshold Modeling and Independent Cascade Modeling. The results shall be the basis of the effectiveness of the proposed framework on Target Viral Marketing and it will show that the network structure has a great impact on the diffusion process.

  • Organization Mapping Using Social Network Graph Analysis
    Edward Benedict Suyu, Virna Addatu, Shierly Maccarubo, Darryl Erwin Vargas and Luisa B. Aquino, University of Saint Louis, Philippines

    This paper studies the utilization and weight factors of graph metrics on visualization and mapping of sub-networks that exist in a given network of connected people. We utilize the use of algorithms such as the Clauset-Newman Moore Algorithm and the Harel-Koren Fast Multi-scale Layout, and graph metrics such as degrees and centralities. The study showed that using such methods yields results that show network visualization that is useful for investigative purposes.

  • A Survey of Sentiment Analysis of Customer Reviews
    Saurabh Marathe, comAkshay Jagtap and Akash Jaiswal, Maharashtra institute of technology, India

    A field of Sentiment Analysis has build up in the past decade to analyze these textual data for extracting the information oriented to the needs of customers. The sentiment found within comments, feedback or critiques provide useful indicators for many different purposes Many companies are forming their resources in this field to know shortcomings of the product which will be identified due to feature-by-feature sentiment analysis and for better sales. The system would be useful for companies to know the quality of their products. The entire process involves four steps -Review pre-processing (POS), feature-based sentiment extraction, word scoring based on the comparison with WordNet library and polarity detection using algorithm. We test this algorithm on Amazon's customer reviews for one specific product.

  • A Novel Image Compression Technique Using Wavelets
    Aisha Fernandes and Wilson Jeberson, Sam Higginbotom Institute of Agriculture, Technology & Sciences, India

    Despite the rapid progress in processor speeds and digital communication system performance, demand for data storage capacity and data transmission bandwidth continues to surpass the capabilities of available technologies. In this paper we propose an algorithm for image compression using the Antonini 7/9 filters in the wavelet domain. The wavelet transform decomposes the image into a set of sub-bands. The resultant coefficients are then quantized. Then by using an error metric, we approximate the reconstructed coefficients quantization error and thus minimize the distortion for a given compression percentage. Experimental results demonstrate that the proposed algorithm is efficient and produces a reconstructed image of better quality in comparison to the jpeg compression. .Our algorithm achieves a compression ratio of 60-90 and a PSNR of 34-47 db.

  • Multiple DAG Applications Scheduling on a Cluster of Processors
    Uma Boregowda1 and Venugopal Chakravarthy2, 1Malnad College of Engineering, Hassan, India and 2Sri Jayachamarajendra College of Engineering, India

    Many computational solutions can be expressed as Directed Acyclic Graph (DAG), in which nodes represent tasks to be executed and edges represent precedence constraints among tasks. A Cluster of processors is a shared resource among several users and hence the need for a scheduler which deals with multi-user jobs presented as DAGs. The scheduler must find the number of processors to be allotted for each DAG and schedule tasks on allotted processors. In this work, a new method to find optimal and maximum number of processors that can be allotted for a DAG is proposed. Regression analysis is used to find the best possible way to share available processors, among suitable number of submitted DAGs. An instance of a scheduler for each DAG, schedules tasks on the allotted processors. Towards this end, a new framework to receive online submission of DAGs, allot processors to each DAG and schedule tasks, is proposed and experimented using a simulator. This space-sharing of processors among multiple DAGs shows better performance than the other methods found in literature. Because of space-sharing, an online scheduler can be used for each DAG within the allotted processors. The use of online scheduler overcomes the drawbacks of static scheduling which relies on inaccurate estimated computation and communication costs. Thus the proposed framework is a promising solution to perform online scheduling of tasks using static information of DAG, a kind of hybrid scheduling.

  • Handwritten Irregular Hindi Character Recognition Using Genetic Algorithm
    Prateek Mishra and Md. Tanwir Uddin Haider, National Institute of Technology Patna, India

    Devnagari Hindi is the most popular language in India. The proposed Devnagari Hindi character recognition system is an intelligent system capable of recognizing offline Hindi character written by any individual in his own writing style. In Hindi, a character may be written in various styles and also in different sorts of irregularities which make such recognition more challenging. In this paper, genetic algorithm has been used to recognize irregular Hindi characters efficiently.The motivation behind using Genetic Algorithm comes from the fact that different styles of writing a character can be genetically combined to produce various new and unknown styles. These unknown and newly generated styles can be further used as a sample data to match with the input data. Genetic Algorithm can be more useful when the sample data is very large and the input data is rich with irregularities. In this paper, some cases where irregular characters may lead to poor recognition rate are discussed and an approach using Genetic Algorithm is proposed to deal with this problem.

  • Efficient Technique for Web Image Mining
    Praveen Kumar and Md. Tanwir Uddin Haider, National Institute of Technology Patna, India

    Web image mining is a growing area in present environment. It defines for the use of web image mining techniques on web to find the hidden information which is present in the image as text. In this paper a literature survey has been proposed on web image mining. Web image mining is a technique of searching ,retrieving and accessing the data from an image, There are two type of web image mining techniques i.e. Text based web image mining and image based web image mining. The objective of this paper is to present tools and technique which are used in past and current evaluation. We will show a summarize report for overall development in web image mining.

  • Farm Advisory System for Farmers of Northeastern states of India
    Yengkhom Ranjan Singh, Centre for Development of Advanced Computing (C-DAC), India

    Agriculture means jobs and income for Northeast states of India. Rice is the most important food crop. However, the rice production in this region is still below the all India average rice productivity [Anonymous, 2000]. This could be attributed to traditional cultivation methods that might not always result in good yield, unavailability of experts and non-optimum utilization of tools and techniques. The currently available agricultural services have a few limitations. The information dissemination follows a push model which fails to share information at the right time and with the right group of farmers that may get impacted otherwise. Farmers require timely, accurate and location specific information in relation with different aspects of farming like the pests, diseases, weeds and fertilizer management, etc. for their crops from agricultural experts. On the other side, the complexity of a whole farming process is growing because it is constrained by many factors such as requirements, goals, regulations, etc. that farmer must satisfy or consider. To provide such information and to achieve an optimal crop plan, the automation is provided by computer-based systems termed as Advisory Systems.An advisory system supports the farmers in getting expert advices on many activities in a farming process.With this system, farmers can access virtual agricultural experts as and whenever needed.

  • Low Leakage Domino Based Circuit for Wide Fan-In OR Gates
    Sindhu S and Arun Prasath C, K.S.R College of Engineering, India

    Dynamic logic is widely used in many applications because of their high speed nature and less area. As technology scales down, size of the transistor has been shrinking which lead to reduce the supply voltage and the threshold voltage (Vth) for low power circuits. Reduction in threshold voltage exponentially increases the sub threshold leakage current which is the major concern in wide fan-in dynamic gates. So a new technique is proposed for low leakage dynamic CMOS circuit. Stacking effect of NMOS transistor in PDN provides high resistance path to leakage and improves the dynamic performance of the circuit. The proposed method implemented with Tanner EDA 13.0v which reduces the leakage current about 9.89% compared with conventional circuit and reduces Power Delay Product (PDP).

  • Iris detection and tracking based on Hough Transform
    Meet Pandya and Narendra Patel, Birla Vishvakarma Mahavidyalaya, India

    Various methods have been proposed for detecting iris features from facial images. However, for real time applications, these methods consume a lot of computational power. The proposed algorithm uses a modified approach which is comparatively faster than the existing methods of iris detection. Proposed method is based on the generalized Hough transform for circles, which is proven to be robust under various test conditions. The accuracy of algorithm depends on iris features extracted in initial stage. These results are later used in the tracking stage, which improve the speed of tracking.

  • Satisfactory Result in Web Database Search needs Support of Users Query Information
    Pramod kumar Ghadei and Dr.S.Sridhar, Sathyabama University, India

    Now the day's advancement of web technologies is very much needed for providing better satisfaction to the web user community.Due to the faster growth of seriousness and requirement of data processing for vital information extraction we need better techniques for reconstructing our Web databases. Considering the steep size and volume of databases, retrieval of appropriate information as asked by users has become a tedious process. Web users are faced with the challenge of overloading - too many result sets are returned for their asked queries.Furthermore, too few or no results are returned if a explicit query is asked. This paper proposes a ranking algorithm that gives fast choice to a user's present search and also utilizes past profile information in order to obtain the appropriate results for a user's query.

  • Speed Control of BLDC Motor Using Fuzzy Logic Controller Based on Sensorless Technique
    Sriram J and Sureshkumar K, Meenakshi College of Engineering, India

    Brushless dc (BLDC) motors are very popular and are replacing brush motors in numerous applications due to its superior electrical and mechanical characteristics owing to its trouble free construction. This paper presents the BLDC motor sensorless speed control system with fuzzy logic implementation. The sensorless techniques based on the back EMF sensing and the rotor position detection with a high starting torque is suggested. The rotor position is aligned at standstill for without an additional sensor. Also, the stator current can be easily adjusted by modulating the pulse width of the switching devices during alignment which will be helpful to reduce cost and complexity of the drive system without compromising the performance. The design analysis and simulation of the proposed system is done using MATLAB version 2010a and the simulation results of sensored drive using PI controller and sensorless drive using proposed methods are analyzed.

  • Auto Scaling of Web Based Services in Cloud Computing using Request Redirection Management Algorithms
    Sudarshan N. Sutar and Shyam Sunder Gupta , Siddhant C.O.E., India

    In the present era, cloud computing is drastically applied in service & development fields. Web service providers are taking advantage of cloud computing for better services. Cloud computing give the features of infrastructure of several facilities. Cloud service providers are struggling to handle peak loads of web requests at a critical schedule of demands. Today's Internet application servers are biased in distributed algorithms and load blind algorithms. Current solutions are based on server load & threshold values. It considered front end servers those are dispatching web requests among application servers. These solutions are applying such algorithms where no timing information is monitored for loads. Load blind algorithms are inadequate to cope with request demand patterns of rich internet applications. In this paper study of request redirection algorithms on few factors those are responsible for performance in web request redirection among a set of servers are covered. This Paper is proposing a preparation of table of request redirection distribution log with reference of performance gain prediction algorithm to reduce response time in redirection requests to another server. This table will help to minimize redirection overhead in dynamic request redirection management in cloud computing. I observed that modified algorithm & associated table will show better results comparing with request management algorithms working on the basis of thresholds to improve response time and load balancing in cloud computing.

  • Detecting Anomalies in Equally Typed Function Arguments Using Static Analysis
    J. Albert Mayan1, T. Ravi2 and A.Arul Raga Jeevini1, 1Sathyabama University, India and 2Srinivasa Institute of Engineering and Technology, India

    Function Contain multiple arguments. When calling these functions programmer must pass the multiple arguments in expected order. For static programming languages, compiler helps the programmer to check the type of arguments. If the arguments are in the same type means how can the programmer check the arguments are passed in the correct order? This paper present two simple and effective program analyses for detecting the problem related to the equally typed function arguments. The key idea is to introduce the novel method of static program analyses that detect problems related to the order of equally typed arguments. It first using the name extraction, gathers identifier names that programmers have given to function arguments. The anomaly detection analysis searches for anomalies in the order of arguments by leveraging .The naming bug detection is used to detects the bugs of argument names in the programs. The analyses solve the problem regarding program correctness and understandability. We evaluate the approach with some real world programs written in Java language.

  • Recognizing and Mask Removal in 3D Faces Even in Presence of Occlusions
    L.Lakshmanan1, D.C.Tomar2 and J.Jasmine1, 1Sathyabama University, India and 2Jerusalem College of Engineering, India

    3D Face Recognition has been developed than the 2D face recognition to have better accuracy for handling the facial features. The unsuspected dangers are avoided for 2D face recognition when it used in pose variation, changes in face, and lighting condition. To improve the accuracy of 3D face recognition for custom images using recognition, the known formal method is used. The 3D image is considered for face recognition, occlusions (extra objects that occurred face recognition e.g., eyeglass, cap, mobile phones, etc.,) are one of the greatest challenges in face recognition systems. The masked projection is proposed with high accuracy in face recognition system using classification algorithms. Masked Projection technique is proposed that can cope with missing data. Furthermore, a regional approach is utilized to improve the classification performance, where different regions serve as separate classifiers.

  • Preferences Databases for First-Class Citizens
    A. Mary Posonia1, .V.L.Jyothi2 and E. Baby Salini1, 1Sathyabama University, India and 2Jeppiaar Engineering College, India

    The preference databases need to be queried processing into the DBMS. We introduce a relational data model that extends with the preference query and captures. Based on the set of special operators and algebraic properties. We propose, provide several query optimal for extended query plans and we describe a query execution algorithm. We have implemented in a framework and methods PrefDB. It allows efficient evaluation of preferential queries using a relational database. In our experimental evaluation of using two real world movie data sets demonstrates the feasibility solutions of our framework.

  • TV Program Detection Using Effective Machine Learning Classifiers
    R. Subbulakshmi, Dr.MCET, India

    Item detection from a tweet is a common task to understand the current movies/topics attracting a large number of common users. However the unique characteristics of tweets (short and noisy content, and a large data volume) make the item detection a challenging task. Existing techniques proposed for item detection uses battery of one class classifier using key word matching techniques and SVM classifier and those techniques provide better accuracy but the features are extracted are found to be noisy, this is a major limitation in SVM classifier.

    In this system a SVM classifier with genetic algorithm optimization is proposed. In GA optimization we use accuracy of SVM as a fitness function; only the best features are selected. And this will improve accuracy for item detection and also the system provides user rating based on the polarity of tweets. This system is expected to improve in terms of classification accuracy when GA is combined with SVM.

AIRCC Library


Technically Sponsored by