Browsing by Author "Pais, A.R."
Now showing 1 - 20 of 151
- Results Per Page
- Sort Options
Item A Boosting-Based Hybrid Feature Selection and Multi-Layer Stacked Ensemble Learning Model to Detect Phishing Websites(Institute of Electrical and Electronics Engineers Inc., 2023) Lakshmana Rao, L.R.; Rao, R.S.; Pais, A.R.; Gabralla, L.A.Phishing is a type of online scam where the attacker tries to trick you into giving away your personal information, such as passwords or credit card details, by posing as a trustworthy entity like a bank, email provider, or social media site. These attacks have been around for a long time and unfortunately, they continue to be a common threat. In this paper, we propose a boosting based multi layer stacked ensemble learning model that uses hybrid feature selection technique to select the relevant features for the classification. The dataset with selected features are sent to various classifiers at different layers where the predictions of lower layers are fed as input to the upper layers for the phishing detection. From the experimental analysis, it is observed that the proposed model achieved an accuracy ranging from 96.16 to 98.95% without feature selection across different datasets and also achieved an accuracy ranging from 96.18 to 98.80% with feature selection. The proposed model is compared with baseline models and it has outperformed the existing models with a significant difference. © 2013 IEEE.Item A chinese remainder theorem based key management algorithm for hierarchical wireless sensor network(Springer Verlag, 2015) Bhaskar, P.K.; Pais, A.R.Wireless Sensor Networks (WSN) are network of sensors having low computation, storage and battery power. Hierarchical WSN are heterogeneous network of sensors having different capabilities which form a hierarchy to achieve energy efficiency. Key management algorithms are center of the security protocols in WSN. It involves key pre distribution, shared key discovery, key revocation, and refreshing. Due to resource constraints in WSN achieving a perfect key management scheme has been quite challenging. In this paper a new key management scheme for Hierarchical WSN based on Chinese Remainder Theorem has been proposed. An experimental setup is created to evaluate this scheme. The results indicate that it establishes the key with minimum computation, communication, storage cost at each node, also it is scalable and resilient to different attacks. © Springer International Publishing Switzerland 2015.Item A Fine Grain Attribute Enabled Access Control(Institute of Electrical and Electronics Engineers Inc., 2018) Bhusare, S.S.; Pais, A.R.Attribute-Based Access Control model has attracted many researchers to come up with better techniques that can improve the current performance and policy modeling. The XACML policy language standard used by the attribute based access model uses rules and policy set to govern the access mechanism. The policy sets are complex rules consisting of several combinations of attribute and attribute values. Different techniques including policy compression are proposed to date to improve the performance compared with the standard ABAC implementation. However, no research is done to model the policy store based on attributes. This paper makes the first attempt on the design of an attribute enabled access control mechanism that takes benefit of the attribute store residing along with the policy. The proposed model can also accommodate the policies defined in currently used enterprise model, ABAC. This work also presents a policy conversion method from ABAC to the proposed model design. A performance analysis is given using real-life and synthetic dataset, that shows its applicability to the real world scenarios and the characterized policy set it intends to satisfy. The comparison between the decision evaluation over the proposed model with the other model shows its correctness. This paper also suggests more research on modeling of attributes store and policy sets for access decision computation. © 2018 IEEE.Item A framework for intrusion tolerance in cloud computing(2011) Karande, V.M.; Pais, A.R.Cloud Computing has been envisioned as the next generation architecture and one of the fastest growing segments of the IT enterprises. No matter how much investment is made in cloud intrusion detection and prevention, cloud infrastructure remains vulnerable to attacks. Intrusion Tolerance in Cloud Computing is a fault tolerant design approach to defend cloud infrastructure against malicious attacks. Thus to ensure dependability we present a framework by mapping available Malicious and Accidental Fault Tolerance for Internet Applications (MAFTIA) intrusion tolerance framework for dependencies such as availability, authenticity, reliability, integrity, maintainability and safety against new Cloud Computing environment. The proposed framework has been validated by integrating Intrusion Tolerance via Threshold Cryptography (ITTC) mechanism in the simulated cloud environment. Performance analysis of the proposed framework is also done. © 2011 Springer-Verlag.Item A heuristic technique to detect phishing websites using TWSVM classifier(Springer Science and Business Media Deutschland GmbH, 2021) Rao, R.S.; Pais, A.R.; Anand, P.Phishing websites are on the rise and are hosted on compromised domains such that legitimate behavior is embedded into the designed phishing site to overcome the detection. The traditional heuristic techniques using HTTPS, search engine, Page Ranking and WHOIS information may fail in detecting phishing sites hosted on the compromised domain. Moreover, list-based techniques fail to detect phishing sites when the target website is not in the whitelisted data. In this paper, we propose a novel heuristic technique using TWSVM to detect malicious registered phishing sites and also sites which are hosted on compromised servers, to overcome the aforementioned limitations. Our technique detects the phishing websites hosted on compromised domains by comparing the log-in page and home page of the visiting website. The hyperlink and URL-based features are used to detect phishing sites which are maliciously registered. We have used different versions of support vector machines (SVMs) for the classification of phishing websites. We found that twin support vector machine classifier (TWSVM) outperformed the other versions with a significant accuracy of 98.05% and recall of 98.33%. © 2020, Springer-Verlag London Ltd., part of Springer Nature.Item A hybrid super learner ensemble for phishing detection on mobile devices(Nature Research, 2025) Rao, R.S.; Kondaiah, C.; Pais, A.R.; Lee, B.In today’s digital age, the rapid increase in online users and massive network traffic has made ensuring security more challenging. Among the various cyber threats, phishing remains one of the most significant. Phishing is a cyberattack in which attackers steal sensitive information, such as usernames, passwords, and credit card details, through fake web pages designed to mimic legitimate websites. These attacks primarily occur via emails or websites. Several antiphishing techniques, such as blacklist-based, source code analysis, and visual similarity-based methods, have been developed to counter phishing websites. However, these methods have specific limitations, including vulnerability to zero-day attacks, susceptibility to drive-by-downloads, and high detection latency. Furthermore, many of these techniques are unsuitable for mobile devices, which face additional constraints, such as limited RAM, smaller screen sizes, and lower computational power. To address these limitations, this paper proposes a novel hybrid super learner ensemble model named Phish-Jam, a mobile application specifically designed for phishing detection on mobile devices. Phish-Jam utilizes a super learner ensemble that combines predictions from diverse Machine Learning (ML) algorithms to classify legitimate and phishing websites. By focusing on extracting features from URLs, including handcrafted features, transformer-based text embeddings, and other Deep Learning (DL) architectures, the proposed model offers several advantages: fast computation, language independence, and robustness against accidental malware downloads. From the experimental analysis, it is observed that the super learner ensemble achieved significant accuracy of 98.93%, precision of 99.15%, MCC of 97.81% and F1 Score of 99.07%. © The Author(s) 2025.Item A LDPC codes based Authentication Scheme(Institute of Electrical and Electronics Engineers Inc., 2020) Kittur, A.S.; Kauthale, S.; Pais, A.R.Verifying multiple digital signatures in a batch to reduce verification time and computation has caught the interest of many researchers since many years. There are various batch verification schemes proposed for various popular digital signature algorithms such as DSS, RSA, ECDSA and other signature schemes. If there are any bad signatures in the given batch of signatures, then the batch verification test fails but the test does not indicate the location of the bad signature. In literature, there are very few efficient schemes, which locate the index of the bad signature/s in the given batch. These existing schemes perform poorly when the bad signature/s count is unknown or when the entire batch of signatures is faulty. To overcome these disadvantages, we propose a new Low-Density Parity-Check (LDPC) based verification scheme to locate the index of the bad signature/s. Our proposed scheme outperforms the other bad signature identification schemes. The comparative analysis of our scheme with the other schemes is provided. The primary advantage of the scheme is, it removes all the transmission errors in the received batch of signatures. © 2020 IEEE.Item A new batch verification scheme for ECDSA ? signatures(Springer, 2019) Kittur, A.S.; Pais, A.R.In this paper, we propose an efficient batch verification algorithm for ECDSA? (Elliptic Curve Digital Signature Algorithm)? signatures. Our scheme is efficient for both single and multiple signers. ECDSA? signature is a modified version of ECDSA, which accelerates the verification of ECDSA signature by more than 40%. However, the highlighting feature of our proposed scheme is its efficiency for varied batch sizes. The scheme is resistant to forgery attacks by either signer or intruder. The performance of our scheme remains consistent for higher batch sizes too (? 8). Our paper also discusses the possible attacks on ECDSA signatures and also how our scheme is resistant to such attacks. © 2019, Indian Academy of Sciences.Item A New Combinatorial Design Based Data En-Route Filtering Scheme for Wireless Sensor Networks(Institute of Electrical and Electronics Engineers Inc., 2019) Kumar, A.; Pais, A.R.Wireless sensor networks are susceptible to report fabrication attacks, where adversary can use compromised nodes to flood the network with false reports. En-route filtering is a mechanism of dropping bogus/false reports while they are being forwarded towards the sink. Majority of the proposed en-route filtering schemes are probabilistic, where the originality of forwarded reports is checked with fixed probability by intermediate nodes. Thus, false reports can travel multiple hops before being dropped in probabilistic en-route filtering schemes. Few deterministic based en-route filtering schemes have also been proposed, but all such schemes need to send the reports through fixed paths. To overcome the above mentioned limitations of existing en-route filtering schemes, we propose a novel deterministic enroute filtering scheme. In the proposed scheme, secret keys are allocated to sensor nodes based on combinatorial design. Such design ensures direct communication between any two nodes without adding more key storage overhead. We provide in-depth analysis for the proposed scheme. The proposed scheme significantly outperforms existing schemes in terms of expected filtering position of false reports and is more buoyant to selective forwarding and report disruption attacks. Our scheme also performs neck-To-neck with existing schemes in terms of protocol overheads. © 2018 IEEE.Item A new combinatorial design based key pre-distribution scheme for wireless sensor networks(Springer Verlag service@springer.de, 2019) Kumar, A.; Pais, A.R.In this paper we present a new Combinatorial Design based Key Pre-Distribution scheme (CD-KPD). For the scheme, the network region is divided into cells of equal size and each cell has two types of sensor nodes namely, normal sensor nodes and cluster heads. Within a particular cell, normal sensor nodes can communicate with each other directly and cluster heads are used for inter-cell communication. To ensure secure communication we use CD-KPD to assign keys to all the sensor nodes including cluster heads. We further modify CD-KPD to propose Combinatorial Design based Reduced Key Pre-Distribution scheme (CD-RKPD) by reducing the number of keys stored in each cluster head. The CD-RKPD was need of the hour when we consider to limit the inter-cell communication of each cell within its Lee sphere region. We give in-detail analysis of both the proposed schemes. We measure the resiliency of both proposed schemes by calculating fraction of links disrupted and fraction of cells disconnected when few sensor nodes are compromised in the network. We found that CD-KPD and CD-RKPD outperforms (Ruj and Roy in ACM Trans Sens Netw 6(1):4, 2009) by 59 and 6.5% respectively in terms of Global Resiliency and 5 and 9.7% respectively in terms of fraction of cell disconnected in the network. Further, we found that both our proposed schemes achieves high resiliency than majority of existing schemes. © 2018, Springer-Verlag GmbH Germany, part of Springer Nature.Item A new hybrid key pre-distribution scheme for wireless sensor networks(Springer New York LLC barbara.b.bertram@gsk.com, 2019) Kumar, A.; Pais, A.R.This article presents a novel hybrid key pre-distribution scheme based on combinatorial design keys and pair-wise keys. For the presented scheme, the deployment zone is cleft into equal-sized cells. We use the combinatorial design based keys to secure intra-cell communication, which helps to maintain low key storage overhead in the network. For inter-cell communication, each cell maintain multiple associations with all the other cells within communication range and these associations are secured with pair-wise keys. This helps to ensure high resiliency against compromised sensor nodes in the network. We provide in-depth analysis for the presented scheme. We measure the resiliency of the presented scheme by calculating fraction of links effected and fraction of nodes disconnected when adversary compromises some sensor nodes in the network. We find that the presented scheme has high resiliency than majority of existing schemes. Our presented scheme also has low storage overhead than existing schemes. © 2018, Springer Science+Business Media, LLC, part of Springer Nature.Item A new probabilistic rekeying method for secure dynamic groups(2008) Joshi, S.; Pais, A.R.Logical Key Hierarchy (LKH) is a basic method in secure multicast group rekeying. LKH maintains a balanced tree which provide uniform cost of O(log N) for compromise recovery, where N is group size. However, it does not distinguish the behavior of group members even though they have different probabilities of join or leave. When members have diverse changing probability the gap between LKH and the optimal rekeying algorithm will become bigger. The Probabilistic optimization of LKH (PLKH) scheme, optimized rekey cost by organizing LKH tree with user rekey characteristic. In this paper, we concentrate on further reducing the rekey cost by organizing LKH tree with respect to compromise probabilities of members using new join and leave operations. Simulation results show that our scheme performs 18% to 29% better than PLKH and 32% to 41% better than LKH.Item A new probabilistic rekeying method for secure multicast groups(2010) Pais, A.R.; Joshi, S.The Logical Key Hierarchy (LKH) is the most widely used protocol in multicast group rekeying. LKH maintains a balanced tree that provide uniform cost of O(log N) for compromise recovery, where N is group size. However, it does not distinguish the behavior of group members even though they may have different probabilities of join or leave. When members have diverse changing probabilities, the gap between LKH and the optimal rekeying algorithm will become bigger. The Probabilistic optimization of LKH (PLKH) scheme, optimized rekey cost by organizing LKH tree with user rekey characteristic. In this paper, we concentrate on further reducing the rekey cost by organizing LKH tree with respect to rekey probabilities of members using new join and leave operations. Simulation results show that our scheme performs 18 to 29% better than PLKH and 32 to 41% better than LKH. © 2010 Springer-Verlag.Item A Novel Cancelable Fingerprint Template Generation Mechanism Using Visual Secret Sharing(Springer Science and Business Media Deutschland GmbH, 2024) Muhammed, A.; Pais, A.R.In fingerprint-based authentication system, cancelable fingerprint templates are generated to defend the fingerprint information. In this paper, we proposed a novel cancelable fingerprint template using Visual Secret Sharing (VSS). Using VSS, each fingerprint image is encrypted into different shares. Finally, these shares are preserved in distinct databases and treated as fingerprint template. Traditional VSS schemes are suffering from pixel expansion and contrast reduction. We have used grid-based VSS and data embedding mechanisms to succeed these limitations. The proposed fingerprint templates satisfy ideal properties of cancelable templates such as non-invertibility, diversity, and revocability without altering the performance of the authentication system. To enhance the speed of the template generation and reconstruction, we have used General Purpose Graphical Processing Unit (GPGPU) to fulfill the operations. The experimental evaluation validates that the reconstructed fingerprints have equivalent performance as the initial fingerprints with upgraded security. © Springer Nature Switzerland AG 2024.Item A novel Deep Learning architecture for lung cancer detection and diagnosis from Computed Tomography image analysis(Elsevier Inc., 2024) Crasta, L.J.; Neema, R.; Pais, A.R.Timely identification of lung nodules, which are precursors to lung cancer, and their evaluation can significantly reduce the incidence rate. Computed Tomography (CT) is the primary technique used for lung cancer screening due to its high resolution. Identifying white, spherical shadows as lung nodules in CT images is essential for accurately detecting lung cancer. Convolutional Neural Network (CNN)-based methods have performed better than traditional techniques in various medical image applications. However, challenges still need to be addressed due to insufficient annotated datasets, significant intra-class variations, and substantial inter-class similarities, which hinder their practical use. Manually labeling the position of nodules on CT slices is critical for distinguishing between benign and malignant cases, but it is an unreliable and time-consuming process. Insufficient data and class imbalance are the primary factors that may result in overfitting and below-par performance. The paper presents a novel Deep Learning (DL) framework to detect and classify lung cancer in input CT images. It introduces a 3D-VNet architecture for accurate segmentation of pulmonary nodules and a 3D-ResNet architecture designed for their classification. The segmentation model achieves a Dice Similarity Coefficient (DSC) of 99.34% on the LUNA16 dataset while reducing false positives to 0.4%. The classification model shows performance metrics with accuracy, sensitivity, and specificity of 99.2%, 98.8%, and 99.6%, respectively. The 3D-VNet network outperforms previous segmentation methods by accurately calibrating lung nodules of various sizes and shapes with excellent robustness. The classification model's metrics show that the suggested method outperforms current approaches regarding accuracy, specificity, sensitivity and F1-Score. © 2024 The AuthorsItem A Novel Fingerprint Image Enhancement based on Super Resolution(Institute of Electrical and Electronics Engineers Inc., 2020) Muhammed, A.; Pais, A.R.Fingerprint is a most common and broadly accepted biometric trait used for personal authentication. In fingerprint-based authentication, the feature extraction module extract features, and these extracted characteristics are used for authentication. In fingerprints, the feature extraction module heavily depends on the status of the image. However, in practice, always getting a good quality fingerprint image is not possible. Moreover, a notable number of fingerprints collected are of poor quality. The accurate extraction of fingerprint characteristics from a lesser quality fingerprint image is a challenging problem. Fingerprint enhancement is introduced to resolve this issue. Hence in this paper, we introduce a fingerprint enhancement technique using a Deep Convolution Neural Network (DCNN), which improves image quality. The proposed method consists of super-resolution, followed by filtering and enhancement. The proposed method provides better results as compared with the conventional fingerprint enhancement methods. The experimental results determine that the proposed strategy improves the visual clarity of low-quality images and reduces the error rates during the fingerprint matching. © 2020 IEEE.Item A novel fingerprint template protection and fingerprint authentication scheme using visual secret sharing and super-resolution(Springer, 2021) Muhammed, A.; Mhala, N.C.; Pais, A.R.Fingerprint is the most recommended and extensively practicing biometric trait for personal authentication. Most of the fingerprint authentication systems trust minutiae as the characteristic for authentication. These characteristics are preserved as fingerprint templates in the database. However, it is observed that the databases are not secure and can be negotiated. Recent studies reveal that, if a person’s minutiae points are dripped, fingerprint can be restored from these points. Similarly, if the fingerprint records are lost, it is a permanent damage. There is no mechanism to replace the fingerprint as it is part of the human body. Hence there is a necessity to secure the fingerprint template in the database. In this paper, we introduce a novel fingerprint template protection and fingerprint authentication scheme using visual secret sharing and super-resolution. During enrollment, a secret fingerprint image is encrypted into n shares. Each share is stored in a distinct database. During authentication, the shares are collected from various databases. The original secret fingerprint image is restored using a multiple image super-resolution procedure. The experimental results show that the reconstructed fingerprints are similar to the original fingerprints. The proposed method is robust, secure, and efficient in terms of fingerprint template protection and authentication. © 2020, Springer Science+Business Media, LLC, part of Springer Nature.Item A novel technique for defeating virtual keyboards - Exploiting insecure features of modern browsers(2011) Nadkarni, T.S.; Mohandas, R.; Pais, A.R.Advancement in technology is a necessity of time, but as new techniques are introduced, new security vulnerabilities are discovered and exploited in practice. In this paper we are presenting a new approach to defeat virtual keyboards using a new method for capturing parts of a browser screen. The page rendered in the browser is captured by using the canvas element provided by HTML5. We have specified the technical details of how this functionality is exploited and created a malicious extension for Mozilla Firefox browser. This extension captures screenshots of web pages rendered in the browser and sends them to a remote server. In addition, we have suggested mitigation strategies to prevent misuse of such browser functionalities. © 2011 Springer-Verlag.Item A partial key pre-distribution based en-route filtering scheme for wireless sensor networks(Springer Science and Business Media Deutschland GmbH, 2021) Kumar, A.; Bansal, N.; Pais, A.R.Compromised sensor nodes can be used to inject false reports (bogus reports) in wireless ssensor networks (WSNs). This can cause the sink to take wrong decisions. En-route filtering is a method to detect and filter false reports from WSNs. Most of the existing en-route filtering schemes use probabilistic approaches to filter false reports from the network, where filtering of false reports is based on a fixed probability. Thus false reports can travel multiple hops before being dropped. In this article we seek to overcome limitations of the existing schemes and reduce the overall key storage overhead in the cluster heads. In this article we propose a combinatorial design based partial en-route filtering scheme (CD-PEFS) which filters the fabricated reports deterministically. CD-PEFS reduces the energy requirements in the network by early detection and elimination of the false reports. Adoption of combinatorial design based keys get rid of shared key discovery phase from the network. This considerably reduces the communication overhead in the network. We carried out a detailed analysis of CD-PEFS against an increasing number of compromised sensor nodes in the network. We found that our scheme performs better than existing schemes in terms of filtering efficiency while maintaining low key storage overhead in the network. Further the performance of CD-PEFS is at par with existing schemes in terms of other protocol overheads. © 2020, Springer-Verlag GmbH Germany, part of Springer Nature.Item A random key generation scheme using primitive polynomials over GF(2)(Springer Verlag service@springer.de, 2016) Singh, I.; Pais, A.R.A new key generation algorithm is proposed using primitive polynomials over Glaois Field GF(2). In this approach, we have used MD5 algorithm to digest the system time and IP address of the system. The combination of these digest values acts as random seed for the key generation process. The randomness test for the generated key is performed by using Blum Blum Shub (BBS), Micali-Schnorr and Mersenne Twister (MT19937) PRNG algorithms. The generated key has been compared on the basis of the combination of 2 bit, 3 bit, 4 bit and 8 bit count values of 0’s and 1’s. In this paper, we have used chi squared test, R squared test and standard deviation to check the randomness of the generated key. We have analyzed our result based on the above three criteria and observed that the proposed algorithm achieves lower dispersion in 72.5% of the test cases, lower error rate in 61.6% of the test cases and higher fitness value in 68.3% of the test cases. © Springer Nature Singapore Pte Ltd. 2016.
