Faculty Publications

Permanent URI for this communityhttps://idr.nitk.ac.in/handle/123456789/18736

Publications by NITK Faculty

Browse

Search Results

Now showing 1 - 8 of 8
  • Item
    BBRvl vs BBRv2: Examining Performance Differences through Experimental Evaluation
    (IEEE Computer Society help@computer.org, 2020) Nandagiri, A.; Tahiliani, M.P.; Misra, V.; Ramakrishnan, K.K.
    BBR, a congestion control algorithm proposed by Google, regulates the source sending rate by deriving an estimate of the bottleneck's available bandwidth and RTTof the path. The initial version of BBR, called BBRvl, was found to be unfair, getting higher than the fair share of bandwidth when co-existing on bottleneck links with other congestion control algorithms. It also does not perform as well with networks having routers with shallow buffers. To overcome these concerns, a newer version, called BBRv2, has been proposed. Our goal in this paper is to understand the differences between the two versions and examine the primary reasons behind the improvement in performance of BBRv2. We present an experimental evaluation of BBRvl and BBRv2, evaluating their fairness across connections using the same protocol (intra-protocol fairness) and using different protocols (inter-protocol fairness) as well as delay and link utilization. From experiments with shallow and deep buffers, BBRv2 is most effective when it uses Explicit Congestion Notification (ECN), but fairness issues continue to exist in BBRv2 when ECN is disabled. A concern for BBRv2 is that it is somewhat complex to deploy in Wide Area Networks (WAN) because of the dependency with the DCTCP-style reduction of the congestion window, which is primarily usable in low-feedback delay Data Center Networks. © 2020 IEEE.
  • Item
    FMCW Radar-Based UAV Detection and Tracking Using Transfer Learning
    (Institute of Electrical and Electronics Engineers Inc., 2024) Sreekumar, S.; Shashank, S.K.; Srihari, P.; Nandagiri, A.; Vandana, G.S.; Pardhasaradhi, B.P.; Cenkarmaddi, L.R.
    This research investigation offers a novel method for monitoring and detecting unmanned aerial vehicles (UAVs) by combining transfer learning neural networks with Frequency Modulated Continuous Wave (FMCW) radar. The system utilizes a 60 GHz Texas Instruments IWR6843ISK radar with a DCA1000 board to capture raw radar signals, which are subsequently processed to generate range-angle heat maps. Ground truth data for UAV positioning is meticulously obtained using a dual GPS setup, where one GPS is stationed at the radar and the other is mounted on the UAV. The processed range-angle heat maps serve as the input for various transfer learning models, including DenseNet, InceptionV3, MobileNet, ResNet, and VGG, which are employed to compute the range data and angle data of the UAV. The results emphasize the potential of transfer learning in improving radar signal processing by demonstrating the effectiveness of these models in attaining accurate UAV detection and tracking. This approach is pivotal for applications requiring precise UAV monitoring, offering a robust solution for scenarios where traditional radar systems may fall short. The study underscores the advantages of leveraging transfer learning for improved radar-based UAV detection and sets the stage for future advancements in autonomous aerial monitoring and surveillance systems. © 2024 IEEE.
  • Item
    FMCW Radar-Based Detection and Tracking of Drones Using DBSCAN Clustering and Extended Kalman Filter for Anti-Drone Defense Systems
    (Institute of Electrical and Electronics Engineers Inc., 2024) Srihari, P.; Vandana, G.S.; Kumar, U.; Nandagiri, A.; Pardhasaradhi, B.P.; Cenkarmaddi, L.R.
    This paper aims to develop a radar-based detection and tracking system to mitigate the threats posed by drones, particularly those carrying malicious payloads. Due to the limitations of cameras in adverse weather and the high costs of LiDAR systems, radar technology is employed as a cost-effective alternative. The system utilizes 3D FFT followed by CA-CFAR for drone range-azimuth detections. The range-azimuth detections are clustered using DBSCAN. We simplified the extended target tracking problem into point target tracking based on the drone's size, with the dBSCAN cluster center acting as the measurement for the tracker. The tracking algorithm combines an Extended Kalman Filter (EKF) with Global Nearest Neighbor (GNN) data association. Experiments were conducted using a 77 GHz AWR1642 radar sensor to track a micro drone of hexacopter type within a range of 10m to 100m. The results demonstrated effective tracking capabilities with radar sensors successfully generating tracks. This study highlights the viability of radar-based systems for anti-drone applications, offering a practical solution for enhancing infrastructure security against potential drone threats. © 2024 IEEE.
  • Item
    Real-Time UAV Altitude Estimation and Data Transmission via mmWave Radar and Edge Computing
    (Institute of Electrical and Electronics Engineers Inc., 2024) Vandana, G.S.; Srihari, P.; Kumar, U.; Nandagiri, A.; Pardhasaradhi, B.P.; Cenkarmaddi, L.R.
    This paper presents a novel approach for UAV altitude estimation and data transmission using a 60 GHz IWR6843 mmWave radar mounted on a micro-drone, coupled with a Raspberry Pi edge device. The radar, configured in a long-range mode, leverages its high accuracy in altitude measurement, surpassing the performance of traditional UAV altimeters. The radar altimeter data is processed on the Raspberry Pi and wirelessly transmitted to the cloud, from which it can be accessed by a ground station for real-time monitoring and analysis. To validate the accuracy of the radar-based altitude measurements, GPS data is simultaneously recorded on the UAV, serving as a ground truth reference. Experimental results demonstrate that the radar-based measurements closely match the GPS-derived altitudes, showcasing the effectiveness of the proposed system. This approach not only improves altitude estimation accuracy but also enhances the reliability of UAV operations in various environments. Potential applications of this system include precision agriculture, disaster management, and search and rescue operations, where accurate altitude data is critical for mission success. The integration of mmWave radar with edge computing and cloud-based data management opens new avenues for real-time UAV monitoring and autonomous navigation. © 2024 IEEE.
  • Item
    MIMO-SAR Image Reconstruction Experiment Using Back-Projection Algorithm with Automotive Radar for ADAS Applications
    (Institute of Electrical and Electronics Engineers Inc., 2024) Jena, P.; Singh, A.; Vandana, S.G.; Nandagiri, A.; Srihari, P.; Pardhasaradhi, B.; Cenkarmaddi, L.R.
    Synthetic aperture radar (SAR) imaging has numerous uses in surface mapping, civil infrastructure, remote sensing, and terrain monitoring. Despite the benefits of multiple input multiple outputs (MIMO) in automotive radars, they are primarily used to provide range, azimuth, and elevation information for automotive applications. Obtaining acceptable angular resolution for automotive radar is a recurring difficulty due to vehicle-to-vehicle, vehicle-to-ground, vehicle-to-guardrail, and vehicle-to-tunnel discrimination. The purpose of this work is to demonstrate MIMO-SAR for finer angular resolution utilizing the 77-GHz Texas Instruments (TI) frequency-modulated continuous wave (FMCW) AWR1642 radar. SAR and MIMO radar topologies are used to increase synthetic or virtual aperture while maintaining adequate angular resolution. SAR is used to rebuild images from experimental data, and the images are created using a backpropagation algorithm. The findings are presented for SAR, MIMO, and MIMO-SAR. Furthermore, the experimental demonstration of MIMO-SAR using 77 GHz automobile radar verifies the prior modeling results. In addition, MIMO-SAR has been shown to provide better angular resolution than SAR and MIMO approaches. This algorithm's superior performance makes it appropriate for the automotive industry to perform SAR imaging on ego-corner deployed short-range radars (SRR) to picture guard rails, crossing vehicles, and VRUs. © 2024 IEEE.
  • Item
    Cyber Attacking Active FMCW Radar Signal AoA Estimation Using Passive FMCW Radar for ADAS Applications
    (Institute of Electrical and Electronics Engineers Inc., 2024) Prakash, A.S.; Vandana, S.G.; Nandagiri, A.; Srihari, P.; Pardhasaradhi, B.; Cenkarmaddi, L.R.
    Millimeter-wave (mmWave) radars are a popular choice for Advanced Driver Assistant Systems (ADAS) that identify and track objects in the field of view. These mmWave radars (the primary radar on ego vehicles) are susceptible to interference signals from other mmWave radars (secondary radars on traffic participant vehicles) in the vicinity, which can result in false detection and tracking triggers. Knowing the interference signal's angle of arrival (AoA) is critical for locating the secondary radar source. This study discusses the experiments with AoA estimation of interference signals created by secondary radars when the primary radar is in a passive state. We performed a 3-dimensional Fast Fourier Transform (FFT) on the received I-Q data and used a range-angle heatmap image to determine the signal's spatial pattern. The 3D FFT (range FFT on time-domain ADC samples, velocity FFT on chirps, and angle FFT across antennas) calculates the AoA of the signals. In this experiment, the 77GHz IWR1642 primary radar is in passive mode, while the other 77GHz secondary radars (AWR1642 and AWR2944) are in active mode, providing an interference attack. We also tried with different ranges (2m, 3m, 5m, and 8m) and azimuths to determine the stealthiness of the attack. The AoA for passive radar is a good fit for identifying spurious sources/illuminators of opportunities, electronic counter-countermeasures (ECCM), source localization, knowledge-aided passive radar systems, and cognitive radar development. © 2024 IEEE.
  • Item
    Automotive Radar Signal Authentication via Correlation and Power Spectral Density
    (Institute of Electrical and Electronics Engineers Inc., 2024) Vishnu Prasad, P.; Vandana, G.S.; Nandagiri, A.; Srihari, P.; Pardhasaradhi, B.; Cenkarmaddi, L.R.
    Because of their comprehensive target detection, classification, and tracking capabilities, mm-wave radars are becoming increasingly popular in advanced driver assistance systems (ADAS). Unfortunately, these radars are vulnerable to attacks such as jamming and spoofing. This research presents a simple and low-cost radar signal authentication method that can be used in automotive radar receivers that lack external hardware or networking systems. The proposed technique of detecting correlation and power spectral density (PSD) classifies incoming signals as interference-free or not, and it may be swiftly implemented via a firmware update. As an example, the Texas Instruments (TI) IWR1642 frequency modulated continuous wave (FMCW) radar is tested in both non-jamming and jamming situations. The return signals are processed to get the correlation and power spectral density (PSD) observations and thereby classify the signals. © 2024 IEEE.
  • Item
    Depth Information Fusion Using Radar-LiDAR-Camera Experimental Setup for ADAS Applications
    (Institute of Electrical and Electronics Engineers Inc., 2024) Dayananda, N.B.; Srivastava, N.; Achala, G.; Nandagiri, A.; Srihari, P.; Pardhasaradhi, B.; Cenkarmaddi, L.R.
    Improved scene perception makes the safe driving of automotive vehicles (AVs) feasible. The most common automotive sensors for AV perception for detection, classification, a nd t racking a re t he Light Detection And Ranging (LiDAR), radar, and camera sensors. The most reliable sensors for determining range are LiDAR and radar. In this research, we consider the referencing from the camera-based object recognition to fuse the LiDAR and radar point cloud data. To minimize any unintended effects from sensor orientation and sampling time, all three sensors are installed, calibrated, and time-aligned for this experiment. Subsequently, the obtained camera sensor data is subjected to object detection using a MobileNet-based deep neural network (DNN). The radar and LiDAR point cloud data are projected with the two-dimensional bounding box width, length, and height used for object recognition. Following that, the range information from the radar and LiDAR is retrieved and combined using a weighted average fusion algorithm. This experiment is run on the ROS platform, using AWR1642 radar sensor and RealSense LiDAR camera L515 sensor. The object detection from the camera and conducting fusion on the radar and LiDAR sensor is a potential algorithm for the Advanced Driver Assistant System (ADAS) emergency brake assistant (EBA) function. © 2024 IEEE.