<?xml version="1.0" encoding="UTF-8"?><rss xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:atom="http://www.w3.org/2005/Atom" version="2.0"><channel><title><![CDATA[Autonomous Systems, Sensor Fusion, Digital Twins]]></title><description><![CDATA[Documenting Innovation in Real-Time]]></description><link>https://raffayhassan772.wixsite.com/autonomous-systems/blog</link><generator>RSS for Node</generator><lastBuildDate>Thu, 09 Apr 2026 04:05:19 GMT</lastBuildDate><atom:link href="https://raffayhassan772.wixsite.com/autonomous-systems/blog-feed.xml" rel="self" type="application/rss+xml"/><item><title><![CDATA[YOLOv5n vs YOLOv8n: Generational Performance Evaluation in CARLA Simulation]]></title><description><![CDATA[Phase:  1 (Simulation Extended Model Evaluation)  Focus:  Architecture comparison + adverse weather robustness + real-time stability Overview Following the initial YOLO implementation documented in the collision avoidance scenarios, this analysis compares two generational models: YOLOv5n (2020) and YOLOv8n (2023). The objective is to evaluate whether the newer YOLOv8 architecture provides meaningful improvements over the mature YOLOv5 for real-time collision detection. Both models were...]]></description><link>https://raffayhassan772.wixsite.com/autonomous-systems/post/yolov5n-vs-yolov8n-generational-performance-evaluation-in-carla-simulation</link><guid isPermaLink="false">69cfd6ea462bc80100bf2262</guid><pubDate>Fri, 03 Apr 2026 15:26:18 GMT</pubDate><enclosure url="https://static.wixstatic.com/media/7531f0_97efe3dfa89144a19b5d3cab03e3850b~mv2.png/v1/fit/w_1000,h_1000,al_c,q_80/file.png" length="0" type="image/png"/><dc:creator>Raffay Hassan</dc:creator></item><item><title><![CDATA[First Real Obstacle Tests: What the Data Shows]]></title><description><![CDATA[With the reactive controller in place and the thresholds corrected, two test sessions were run on 1 April 2026. The first session placed multiple obstacles across the car's path to force repeated avoidance manoeuvres over an extended run. The second was a single-obstacle test to validate a clean detection and escape cycle. Both sessions were logged to CSV and analysed after each run. This post walks through what the data shows. The Test Setup Test 1 placed multiple obstacles at various...]]></description><link>https://raffayhassan772.wixsite.com/autonomous-systems/post/first-real-obstacle-tests-what-the-data-shows</link><guid isPermaLink="false">69ce3b612a4608ae001c45ed</guid><pubDate>Thu, 02 Apr 2026 10:46:57 GMT</pubDate><enclosure url="https://static.wixstatic.com/media/7531f0_8999efd048b442d2bc0c1b611fde62e7~mv2.png/v1/fit/w_1000,h_1000,al_c,q_80/file.png" length="0" type="image/png"/><dc:creator>Raffay Hassan</dc:creator></item><item><title><![CDATA[Rethinking the Avoidance Logic: A Smarter, More Fluid Approach]]></title><description><![CDATA[The previous post documented getting everything off the bench and onto the car for the first time. The state machine worked in the sense that the car did reverse, turn, and escape. But there was a persistent problem: in anything tighter than a wide open space, the car got confused. It would reverse correctly, then get stuck trying to turn in one direction repeatedly. It would glide toward an obstacle and not stop in time. The root cause was not a single bug. It was an architectural decision....]]></description><link>https://raffayhassan772.wixsite.com/autonomous-systems/post/rethinking-the-avoidance-logic-a-smarter-more-fluid-approach</link><guid isPermaLink="false">69ce31162a4608ae001c281a</guid><pubDate>Thu, 02 Apr 2026 09:45:23 GMT</pubDate><enclosure url="https://static.wixstatic.com/media/7531f0_e889b0b49c6e41e7a2e3166e3c9cba90~mv2.png/v1/fit/w_1000,h_648,al_c,q_80/file.png" length="0" type="image/png"/><dc:creator>Raffay Hassan</dc:creator></item><item><title><![CDATA[From Bench to Car: The System That Finally Drives Itself]]></title><description><![CDATA[Image 1 : RC Car with Components The previous post documented bench testing in full. The LiDAR was tuned with a persistence filter and a narrow forward FOV. The radar pipeline had MTI clutter suppression, CFAR detection, and a nearest-neighbour tracker. YOLO was running on CUDA with bounding-box distance estimation. The sensor health monitoring layer was built. Everything was sitting on a desk pointing at a wall. The next step was getting it on the car. That sentence sounds simple. It was...]]></description><link>https://raffayhassan772.wixsite.com/autonomous-systems/post/from-bench-to-car-the-system-that-finally-drives-itself</link><guid isPermaLink="false">69ca0ed38bb7e54baa6b7857</guid><pubDate>Mon, 30 Mar 2026 06:02:08 GMT</pubDate><enclosure url="https://static.wixstatic.com/media/7531f0_6f8e098556aa4ac3bf74e19aef7ce830~mv2.jpeg/v1/fit/w_1000,h_1000,al_c,q_80/file.png" length="0" type="image/png"/><dc:creator>Raffay Hassan</dc:creator></item><item><title><![CDATA[Why CARLA Is Not the Digital Twin in My Project]]></title><description><![CDATA[When the term digital twin  is mentioned in autonomous vehicle research, many people assume it refers to a simulation environment. In many cases, this leads to the assumption that the CARLA simulator itself acts as the digital twin. However, in my project this interpretation is not entirely accurate. CARLA plays an important role in the development process, but it is not the digital twin. Understanding this distinction was one of the most important conceptual steps in the project. What CARLA...]]></description><link>https://raffayhassan772.wixsite.com/autonomous-systems/post/why-carla-is-not-the-digital-twin-in-my-project</link><guid isPermaLink="false">69b2dc9519a7028f2bc1cb64</guid><pubDate>Thu, 12 Mar 2026 15:34:11 GMT</pubDate><dc:creator>Raffay Hassan</dc:creator></item><item><title><![CDATA[Why My Autonomous Vehicle Project Is Different From Existing Solutions]]></title><description><![CDATA[Autonomous driving has become one of the most rapidly developing areas of robotics and artificial intelligence. Companies such as Waymo, Tesla, and Mobileye have spent years developing sophisticated perception systems capable of detecting objects, predicting movement, and making complex driving decisions. At first glance, building an autonomous vehicle perception system might appear to simply replicate what these companies have already achieved. However, my final-year project takes a...]]></description><link>https://raffayhassan772.wixsite.com/autonomous-systems/post/why-my-autonomous-vehicle-project-is-different-from-existing-solutions</link><guid isPermaLink="false">69b2db84ecfce39c49aa05b0</guid><pubDate>Thu, 12 Mar 2026 15:30:10 GMT</pubDate><dc:creator>Raffay Hassan</dc:creator></item><item><title><![CDATA[Tuning LiDAR, Radar, and Adding Camera-Based Distance to the Stack]]></title><description><![CDATA[The hardware is all there. The LiDAR is spinning, the radar is streaming tracks, and everything lands in one dashboard on the Jetson. But getting the sensors to a point where they actually mean something  where a warning is a warning and silence is silence that took a lot longer than getting them plugged in. This post covers what I changed on the LiDAR filtering, the radar signal processing pipeline, how YOLO estimates distance from bounding boxes, and the sensor health monitoring layer I...]]></description><link>https://raffayhassan772.wixsite.com/autonomous-systems/post/tuning-lidar-radar-and-adding-camera-based-distance-to-the-stack</link><guid isPermaLink="false">69b047d7be29996eed565d88</guid><pubDate>Tue, 10 Mar 2026 16:55:49 GMT</pubDate><dc:creator>Raffay Hassan</dc:creator></item><item><title><![CDATA[YOLOv8 on Jetson CUDA: Trying to Fix the LiDAR Noise Problem With a Camera]]></title><description><![CDATA[The LiDAR had a problem I couldn't ignore: it couldn't tell a real obstacle from a reflection. A shiny floor, a piece of glass, even the wrong angle of light all of them would produce a confident STOP alert from nothing. The persistence filter helped, but it didn't solve the root issue. The solution was to add a camera and make the LiDAR and camera cross-validate each other. An obstacle only triggers an alert if both sensors agree something is there. The LiDAR provides range, the camera...]]></description><link>https://raffayhassan772.wixsite.com/autonomous-systems/post/yolov8-on-jetson-cuda-trying-to-fix-the-lidar-noise-problem-with-a-camera</link><guid isPermaLink="false">69a6f096b220091e29d7d023</guid><pubDate>Tue, 03 Mar 2026 14:48:57 GMT</pubDate><enclosure url="https://static.wixstatic.com/media/7531f0_3621c22f91f3431a91f391852ba8286d~mv2.png/v1/fit/w_1000,h_1000,al_c,q_80/file.png" length="0" type="image/png"/><dc:creator>Raffay Hassan</dc:creator></item><item><title><![CDATA[Fusing LiDAR and Radar Into One Collision Decision]]></title><description><![CDATA[With the LiDAR running on the Jetson and the radar streaming tracks from the Pi, the next step was getting them talking to each other. Two independent sensors, two separate data streams, and I needed one unified answer: is the path ahead safe or not? Still all bench testing at this stage sensors on a desk pointed across the room. The goal here was to get the fusion logic, threading model and live GUI working correctly before anything goes near the actual car. How the Two Devices Connect The...]]></description><link>https://raffayhassan772.wixsite.com/autonomous-systems/post/fusing-lidar-and-radar-into-one-collision-decision</link><guid isPermaLink="false">69a6eeb97c30ee37982fce7f</guid><pubDate>Tue, 03 Mar 2026 14:29:49 GMT</pubDate><enclosure url="https://static.wixstatic.com/media/7531f0_9c81e24e23a64eee8fce8566ac65e34f~mv2.jpg/v1/fit/w_1000,h_1000,al_c,q_80/file.png" length="0" type="image/png"/><dc:creator>Raffay Hassan</dc:creator></item><item><title><![CDATA[Concept Milestone Poster Reflection Blog]]></title><description><![CDATA[Last week  presented my research poster on “Sensor-Driven Digital Twin Framework for Collision Prevention in Autonomous Systems.”  The poster focused on my investigation into improving perception reliability in autonomous vehicles through multi-sensor fusion and simulation-based validation. The main concept of my poster was to demonstrate how combining camera-based computer vision, LiDAR distance sensing, and mmWave radar velocity measurement can improve robustness compared to single-sensor...]]></description><link>https://raffayhassan772.wixsite.com/autonomous-systems/post/concept-milestone-poster-reflection-blog</link><guid isPermaLink="false">69a6ed6affff75773eb67173</guid><pubDate>Tue, 03 Mar 2026 14:20:09 GMT</pubDate><enclosure url="https://static.wixstatic.com/media/7531f0_93d990cdd1ff4480b0ba8d6067e92565~mv2.jpg/v1/fit/w_1000,h_1000,al_c,q_80/file.png" length="0" type="image/png"/><dc:creator>Raffay Hassan</dc:creator></item><item><title><![CDATA[Building the Sensor Foundation: LD06 LiDAR and 60 GHz Radar]]></title><description><![CDATA[I've been working on the hardware phase three sensors, two boards, and one unified dashboard to tie it all together. Before any of it goes near the actual car though, I wanted to build and validate each piece on the bench first. No point bolting things to a chassis if the software isn't working yet. This first post covers the two range sensors: the LD06 LiDAR and the BGT60TR13C 60 GHz radar on the DreamHAT+ board. They do very different things the LiDAR sees geometry, the radar sees velocity ...]]></description><link>https://raffayhassan772.wixsite.com/autonomous-systems/post/building-the-sensor-foundation-ld06-lidar-and-60-ghz-radar</link><guid isPermaLink="false">69a1c2779d34acb7c434914f</guid><pubDate>Fri, 27 Feb 2026 16:41:16 GMT</pubDate><enclosure url="https://static.wixstatic.com/media/7531f0_10ccedade2bf4b8fa8c3b354f5299307~mv2.png/v1/fit/w_1000,h_1000,al_c,q_80/file.png" length="0" type="image/png"/><dc:creator>Raffay Hassan</dc:creator></item><item><title><![CDATA[Phase 2 Real-World Hardware Platform for Autonomous Collision Avoidance]]></title><description><![CDATA[After developing and validating the perception algorithms in simulation using the CARLA autonomous driving environment, the project progressed to a real-world deployment on physical hardware. This phase focuses on transferring the digital twin concepts into a functioning autonomous system capable of sensing and reacting to real obstacles. Unlike simulation, real environments introduce sensor noise, communication delays, imperfect measurements, and physical constraints. Therefore, Phase 2...]]></description><link>https://raffayhassan772.wixsite.com/autonomous-systems/post/phase-2-real-world-hardware-platform-for-autonomous-collision-avoidance</link><guid isPermaLink="false">69a1c01150fcf4a4279fde5c</guid><pubDate>Fri, 27 Feb 2026 16:10:52 GMT</pubDate><enclosure url="https://static.wixstatic.com/media/7531f0_16d57c9414014c589c268406a6bc34d2~mv2.png/v1/fit/w_1000,h_1000,al_c,q_80/file.png" length="0" type="image/png"/><dc:creator>Raffay Hassan</dc:creator></item><item><title><![CDATA[YOLO Implementation and Performance Analysis Pipeline]]></title><description><![CDATA[Phase:  1 (Simulation Baseline)  Focus:  YOLOv8n inference + lane-relevance filtering + performance reporting Overview This blog documents the YOLO implementation used , including the offline playback pipeline designed to evaluate detection behaviour and real-time feasibility. The aim was not only to run detections, but to quantify performance (latency/FPS) and reduce irrelevant detections using a lane-focused filtering approach. Why YOLO in This Phase? YOLO provides fast object detection and...]]></description><link>https://raffayhassan772.wixsite.com/autonomous-systems/post/yolo-implementation-and-performance-analysis-pipeline</link><guid isPermaLink="false">69935a52cf429c4fcb4cad21</guid><pubDate>Mon, 16 Feb 2026 18:24:19 GMT</pubDate><enclosure url="https://static.wixstatic.com/media/7531f0_29554ae3b8f1428197c208f256380f38~mv2.png/v1/fit/w_1000,h_612,al_c,q_80/file.png" length="0" type="image/png"/><dc:creator>Raffay Hassan</dc:creator></item><item><title><![CDATA[Collision Avoidance Scenarios with Multi-Sensor Fusion in CARLA]]></title><description><![CDATA[Phase:  1 (Scenario Baseline) Focus:  Lane following + radar/LiDAR fusion + TTC + emergency braking + avoidance manoeuvre Overview This blog documents the Phase 1 scenario implementation of an autonomous collision avoidance pipeline in CARLA. The focus was to create repeatable scenarios, apply multi-sensor fusion, and validate safety behaviours (emergency braking and post-stop manoeuvres) under both normal and adverse weather conditions. Scenario Design (Town04) Town04 was selected due to its...]]></description><link>https://raffayhassan772.wixsite.com/autonomous-systems/post/collision-avoidance-scenarios-with-multi-sensor-fusion-in-carla</link><guid isPermaLink="false">699352d47ae93e4cf2f8654f</guid><pubDate>Mon, 16 Feb 2026 17:52:48 GMT</pubDate><enclosure url="https://static.wixstatic.com/media/7531f0_3b45265d65f7415888d372cd98516336~mv2.png/v1/fit/w_1000,h_1000,al_c,q_80/file.png" length="0" type="image/png"/><dc:creator>Raffay Hassan</dc:creator></item><item><title><![CDATA[What Is the Digital Twin in My Project?]]></title><description><![CDATA[One of the most important conceptual clarifications in my final-year project was understanding that the Digital Twin is not a separate software application like CARLA, Unreal Engine, or a commercial industrial platform. Instead, the Digital Twin is an architectural layer within the system. This distinction is fundamental to understanding the contribution of the project. Digital Twin ≠ Simulator It is easy to assume that CARLA itself is the Digital Twin, since it provides: A 3D virtual...]]></description><link>https://raffayhassan772.wixsite.com/autonomous-systems/post/what-is-the-digital-twin-in-my-project</link><guid isPermaLink="false">698c931a5ce248ef44890388</guid><pubDate>Wed, 11 Feb 2026 14:33:32 GMT</pubDate><dc:creator>Raffay Hassan</dc:creator></item><item><title><![CDATA[Project Setup Finalisation]]></title><description><![CDATA[Sensor-Driven Digital Twin for Collision Prevention The infrastructure for the Sensor-Driven Digital Twin project has now been fully finalised. With the system architecture clearly defined and all major components allocated to dedicated hardware, the project is ready to transition into structured implementation. The setup phase focused on establishing a scalable, low-latency architecture capable of supporting both simulation-based testing and real sensor-driven operation. Dedicated Linux GPU...]]></description><link>https://raffayhassan772.wixsite.com/autonomous-systems/post/project-setup-finalisation</link><guid isPermaLink="false">698c8f18d9c323c10df5d18b</guid><pubDate>Wed, 11 Feb 2026 14:16:43 GMT</pubDate><enclosure url="https://static.wixstatic.com/media/7531f0_49955b56bb5448cab75f196026f2f1c1~mv2.png/v1/fit/w_883,h_488,al_c,q_80/file.png" length="0" type="image/png"/><dc:creator>Raffay Hassan</dc:creator></item><item><title><![CDATA[Weekly Reflection: Peer Feedback on My Project Idea]]></title><description><![CDATA[This week, I took part in a cross-disciplinary peer feedback session where students from different engineering departments discussed and reviewed each other’s project ideas. The aim of the activity was to gather constructive feedback and refine our project direction through discussion. I presented my research project on developing a sensor-driven digital twin for collision prevention, which combines computer vision, LiDAR, and mmWave radar to evaluate safety behaviour in simulation and on a...]]></description><link>https://raffayhassan772.wixsite.com/autonomous-systems/post/weekly-reflection-peer-feedback-on-my-project-idea</link><guid isPermaLink="false">698469f753744f609cc26c5c</guid><pubDate>Thu, 05 Feb 2026 10:01:05 GMT</pubDate><enclosure url="https://static.wixstatic.com/media/7531f0_48e37b2a440346a0804d35d17c08698c~mv2.jpeg/v1/fit/w_1000,h_1000,al_c,q_80/file.png" length="0" type="image/png"/><dc:creator>Raffay Hassan</dc:creator></item><item><title><![CDATA[Research Day: Is Sensor Fusion Better Than Computer Vision Alone?]]></title><description><![CDATA[One of the key questions explored in this project is whether sensor fusion  provides a more reliable collision-prevention strategy than relying solely on computer vision . With modern deep-learning models such as YOLO achieving strong real-time performance, it is reasonable to question whether additional sensors are necessary. Computer vision is highly effective at identifying what  is present in the environment. Vision-based models can detect and classify vehicles, pedestrians, and obstacles...]]></description><link>https://raffayhassan772.wixsite.com/autonomous-systems/post/research-day-is-sensor-fusion-better-than-computer-vision-alone</link><guid isPermaLink="false">698347f499fa9e70ea64d0e3</guid><pubDate>Wed, 04 Feb 2026 13:22:19 GMT</pubDate><dc:creator>Raffay Hassan</dc:creator></item><item><title><![CDATA[Aims and Objectives]]></title><description><![CDATA[Project Aim The aim of this project is to design, implement, and evaluate a sensor-driven digital twin for collision prevention, with a particular focus on assessing whether multi-sensor fusion offers a more reliable and robust safety strategy than computer vision alone. The project uses simulation and scaled hardware experiments to enable safe, repeatable testing of perception and decision-making behaviour in autonomous systems. Project Objectives To achieve this aim, the project is...]]></description><link>https://raffayhassan772.wixsite.com/autonomous-systems/post/aims-and-objectives</link><guid isPermaLink="false">698346f8131f188df560ff3f</guid><pubDate>Wed, 04 Feb 2026 13:17:47 GMT</pubDate><dc:creator>Raffay Hassan</dc:creator></item><item><title><![CDATA[Starting My Research Project Journey: Building a Sensor-Driven Digital Twin for Vehicle Safety]]></title><description><![CDATA[Welcome to this blog. This space will document my final-year research project in Computer Systems Engineering , where I explore how digital twin technology  can be used to improve safety in autonomous and semi-autonomous vehicles. Autonomous driving systems operate in safety-critical environments, where mistakes are costly and real-world testing is both risky and limited. One of the biggest challenges in this field is validating perception and decision-making systems reliably, especially when...]]></description><link>https://raffayhassan772.wixsite.com/autonomous-systems/post/starting-my-research-project-journey-building-a-sensor-driven-digital-twin-for-vehicle-safety</link><guid isPermaLink="false">6979fb4fff1fc7c00331b0c8</guid><pubDate>Wed, 28 Jan 2026 12:04:45 GMT</pubDate><dc:creator>Raffay Hassan</dc:creator></item></channel></rss>