top of page
Search

Phase 2 Real-World Hardware Platform for Autonomous Collision Avoidance

  • Writer: Raffay Hassan
    Raffay Hassan
  • Feb 27
  • 3 min read

After developing and validating the perception algorithms in simulation using the CARLA autonomous driving environment, the project progressed to a real-world deployment on physical hardware. This phase focuses on transferring the digital twin concepts into a functioning autonomous system capable of sensing and reacting to real obstacles.

Unlike simulation, real environments introduce sensor noise, communication delays, imperfect measurements, and physical constraints. Therefore, Phase 2 emphasizes robust sensor fusion, distributed processing, and real-time decision making on embedded edge devices.


System Architecture Overview

The hardware platform consists of a scaled RC vehicle carrying multiple sensors and two computing units. Processing is distributed to improve efficiency and reliability.

Jetson Orin Nano (Primary Node)

  • Camera input and YOLO object detection

  • LiDAR processing for distance measurement

  • Sensor fusion

  • Time-to-Collision (TTC) calculation

  • Final collision avoidance decisions

Raspberry Pi 5 (Secondary Node)

  • mmWave radar processing

  • Relative velocity estimation

  • Transmission of radar data to the Jetson via IP networking

Radar measurements are sent over the network using the Jetson’s IP address, enabling wireless communication without a direct cable connection.


RC Vehicle Platform

The project uses a high-performance off-road RC chassis capable of supporting approximately 8–12 kg of payload. This vehicle serves as a scaled physical testbed for autonomous driving research.

Key characteristics include:

  • Four-wheel drive traction

  • Independent suspension for uneven terrain

  • High-torque motor with electronic speed controller

  • Steering servo for precise control

  • Sufficient space for mounting sensors and computing hardware

The platform allows repeatable real-world experiments beyond simulation environments.


Image 1: RC Car Base
Image 1: RC Car Base

NVIDIA Jetson Orin Nano Central AI Processor

The Jetson Orin Nano functions as the main processing unit responsible for perception fusion and decision making. Its GPU acceleration enables real-time execution of deep learning models while maintaining efficiency suitable for mobile robotics.

Processes running on the Jetson include:

  • Camera capture and preprocessing

  • YOLO object detection for semantic understanding

  • LD06 LiDAR processing for geometric perception

  • Integration of radar data received from the Raspberry Pi

  • Time-to-Collision computation

  • Collision avoidance decision logic

This node acts as the “brain” of the system.

LD06 360° LiDAR Distance and Geometry

The LD06 scanning LiDAR provides continuous 360-degree distance measurements in a horizontal plane. Within this project, LiDAR data is transformed into the vehicle’s coordinate frame to identify obstacles within a forward driving corridor.

Advantages of LiDAR:

  • High accuracy distance measurement

  • Independence from lighting conditions

  • Reliable detection of static obstacles

  • Real-time spatial awareness

LiDAR determines how far objects are from the vehicle and whether they lie directly in its path.

Camera + YOLO Semantic Perception

A forward-facing camera paired with a YOLO deep learning model provides semantic understanding of the environment. While LiDAR determines distance, computer vision identifies object types.

This enables classification of:

  • Pedestrians

  • Vehicles

  • Static structures

  • Non-threatening objects

Semantic perception enhances decision making and supports future behavioral prediction capabilities.


Image 2: Jetson orin nano and LD06 Lidar
Image 2: Jetson orin nano and LD06 Lidar

mmWave Radar + Raspberry Pi 5 — Velocity Sensing

A 60 GHz mmWave radar module connected to a Raspberry Pi 5 measures the relative velocity of objects using Doppler shift. Radar complements LiDAR by providing motion information that optical sensors cannot directly measure.

Capabilities include:

  • Direct measurement of approach speed

  • Operation in poor visibility conditions

  • Detection of moving objects

  • Extended sensing range

The Raspberry Pi processes radar signals locally and transmits structured target data to the Jetson over the network.


Image 3: Raspberry Pi and Radar
Image 3: Raspberry Pi and Radar

Inter-Device Communication via IP Networking

The Raspberry Pi communicates with the Jetson using standard IP networking rather than a dedicated cable link. The Pi sends radar measurements to the Jetson’s IP address using network protocols such as TCP or UDP.

This approach provides:

  • Wireless flexibility

  • Remote monitoring capability

  • Reduced wiring complexity

  • Scalability to multi-device systems

Distributed processing ensures that each device handles specialized tasks while the Jetson performs final fusion and decision making.

Time-to-Collision (TTC) Computation

Collision risk is assessed on the Jetson using fused sensor data. LiDAR supplies obstacle distance while radar provides relative velocity. TTC is calculated as:

TTC = Distance / Closing Speed

This metric indicates how urgently the vehicle must respond. Thresholds classify the situation into safe, warning, or emergency conditions, triggering actions such as slowing or stopping.

From Simulation (CARLA) to Physical Deployment

Phase 1 of the project involved developing and validating algorithms in the CARLA simulation environment. Simulation allowed rapid testing under controlled conditions using a digital twin of urban scenarios.

Transitioning to real hardware introduces:

  • Sensor noise and calibration challenges

  • Communication delays

  • Environmental variability

  • Physical dynamics of the vehicle

Successfully deploying the system demonstrates that the architecture functions beyond simulation and validates the digital twin methodology.

 
 
 

Comments


  • LinkedIn

The Burroughs, London

NW4 4BT

Autonomous Systems, Sensor Fusion, Digital Twins

 

© 2026 by Department of Science and Technology

 

bottom of page