top of page
Search

Aims and Objectives

  • Writer: Raffay Hassan
    Raffay Hassan
  • Feb 4
  • 1 min read

Project Aim

The aim of this project is to design, implement, and evaluate a sensor-driven digital twin for collision prevention, with a particular focus on assessing whether multi-sensor fusion offers a more reliable and robust safety strategy than computer vision alone. The project uses simulation and scaled hardware experiments to enable safe, repeatable testing of perception and decision-making behaviour in autonomous systems.

Project Objectives

To achieve this aim, the project is structured around the following objectives:

  • Design a sensor-driven digital twin capable of representing the perception state of a vehicle using both simulated and real sensor inputs.

  • Implement a camera-based perception module using a YOLO object detection model to identify dynamic obstacles in real time.

  • Integrate LiDAR sensing to provide accurate, metric distance measurements to detected objects.

  • Incorporate mmWave radar data to estimate relative object velocity and improve collision-risk assessment.

  • Fuse camera, LiDAR, and radar data to compute Time-To-Collision (TTC) as a physically meaningful safety metric.

  • Apply collision-prevention decisions, such as braking or avoidance, within a simulated environment to observe realistic vehicle responses.

  • Compare the performance of vision-only and sensor-fusion-based approaches under identical test scenarios using the digital twin.

  • Evaluate system performance in terms of detection reliability, responsiveness, and overall safety behaviour.

  • Demonstrate scalability by transitioning from simulation-only validation to experiments on a scaled hardware platform.

 
 
 

Comments


  • LinkedIn

The Burroughs, London

NW4 4BT

Autonomous Systems, Sensor Fusion, Digital Twins

 

© 2026 by Department of Science and Technology

 

bottom of page