top of page
Search

Building the Sensor Foundation: LD06 LiDAR and 60 GHz Radar

  • Writer: Raffay Hassan
    Raffay Hassan
  • Feb 27
  • 5 min read

Updated: Mar 30

I've been working on the hardware phase three sensors, two boards, and one unified dashboard to tie it all together. Before any of it goes near the actual car though, I wanted to build and validate each piece on the bench first. No point bolting things to a chassis if the software isn't working yet.

This first post covers the two range sensors: the LD06 LiDAR and the BGT60TR13C 60 GHz radar on the DreamHAT+ board. They do very different things the LiDAR sees geometry, the radar sees velocity and getting both working independently was the foundation everything else builds on.


The LD06 LiDAR


The LD06 is a compact 360° spinning rangefinder. The goal was to detect what's in the path ahead of the car and split the view into three zones left, centre, right so the car knows not just that something is there but roughly where.

What sounds straightforward turned into a solid debugging session involving byte order, coordinate flips, and a sensor that thought everything was behind it.

Parsing the Packets

The LD06 sends 47-byte packets over serial at 230400 baud. Each packet has 12 distance measurements spread across an angular sweep. Bytes 0-1 are fixed headers, bytes 4-5 give the start angle, bytes 42-43 the end angle — both divided by 100 to get degrees. The 12 measurements sit in bytes 6-41, three bytes each: distance low, distance high, confidence.


Image 1: Lidar Test
Image 1: Lidar Test

Parsing the Packets

The LD06 sends 47-byte packets over serial at 230400 baud. Each packet has 12 distance measurements spread across an angular sweep. Bytes 0-1 are fixed headers, bytes 4-5 give the start angle, bytes 42-43 the end angle both divided by 100 to get degrees. The 12 measurements sit in bytes 6-41, three bytes each: distance low, distance high, confidence.

Everything Was Appearing Behind Me

When I first ran the detector, every obstacle showed up behind the origin in the scatter plot. Put something in front of the sensor and it'd appear with a negative x coordinate.

The problem was the LiDAR's zero-degree reference pointing opposite to what I wanted as "forward." It's a 180° offset that needs correcting regardless of how it eventually gets mounted:

Applied during angle decoding: angles = (angles + ANGLE_OFFSET_DEG) % 360.0

After that one constant, the scatter plot immediately made sense.


Image 2: Lidar live  forward obstacle detection.
Image 2: Lidar live forward obstacle detection.

Killing the False Positives

First test on a shiny desk: constant STOP alerts from nothing. Single stray points reflections, dust, cable edges were enough to trigger it.

The fix was a persistence filter. An obstacle has to appear in 3 consecutive frames before it counts, and needs to be gone for 4 frames before the alert clears. A zone also needs at least 3 points in it to be a real detection.


That basically killed the false positives. I also narrowed the field of view from 360° to just ±20° forward everything behind and to the sides is irrelevant for a forward-moving car, and cutting it out cleaned up the readings noticeably.

The LiDAR GUI


The standalone LiDAR script has a live Tkinter GUI with a Matplotlib scatter plot. Points coloured by zone, two horizontal threshold lines, car silhouette at the origin. Watching this while moving objects around in front of the sensor made it much easier to understand what the code was actually seeing and catch remaining issues before moving on.


Part Two: The 60 GHz Radar


The radar is a BGT60TR13C on the DreamHAT+ board, sitting on a Raspberry Pi 5. Where the LiDAR tells you geometry, radar tells you motion. It sees velocity, which is what you need for Time-To-Collision. It also works through dust, smoke and poor lighting useful for a car that might be driving somewhere with LiDAR-unfriendly conditions.


All the radar signal processing runs on the Pi. It streams clean track data to the Jetson over UDP rather than pushing raw samples across the network.


Image 3: mmWave Radar
Image 3: mmWave Radar

How FMCW Radar Works

FMCW stands for Frequency Modulated Continuous Wave. The radar transmits a chirp a signal that sweeps linearly from low to high frequency over a very short time. When it bounces off something and returns, it arrives slightly delayed. Mixing the outgoing and incoming signals produces a beat frequency proportional to that delay, which is proportional to distance.


Repeat the chirp and compare the phase between successive returns the rate of phase change is the Doppler shift, which gives velocity. Use multiple receive antennas and the phase differences between them give the angle of arrival.

Three FFTs, three different pieces of information from the same raw data.


The Signal Processing Pipeline

The BGT60TR13C has 1 TX and 3 RX antennas, running 64 chirps of 128 samples each. Each frame is a 3D array with shape (3, 64, 128). Processing runs three FFT stages:

Range FFT - FFT along the samples axis. Each output bin is a range gate.


Doppler FFT — FFT along the chirps axis. Combines with range to give a range-Doppler map per antenna.


Angle FFT — FFT across the three antennas. Resolves angle of arrival, giving a full range-Doppler-angle cube.


Tracking

Raw detections jump around frame to frame. A simple nearest-neighbour tracker turns them into stable objects:

  • Detections near an existing track get associated to it

  • Unmatched detections start tentative tracks

  • Tentative tracks get confirmed after 3 consecutive frames

  • Confirmed tracks get deleted after 5 frames without a match

The 3-frame confirmation is important it filters most transient reflections before they ever become visible as tracks.


Time-To-Collision

Once you have range and velocity per track, TTC is straightforward:


def calculate_ttc(range_m, velocity_mps):

if velocity_mps >= 0:

return float('inf') # stationary or moving away

return -range_m / velocity_mps


TTC thresholds map to alert levels:


TTC_IMMINENT_S = 1.5 # red — under 1.5 seconds to impact

TTC_CAUTION_S = 3.0 # amber — under 3.0 seconds


There's also a range fallback for when velocity is near zero — anything within 50cm is IMMINENT regardless, within 1.2m is at least CAUTION.


Image 4: Radar GUI with live Detections.
Image 4: Radar GUI with live Detections.

Streaming to the Jetson

The Pi sends JSON track data to the Jetson over UDP port 9576 after every radar frame:


track_data = {

"timestamp": time.time(),

"tracks": [

{

"id": track.id,

"range_m": track.range_m,

"velocity_mps": track.velocity_mps,

"angle_deg": track.angle_deg,

"ttc_s": track.ttc_s,

"level": track.collision_level

}

for track in confirmed_tracks

]

}

sock.sendto(json.dumps(track_data).encode(), (JETSON_IP, UDP_PORT))


Keeping the FFT pipeline on the Pi was a deliberate choice the Jetson is already handling LiDAR, camera inference, fusion and the GUI. The Pi 5 handles the full 4D FFT pipeline comfortably and the UDP hop adds under 1ms of latency.


Bench Testing Observations


Indoors with the sensor pointing across a room, it sees a lot. Walls, furniture, the floor all of them produce reflections. In one test session I had 41 active tracks, all SAFE. That's normal for an open room. When it's eventually on a car pointing down a corridor the geometry is much more constrained and the track count drops significantly.


With both sensors validated independently, Part 2 covers bringing them together fusing LiDAR and radar into a single collision decision and building the live dashboard that shows both sensor views at the same time.


 
 
 

Comments


  • LinkedIn

The Burroughs, London

NW4 4BT

Autonomous Systems, Sensor Fusion, Digital Twins

 

© 2026 by Department of Science and Technology

 

bottom of page