MATLAB & Simulink

Automotive Applications

Automated Driving with MATLAB

Course Highlights

This two-day course provides hands-on experience with developing and verifying
automated driving perception algorithms. Examples and exercises demonstrate the
use of appropriate MATLAB® and Automated Driving Toolbox™ functionality.


Topics include:

  • Labeling of ground truth data

  • Visualizing sensor data

  • Detecting lanes and vehicles

  • Processing lidar point clouds

  • Tracking and sensor fusion

  • Generating driving scenarios and modeling sensors

Who Should Attend

Partners 

mathworks.jpg

Upcoming Program

xilinx ATP 黑.png

Techsource Systems is
Mathworks Sole and Authorised Distributor and Training Partner

Prerequisite

Course Benefits

Course Outlline

Day 1 of 2

Labeling of Ground Truth Data

Objective: Label ground truth data in a video or sequence of images interactively. Automate the labeling with detection and tracking algorithms.

  • Overview of the Ground Truth Labeler app

  • Label regions of interest (ROIs) and scenes

  • Automate labeling

  • View/export ground truth results

 

Visualizing Sensor Data

Objective: Visualize camera frames, radar, and lidar detections. Use appropriate coordinate systems to transform image coordinates to vehicle coordinates and vice versa.

  • Create bird’s eye plot

  • Plot sensor coverage areas

  • Visualize detections and lanes

  • Convert from vehicle to image coordinates

  • Annotate video with detections and lane boundaries

 

Detecting Lanes and Vehicles

 

Objective: Segment and model parabolic lane boundaries. Use pretrained object detectors to detect vehicles

  • Perform bird’s eye view transform

  • Detect lane features

  • Compute lane model

  • Validate lane detection with ground truth

  • Detect vehicles with pretrained object detectors

Processing Lidar Point Clouds

 

Objective: Work with lidar data stored as 3-D point clouds. Import, visualize, and process point clouds by segmenting them into clusters. Register point clouds to align and build an accumulated point cloud map.

  • Import and visualize point clouds

  • Preprocess point clouds

  • Segment objects from lidar sensor data 

  • Build a map from lidar sensor data

Day 2 of 2

Fusing Sensor Detections and Tracking

Objective: Create a multi-object tracker to fuse information from multiple sensors such as camera, radar, and lidar.

  • Track multiple objects 

  • Preprocess detections 

  • Utilize Kalman filters 

  • Manage multiple tracks

  • Track with multi-object tracker

 

Tracking Extended Objects

Objective: Create a probability hypothesis density tracker to track extended objects and estimate their spatial extent.

  • Define sensor configurations

  • Track extended objects

  • Estimate spatial extent

 

Generating Driving Scenarios and Modeling Sensors

 

Objective: Create driving scenarios and synthetic radar and camera sensor detections interactively to test automated driving perception algorithms.

 

  • Overview of the Driving Scenario Designer app

  • Create scenarios with roads, actors, and sensors

  • Simulate and visualize scenarios

  • Generate detections and export scenarios

  • Test algorithms with scenarios