Sensor Fusion and Tracking Toolbox
Design and simulate multisensor tracking and positioning systems
Sensor Fusion and Tracking Toolbox™ includes algorithms and tools for the design, simulation, and analysis of systems that fuse data from multiple sensors to maintain position, orientation, and situational awareness. Reference examples provide a starting point for implementing components of airborne, ground-based, shipborne, and underwater surveillance, navigation, and autonomous systems.
The toolbox includes multi-object trackers, sensor fusion filters, motion and sensor models, and data association algorithms that let you evaluate fusion architectures using real and synthetic data. With Sensor Fusion and Tracking Toolbox you can import and define scenarios and trajectories, stream signals, and generate synthetic data for active and passive sensors, including RF, acoustic, EO/IR, and GPS/IMU sensors. You can also evaluate system accuracy and performance with standard benchmarks, metrics, and animated plots.
For simulation acceleration or desktop prototyping, the toolbox supports C code generation.

Trajectory and Scenario Generation
Generate ground-truth waypoint-based and rate-based trajectories and scenarios. Model platforms and targets for tracking scenarios.

Sensor Models
Simulate measurements from IMU (accelerometer, gyroscope, magnetometer), GPS receivers, radar, sonar, and IR under different environmental conditions.

Inertial Sensor Fusion
Estimate orientation and position over time with algorithms that are optimized for different sensor configurations, output requirements, and motion constraints.

Estimation Filters
Use Kalman, particle, and multiple-model filters for different motion and measurement models.

Multi-Object Tracking
Create multi-object trackers that fuse information from various sensors. Maintain single or multiple hypotheses about the objects it tracks.

Visualization and Analytics
Analyze and compare the performance of inertial filters and multi-object tracking systems.
