Train, test, and validate perception systems

Equipped with validated sensor models and diverse 3D worlds, Spectral executes physics-based sensor simulations to accelerate perception system development with high-fidelity synthetic data.

Key features

Library of validated sensor models that teams can tune to represent specific sensor hardware
Programmatically generated ground truth labels such as bounding boxes, depth, semantic segmentation, and optical flow
Procedurally generated 3D worlds customized to an operational design domain (ODD)
Easy variation of environmental conditions such as weather and lighting
Extensive library of physically based rendering (PBR) materials and assets
Real-time sensors and multi-GPU sharding to enable deployment on hardware-in-the-loop (HIL) rigs

Iterate on perception modules in simulation

Create dedicated test scenarios to exercise new functionality while avoiding testing delays in the field. Leverage exact control over sensors, 3D worlds, and environmental conditions.

Comprehensively test robustness to long-tail cases

Test edge cases that are difficult or dangerous to test in the real world. Compute metrics to quantify performance, and automatically execute scenarios on each commit to prevent regressions.

Optimize sensor configurations

Prototype new sensors in simulation to test their impact on perception performance. Vary sensor intrinsics and extrinsics automatically in simulation to discover optimal configurations.