Challenges

When developing perception modules for advanced driver-assistance systems (ADAS) and automated driving (AD) software, real-world testing can be slow, expensive, and non-deterministic. The diversity of operational design domains (ODDs) and the long tail of safety-critical events requires engineering teams to run a diverse set of tests against conditions that are not always observable in the real world.

Why Sensor Sim?

Applied Intuition Sensor Sim allows perception engineers, sensor hardware engineers, and validation teams to test machine learning (ML)-based and classical sensor suites and perception systems.
Equipped with validated sensor models and diverse, customizable 3D worlds for ADAS and AD sensor simulation
Programmatically generated ground truth labels ready for ML systems, such as bounding boxes, depth, semantic segmentation, and optical flow
High realism to run simulations in real time

Benefits

Develop safer systems

Increase test coverage to catch and root cause defects earlier in the development process.

Reduce testing costs

Save millions annually by shifting 90% of real-world sensing and perception development, testing, validation, and data collection into simulation.

Accelerate time to market

Develop, test, and validate sensing and perception systems faster and shorten time to deployment.

Key components

Physically based rendering

Multi-spectral rendering enables teams to accurately model sensors across the electromagnetic spectrum, including sensor modalities such as camera, lidar, radar, and ultrasonic sensors. Use path tracing to maximize fidelity and hybrid ray tracing to maximize performance. Sensor Sim rendering minimizes the domain gap so that ML-based perception systems see simulated data in the same way as real data.

Real-time performance

Sensor Sim is optimized for real-time performance. It enables sensor hardware-in-the-loop (HIL) testing and cost-effective large-scale multi-sensor software-in-the-loop (SIL) testing. Directly configure simulations to ensure performance and fidelity are tuned to the specific system, test case, and budget.

Hardware-specific sensors

Applied Intuition has developed hardware-specific sensor models in collaboration with partners such as Valeo and Ouster and validated them against their real-world counterparts. Parameterize customer models from datasheets and refine them using experimental data. Alternatively, directly co-simulate external models using Sensor Sim’s ray tracing API.

Diverse simulation content

Sensor Sim’s premade library of 3D assets, 3D worlds, and multi-spectral rendering materials covers 80% of robotics ODDs out of the box. Customize Sensor Sim to the long tail of your ODD using flexible editors and tools that leverage both generative AI and rule-based proceduralism to construct new 3D worlds.

Automated performance evaluation

Use programmatically generated ground truth labels, performance metrics, and simulation observers to achieve a comprehensive and continuous evaluation of sensor, perception, and full-system performance. Analyze your data with both traditional charts and conversational AI that enables you to ask questions about your data.

Key components

Physically based rendering

Multi-spectral rendering enables teams to accurately model sensors across the electromagnetic spectrum, including sensor modalities such as camera, lidar, radar, and ultrasonic sensors. Use path tracing to maximize fidelity and hybrid ray tracing to maximize performance. Sensor Sim rendering minimizes the domain gap so that ML-based perception systems see simulated data in the same way as real data.

Real-time performance

Sensor Sim is optimized for real-time performance. It enables sensor hardware-in-the-loop (HIL) testing and cost-effective large-scale multi-sensor software-in-the-loop (SIL) testing. Directly configure simulations to ensure performance and fidelity are tuned to the specific system, test case, and budget.

Hardware-specific sensors

Applied Intuition has developed hardware-specific sensor models in collaboration with partners such as Valeo and Ouster and validated them against their real-world counterparts. Parameterize customer models from datasheets and refine them using experimental data. Alternatively, directly co-simulate external models using Sensor Sim’s ray tracing API.

Diverse simulation content

Sensor Sim’s premade library of 3D assets, 3D worlds, and multi-spectral rendering materials covers 80% of robotics ODDs out of the box. Customize Sensor Sim to the long tail of your ODD using flexible editors and tools that leverage both generative AI and rule-based proceduralism to construct new 3D worlds.

Automated performance evaluation

Use programmatically generated ground truth labels, performance metrics, and simulation observers to achieve a comprehensive and continuous evaluation of sensor, perception, and full-system performance. Analyze your data with both traditional charts and conversational AI that enables you to ask questions about your data.

Get started with Sensor Sim

Learn how to accelerate the development of safety-critical sensing and perception systems through simulation-first development, testing, and validation.
Contact us