Author: Christoforos Chatzikomis
Head of Simulation and Validation Platform SODA.AUTO
The SODA ecosystem focuses on enabling software-defined vehicle (SDV) development through open tools for simulation, integration, and validation. SODA.V provides an end-to-end framework covering requirements definition, validation, and verification, with full traceability across development stages. Within this framework, SODA.Sim supports simulation-based development and testing, while close integration with SODA.Rig enables hardware-in-the-loop system testing.

As part of ongoing collaboration with the Autoware community, this work explores how SODA.Sim can be used alongside Autoware to support camera-based evaluation workflows.
This post describes an initial integration of SODA.Sim with Autoware, focusing on using a simulated camera sensor as input and running camera-based modules without modifying existing network weights.
SODA.Sim Overview
SODA.Sim is an open-source vehicle simulator built on Unreal Engine 5. It provides configurable environments and sensor simulation with high-quality visual rendering, and exposes ROS 2 interfaces for sensor data and system interaction.
The simulator follows a modular architecture, allowing vehicle models, sensor configurations, and environment components to be composed and extended independently. SODA.Sim includes an extensive, extensible library of sensor and vehicle models, enabling users to configure simulation setups that match different system architectures and testing needs.
In this work, SODA.Sim is used to generate forward-facing monocular camera images under varying conditions.
Integration Scope
The goal of this integration is to validate sensor-level compatibility between SODA.Sim and Autoware. The current scope includes:
- Publishing simulated camera images and calibration data over ROS 2
- Subscribing to these topics in Autoware using standard message types
- Running Autoware camera-based modules on simulated input
Visual Configuration and Scenario Setup
SODA.Sim provides a visual editor that allows users to configure simulation assets and parameters without modifying code. In this integration, the editor was used to define the ego vehicle configuration, including sensor placement and vehicle properties.
The visual configuration workflow supports:
- Selection and configuration of the ego vehicle
- Placement and parameterization of camera sensors
- Configuration of the vehicle dynamics model
- Creation and modification of simulation scenarios, including environment selection and traffic setup

This visual approach enables rapid iteration on vehicle and sensor configurations, as well as efficient exploration of different scenarios, while maintaining a consistent interface to downstream software such as Autoware.
Camera Sensor Integration
A forward-facing monocular camera is simulated in SODA.Sim and publishes standard sensor_msgs/Image and sensor_msgs/CameraInfo topics. These topics are consumed directly by Autoware without changes to its internal interfaces.
Environmental Variations
Simulation parameters were adjusted to introduce:
- Time-of-day changes, affecting lighting, shadows, and surface reflections
- Weather variations
- Different simulated maps and environments
Across these variations, the simulated camera continued to provide valid image streams.



AutoSpeed on Simulated Input
Autoware’s AutoSpeed module was executed using simulated camera data. Existing camera-based networks were run on simulated images without modifying network weights, and no obvious rendering artifacts were observed that would prevent network execution.
Intended Usage and Next Steps
This integration is intended for Autoware developers and researchers working with camera-based and sensor-driven modules. It enables execution of existing Autoware pipelines on simulated sensor data, with a focus on evaluating data flow, module interoperability, and system configuration. The setup supports testing under controlled variations in lighting, weather, and environment, using consistent interfaces between the simulator and Autoware.
Ongoing work will extend the integration with SODA.Sim to incorporate additional simulated sensors, expand interaction with Autoware modules, and connect Autoware control outputs back to the simulator to enable closed-loop control in simulation. The setup can then be used for integration testing and functional evaluation of end-to-end pipelines in simulated or hybrid (SIL/HIL) environments.
Summary
This work presents an initial camera-level integration of SODA.Sim with Autoware. It shows how simulated camera data can be used to run existing Autoware modules under different environmental conditions.
The setup makes it easier to test the same software across multiple scenarios and to move between simulation and later validation stages, including real and mixed setups, without changing interfaces or system structure.