Latest News / December ‘25 /CES Community Demo #2: Integrating SODA.Sim with Autoware for Camera-Based Evaluation

AutoSpeed and Object Finder: An Open Source, Camera-Only Neural Network Framework

Author: ADMIN
AutoSpeed and Object Finder: An Open Source, Camera-Only Neural Network Framework

Another milestone from the Privately Owned Vehicle (PoV) Working Group at the Autoware Foundation 🚗

Today, we’re showcasing AutoSpeed with Object Finder — an open source, camera-only neural network framework designed to detect critical obstacles in real-world driving scenes.

From a single front-facing camera, the system
🔵 Detects all relevant foreground objects
🔴 Identifies the closest in-path object
🟡 Tracks cut-in and cut-out vehicles — without explicit lane detection

This enables key ADAS & autonomy features
🔸 Forward Collision Warning
🔸 Automatic Emergency Braking
🔸 Autonomous Cruise Control

Works on all road types
🔹 Highways, urban streets, unstructured roads
🔹 Even where lane lines are missing or unreliable

Robust in the real world
🌧️ Rain, water droplets on the lens
🌙 Nighttime & low-light conditions
🛣️ Straight roads and high-curvature scenarios

Smart perception, simplified sensing
📏 Distance & relative speed estimated directly
📷 No LiDAR. No Radar. Just vision.

Beyon vehicles🚲🚶
The system reliably detects pedestrians, bicyclists, and other critical road users — supporting safer decision-making in complex urban traffic.

AutoSpeed + Object Finder together form a powerful open source vision stack for real-time autonomous driving — built by the community, for the community, and ready for OEMs and Tier-1 suppliers to experiment, extend, and deploy.

𝐓𝐡𝐞 𝐥𝐢𝐧𝐤 𝐭𝐨 𝐭𝐡𝐞 𝐀𝐮𝐭𝐨𝐒𝐩𝐞𝐞𝐝 𝐧𝐞𝐭𝐰𝐨𝐫𝐤 𝐆𝐢𝐭𝐇𝐮𝐛 𝐩𝐚𝐠𝐞 𝐢𝐬 𝐢𝐧 𝐭𝐡𝐞 𝐟𝐢𝐫𝐬𝐭 𝐜𝐨𝐦𝐦𝐞𝐧𝐭. 𝐂𝐡𝐞𝐜𝐤 𝐢𝐭 𝐨𝐮𝐭!

👏 Huge thanks to the contributors who made this possible:
🏆 Atanasko Boris Mitrev — AutoSpeed neural network development & training
🏆 Pranav Doma — Object Finder module

🔜 More PoV Working Group breakthroughs are coming soon.