Releases: Tinker-Twins/AutoDRIVE-AVLDC
Releases · Tinker-Twins/AutoDRIVE-AVLDC
AutoDRIVE-AVLDC v0.2.0
Updated release for the AutoDRIVE-AVLDC development and integration. This release was showcased at the VIPR-GS Annual Review 2024.
This release hosts teleoperation and automatic emergency braking (AEB) demos for the OpenCAV digital twin developed using AutoDRIVE Simulator, AutoDRIVE Python API, AVL Python API and AVL Model.CONNECT.
The teleoperation demo is implemented as follows:
- Teleoperation: A graphical user interface (GUI) developed using AVL Model.CONNECT allows a human operator to monitor key performance indices (KPIs), such as distance to collision (DTC), and control the throttle, brake, steering, and handbrake inputs of the vehicle digital twin in real-time.
OpenCAV Teleoperation Demo Result | ||
---|---|---|
AutoDRIVE Simulator | AutoDRIVE Devkit | AVL Model.CONNECT |
The AEB demo is implemented in two different capacities:
- Emulation: The ADAS/AD stack receives "ground-truth" measurements in the form of a distance to collision (DTC) metric. The planning and control modules use this information to engage brakes in case a collision is imminent.
- Stimulation: The ADAS/AD stack receives raw sensor data in the form of an RGB camera frame. The perception stack uses a deep learning model for object detection and computes the distance to collision (DTC) metric based on the type of object, its detection confidence, and its relative size in the frame. The planning and control modules use this information to engage brakes in case a collision is imminent.
OpenCAV AEB Demo Result: Emulation Mode | ||
---|---|---|
AutoDRIVE Simulator | AutoDRIVE Devkit | AVL Model.CONNECT |
OpenCAV AEB Demo Result: Stimulation Mode | ||
---|---|---|
AutoDRIVE Simulator | AutoDRIVE Devkit | AVL Model.CONNECT |
AutoDRIVE-AVLDC v0.1.0
Initial release for the AutoDRIVE-AVLDC development and integration. This release hosts an automatic emergency braking (AEB) demo for the OpenCAV digital twin developed using AutoDRIVE Python API as well as AutoDRIVE ROS API. This release implements the AEB demo in two different capacities:
- Emulation: The ADAS/AD stack receives "ground-truth" measurements in the form of a distance to collision (DTC) metric. The planning and control modules use this information to engage brakes in case a collision is imminent.
- Stimulation: The ADAS/AD stack receives raw sensor data in the form of an RGB camera frame. The perception stack uses a deep learning model for object detection and computes the distance to collision (DTC) metric based on the type of object, its detection confidence, and its relative size in the frame. The planning and control modules use this information to engage brakes in case a collision is imminent.
Emulation | Stimulation |