Skip to Main content Skip to Navigation
Journal articles

SynWoodScape: Synthetic Surround-View Fisheye Camera Dataset for Autonomous Driving

Abstract : Surround-view cameras are a primary sensor for automated driving, used for near-field perception. It is one of the most commonly used sensors in commercial vehicles primarily used for parking visualization and automated parking. Four fisheye cameras with a 190 °field of view cover the 360 °around the vehicle. Due to its high radial distortion, the standard algorithms do not extend easily. Previously, we released the first public fisheye surround-view dataset named WoodScape. In this work, we release a synthetic version of the surround-view dataset, covering many of its weaknesses and extending it. Firstly, it is not possible to obtain ground truth for pixel-wise optical flow and depth. Secondly, WoodScape did not have all four cameras annotated simultaneously in order to sample diverse frames. However, this means that multi-camera algorithms cannot be designed to obtain a unified output in birds-eye space, which is enabled in the new dataset. We implemented surround-view fisheye geometric projections in CARLA Simulator matching WoodScape’s configuration and created SynWoodScape. We release 80k images from the synthetic dataset with annotations for 10+ tasks. We also release the baseline code and supporting scripts.
Complete list of metadata

https://hal-normandie-univ.archives-ouvertes.fr/hal-03749088
Contributor : Paul Honeine Connect in order to contact the contributor
Submitted on : Wednesday, August 10, 2022 - 12:39:22 PM
Last modification on : Saturday, August 13, 2022 - 4:17:40 AM

Links full text

Identifiers

Citation

Ahmed Rida Sekkat, Yohan Dupuis, Varun Ravi Kumar, Hazem Rashed, Senthil Yogamani, et al.. SynWoodScape: Synthetic Surround-View Fisheye Camera Dataset for Autonomous Driving. IEEE Robotics and Automation Letters, IEEE 2022, 7 (3), pp.8502-8509. ⟨10.1109/LRA.2022.3188106⟩. ⟨hal-03749088⟩

Share

Metrics

Record views

30