Skip to Main content Skip to Navigation
Conference papers

SynWoodScape: Synthetic Surround-view Fisheye Camera Dataset for Autonomous Driving

Abstract : Surround-view cameras are a primary sensor for automated driving, used for near-field perception. It is one of the most commonly used sensors in commercial vehicles primarily used for parking visualization and automated parking. Four fisheye cameras with a 190 °field of view cover the 360 °around the vehicle. Due to its high radial distortion, the standard algorithms do not extend easily. Previously, we released the first public fisheye surround-view dataset named WoodScape. In this work, we release a synthetic version of the surround-view dataset, covering many of its weaknesses and extending it. Firstly, it is not possible to obtain ground truth for pixel-wise optical flow and depth. Secondly, WoodScape did not have all four cameras annotated simultaneously in order to sample diverse frames. However, this means that multi-camera algorithms cannot be designed to obtain a unified output in birds-eye space, which is enabled in the new dataset. We implemented surround-view fisheye geometric projections in CARLA Simulator matching WoodScape’s configuration and created SynWoodScape. We release 80k images from the synthetic dataset with annotations for 10+ tasks. We also release the baseline code and supporting scripts.
Complete list of metadata
Contributor : Paul Honeine Connect in order to contact the contributor
Submitted on : Wednesday, August 10, 2022 - 3:30:44 PM
Last modification on : Saturday, August 13, 2022 - 4:17:40 AM


Files produced by the author(s)


  • HAL Id : hal-03749224, version 1


Ahmed Rida Sekkat, Yohan Dupuis, Varun Ravi Kumar, Hazem Rashed, Senthil Yogamani, et al.. SynWoodScape: Synthetic Surround-view Fisheye Camera Dataset for Autonomous Driving. 2022 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS 2022), Oct 2022, Kyoto, Japan. ⟨hal-03749224⟩



Record views


Files downloads