Embedded Systems

Rendering Physically Correct Raindrops on Windshields for Robustness Verification of Camera-based Object Recognition

by Alexander von Bernuth, Georg Volk, and Oliver Bringmann
In 2018 IEEE Intelligent Vehicles Symposium (IV), pages 922–927, 2018.

Abstract

Recent developments in the field of autonomous cars indicate the appearance of those vehicles on the streets of every city in the near future. This urban driving requires zero error tolerance. In order to guarantee safety requirements self-driving cars and the used software have to pass exhaustive tests under as many different conditions as possible. The more versatile the considered influences and the more thorough the tests made under those influences, the safer the car will drive under real conditions. Unfortunately, it is very time and resource intensive to record the same test set of images over and over again, every time producing, or hoping for, specific conditions; especially when using real test vehicles. This is where environment simulation comes into play. This research investigates the simulation of environmental influences which may affect the sensors used in autonomous vehicles, in particular how raindrops resting on a windshield affect cameras as they may occlude large parts of the field of view. We propose a novel method to render these raindrops using Continuous Nearest Neighbor search leveraging the benefits of R-trees. The 3D scene in front of the camera, which is generated from stereo images, reflects physically correct in these drops. This leads to near photo-realistic simulated results. The derived images may be used to extend the training data sets used for machine learning without being forced to capture new real pictures.