Out-of-Distribution Detection and Feature Identification

Automated Workflow for Out-of-Distribution Detection using B-Variational Autoencoders

Machine Learning components have shown remarkable performance in several perception and control tasks like NVIDIA’s DAVE-II self-driving car. However, incidents like TESLA’s self-driving accident and UBER’s autonomous car crash have shown these components to be susceptible to Out-of-distribution (OOD) data. Besides, the black-box nature of these components makes it difficult to test and verify them. To address this, we have used a generative model called B-Variational Autoencoder (B-VAE) to detect the OOD data and identify the factor responsible for the OOD data problem. For example, if an autonomous vehicle is trained on images from day scenes, but if it encounters images from evening scenes, then the vehicle’s performance on these images will be erroneous. For the vehicle’s safety, it is required to detect that the operating scene has changed from that of training and the time-of-day factor is responsible for the problem. Recommended Reading

Demonstration

The precipitation increases to a high value at t = 13 seconds in the video. Images of high rain were not included in training the Machine Learning controller driving the AV. So, we used the B-VAE detector to identify OOD images in CARLA simulation. As the precipitation value increases, the detector martingale and the martingale of the precipitation reasoner increases (Martingale used for performing time-series detection. You can find out more from the recommended reading material) You can find more videos and the implementation from our GitHub.

Shreyas Ramakrishna
Shreyas Ramakrishna
Senior Architect, System Safety Engineer