Authors: Rachel Blin, Samia Ainouz, Stéphane Canu, Fabrice Meriaudeau Description: Road scene analysis is a fundamental task for both autonomous vehicles and ADAS systems. Nowadays, one can find autonomous vehicles that are able to properly detect objects present in the scene in good weather conditions but some improvements are left to be done when the visibility is altered. People claim that using some non conventional sensors (infra-red, Lidar, etc.) along with classical vision enhances road scene analysis but still when conditions are optimal. In this work, we present the improvements achieved using polarimetric imaging in the complex situation of adverse weather conditions. This rich modality is known for its ability to describe an object not only by its intensity but also by its physical information, even under poor illumination and strong reflection. The experimental results have shown that, using our new multimodal dataset, polarimetric imaging was able to provide generic features for both good weather conditions and adverse weather ones. By combining polarimetric images with an adapted learning model, the different detection tasks in adverse weather conditions were improved by about 27%.