Thanks to the advances in the fields of robotics and intelligent transportation systems (ITS), the autonomous vehicles of the future are gradually becoming a reality. As autonomous vehicles will have to behave safely in presence of other vehicles, pedestrians and other fixed and moving objects, one of the most important things they need to do is to effectively perceive both their motion and the environment around them. In this thesis, we first investigated how bio-inspired visual sensors, called Local Motion Sensors (LMSs), giving 1-D optic flow using a few pixels based on the findings on the fly’s visual system, could be used to improve automatic parking maneuvers. For this purpose, we developed a low computational-cost method for detecting and tracking a parking spot in real time using only 1-D OF measurements around the vehicle together with the vehicle’s longitudinal velocity and steering angle. Highly simplified 2-D parking simulations were first performed using Matlab/Simulink software, then some preliminary experiments were carried out using a vehicle equipped with two 6-pixel LMSs. As the main challenge for visual sensors is to correctly operate in high-dynamic-range lighting conditions, we also dealt here with a novel bio-inspired auto-adaptive silicon retina designed and developed by our laboratory in collaboration with the Center of Particle Physics of Marseille (CPPM). We successfully tested this silicon retina, showing that the novel pixel, called M2APix, which stands for Michaelis-Menten Auto-Adaptive Pixel, can auto-adapt in a 7-decade range and respond appropriately to step changes up to 3 decades, while keeping sensitivity to contrasts as low as 2%. We subsequently developed and tested a novel optic flow sensor based on this auto-adaptive retina and a new robust method for computing the optic flow, which provides several advantages to previously developed optic flow sensors such as its robustness to light levels, textures and vibrations that can be found while operating on the road. To test the performances of this novel sensor and show how it can be used for robotic and automotive applications such as visual odometry, we constructed a car-like robot, called BioCarBot, which estimates its velocity and steering angle by means of an extended Kalman filter (EKF) using only the optic flow measurements delivered by two downward-facing sensors of this kind. Indoor and outdoor experiments were successfully carried out in a 7-decade light level range and using various textures, showing promising perspectives of these sensors for odometry-based applications.