The interest in autonomous robotics is continually expanding, especially in the domain of micro air vehicles. Indeed, much research focuses on these small-size aircraft in order to miniaturize them and to make their navigation more autonomous. Such research must, for that, take up many challenges such as increasing the power to flying mass ratio, precisely stabilizing flight, or increasing flight autonomy. But the real challenge in navigating these micro robots is perhaps to give them a highly reliable and reactive visual perception of their environment. A conventional solution is visual perception by means of optical cameras. While these cameras provide rich information through high resolution images, this type of perception requires extensive computing resources to process the massive flow of generated data. It therefore currently appears difficult to miniaturize a microdrone equipped with such a system while making it reactive to an unknown and unpredictable environmental configuration. However, Nature shows us that some winged insects have managed to reconcile miniaturization and high speed of movement given their tiny size. Indeed, the honeybee is a flying insect of about 100 mg for about 13 mm long that is able to fly several kilometers at a maximum speed of about 8 m/s without colliding with surrounding obstacles. Many studies have shown that this ability is mainly based on the detection of the visual flow also called optic flow. This PhD thesis explores a parsimonious vision system dedicated to short range navigation using innovative self-adaptive visual sensors composed of only 12 pixels with optical properties inspired by those of honeybees. Two optic flow measurement algorithms are first compared under ideal conditions over 5 decades of irradiance and 3 decades of optical velocity, then tested under real flight conditions. The most robust and efficient algorithm, due to its very low computing requirements, was embedded on board a micro quadrotor weighing about 400 g and equipped with a parsimonious visual system of 96 pixels stabilized via an articulated gimbal in roll and pitch to compensate the quadrotor rotations. The navigation strategies observed in honeybees were simulated in virtual environments (6 m or 12 m long tunnel for a minimum section of 25 or 50 cm) and the feasibility of the detection of the optic flow on board a micro quadrotor was demonstrated in real flight conditions in experimental room (flight of 4 m long at a minimum distance of 50 cm). Coupled with navigation strategies inspired by the honeybee, this innovative visual system dedicated to the perception of movement will in the near future allow to navigate in cluttered or cramped environments.