This paper presents a new bio-inspired artificial eye composed of only 24 pixels, able to locate accurately a target in its small field of view. The proposed eye is composed of 4 hyperacute minimalist eyes ([l]) which are combined to extends the previous works ([2,3]) to the 2-D localization case. The linearity and the fields of view of the visual sensors were improved thanks to new fusion algorithms and an efficient calibration procedure. The eye, oriented toward the ground, is mounted on a very light custom-made pan-tilt system which makes the eye able to track faithfully and smoothly a moving target. lt is shown here, that such a system can be embedded onto a small quadrotor to achieve accurate positioning with respect to the target. The quadrotor used in this work is an open-source platform which is described in details in [ 4], and algorithms are implemented thanks to a new fast prototyping open-source Matlab/Simulink toolbox . The attitude of the robot is estimated using a complementary filter () implemented with quaternions and attitude setpoints are tracked using a quaternion based geometric controller (). The positions and translational speeds are estimated using an EKF fusing the visual information. The decoupling system is enhanced with bio-inspired reflexes which stabilize efficiently the gaze of the robot during manoeuvres and disturbances [8,9,10]. The use of very few pixels allow to achieve high refresh-rate of the visual algorithm as fast as 400 Hz. This high refresh rate coupled to a very fast mechanical active decoupling make the aerial robot able to track efficiently a target moving at a speed up to 200°/s , and to achieve rotational speed of 800 °/s during disturbances rejection. Finally, we demonstrated experimentally, that it is possible with such a system to achieve accurate visual hovering and tracking. The positioning and tracking performances of the proposed algorithms were evaluated in a flying arena equipped with a 17 T-40s Vicon cameras, delivering a high reliable ground truth. S. Viollet, Vibrating makes for better seeing: from the fly's micro eye movements tohyperacute visual sensors, Frontiers in Bioengineering and Biotechnology, vol. 2, no. 9, 2014. L. Kerhuel, S. Viollet, and N. Franceschini, The VODKA sensor: A bio-inspired hyperacuteoptical position sensing device, Sensors Journal, IEEE, vol. 12, no. 2, pp. 315-324, Feb 2012 R. Juston, L. Kerhuel, N. Franceschini, and S. Viollet, Hyperacute edge and bar detection in abioinspired optical position sensing device, Mechatronics, IEEE/ASME Transactions on, vol. 19,no. 3, pp. 1025-1034, June 2014. A. Manecy, N. Marchand, F. Ruffier, and S. Viollet, X4-MaG: A Low-Cost Open-SourceMicro-Quadrotor and its Linux-Based Controller, International Journal of Micro Air Vehicles(IJM..4. V), Accepted for publication. A. Manecy, N. Marchand and S. Viollet. RT-MaG: an open-source SIMULINK Toolbox forReal-Time Robotic Applications. IEEE International Conference on Robotics and Biomimetics(ROBIO), 2014, Bali, Indonesia, R. Mahony, T. Hamel, J.-M. Pflimlin, (2008). Nonlinear Complementary Filters on theSpecial Orthogonal Group, Automatic Contra[, IEEE Transactions on, 53(5): 1203-1218. T. Lee, M. Leoky, N. McClamroch, Geometric Tracking Control of a Quadrotor UAV onSE(3), Decision and Contra[ (CDC), 2010 49th IEEE Conference on, 2010, pp. 5420- 5425.doi:l0.1109/CDC.2010.5717652 .. A. Manecy, S. Viollet, N. Marchand. Bio-Inspired Hovering Control for an Aerial RobotEquipped with a Decoupled Eye and a Rate Gyro. IEEEIRSJ International Conference onIntelligent Robots and Systems (IROS), 2012, Vilamoura, Algarve, Portugal, pp 1110-1117. doi:10.1109/IROS.2012.6385853 A. Manecy, R. Juston, N. Marchand, S. Viollet. Decoupling the Eye: A Key toward a RobustHovering for Sighted Aerial Robots. Advances in Aerospace Guidance, Navigation and Contra[,part III, Springer, 2013, pp 317-336. doi: 10.1007/978-3-642-38253-6_20. A. Manecy, N. Marchand and S. Viollet. Hovering by Gazing: A Novel Strategy forImplementing Saccadic Flight-based Navigation in GPS-denied Environments. Int J Adv RobotSyst, 2014, 11:66. doi: 10.5772/58429.