Navigating insects like desert ants are known to robustly estimate their position from their nest while foraging for food, for several hundred meters across hostile environments , by means of very low resolution visual information processing. This tour de force stands for a great source of inspiration to design smart, parsimonious and robust solutions to make robots of any size navigate in global navigation satellite systems-denied or in cluttered environments. In this study, we introduce a new insect-inspired omnidirectional visual sensor (640×120 pixels; 120 fps). The inter-pixel angle is equal to 0.6° and the acceptance angle is equal to 1.5°, which is comparable to those observed in predatory flying insects. This sensor was embedded on-board the AntBot robot, a six-legged walking robot mimicking desert ants at morphological, locomotive and sensing levels. Despite the residual visual oscillations of the field of view while walking, the robot successfully detected fixed obstacles and was able to locate itself with an accuracy as low as 25 ± 10 cm, which actually corresponds to an average error of only 3 strides (hexapod stride length: 8.2 cm) after a 9m-long journey. This suggests that low-acuity visual sensors, inherently requiring few computational resources, are good candidates for ant-like familiarity-based navigation in cluttered environments.