Event-based visual guidance inspired by honeybees in a 3D tapered tunnel

  • Serres Julien
  • Raharijaona T
  • Vanhoutte E
  • Ruffier F

COMM

— In view of neuro-ethological findings on honeybees and our previously developed vision-based autopilot, in-silico experiments were performed in which a " simulated bee " was make to travel along a doubly tapering tunnel including for the first time event-based controllers. The " simulated bee " was equipped with: • a minimalistic compound eye comprising 10 local motion sensors measuring the optic flow magnitude, • two optic flow regulators updating the control signals whenever specific optic flow criteria changed, • and three event-based controllers taking into account the error signals, each one in charge of its own translational dynamics. A MORSE/Blender based simulator-engine delivered what each of 20 " simulated photoreceptors " saw in the tunnel lined with high resolution natural 2D images. The " simulated bee " managed to travel safely along the doubly tapering tunnel without requiring any speed or distance measurements, using only a Gibsonian point of view, by: • concomitantly adjusting the side thrust, vertical lift and forward thrust whenever a change was detected on the optic flow-based signal errors, • avoiding collisions with the surface of the doubly tapering tunnel and decreasing or increasing its speed, depending on the clutter rate perceived by motion sensors.