When insects or pilots are flying forward, the image of the environment sweeps backward across their viewfield and forms an "optic flow" (OF) that depends on both the groundspeed and the distance to the ground (or to the lateral obstacles). We recently presented explicit control schemes which explain how insects may navigate on the sole basis of OF cues, without requiring any distance or speed sensors: how they take off and land, follow the terrain, avoid the lateral walls in a corridor and control their forward speed automatically. The optic flow regulator is a feedback control system controlling either the lift, the forward thrust or the lateral thrust. Three OF regulators account for various insect flight patterns observed over the ground and over still water, under calm and windy conditions and in straight and tapered corridors. We simulated and implemented these OF based autopilots on-board two kinds of aerial vehicles: a miniature helicopter for ground avoidance and a miniature hovercraft for lateral obstacle avoidance and cruise control in straight or tapered corridors. The OF sensors we are using are electronic sensors that we originally designed in 1986 and miniaturized using microcontrollers, LTCC or FPGA technology. They are based on a travel-time principle inspired by the housefly motion sensitive neurons that we had previously analyzed using single neuron recording combined with single photoreceptor stimulation. The simple and parsimonious control schemes described require no conventional avionic devices such as emissive range finders or groundspeed sensors based on Doppler Radars or GPS. They meet the low avionic payload requirements of autonomous micro-air and space vehicles, while being consistent with the neural repertory of flying insects.