An ever-increasing number of human-machine interfaces have embraced touchscreens as their central component, such as in smartphones, laptops, terminals, etc. Their success has been particularly noticeable in the automotive industry, where physical buttons have been replaced by touchscreens to handle multiple elements of the driving environment. However, contrary to physical buttons, these interfaces do not possess any tangible elements that allows the user to feel where the commands are. Without tactile feedback, users have to rely on visual cues and simple adjustment tasks become significant distractions that may lead to dangerous situations while driving. Recently, haptic touchscreens have emerged to restore tangibility to these interfaces, by rendering the sensation of feeling textures and shapes through friction modulation. However, we still do not have a good understanding of how these synthetic textures are perceived by humans, which is crucial to design meaningful and intuitive haptic interfaces. In this thesis, I first show that the perception thresholds of friction modulated textures are similar to vibrotactile thresholds. Then, I investigate the perception of haptic gradients, i.e., textures whose spatial frequency gradually changes. Hence, a law that describes the minimal exploration distance to perceive a given gradient is deduced. This law is similar to the auditory perception of rhythm variations, which suggests that there are common mechanisms between the two modalities. Finally, I demonstrate that gradient haptic feedback can guide a user to adjust a setting on an interface without vision. The findings shed new light on the understanding of haptic perception and its multisensory interactions and open up new possibilities in terms of human-machine interaction.