This paper explores a way for drones to communicate with each other visually, instead of relying on radio, by copying signalling ideas from animals like bees, deer, and peacocks. The drones use movement patterns and LED lights as a kind of shared “body language,” and an AI system helps translate what they see into the right visual response.
It lets a swarm of drones keep coordinating even when normal wireless links are jammed or unavailable. The early tests suggest motion-based signals worked very clearly, while LED-only signals were less reliable, so it looks promising but still experimental.


