Although deep learning has received extensive attention and achieved excellent performances in various of scenarios, it suffers from adversarial examples to some extent. Especially, physical attack poses more threats than digital attack. However, existing researches pay less attention to physical attack of object detection in remote sensing images (RSIs). In this work, we systematically analyze the universal adversarial patch attack for multi-scale objects in the remote sensing field. There are two challenges for adversarial attack in RSIs. On one hand, the number of objects in remote sensing images is more than that of natural images. Therefore, it is difficult for adversarial patch to show adversarial effect on all objects when attacking a detector of RSIs. On the other hand, the wide range of height of photography platform causes that the size of objects diverse a lot, which brings challenges for generating universal adversarial perturbation for multi-scale objects. To this end, we propose an adversarial attack method on object detection for remote sensing data. One of the key ideas of the proposed method is the novel optimization of adversarial patch. We aim to attack as many objects as possible by formulating a joint optimization problem. Besides, we raise a scale factor to generate a universal adversarial patch that adapts to multi-scale objects, which ensures the adversarial patch is valid for multi-scale objects in the real world. Extensive experiments demonstrate the superiority of our method against state-of-the-art methods on YOLO-v3 and YOLO-v5. In addition, we also validate the effectiveness of our method in real-world applications.