Submitted:
23 July 2024
Posted:
24 July 2024
You are already at the latest version
Abstract
Keywords:
1. Introduction
1.1. Need and Problem Statement
- Train an AI-based classifier to recognize ‘good’ and ‘bad’ flight behaviour in real-time, aiding in the development of a virtual AI co-pilot for immediate feedback;
- Develop a mission planning tool assigning the best pilot for specific missions based on performance scores and stress models;
- Identify and rank human factors impacting flight performance, linking stress factors to performance;
- Update training procedures to account for human factors, improving pilot awareness and performance.
- A drone simulator tracking operator performance under realistic conditions in standardized environments (discussed in Section 3).;
- A method for quantitative evaluation of drone-based video quality (discussed in Section 3.3);
- A methodology for modelling human performance in drone operations, relating human factors to operator performance (discussed in Section 4);
- An AI co-pilot offering real-time flight performance guidance (discussed in Section 5.1);
- A flight assistant tool for generating optimal flight trajectories (discussed in Section 5.2);
- A mission planning tool for optimal pilot assignment (discussed in Section 5.3);
- An iterative training improvement methodology based on quantitative input (discussed in Section 5.4).
1.2. Related Work in the Domain of Drone Simulation
1.3. Related Work in the Domain of Quantitative Evaluation of Drone Pilot Performance
1.4. Related Work in the Domain of Video Quality Analysis
- Full-Reference Methods: These extract data from high-quality, non-degraded source signals and are often derivatives of PSNR [35], commonly used for video codec evaluation;
- Reduced-Reference Methods: These extract data from a side channel containing signal parameter data;
- No-Reference Methods: These evaluate video quality without using any source information.
1.5. Related Work in the Domain of Human Performance Modelling for Drone Operations
1.6. Related Work in the Domain of AI Assistance for Drone Operations
2. Overview of the Proposed Evaluation Framework and Situation in Comparison to the State of the Art
3. Virtual Environment for Quantitative Assessment
3.1. Software Framework
3.1.1. PX4
3.1.2. Mavlinkrouter
3.1.3. Mavlink Interface
3.1.4. Standard Scripted Scenarios
3.1.5. UnReal Engine
- Next to the standard First-Person-View camera, we introduce a ground-based observer viewpoint camera which renders the environment through the eyes of the pilot. At the start of the operation, the simulator starts in this ground-based observer viewpoint. As the environment is very large and as most operations will take place Beyond Visual Line Of Sight (BVLOS), the operator will after a while usually switch to First-Person-View.
- Measure the number of collisions with the environment, as this is a parameter for the pilot performance assessment
- Measure the geo-location of any target (static camp or dynamic enemies)
- Calculate at any moment the minimum distance to any enemy
- Calculate the detectability of the drone, taking into consideration a detectability model in function of the distance of the drone to the enemy and the noise model of the drone type used [77,78].
- Sound an alarm when the drone is detected
- Include a battery depletion timer (set at 25 min). The drone crashes if it is not landed within the set time.
- We want to avoid measuring the side-effects of virtual embodiment, where some pilots may be subject to.
- Virtual reality would obstruct the use of exteroceptive sensing tools for measuring the physiological state of the pilot during the test.
3.1.6. Standard Test Environment
- A standardised visual acuity object. It consists of a mannequin with a letter written on its chest plate, as shown on Figure 4. The user will be requested to read out the letter (like for an ophthalmologist exam). The letters can be dynamically changed for every simulation and the mannequins can be spread randomly over the environment. In order to increase the level of difficulty, some of the mannequins are placed inside enclosures, forcing legibility only under certain viewing angles, thereby making the job even more difficult for the drone pilot.
- An enemy camp. It consists of a series of a series of tents, military installations and guarded watchtowers, as shown on Figure 4.
3.1.7. AirSim
3.1.8. Dynamic Environment Generator
3.1.9. QGroundControl
3.1.10. Logging Systems
- The MavLink interface records a series of interesting parameters related to the drone itself (its position, velocity and acceleration, control parameters, vibrations, ...) on the MavLink datastream.
- The Dynamic Environment Generator records the environmental conditions (wind direction and speed, density of rain, snow and fog, ...) and the presence of auditory disruptions (type of audio track, sound intensity, ...).
- The UnReal-based simulation records the number of collisions with the environment, the geo-location of all targets, the minimum distance to any enemy and the detectability of the drone.
3.1.11. Video Quality Analysis Tool - See Section 3.2
3.1.12. A Mission Performance Analysis Tool - See Section 4.1
3.2. Quantitative Evaluation of Drone-Based Video Quality
3.2.1. Concept
3.2.2. Methodology
- We assume that the drone camera is always directed at the target. This assumption simplifies the algorithm by avoiding (dynamic) viewpoint adjustments based on drone movement. This is a realistic scenario, as in actual operations, a separate camera gimbal operator typically ensures the camera remains focused on the target. This task can also be automated using visual servoing methodologies [67], which we assume to be implemented in this paper.
- To ensure uniform perception of the target from various viewing angles, we assume the target has a perfect spherical shape. While this is an approximation and may differ for targets with non-spherical shapes, it is the most generic assumption and can be refined if specific target shapes are more applicable to particular uses.
- Since the zoom factor is not dynamically available to the algorithm, we assume a static zoom factor.
- The input parameters for the video quality assessment algorithm are the drone’s position at a given time instance and the target’s position , which is assumed to remain static throughout the video sequence.
- The number of pixels on target . It is well-known that for machine vision image interpretation algorithms (e.g., human detection [68], vessel detection [69]), the number of pixels on target is crucial for predicting the success of the image interpretation algorithm. Similarly, for human image interpretation, Johnson’s criteria [70] indicate that the ability of human observers to perform visual tasks (detection, recognition, identification) depends on the image resolution on the target. Given a constant zoom factor, the number of pixels on target is inversely proportional to the distance between the drone and the target, such that:where is a constant parameter ensuring that , dependent on the minimum distance between the drone and the target, the camera resolution, and the focal length.
-
The data innovation . As discussed in the introduction, assessing the capability of drone operators to obtain maximum information about a target in minimal time is crucial. The data innovation metric evaluates the quality of new video data. This is achieved by maintaining a viewpoint history memory with , which stores all normalized incident angles of previous viewpoints. The current incident angle is compared to this memory by calculating the norm of the difference between the current and previous incident angles. The data innovation is the smallest of these norms, representing the distance to the closest viewpoint on a unit sphere:New viewpoints should be as distinct as possible from existing ones, as expressed by (2).
- The trajectory smoothness . High-quality video requires a smooth drone trajectory over time. Irregular motion patterns make the video signal difficult to interpret by human operators or machine vision algorithms. The metric evaluates trajectory smoothness by maintaining a velocity profile with , which stores all previous velocities. The current velocity is compared to the n most recent velocities. The norm of the difference between the current and previous velocities is weighted by recency. The weighted sum of the n most recent velocity differences measures changes in the motion profile and is inversely proportional to trajectory smoothness:
3.2.3. Validation
4. Drone Operator Performance modelling
4.1. Performance Analysis Tool
4.1.1. Metrics Definition
- GPS latitude, altitude and heading
- Velocity in the X, Y, Z direction
- Roll, pitch and yaw angles
- Velocity in roll, pitch and yaw
- Throttle level
- Climb rate
- Vibrations in the X, Y, Z direction
- Control stick position in the X, Y, Z and roll direction
4.1.2. Interface Design
- The total flight time in seconds;
- The total distance flown in meters;
- Whether the camp has been found, and - if yes- the error on the distance measurement;
- Whether a video was recorded, and - if yes- the video quality score according to equation 4;
- The number and percentage of enemies identified and their localisation error;
- The mean and minimum enemy distance in meters;
- Whether the drone has been detected by enemies;
- The number of collisions;
- The performance score under multiple environmental; conditions or under multiple human factors. Note that - as can be expected - the performance score in normal weather is better than the performance score in bad weather.
4.2. Human Performance Model
4.2.1. Methodology
4.2.2. Results & Discussion
- Measurement of stress level only before the test and not during the test may not be enough to yield appropriate measurements of stress levels. As discussed before, we intend to extend the simulator system with exteroceptive sensing systems that would enable the measurement of stress levels during flight, but this is not something that is ready yet.
- The Holmes-Rahe Life Stress Inventory is likely not be the optimal tool for quantifying stress levels in this context, as it emphasizes long-term life events rather than short-term stressors.
- Limited sample size of our pilot population, which may not be sufficient to yield statistically significant results.
- Another possible reason is that
5. Application Use Cases
5.1. AI Copilot for Drone Operator Assistance
5.1.1. Motivation and Concept
5.1.2. Methodology
- A SequenceInputLayer that handles sequences of the 26 flight parameters.
- A bidirectional long short-term memory layer that learns bidirectional long-term dependencies between time steps of time series.
- A fully connected layer that multiplies the input by a weight matrix and adds a bias vector.
- A softmax layer that applies a softmax function [81] to the input.
- A classification layer that computes the cross-entropy loss for classification tasks with mutually exclusive classes.
- Category 1: Novice pilots, which also includes pilots experienced with fixed-wing drones but not rotary-wing drones, subject to poor skill transfer across drone types, as observed and discussed in Section 4.2.2.
- Category 2: Competent pilots with experience flying rotary-wing drones.
- Category 3: Expert pilots, consisting of highly skilled pilots who regularly practice complex flight operations.
5.1.3. Validation and Discussion
5.2. Automated Optimal Drone Trajectories for Target Observation
5.2.1. Methodology
-
Line 2: As stated above, the algorithm starts from a simple drone motion model, which proposes a number of possible discrete locations where the drone can move to, taking into account the flight dynamics constraints.In a first step, we perform a search over all possible new locations in order to assess which one is the best to move to. This means that a brute brute-force search is followed for searching for the optimal position. This is a quite simplistic approach, but we have opted for this option as the number of possible locations is not so enormous and it is therefore not required to incorporate some advanced optimization scheme.
-
Line 3: In a second step, the safety of the proposed new drone location is assessed. This analysis considers in fact two different aspects:
- -
- The physical safety of the drone, which is in jeopardy if the drone comes too close to the ground. Therefore, a minimal distance from the ground will be imposed and proposed locations too close to the ground are disregarded.
- -
- The safety of the (stealth) observation operation, which is in jeopardy if the drone comes too close to the target, which means that the target (in a military context often an enemy) could hear/perceive the drone and the stealthiness of the operation would thus be violated. Therefore, a minimal distance between the drone and the target will be imposed and proposed locations too close to the target will be disregarded.
- Line 7: The global objective video quality measure at the newly proposed location is calculated, following the equation (4).
- Line 8: The point with the highest video quality score is recorded.
- Line 9-10: At this point, an optimal point for the drone to move to has been selected (). The viewpoint history memory and the velocity history memory are updated to include this new point.
- Line 11: The drone is moved to the new point , in order to prepare for the next iteration.
- Line 12: The point is appended to the drone trajectory profile.
![]() |
5.2.2. Validation and Discussion
5.3. Drone Mission Planning Tool
5.4. Incremental Improvement of Drone Operator Training Procedures
5.4.1. Enabling Fine-Grained Pilot Accreditation
5.4.2. Enabling Iterative Improvement of Training Procedures
5.4.3. Enabling Fine-Grained Pilot Performance Follow-Up
6. Conclusions
6.1. Discussion on the Proposed Contributions
6.2. Future Work
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Acknowledgments
Conflicts of Interest
Abbreviations
| ADAM | Adaptive Moment Estimation |
| AI | Artificial Intelligence |
| APAS | Advanced Pilot Assistance Systems |
| ATM | Aerial Traffic Management |
| BVLOS | Beyond Visual Line Of Sight |
| CBRN | Chemical, Biological, Radiological and Nuclear |
| CSV | Comma-separated values |
| DEG | Dynamic Environment Generator |
| DJI | Da-Jiang Innovations |
| DRL | Drone Racing League |
| GB | Gigabyte |
| GPS | Global Positioning System |
| GUI | Graphical User Interface |
| HITL | Hardware In The Loop |
| HMI | Human Machine Interface |
| ISR | Intelligence, Surveillance, Reconnaissance |
| LiDAR | Light Detection and Ranging |
| MDPI | Multidisciplinary Digital Publishing Institute |
| MTE | Mission Task Element |
| NIST | National Institute of Standards and Technology |
| PSD | Power Spectral Density |
| PSNR | Peak Signal-to-Noise Ratio |
| R-CNN | Region Convolutional Neural Network |
| RC | Remote Control |
| RF | Radio Frequency |
| RMSE | Root Mean Square Error |
| ROC | Receiver Operating Characteristic |
| SITL | Software In The Loop |
| TCP | Transmission Control Protocol |
| TLX | Task Load Index |
| UAS | Unmanned Aircraft System |
| UAVs | Unmanned Aerial Vehicle |
| UDP | User Datagram Protocol |
| UTM | Unmanned Traffic Management |
| YOLO | You Only Look Once) |
References
- Chow, E.; Cuadra, A.; Whitlock, C. Hazard Above: Drone Crash Database - Fallen from the skies. The Washington Post, 2016.
- Buric, M.; De Cubber, G. Counter Remotely Piloted Aircraft Systems. MTA Review 2017, 27, No. 1, Military Technical Academy Publishing House.
- Shively, J. Human Performance Issues in Remotely Piloted Aircraft Systems. In Proceedings of the ICAO Conference on Remotely piloted or piloted: sharing one aerospace system, Montreal, Canada, 23-25 March 2015. [Google Scholar]
- Lee, J.D.; Wickens, C.D.; Liu, Y.; Boyle, L.N. Designing for people: An introduction to human factors engineering. Charleston:CreateSpace, 2017.
- Fletcher, G. Pilot training review – interim report: literature review, Published by the British Civil Aviation Authority, 2017.
- Socha, V.; Socha, L.; Szabo, S.; Hana, K.; Gazda, J.; Kimlickova, M.; Vajdova, I.; Madoran, A.; Hanakova, L.; Nemec, V. Training of pilots using flight simulator and its impact on piloting precision. In Transport Means. Juodkrante: Kansas University of Technology. 2016, 374–379.
- Rostáš, J.; Kováčiková, M.; Kandera, B. Use of a simulator for practical training of pilots of unmanned aerial vehicles in the Slovak Republic. In Proceedings of the 19th IEEE International Conference on Emerging eLearning Technologies and Applications (ICETA), Košice, Slovakia, 11-12 November 2021; pp. 313–319, IEEE. [Google Scholar]
- Shah, S.; Dey, D.; Lovett, C.; Kapoor, A. AirSim: High-Fidelity Visual and Physical Simulation for Autonomous Vehicles. In Field and Service Robotics. Springer Proceedings in Advanced Robotics, vol 5.; Hutter, M.; Siegwart, R., Eds.; Springer, Cham; 2017.
- Lee, N. Unreal Engine, A 3D Game Engine. In: Encyclopedia of Computer Graphics and Games.; Lee, N. (eds); Springer, Cham; 2023.
- Mairaj, A; Baba, A.I.; Javaid, A.Y. Application specific drone simulators: Recent advances and challenges, Simulation Modelling Practice and Theory 2019, 94.
- DJI Flight Simulator. Available online: https://www.dji.com/be/downloads/products/simulator (accessed on 16 June 2024).
- The Drone Racing League Simulator. Available online: https://store.steampowered.com/app/641780/The_Drone_Racing_League_Simulator/ (accessed on 16 June 2024).
- Zephyr. Available online: https://zephyr-sim.com/ (accessed on 16 June 2024).
- droneSim Pro. Available online: https://www.dronesimpro.com/ (accessed on 16 June 2024).
- RealFlight. Available online: https://www.realflight.com/product/realflight-9.5s-rc-flight-sim-with-interlink-controller/RFL1200S.html (accessed on 16 June 2024).
- Cotting, M. An initial study to categorize unmanned aerial vehicles for flying qualities evaluation. In Proceedings of the 47th AIAA Aerospace Sciences Meeting including The New Horizons Forum and Aerospace Exposition, 2009.
- Holmberg, J.; Leonard, J.; King, D.; Cotting, M. Flying qualities specifications and design standards for unmanned air vehicles. In Proceedings of the AIAA Atmospheric Flight Mechanics Conference and Exhibit, 2008.
- Hall, C.; Southwell, J. Equivalent Safe Response Model for Evaluating the Closed Loop Handling Characteristics of UAS to Contribute to the Safe Integration of UAS into the National Airspace System. In Proceedings of the 11th AIAA Aviation Technology, Integration, and Operations (ATIO) Conference, including the AIAA Balloon Systems Conference, 2011.
- Schulze, P. C.; Miller, J.; Klyde, D. H.; Regan, C. D.; and Alexandrov, N. System Identification of a Small UAS in Support of Handling Qualities Evaluations. In Proceedings of the AIAA Scitech 2019 Forum, 2019.
- Abdulrahim, M.; Bates, T.; Nilson, T.; Bloch, J.; Nethery, D.; and Smith, T. Defining Flight Envelope Requirements and Handling Qualities Criteria for First-Person-View Quadrotor Racing. In Proceedings of the AIAA Scitech 2019 Forum, 2019.
- Greene, K. M.; Kunz, D. L.; Cotting, M. C. Toward a Flying Qualities Standard for Unmanned Aircraft. In Proceedings of the AIAA Atmospheric Flight Mechanics Conference, 2014.
- Klyde, D. H.; Schulze, P. C.; Mitchell, D.; and Alexandrov, N.. Development of a Process to Define Unmanned Aircraft Systems Handling Qualities. In Proceedings of the AIAA Atmospheric Flight Mechanics Conference, 2018.
- Sanders, F. C.; Tischler, M.; Berger, T.; Berrios, M. G.; and Gong, A. System Identification and Multi-Objective Longitudinal Control Law Design for a Small Fixed-Wing UAV. In Proceedings of the AIAA Atmospheric Flight Mechanics Conference, 2018.
- Abdulrahim, M.; Dee, J.; Thomas, G.; and Qualls, G. Handling Qualities and Performance Metrics for First-Person-View Racing Quadrotors. In Proceedings of the AIAA Atmospheric Flight Mechanics Conference, 2018.
- Herrington, S.M.; Hasan Zahed, M.J.; Fields, T. Pilot Training and Task Based Performance Evaluation of an Unmanned Aerial Vehicle. In Proceedings of the AIAA Scitech 2021 Forum, 11–21 January 2021.
- Ververs, P.M.; Wickens, C.D. , Head up displays: effect of clutter, display intensity and display location of pilot performance. Int. J. Aviat. Psychol. 1998, 8(4), 377–403. [Google Scholar] [CrossRef] [PubMed]
- Smith, J. K.; Caldwell, J. A. Methodology for evaluating the simulator flight performance of pilots. Brooks City Base, TX: Air Force Research Laboratory. Report No, AFRL-HE-BR-TR-2004-0118, 2004.
- Hanson, C.; Schaefer, J., Burken, J. J.; Larson, D.; Johnson, M. Complexity and pilot workload metrics for the evaluation of adaptive flight controls on a full scale piloted aircraft. NASA Dryden Flight Research Center; Edwards, CA United States. Document ID. 20140005730, 2014.
- Field, E. J.; Giese, S. E. D. Appraisal of several pilot control activity measures. In Proceedings of the AIAA Atmospheric Flight mechanics conference and exhibit, San Francisco, California, 15–18 August 2005.
- Zahed, M.J.H.; Fields, T. Evaluation of pilot and quadcopter performance from open-loop mission-oriented flight testing. Journal of Aerospace Engineering 2021, 235(13), pp 1817–1830. [CrossRef]
- Hebbar, P.A.; Pashilkar, A.A. Pilot performance evaluation of simulated flight approach and landing manoeuvres using quantitative assessment tools. Sādhanā Academy Proceedings in Engineering Sciences 2017, 42, 405–415. [Google Scholar] [CrossRef]
- Jacoff, A.; Mattson, P. Measuring and comparing small unmanned aircraft system capabilities and remote pilot proficiency. National Institute of Standards and Technology, 2020.
- Hoßfeld, T.; Keimel, C.; Hirth, M.; Gardlo, B.; Habigt, J.; Diepold, K.; Tran-Gia, P. Best practices for qoe crowdtesting: Qoe assessment with crowdsourcing. IEEE Transactions on Multimedia 2014, 16, 541–558. [Google Scholar] [CrossRef]
- Takahashi, A.; Hands, D.; Barriac, V. Standardization activities in the ITU for a QoE assessment of IPTV. IEEE Communications Magazine 2008, 46, 78–84. [Google Scholar] [CrossRef]
- Winkler, S.; Mohandas, P. The evolution of video quality measurement: From PSNR to hybrid metrics. IEEE Transactions on Broadcasting 2008, 54, 660–668. [Google Scholar] [CrossRef]
- Hulens, D.; Goedeme, T. Autonomous flying cameraman with embedded person detection and tracking while applying cinematographic rules. In Proceedings of the 14th Conference on computer and robot vision (CRV2017), Edmonton, Canada, 16-19 May 2017; pp. 56–63. [Google Scholar]
- Doroftei, D.; De Cubber, G.; De Smet, H. A quantitative measure for the evaluation of drone-based video quality on a target. In Proceedings of the Eighteenth International Conference on Autonomic and Autonomous Systems (ICAS), Venice, Italy, 22-26 May 2022. [Google Scholar]
- Deutsch, S. UAV Operator Human Performance Models. BBN Report 8460, 2006. [Google Scholar]
- Bertuccelli, L.F.; Beckers, N.W.M.; Cummings, M.L. Developing operator models for UAV search scheduling. In Proceedings of the of AIAA Guidance, Navigation, and Control Conference, Toronto, Canada, 2010.
- Wu, Y.; Huang, Z.; Li, Y.; Wang, Z. Modeling Multioperator Multi-UAV Operator Attention Allocation Problem Based on Maximizing the Global Reward. IEEE Mathematical Problems in Engineering 2016. [CrossRef]
- Hart, S. G.; Staveland, L. E. Development of NASA-TLX (Task Load Index): Results of Empirical and Theoretical Research" (PDF). In Human Mental Workload. Advances in Psychology, Hancock, P. A.; Meshkati, N. (eds.). 52, pp. 139–183.
- Andrews, Jinan M. Human Performance Modeling: Analysis of the Effects of Manned-Unmanned Teaming on Pilot Workload and Mission Performance. Air Force Institute of Technology Theses and Dissertations, 2020.
- Wright, J.L.; Lee, J.; Schreck, J.A. Human-autonomy teaming with learning capable agents: performance and workload outcomes. In Proceedings of the International Conference on Applied Human Factors and Ergonomics, 2021.
- Doroftei, D.; De Cubber, G.; De Smet, H. Human factors assessment for drone operations: towards a virtual drone co-pilot. In Proceedings of the AHFE International Conference on Human Factors in Robots, Drones and Unmanned Systems, 2023.
- Sakib, M.N.; Chaspari, T.; Ahn, C.; Behzadan, A. An experimental study of wearable technology and immersive virtual reality for drone operator training. In Proceedings of the 27th Int. Workshop on Intelligent Computing in Engineering, 2020, pp. 154–163.
- Sakib, M.N. Wearable technology to assess the effectiveness of virtual reality training for drone operators (Doctoral dissertation), 2019.
- Doroftei, D.; De Cubber, G.; De Smet, H. Reducing drone incidents by incorporating human factors in the drone and drone pilot accreditation process. In Proceedings of the AHFE 2020 Virtual Conference on Human Factors in Robots, Drones and Unmanned Systems, 16-20 July 2020, USA, 2020, pp. 71–77.
- Gupta, S. G.; Ghonge, M.; Jawandhiya, P. M. Review of unmanned aircraft system (UAS). International Journal of Advanced Research in Computer Engineering & Technology (IJARCET), 2013, 2.
- Hussein, M.; Nouacer, R.; Corradi, F.; Ouhammou, Y.; Villar, E.; Tieri, C.; Castiñeira, R. Key technologies for safe and autonomous drones. Microprocessors and Microsystems, 2021, 87.
- Chandran, N. K.; Sultan, M. T. H.; Łukaszewicz, A.; Shahar, F. S.; Holovatyy, A.; Giernacki, W. Review on Type of Sensors and Detection Method of Anti-Collision System of Unmanned Aerial Vehicle. Sensors, 2023, 23(15).
- Gupta, A.; Fernando, X. Simultaneous localization and mapping (slam) and data fusion in unmanned aerial vehicles: Recent advances and challenges. Drones, 2022, 6(4). [CrossRef]
- Castro, G. G. D.; Berger, G. S.; Cantieri, A.; Teixeira, M.; Lima, J.; Pereira, A. I.; Pinto, M. F. Adaptive path planning for fusing rapidly exploring random trees and deep reinforcement learning in an agriculture dynamic environment UAVs. Agriculture, 2023, 13(2).
- Telli, K.; Kraa, O.; Himeur, Y.; Ouamane, A.; Boumehraz, M.; Atalla, S.; Mansoor, W. A comprehensive review of recent research trends on unmanned aerial vehicles (uavs). Systems, 2023, 11(8).
- Azar, A. T.; Koubaa, A.; Ali Mohamed, N.; Ibrahim, H. A.; Ibrahim, Z. F.; Kazim, M.; Casalino, G. Drone deep reinforcement learning: A review. Electronics, 2021, 10(9).
- Jawaharlalnehru, A.; Sambandham, T.; Sekar, V.; Ravikumar, D.; Loganathan, V.; Kannadasan, R.; Alzamil, Z. S. Target object detection from Unmanned Aerial Vehicle (UAV) images based on improved YOLO algorithm. Electronics, 2022, 11(15).
- McConville, A.; Bose, L.; Clarke, R.; Mayol-Cuevas, W.; Chen, J.; Greatwood, C.; Richardson, T. Visual odometry using pixel processor arrays for unmanned aerial systems in gps denied environments. Frontiers in Robotics and AI, 2020, 7.
- van de Merwe, K.; Mallam, S.; Nazir, S. Agent transparency, situation awareness, mental workload, and operator performance: A systematic literature review. Human Factors, 2024, 66(1), pp. 180–208.
- Woodward, J.; Ruiz, J. Analytic review of using augmented reality for situational awareness. IEEE Transactions on Visualization and Computer Graphics, 2022, 29(4), pp. 2166–2183. [CrossRef]
- Van Baelen, D.; Ellerbroek, J.; Van Paassen, M. M.; Mulder, M. Design of a haptic feedback system for flight envelope protection. Journal of Guidance, Control, and Dynamics, 2020, 43(4), pp 700–714.
- Nguyen, T-T.; Crismer, A.; De Cubber, G.; Janssens, B.; Bruyninckx, H. Landing UAV on Moving Surface Vehicle: Visual Tracking and Motion Prediction of Landing Deck. In Proceedings of the IEEE/SICE International Symposium on System Integration (SII), 2024.
- Dutrannois, T.; Nguyen, T-T.; Hamesse, C.; De Cubber, G.; Janssens, B. Visual SLAM for Autonomous Drone Landing on a Maritime Platform. In Proceedings of the International Symposium for Measurement and Control in Robotics (ISMCR) – A Topical Event of Technical Committee on Measurement and Control of Robotics (TC17), International Measurement Confederation (IMEKO), 2022.
- Papyan, N.; Kulhandjian, M.; Kulhandjian, H.; Aslanyan, L. AI-Based Drone Assisted Human Rescue in Disaster Environments: Challenges and Opportunities. Pattern Recognition and Image Analysis, 2024, 34(1), pp. 169–186.
- Weber, U.; Attinger, S.; Baschek, B.; Boike, J.; Borchardt, D.; Brix, H.; Brüggemann, N.; Bussmann, I.; Dietrich, P.; Fischer, P.; Greinert, J. MOSES: A novel observation system to monitor dynamic events across Earth compartments. Bulletin of the American Meteorological Society, 2022, 103(2), pp.339–348.
- Ramos, M. A.; Sankaran, K.; Guarro, S.; Mosleh, A.; Ramezani, R.; Arjounilla, A. The need for and conceptual design of an AI model-based Integrated Flight Advisory System. Journal of Risk and Reliability, 2023, 237(2), pp. 485–507.
- Doroftei, D.; De Cubber, G.; De Smet, H. Assessing Human Factors for Drone Operations in a Simulation Environment. In Proceedings of the Human Factors in Robots, Drones and Unmanned Systems – AHFE (2022) International Conference, New York, USA, 2022.
- Doroftei, D.; De Smet, H. Evaluating Human Factors for Drone Operations using Simulations and Standardized Tests. In Proceedings of the 10th International Conference on Applied Human Factors and Ergonomics (AHFE 2019), Washington DC, USA, 2019.
- De Cubber, G.; Berrabah, S. A.; Sahli, H. Color-based visual servoing under varying illumination conditions. Robotics and Autonomous Systems 2004, 47(4), pp. 225—249.
- De Cubber, G.; Marton, G. Human victim detection. in Proceedings of the Third International Workshop on Robotics for risky interventions and Environmental Surveillance-Maintenance, RISE, 2009.
- Marques, J. S.; Bernardino, A.; Cruz, G.; Bento, M. An algorithm for the detection of vessels in aerial images. in Proceedings of the 11th IEEE International Conference on Advanced Video and Signal Based Surveillance (AVSS), pp. 295–300, Aug 2014.
- Johnson, J. Analysis of image forming systems. in Proceedings Image Intensifier Symposium, p. 244–273, 1958.
- Meier, L.; Honegger, D.; Pollefeys, M. PX4: A node-based multithreaded open source robotics framework for deeply embedded platforms. In 2015 IEEE international conference on robotics and automation (ICRA), 2015, pp. 6235–6240.
- Doroftei, D.; De Vleeschauwer, T.; Lo Bue, S; Dewyn, M.; Vanderstraeten, F.; De Cubber,G. Human-Agent Trust Evaluation in a Digital Twin Context. In 30th IEEE International Conference on Robot Human Interactive Communication (RO-MAN), Vancouver, BC, Canada, 2021, pp. 203-207.
- Doroftei, D; De Cubber, G; Wagemans, R.; Matos, A.; Silva, E.; Lobo, V.; Guerreiro Cardoso, K. C.; Govindaraj, S.; Gancet, J.; Serrano, D. User-centered design. In Search and rescue robotics. From theory to practice. De Cubber, G.; Doroftei, D.; Eds. IntechOpen, London, 2017, p. 19–36.
- Holmes, T.H.; Rahe, T.H. The Social Readjustment Rating Scale. Journal of Psychosomatic Research, 1967, 11(213).
- Koubâa, A.; Allouch, A.; Alajlan, M.; Javed, Y.; Belghith, A.; Khalgui, M. Micro Air Vehicle Link (MAVlink) in a Nutshell: A Survey. in IEEE Access, 2019, 7, 87658–87680. [Google Scholar] [CrossRef]
- mavp2p. Available online: https://github.com/bluenviron/mavp2p (accessed on 03 July 2024).
- Garinther, G.R.; Kalb, I.J.T.; Hodge, D.C.; Price, G.R. Proposed aural non-detectability limits for army materiel, U.S. Army Human Engineering Laboratory, 1985.
- Doroftei, D.; De Cubber, G. Using a qualitative and quantitative validation methodology to evaluate a drone detection system, ACTA IMEKO, 2019 8(4), pp. 20–-27.
- De Cubber, G.; Shalom, R.; Coluccia, A.; Borcan, O.; Chamrád, R.; Radulescu, T.; Izquierdo, E.; Gagov, Z. The SafeShore system for the detection of threat agents in a maritime border environment. In IARP Workshop on Risky Interventions and Environmental Surveillance (Vol. 2), 2017.
- Ramirez-Atencia, C.; Camacho, D. Extending QGroundControl for automated mission planning of UAVs. Sensors, 2018, 18(7).
- Bishop, C. M. In Pattern Recognition and Machine Learning. Springer, New York, NY, USA, 2006.
- Kingma, D.; Ba, J. Adam: A method for stochastic optimization. 3rd International Conference on Learning Representations, 2015.
- Alexis, K.; Nikolakopoulos, G.; Tzes, A. Model predictive quadrotor control: attitude, altitude and position experimental studies. IET Control Theory & Applications, 2012, 6(12), 1812-1827.
- Szolc, H.; Kryjak, T. Hardware-in-the-loop simulation of a UAV autonomous landing algorithm implemented in SoC FPGA. In 2022 Signal Processing: Algorithms, Architectures, Arrangements, and Applications, pp. 135–140.
- European Commission. Commission Implementing Regulation (EU) 2019/947 of 24 May 2019 on the rules and procedures for the operation of unmanned aircraft.

















| Human Factor | Importance level (0 – 100%) |
|---|---|
| Task Difficulty | 89% |
| Pilot Position | 83% |
| Pilot Stress | 83% |
| Pilot Fatigue | 83% |
| Pressure | 83% |
| Pilot subjected to water or humidity | 83% |
| Pilot subjected to temperature changes | 78% |
| Information location & organization & formatting of the controller display | 78% |
| Task Complexity | 78% |
| Task Duration | 78% |
| Pilot subjected to low quality breathing air | 72% |
| Pilot subjected to small body clearance | 72% |
| Ease-of-use of the controller | 72% |
| Pilot subjected to noise/dust/vibrations | 67% |
| Task Type | 67% |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2024 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).

