Submitted:
20 December 2025
Posted:
23 December 2025
You are already at the latest version
Abstract
Keywords:
1. Introduction
2. Materials and Methods
2.1. Biomedical Laboratory Participants
-
How do you learn the lab work and the safety aspects:
- o
- Incidents and accidents
- o
- Reading safety data sheet
- o
- Reading safety standard operating procedures (SOPs)
- o
- From mentors
- Is there an incident reporting system in your organization
- Is incident reporting encouraged in your organization?
- Is there fear associated with reporting incidents
2.2. Aviation Industry Participants
- How can organizations address the fear of blame when reporting incidents?
- In your industry, how is action taken if an incident is reported?
- What motivation works best for incident reporting especially relating to near misses and unsafe practices?
- How does the culture of the organization help to encourage incident reporting?
- Are there clear principles of blame-free incident reporting, and are these principles communicated and followed within your organization?
- Does the ease of reporting and the paperwork that follows make a difference to reporting?
- How does your industry focus on Learning vs. Blame?
- How does your industry reward and support incident/near miss reporting?
3. Results
- One participant said the organization has no system to report incidents.
- Two participants said that organization has a system to report incidents, which is encouraged by the manager.
- Twelve participants said organization has a system to report incidents but only those of serious nature were reported.
- Blame assigned to the person and team
- Stigma attached on the person reporting
- The reporting staff may be deemed to be clumsy
- Mistakes were seen as personal failures
- Paperwork is very complicated
- Fear of losing the job
- Reporting resulted in more rules
- Organization emphasizes zero incident policy
- If it’s a small cut with a clean sharp instrument, I will only discuss it within my team and I will not report it. I would only report if it was anything more serious or the sharp instrument was infected.
- It’s very blame shifting; the safety department doesn’t think so much in terms of actual safety they just make rules. I guess that is the biggest impediment. If you report the incident, an email is sent with new rules as opposed to why this incident happened and what we can learn from it.
- Even if they say no blame will be assigned, the person about whom we report will feel bad and we cannot be sure that no action will be taken. If a senior does something, I am afraid it will have repercussions if I report. Therefore, there are people who see someone doing something wrong but don’t want to say it.
- Small confidential group discussion will be hugely beneficial because I am very sceptical that people will be willing to expose their frailties or errors by reporting incidents.
- While working with HIV (Human Immunodeficiency Virus), we have to wear safety goggles when retrieving virus samples from the liquid nitrogen tanks. A particular staff has done this many times and probably wondered why we need to wear safety goggles. One time while retrieving samples, the staff did not wear the safety goggles. The vials burst open and some drops got into the staff’s eyes. The funny thing is, the staff hesitated even to tell the supervisor that this tube of HIV broke, because of fear of being scolded. I told the staff that it was stupid because HIV can ruin the whole life and the incident should be reported so that the staff can seek medical attention. I was amazed that there was this hesitation and it took a lot of persuasion from me before the staff reported it.
- A staff was working with the dengue virus in the biosafety cabinet (primary protective equipment) and a needle scratched through the gloves. The staff felt a scratch but when the staff decontaminated and doffed the gloves and looked, there was no scratch on the finger, only the glove was punctured. The staff therefore chose not to report it because there was no break in the skin. According to protocol, the staff should have reported it anyway because it punctured the gloves and the staff felt the scratch even though it appeared not to break through the skin.
- Reporting culture depends on the Principal Investigator (head of the laboratory), and his/her attitude. For every experiment there are always some risky parts, for example while using the ultracentrifuge. When we are taught by the seniors they tell us to just approximately pour the liquid into the tubes and that there was no need to balance the tubes by weighing it as per the correct procedure. A staff was using it and the centrifuge exploded. As always they accused the last person who used it before the explosion, of not balancing the tubes. No one was balancing it and this was not corrected. After this incident, they set up a booking system but in practice, only a few responsible ones logged in.
- There are many instances with equipment. A student may have some problems, like equipment breaking during use, and they chose not to report it. We have to look at CCTV footage to see who was using the equipment. Even though we do not blame, if something happens or the tubes breaks within the centrifuge or you didn’t balance the centrifuge properly and the tube broke, they should be warned that is not proper way to use the centrifuge. Sometimes the work culture of dealing with things is incorrect; staff are doing it incorrectly and they teach this to newcomers. I also think that they don’t report due to multiple factors like the paperwork and they don’t want people fussing over them. Not reporting makes it easier and does not put the department under investigation. There is a tendency for the person to be blamed, leading to an inferiority complex. It should be done in a better manner, for example if there is an incident then we should discuss it without using names of the people involved but rather explain what happened and what might be the correct way to do it.
- As a supervisor, I encourage my team to report incidents, but at first the incident reporting was seen only negatively. It took 2-3 years before the team members understood that incident reports are not negative. Now incident reports have become a part of life in the laboratory. Team members understand that they have to report to improve the system and also to learn from mistakes. At first though, it was not easy, but it got easier for them and now team members submit incident reports even without being asked. However, even with better incident reporting practices, unless, it is really bad like accidentally dropping a whole bottle of formaldehyde or something like that, people don’t report it. So, even with the current improved reporting practices, they do an assessment in their head about what to report. Of course if the incident is their fault they’re not happy about it, but they understand that this is how things happen here. It is not personal and you just have to follow this; sometimes people just make mistakes and things happen to us.
- Speaking as a supervisor, my facility does not impose a zero incident requirement. When I was a junior, there was a blame system in the institution. Because of growing in such a system where they blame you instead of correcting you or mitigating the problem, I started the blame free system within my own unit/section during the course of my career. In other units/sections, the staff still take it negatively because if they make an incident report it may affect their performance evaluation. In my section, we encourage incident reporting not to blame them but to mitigate the situation and to learn.
- Just culture: a legal requirement anchored in the aviation Safety Management System (SMS)
- A.
- All participants said that the introduction of just culture and the practice of blame-free reporting of safety occurrences came about in the wake of some serious accidents around 20 years ago. Participants talked about the stakeholders involved in the practice of just culture and the importance of trust among them. The aviation industry manages safety through a comprehensive SMS (IATA), 2025). They said that it took 15-20 years for the aviation industry to make just culture a part of their safety management system and legal requirements.
- In the late 1990s it became a mandatory requirement to report incidents. Two years later, there was an accident where the air traffic controllers were blamed and treated badly. Following this, a more formal process to protect controllers and to encourage them to report was developed. We call this ‘just culture’, which is a clearly written requirement in the law and safety manuals. Now when a frontline operator files a report, they know it won’t be a witch hunt.
- I have over 30 years’ experience in air traffic control. If we go back through the evolution of occurrence reporting, in the early 1990s people didn’t report. If you had a serious occurrence, you said nothing. There was a legal requirement that the radar tapes and the voice tapes be retained for 30 days and if there was an occurrence people would say nothing for the 30 days. After that you could talk about it because you knew the radar tapes would be recycled. This was called being part of the ’30-day club’. Then there became a requirement to report, and it started to become established in the mid-nineties. After that the industry got serious about safety and implemented a formal SMS.
- Just culture is a legal term within aviation regulation, and it states that you can’t be prosecuted for reporting safety occurrences. It has taken nearly 15 years to establish this.
- In the aviation industry, we really had to be very patient and accept that it took decades to implement this. If I suggest, we know not to be too optimistic and adjust expectations to a realistic time path. Before we get such a system implemented, we must take time to explore the possibilities. What we have now in aviation is not perfect, but I think given the circumstances it’s the best we can realistically achieve. You must look at the industry’s expectations: what does safety, prevention, and collective learning mean for the authorities. Once you get that picture about the expectations and what is in the interest of the authorities, then I think we can carefully go to the next step. We must be patient, and it will take many years to establish just culture requirements in the legal system, which has not yet happened in all industries.
- B.
- Just culture was explained by all participants as an organizational atmosphere of trust where people feel safe to report safety concerns and errors without fear of reprisal but are also held accountable for their actions. It fosters a learning environment by distinguishing between honest, unintentional mistakes and reckless or willful violations, encouraging systems to be examined rather than solely blaming individuals.
- If you made a mistake during a flight, expelling the person cannot address the problem. You would learn more by asking, “why did you do that?” This information can change the training, system support, or even design constructions.
- The information and data about the occurrences have more value than the punishment of people who make mistakes. But it is a sensitive balancing act and there’s no one-size-fits-all.
- C.
- When asked if reporting would be as prevalent even if it was not part of the law and the SMS, participants replied as follows:
- I have worked before in an organization with blame culture and people would hide as much as they could, reporting only occurrences mandated in the manuals. Even if they reported they would provide very little information. Now I work for an internationally acclaimed airline and it’s a completely different world. Here the company says three things: if you’re not under influence of any substances, if you’re not stealing or if you’re not lying, nothing will happen to you, just report it. Just culture is taught when a worker is initially trained as well as in subsequent trainings and applies to all workers. In my current job the idea of reporting is to learn and to avoid the same thing happening again. This provides innovation of processes and makes work more efficient and safer.
- Yes, the organizations saw the added value of investing in blame-free reporting culture. Insurers are a stakeholder in the aviation industry, who bear financial risk. Based on what I have heard, the insurers were promoting just culture principles even before it went into law. This is in their own interest, because if you decrease the number of incidents and promote excellence, the financial risk can be reduced.
- There is a three-partner relationship: the reporter, the organization, and the authorities. The law is very clear that the organization must have a just culture policy anchored in the SMS. The same is for the authorities who must have a safety policy based on just culture principles and that’s why as regulators we are not allowed to share the information from individual occurrence reports with third parties, unless required by law.
- 2.
- Mindset and thinking
- There is something called telemetry [44], which transmits data to a remote location for further analysis. So, we know that everything is tracked, so we might as well report it. That is the sentiment now.
- One way to make the change to just culture and encourage incident reporting is to make people understand that their reporting is what has improved not only safety but also efficiency. It may get them to think that if everybody is reporting it and we are finding better ways to do things, why don’t I do it too. So, it’s a form of “official gossip” which is positive for you and the organization. Every Friday, we get a curated report and everyone gets to read what is happening in the company. Often, not reporting gets a reprimand. The supervisor will say that if you had reported it would have been OK but by not reporting it an opportunity to learn and correct is lost. This spreads the message that the person is not being reprimanded for the error but being reprimanded for hiding it.
- There is very strong wording in the safety manual that we are given the ability to report and nothing will happen, thanks to unions and leadership style. I will give you one case example: we had some newly employed cabin crew who was reporting fatigue a few times, and it led to the probation period being extended and they were not able to continue the job as cabin crew. In this case I heard that they were called to a meeting with the supervisor. If you report fatigue frequently, the boss will verify is it because you were out late with your family or friends and then reported fatigue the day after? In that case, it’s bad planning. But if you’re working long shifts for 3-5 days continuously and you were given a noisy hotel room and that is the reason for reporting fatigue, then that it will not be a problem, they will investigate it. People understand that if they don’t report, nothing will happen but there also will not be any improvements in processes or safety.
- The first thing that comes to my mind is the question, why do we report? People should know that reporting has huge learning potential, so I think that what is so important with the reporting system is the mindset of collective learning. We can all learn to become better and tomorrow if you report something about me, that becomes an eye opener for me.
- So, there are a couple of things: if there’s no legal requirement for people to report they’ll say why would I bother to report. First, you must involve the frontline workers in the investigation process so that they can see it’s not a witch hunt that it’s a blame-free learning environment. When the de-identified information comes back to the frontline operator as learning opportunity, that’s when you get buy-in.
- There is an app which can show if a worker has been removed from duty due to any incident. Coworkers can see this and previously could take a screenshot and share it among themselves so that becomes like a reason for shame. Now they’ve changed the app in a way that when you try to take a screenshot, the screen goes black. This tells co-workers that the management takes confidentiality seriously and does not encourage spreading information like this.
- Airlines have worked very hard to change the mindset. They explain that error is part of our work and part of being human. It is important to acknowledge the error, report it and fix it rather than hide it. What we are looking at now is resilience, the ability to bounce back when you have made errors. In my previous blame-culture experience, I think workers feared the company not their colleagues. To achieve active reporting of incidents it is important that people feel safe, and they know that nothing will happen to them. Culture is very important; my current company has a very mature system and they are able to trickle down the blame-free reporting and just culture principles from the top. People report everything including efficient work practices as well as safety occurrences. The company decides how to disseminate it and whether it needs to be added to the safety manuals or the protocols.
- I have not seen anyone write a report about a co-worker. If they have any issues, they just talk about it openly or take it to the manager. We have a very mature way to sort out things. For example, if a junior pilot or cabin crew did something non-compliant, we talk about it and discuss how to ensure training and communication is adequate. We aim at developing the team instead of blaming the team.
- About eight years ago there was an occurrence that needed to be reported. The controller involved didn’t want to report it. After two or three days he was convinced by the rest of the team to report this event, and he did. Within the team if somebody saw something that they were uncomfortable with they would probably discuss with a colleague/supervisor.
- 3.
- Trust
- I think in 1989 there was a ground incident in a control tower which was investigated internally by the organization and it was deemed that there was no culpability. People were retrained there was no blame assigned. A few months later the state prosecutor became aware of the incident and decided to prosecute the staff involved. So, there was legal case taken against three members of staff as I understand it and two of them were convicted, but no sentence was given. What happened overnight in the organization was that reporting stopped, simply stopped [45].
- For incident reporting you need peer buy-in and experienced people who are involved in the fully transparent process. To apply just culture, we need a joint decision-making panel which includes frontline operators, so that it’s not perceived as a management tool.
- Regardless of the industry, an effective just culture policy and reporting system cannot be developed unless we first work on mutual trust. Without trust it’s basically a waste of time to start thinking about it. This requires a lot of time and patience because we can’t write trust into regulations and standard operating procedures (SOP).
- There is a specific paragraph in our manuals that says the internal just culture policies and regulation of an organization may only be introduced after consulting with employee representatives. This means the organization must have given the employees the opportunity to give their input. Reporter(s) must feel fully confident that they will not be blamed for acceptable errors. If they are going to be blamed for that then you will ruin your whole reporting culture.
- There was a serious occurrence a few years ago in which I expected that the controller would be blamed, but there was no blame. If you admit to a mistake your unit manager has a responsibility to debrief you on the occurrence; anecdotally controllers use the term ‘just cultured’ for this type of debrief. On the other hand, if you do not report an event, you can be prosecuted because you’re breaking the law. So, people are probably more afraid not to report than the consequences of reporting. The reporting system covers very minor events that have no significant impact, all the way to very serious events.
- 4.
- Categorizing the severity of the accident
- There is a chapter in our operations manual which is based on ICAO Annex 13 [43]. This annex is about aircraft incidents and accidents and they have listed around 50 topics on which a safety report should be filed and which is mandatory. In addition, the airlines have their own manual which clearly states what is a reportable safety occurrence. There are situations where the crew may discuss among themselves the need to report a particular occurrence and often, they err on the side of caution and report it anyway
- We have manuals that clearly states what needs to be reported. When we report to the company, the company will decide on reporting to the authorities based on the manuals and the legal requirements.
- 5.
- The process of reporting and what is done with the data
- Reporting is quite easy and if you omit something it will be addressed afterwards, and you can give some extra information if required.
- There is a department that parses through the reports and performs a very systematic analysis. They use the data to pick up occurrences that can provide good learning material and use them for training and retraining.
- Trend analysis is very big because it’s data driven. Sometimes, we can’t be certain if an event is attributable to the person’s capability or it was just the luck of the draw. If the same person was involved in two or three things, the company would steer towards being conservative and assume that there is some capability issue and discuss, counsel, and retrain the person. Yes, sometimes it’s unfair and we almost sympathize with the person because we know that tomorrow if we mess up three times the same way we will not be spared ourselves; even the vice president’s son will not be spared. So, it’s that maturity of the system in dealing with these serious things.
- The large databases are used to provide safety and other bulletins periodically. We have a very mature way to sort out things, for example if you have a junior staff and they’re doing something incorrect then we talk and determine the reason. We address the problem instead of blaming the person or team.
- Reporting is confidential. Eventually when the report is made available for others to view, the identifiable information is removed. The person reporting also gets a notification to thank them for reporting and they may be given a file number of their report. That’s the beauty of it: once they take your name out of it you can see it’s available for everybody. I did my master’s degree based on this database.
- If you report a serious occurrence, you may get interviewed, debriefed and put under investigation. During the investigation you may not be allowed to work. After the investigation the company will decide what to do; most of the time they retrain you in the simulator, but they will not dismiss you, unless it is a deliberate violation.
- You can see your own report but not others’ reports. The safety department checks all the reports and maybe once or twice a year, we get bulletins. The bulletins will have information on the types of occurrences for example: air safety, fatigue, hazard and non-flight related. They will elaborate on specific examples to illustrate the issues and how to deal with it.
- We received around 30,000 reports last year and we code them into the types of event. For more serious events if we need to, we will contact the controller’s units but that is rare. We have three severity categories and if we’re not completely sure about its severity level, we always contact a third party called the national investigation board. The investigation board works fully independently of us so they can also have a critical view at our role as a regulator.
- Our main role as regulatory authority is that we oversee that the safety management system which the Air Navigation Service Provider [46] has implemented in accordance with legal regulations. We only contact the organization in case of an individual occurrence when there’s really a special training point or necessity to contact them because of their specific occurrence. If we have an audit for an air traffic control unit, one of our agenda points is a discussion or a meeting with the safety manager where we ask how many reports they received, and we cross check with our database. If the safety manager received 10 and we received 30 then we have to look at the discrepancy. So our main task is to audit whether the safety management system of the organization works properly.
- The current methods of training like Crew Resources Management [47] programs can improve the competencies of a population of pilots, but not of the individuals. Evidence-based training builds resilience, and this is becoming popular in aviation. In the US they call it Advanced Qualification Program [48]. It uses the large databases of occurrence reports and produces training programs for simulator training [49]. The databases are decoded into competencies and observable behaviors. This methodology complements the existing training programs to focus on specific requirements for specific individuals. For example, if the event happened because of lack of communication, they would implement communication strategies in the training for the pilots. The challenge is to have a very robust data collection system and software; it’s not a cheap process. The information is not only used to learn what people are doing incorrectly but also what people are doing correctly.
- The idea is to promote errors in the simulator so that they can learn in the simulator instead of the cockpit.
- We share the de-identified reports information openly among the air traffic control and airport authorities. Mostly there’s one main service provider for each country. Within that organization there is are safety manager(s), who meet regularly with international counterparts and are constantly in touch. They form working groups to share and discuss significant events and take the information back to their country for improving safety and efficiency. The culture that everybody can have such problems and they should talk about it and fix it is prevalent among air traffic controllers. If a pilot made a mistake, the controller would let the pilot know that it’s going to be reported. So people talk openly about the requirement to report, so it’s known within the culture, within the workplace, that yes there is a strong reporting culture.
- There was this incident with a controller, which was not a wilful act; there were no contributing factors, it’s just something he didn’t spot. So, he said I’ve made this mistake and the organization said that was a bit silly, wasn’t it? He said I don’t know why I did it, but he told his story to the group and his story circulated amongst the teams and there was no blame and that was a significant moment within the organizational culture. The frontline operators saw this guy who was known as a good operator had made this mistake and wasn’t punished for it.
- The more serious events would be included in annual refresher training with opportunity to discuss. The rest would be communicated through emails with “read receipt” to make sure the controller has read it and within the team there would possibly be some discussion. There is also an internal quarterly report categorized into the five key risk areas. These also go to the regulators; in addition, there is an annual report that also goes to the regulator. The regulatory function is a separate body; for certain serious occurrences we must inform the Accident Investigation Unit, which is a third party a separate from us and the regulator.
- 6.
- Reward system for reporting and aim for zero incident statistics
- We don’t put up signs saying we aim for incident free situations; we have moved away from that. We are taught is that you need to mitigate the threats and avoid errors; that’s the baseline of every operation. Errors will never be stopped we can never be 100% error free, but we need to mitigate the threats that exist.
- In my previous company they made fake reports because the auditor would ask how it was possible to have no occurrence reports. In my current company there is no need to do that because there are lots of genuine reports to be shown during audits. We don’t have the culture of zero incident targets.
- No we don’t have zero incidence targets. It can be seen from air traffic management platforms that we had 1000 days without accident and then something really serious happened, so if we say zero incident target we are just playing with figures.
- I am bit skeptical that people are then starting to report to get their reward but it’s not about the reward. So, it’s a kind of mindset which is so important; it has to be natural for the person to report because they want to share their experience so that other people will not face the same situation.
4. Discussion
- Involve key stakeholders—especially laboratory workers, managers, and leaders—to define what just culture means within the organization.
- Establish a simple incident reporting system, train a dedicated team to analyze reports, and translate findings into actionable learning outcomes.
- Share lessons learned widely with all staff to foster a collective learning culture.
- Build trust by clearly communicating that genuine mistakes can be reported without fear of blame or reprisal.
5. Conclusions
6. Limitations
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
References
- Wikipedia Contributors. Commercial Aviation. Wikipedia, The Free Encyclopedia. 2025. (accessed 2025). Available online: https://en.wikipedia.org/wiki/Commercial_aviation.
- Barnett, A.; Reig Torra, J. Airline safety: Still getting better? Journal of Air Transport Management 2024, 119, 102641. [Google Scholar] [CrossRef]
- Clinton, J.; John, S.; Zorn, C. Analyzing aviation safety: Problems, challenges, opportunities. Research in Transportation Economics 2013, 43, 148–164. [Google Scholar] [CrossRef]
- International Civil Aviation Organization. State of Global Aviation Safety; International Civil Aviation Organization, Montréal, Canada. 2025. Available online: https://www.icao.int/sites/default/files/sp-files/safety/Documents/ICAO_SR_2025.pdf.
- Bieder, C. Safety Management Systems and their Origins: Insights from the Aviation Industry, 1st ed.; CRC Press, 2022. [Google Scholar] [CrossRef]
- Cheetham, J.; Ma, Y.; Murphy, M. Are planes crashing more often? BBC Verify. 2025. Available online: https://www.bbc.com/news/articles/c5ym8n4lzp6o.
- Enten, H. It feels like there are suddenly way more plane crashes and incidents. Here’s the truth CNN Business (USA). 2025. Available online: https://edition.cnn.com/2025/02/19/business/airplane-crashes-statistics.
- Nanyonga, A.; Joiner, K. F.; Turhan, U.; Wild, G. Deep Learning Approaches for Classifying Aviation Safety Incidents: Evidence from Australian Data. AI 2025, 6(10), 251. [Google Scholar] [CrossRef]
- SKYbrary. Safety Occurrence Reporting . 2025. Available online: https://skybrary.aero/articles/safety-occurrence-reporting.
- Xing, Y.; Wu, Y.; Zhang, S.; Wang, L.; Cui, H.; Jia, B.; Wang, H. Discovering latent themes in aviation safety reports using text mining and network analytics. International Journal of Transportation Science and Technology 2024, 16, 292–316. [Google Scholar] [CrossRef]
- Xiong, M.; Wang, H.; Wong, Y. D.; Hou, Z. Enhancing aviation safety and mitigating accidents: A study on aviation safety hazard identification. Advanced engineering informatics 2024, 62, 102732. [Google Scholar] [CrossRef]
- European Union Aviation Safety Agency. Regulations . 2025. Available online: https://www.easa.europa.eu/en/regulations.
- FAA. AC 120-92D - Safety Management Systems for Aviation Service Providers; FAA, Ed.; 2024. [Google Scholar]
- Kıvanç, E.; Tuzkaya, G.; Vayvay, Ö. Safety management system and risk-based approach in aviation maintenance: A systematic literature review. Safety Science 2025, 184, 106755. [Google Scholar] [CrossRef]
- Çoban, R.; BÜKEÇ, C. M. Just Culture in Aviation: A Metaphorical Study on Aircraft Maintenance StudentsMaintenance Students. International Journal of Aviation, Aeronautics, and Aerospace 2024, 11(1). [Google Scholar] [CrossRef]
- Dekker, S. Restorative Just Culture: From Disciplinary Action to Meaningful Accountability; CRC Press, 2025. [Google Scholar] [CrossRef]
- Pellegrino, F. The Just Culture Principles in Aviation Law: Towards a Safety-Oriented Approach; Springer International Publishing, 2019. [Google Scholar] [CrossRef]
- Reason, J. Managing the Risks of Organizational Accidents; Routledge, 1997. [Google Scholar] [CrossRef]
- Wikipedia Contributors. Just culture. Wikipedia, The Free Encyclopedia, 2025. Available online: https://en.wikipedia.org/w/index.php?title=Just_culture&oldid=1292104615 (accessed on 23 September 2025).
- International Civil Aviation Organization. Safety Management Manual (SMM). In Doc 9859, (ICAO); I. C. A. O., Ed.; 2018. [Google Scholar]
- Kovacova, M.; Licu, A.; Balint, J. Just Culture – Eleven Steps Implementation Methodology for organisations in civil aviation – “JC 11”. Transportation Research Procedia 2019, 43, 104–112. [Google Scholar] [CrossRef]
- SKYbrary. Just Culture. SKYbrary, 2025. (accessed.
- Al-Dmour, H.; AlKhawaldeh, H.; Al-Dmour, A.; Obidat, B.; Al-Dmour, R. The integrated role of Safety Management Systems (SMS) and risk management in achieving aviation sustainability. Discover Sustainability 2025, 6(1), 985. [Google Scholar] [CrossRef]
- Mrusek, B.; Miller, M.; Olaganathan, R. Shared Leadership and Just Culture: Tools to Promote SMS Hazard Reporting. In 2020 IEEE Aerospace Conference, 7-14 March 2020, 2020; pp 1-13. [CrossRef]
- Ray, A. T.; Bhat, A. P.; White, R. T.; Nguyen, V. M.; Pinon Fischer, O. J.; Mavris, D. N. Examining the Potential of Generative Language Models for Aviation Safety Analysis: Case Study and Insights Using the Aviation Safety Reporting System (ASRS). Aerospace 2023, 10(9), 770. [Google Scholar] [CrossRef]
- Vempati, L.; Woods, S.; Solano, R. C. Qualitative Analysis of General Aviation Pilots’ Aviation Safety Reporting System Incident Narratives Using the Human Factors Analysis and Classification System. The International Journal of Aerospace Psychology 2023, 33(3), 182–196. [Google Scholar] [CrossRef]
- Markes, A.; Diab, M. Voluntary Incident Reporting in Health Care. The Journal of Bone and Joint Surgery 2025, 107(14), 1651–1656. [Google Scholar] [CrossRef] [PubMed]
- Morrow, S.; Koves, G.; Barnes, V. Exploring the relationship between safety culture and safety performance in U.S. nuclear power operations. Safety Science 2014, 69, 37–47. [Google Scholar] [CrossRef]
- International Organization for Standardization. ISO Standard No. 35001:2019; Biorisk management for laboratories and other related organisations. 2019.
- World Health Organization. Laboratory Biosafety Manual, 4th ed.; 2020. [Google Scholar]
- Centers for Disease Control and Prevention. Biosafety in Microbiological and Biomedical Laboratories 6th Edition . 2020. Available online: https://www.cdc.gov/labs/pdf/SF__19_308133-A_BMBL6_00-BOOK-WEB-final-3.pdf (accessed on 30 April 2025).
- Bayot, M. L.; King, K. C. Biohazard Levels; StatPearls Publishing, 2022. [Google Scholar]
- Ta, L.; Gosa, L.; Nathanson, D. A. Biosafety and Biohazards: Understanding Biosafety Levels and Meeting Safety Requirements of a Biobank. Methods Mol Biol 2019, 1897, 213–225. [Google Scholar] [CrossRef]
- Perkins, D.; Danskin, K.; Rowe, A. E.; Livinski, A. A. The Culture of Biosafety, Biosecurity, and Responsible Conduct in the Life Sciences: A Comprehensive Literature Review. Applied biosafety 2019, 24(1), 34–45. [Google Scholar] [CrossRef]
- Ritterson, R.; Kingston, L.; Fleming, A. E. J.; Lauer, E.; Dettmann, R. A.; Casagrande, R. A Call for a National Agency for Biorisk Management. Health Security 2022, 20(2), 187–191. [Google Scholar] [CrossRef]
- World Health Organization. Strengthening laboratory biological risk management . 2024. Available online: https://apps.who.int/gb/ebwha/pdf_files/WHA77/A77_R7-en.pdf.
- US Department of Labor. Incident Investigation . 2025. Available online: https://www.osha.gov/incident-investigation.
- Blacksell, S. D.; Dhawan, S.; Kusumoto, M.; Le, K. K.; Summermatter, K.; O’Keefe, J.; Kozlovac, J. P.; Almuhairi, S. S.; Sendow, I.; Scheel, C. M.; et al. Laboratory-acquired infections and pathogen escapes worldwide between 2000 and 2021: a scoping review. The Lancet Microbe 2024, 5(2), e194–e202. [Google Scholar] [CrossRef]
- Zavaleta-Monestel, E.; Rojas-Chinchilla, C.; Anchía-Alfaro, A.; Quesada-Loría, D.; García-Montero, J.; Arguedas-Chacón, S.; Hanley-Vargas, G. Tracking the Threat, 50 Years of Laboratory-Acquired Infections: A Systematic Review. Acta Microbiologica Hellenica 2025, 70(2), 11. [Google Scholar] [CrossRef]
- Manheim, D.; Lewis, G. High-risk human-caused pathogen exposure events from 1975-2016. F1000Res 2021, 10, 752. [Google Scholar] [CrossRef]
- Vijayan, V. Understanding Work-as-Imagined and Work-as-Done in Biomedical Laboratories. Applied Biosafety 2025. [Google Scholar] [CrossRef]
- UK Civil Aviation Authority. Occurence Reporting for General Aviation. 2023. [Google Scholar]
- International Civil Aviation Organization. Annex 13: Aircraft Accident and Incident Investigation; 2016. [Google Scholar]
- Wikipedia Contributors. Telemetry. Wikipedia, The Free Encyclopedia. 2025. (accessed 2025). Available online: https://en.wikipedia.org/wiki/Telemetry.
- SKYbrary. B763, Delta Air Lines, Amsterdam Schiphol Netherlands, 1998 (Legal Process - Air Traffic Controller) . 1998. Available online: https://skybrary.aero/articles/b763-delta-air-lines-amsterdam-schiphol-netherlands-1998-legal-process-air-traffic?__cf_chl_tk=HilAhBhOjVljbOw9XtyJEZZ7EVMNenYhkLx_WCaYZjo-1759371877-1.0.1.1-tg7FnXsX4Gqgla.i8xe915HLR4j24zFz2H99eiFZ0Fg.
- Wikipedia Contributors. Air Navigation Service Provider. Wikipedia, The Free Encyclopedia. 2025. (accessed 2025). Available online: https://en.wikipedia.org/wiki/Air_navigation_service_provider.
- Wikipedia Contributors. Crew resource management. Wikipedia, The Free Encyclopedia, 2025. (accessed 2025).
- Federal Aviation Administration. Advanced Qualification Program (AQP) . In US Department of Transportaion; 2024. Available online: https://www.faa.gov/training_testing/training/aqp.
- Wikipedia Contributors. Flight Simulator. Wikipedia, The Free Encyclopedia, 2025. Available online: https://en.wikipedia.org/wiki/Flight_simulator (accessed on 11 December 2025).
- Nemmers, P. The Differences Between Incidents vs. Accidents in the Workplace; Professinals, N. A. o. S., Ed.; 2023. [Google Scholar]
- Chamberlain, A. T.; Burnett, L. C.; King, J. P.; Whitney, E. S.; Kaufman, S. G.; Berkelman, R. L. Biosafety Training and Incident-Reporting Practices in the United States: A 2008 Survey of Biosafety Professionals. Applied Biosafety acccessed. 2009, 14(3), 135–143. [Google Scholar] [CrossRef] [PubMed]
- Al-Firm, A. T.; Alshalawi, M.; Almarzouqi, M.; Alhuthil, R.; Qanbar, S.; Alsalmi, L.; Alaklabi, A. Perception of just culture among staff in a research organization. Industrial and Commercial Training 2025, 57(2), 232–241. [Google Scholar] [CrossRef]
- Bükeç, C.; Çoban, R. A QUALITATIVE RESEARCH ON FACTORS AFFECTING JUST CULTURE IN AIRLINES / HAVAYOLU İŞLETMELERİNDE ADİL KÜLTÜRÜ ETKİLEYEN FAKTÖRLER ÜZERİNE NİTEL BİR ARAŞTIRMA. Anadolu Üniversitesi İktisadi ve İdari Bilimler Fakültesi Dergisi 2023, 24, 496–525. [Google Scholar] [CrossRef]
- Murray, J. S.; Lee, J.; Larson, S.; Range, A.; Scott, D.; Clifford, J. Requirements for implementing a ‘just culture’ within healthcare organisations: an integrative review. BMJ Open Qual 2023, 12(2). [Google Scholar] [CrossRef]
- Almansour, H. Barriers Preventing the Reporting of Incidents and Near Misses Among Healthcare Professionals. Journal of Health Management 2024, 26(1), 78–84. [Google Scholar] [CrossRef]
- Moshiri, E.; Abbaszadeh, A.; Shahcheragh, S. H. Prioritizing just culture: A call to action for patient safety. Nursing practice today 2025, 12(2). [Google Scholar] [CrossRef]
- Engeda, E. H. Incident Reporting Behaviours and Associated Factors among Nurses Working in Gondar University Comprehensive Specialized Hospital, Northwest Ethiopia. Scientifica 2016, 2016(1), 6748301. [Google Scholar] [CrossRef]
- Oweidat, I.; Al-Mugheed, K.; Alsenany, S. A.; Abdelaliem, S. M. F.; Alzoubi, M. M. Awareness of reporting practices and barriers to incident reporting among nurses. BMC Nursing 2023, 22(1), 231. [Google Scholar] [CrossRef]
- Aziida, N.; Joiner, K. F.; Turhan, U.; Wild, G. Deep Learning Approaches for Classifying Aviation Safety Incidents: Evidence from Australian Data. AI 2025, 6(10), 251, Publicly Available Content Database. [Google Scholar] [CrossRef]
- Zierman, R. Identifying Aircraft Damage Mitigating Factors with Explainable Artificial Intelligence (XAI): An Evidence-Based Approach to Rule-Making for Pilot Training Schools. Journal of aviation/aerospace education and research 2024, 33(4). [Google Scholar] [CrossRef]
- de Kam, D.; Kok, J.; Grit, K.; Leistikow, I.; Vlemminx, M.; Bal, R. How incident reporting systems can stimulate social and participative learning: A mixed-methods study. Health Policy 2020, 124(8), 834–841. [Google Scholar] [CrossRef] [PubMed]
- Snyder, B. C.; Wentzel, J. M.; Epstein, G. L.; Kadlec, R. P.; Parker, G. W. Trust, but Verify: A “Just Culture” Model for Oversight of Potentially High-Risk Life Sciences Research. Applied biosafety 2025, 30(2), 17–111. [Google Scholar] [CrossRef]
- van Baarle, E.; Widdershoven, G.; Molewijk, B. Just culture as dialogical learning: theoretical foundations and practical implications of restorative justice. Journal of Medical Ethics 2025, jme-2025-110761. [Google Scholar] [CrossRef]
- Miaoulis, G.; Manev, I. M. Personal and Organizational Responsibility in the Delivery of Healthcare Services: Breaking the Code of Silence. Health Services Insights 2025, 18, 11786329251356095. [Google Scholar] [CrossRef]
- Tasker, A.; Jones, J.; Brake, S. How effectively has a Just Culture been adopted? A qualitative study to analyse the attitudes and behaviours of clinicians and managers to clinical incident management within an NHS Hospital Trust and identify enablers and barriers to achieving a Just Culture. BMJ Open Qual 2023, 12(1). [Google Scholar] [CrossRef]
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2025 by the author. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license.
