Preprint
Article

This version is not peer-reviewed.

Practicing Just Culture in Biomedical Laboratories: Lessons from Aviation

Submitted:

20 December 2025

Posted:

23 December 2025

You are already at the latest version

Abstract
A "just culture" is a workplace environment that balances accountability with learning, where employees feel safe to report errors and safety concerns without fear of blame. In commercial aviation, this is a legal requirement and is anchored in their safety management system, this is not a feature of laboratory biorisk management systems. This study used interview data from fifteen biomedical laboratory workers and five aviation workers to understand the differences in incident reporting practices. Results show laboratory workers were extremely reluctant to report safety incidents for fear of blame and stigma. Only two of the fifteen participants worked in a culture where reporting was encouraged. If there was no ill health the incidents were not reported, missing very valuable learning opportunities. Whereas in aviation, safety occurrence reporting was strongly encouraged, curated and developed into targeted training programs. A key lesson biomedical laboratories can glean from this study is that instead of spending time investigating every incident, the reports can be collected and developed into learning opportunities. If management ensures that blame will not be assigned, unless the intent was malicious or willful negligence, workers will start to report, if they feel that collectively they can improve safety and productivity.
Keywords: 
;  ;  ;  ;  

1. Introduction

Today, commercial aviation is one of the safest modes of transportation. Commercial aviation refers to the sector involving the operation of airlines, commercial airports, and aircraft for the purpose of transporting passengers and cargo for profit [1]. It is regulated and supported by each region’s national agency (for example, the United States Federal Aviation Administration, European Union Aviation Safety Agency, etc) as well as international agencies like the International Civil Aviation Organization (ICAO) and the International Air Transport Association (IATA). A decade ago, there was one accident for every 456,000 flights, today the five-year average is one accident for every 810,000 flights [2,3,4]. A passenger could on average choose one flight at random every day for 220,000 years before succumbing to a fatal accident [2,5]. Recent high profile air accidents may cast doubt on these figures, but robust analysis of data shows that over the past two decades there has been a downward trend in air accidents. Specifically, the figure for January 2025 (52), was lower than it was in January last year (58) and January 2023 (70) [6,7].
One of the most important tools that has improved aviation safety is safety occurrence reporting and the collection of data, which is de-identified, curated, analyzed and used for the purposes of improving safety [4]. A large number of studies and dissertations have used these databases to learn and further improve aviation safety [8,9,10,11].
In the aviation industry, safety is uniformly administered through the implementation of a comprehensive safety management system. This is mandatory and is controlled by national and international regulatory requirements [5,12,13,14]. Over the past 20-30 years, the aviation industry has adopted an approach based on just culture principles into its safety management systems. The term just culture refers to an organizational culture of trust where people feel safe to report safety concerns and errors without fear of reprisal. It fosters a learning environment by distinguishing between unintentional human errors, which require process improvement, and wilful violations, which demand disciplinary action [15,16,17,18,19]. Just culture principles is a written requirement in aviation industry regulations and safety management manuals. They are also integrated into training for all workers in aviation [20,21,22].
This approach gained traction in the 1990s, when there was growing acknowledgement of the benefits of safety occurrence reporting as opposed to the punishment of individuals resulting in the loss of learning and improving opportunities. This led senior leaders to design a system for non-punitive safety reporting, which eventually morphed into the just culture principles that are so well anchored in the aviation industry today [23,24]. While it took more than 20 years, the success of just culture implementation is accepted widely and has been very impactful [4,25,26]. While just culture practice started in the aviation industry, it has since been adopted by different industries including nuclear power, maritime transportation, railways, and healthcare [27,28].
The term ‘biological laboratories’ can be applied to any laboratory that handles biological materials. The types of work undertaken in these laboratories can range from research, diagnostic, manufacturing, biorepositories, and others [29,30]. Typically, such laboratories can be divided into low-risk and high-risk laboratories, generally using the system of Biosafety Containment Levels 1 to 4, with 4 being the highest risk and requiring extensive mitigation measures to contain the risks [31]. Biosafety Level 2 containment is the common type, encompassing most clinical diagnostic and university research laboratories [32,33]. The safety management system used in biomedical laboratories is called a biorisk management system (BMS). BMS is a systematic approach to identify, assess, control, and monitor both safety and security risks associated with biological agents and toxins. Many countries have national regulations on handling biorisks in laboratories. These, together with internationally accepted guidance documents like the World Health Organization Laboratory Biosafety Manual [30], the Biosafety in Microbiological and Biomedical Laboratories 6th Edition published by the Centers for Disease Control and Prevention [31], and ISO standards (ISO35001:2019) [29] form the basis for the structure of BMS. These standards and guidance documents emphasize the need for risk assessment and implementation of suitable and sustainable risk mitigation measures as a core part of the BMS. In high-risk laboratories which are classified under Biosafety Levels 3 and 4, the implementation of BMS is better governed by national regulations than those classified as Biosafety Levels 1 and 2. This leaves many gaps in the implementation of a BMS even within one country, and these gaps are amplified globally [34,35,36].
Incident reporting is a part of the BMS, however just culture is not included in it. The term safety occurrence [9] is synonymous with incident which is a term used for all events that resulted in injury as well as close calls (sometimes called “near misses”) [37]. The term Lab-acquired infection (LAI) is used when a worker contracts an infection from exposure to a pathogen in a laboratory setting, such as through accidental inoculation, inhalation, or ingestion. This is considered one of the most serious negative outcomes of an accident in a laboratory, because it can spread to the family and community through the infected person [38,39]. A current, albeit limited, initiative to report LAIs on a global scale is a publicly accessible international database on LAIs hosted on a Belgian server (https://www.biosafety.be/content/laboratory-acquired-infections-and-bio-incidents). However, this contains only a fraction of the LAIs that may have occurred throughout the world and not enough information to use the data for in-depth analysis. Furthermore, there would be many near misses that could have led to an LAI but did not, and there is no information on these occurrences that could have formed excellent learning and improving opportunities. Additionally, several research papers have emphasized the need for international collaboration and transparency in understanding safety occurrences and incidents in biomedical laboratories so that learnings can be shared throughout the world [38,39,40].
In the field of biomedical laboratories, the term ‘just culture’ is almost nonexistent; a search of the literature yielded very little information about such a concept in the management of safety and security in biological laboratories. This was demonstrated in a previous, recent study published by the author of this paper on Understanding Work-as-Imagined and Work-as-Done in Biomedical Laboratories [41]. In the previous study, it was noted that most workers in the laboratory do not like to report incidents or even accidents because they are afraid of blame. They would do a mental risk assessment of the incident to see if it was worth reporting. If it did not affect their health, they preferred not to report; in some cases, they decided not to report despite health risks, due to fear of blame and reprisal. This follow-on study analyzes how just culture practices adopted in the aviation industry can be applied to biomedical laboratories. The study is based on the incident reporting culture of fifteen participants from biomedical laboratories and five from commercial aviation industry, to compare and understand the challenges of implementing just culture in biomedical laboratories.

2. Materials and Methods

Convenience sampling was used to recruit a total of twenty participants from the author’s network via email or WhatsApp. Participants were from biomedical laboratories and commercial aviation industry. Verbal consent was obtained prior to the one-time virtual interview, which lasted 45-60 minutes. Interviews were audio-recorded with participants’ consent, and all recordings were subsequently transcribed, anonymized, and stored in encrypted electronic format. The semi-structured one-on-one interview method gave the interviewer an opportunity to follow up on the participants’ experiences and knowledge to pursue related topics. It also allowed the discovery of information which may be important to the participant but was not known to the interviewer prior to the interview thus allowing the interviewer to explore other dimensions of the research question. No confidential information is included in the preparation of this article.

2.1. Biomedical Laboratory Participants

Fifteen participants, whose roles were laboratory heads, laboratory workers (technician and research assistant), and post-doctoral fellows working with infectious agents in Biosafety Level 2 laboratories were interviewed. The definition of incidents used in this study was adopted from the U.S. Department of Labor Occupational Safety and Health Administration. OSHA uses the term “incident” for all events where a worker was hurt, as well as close calls sometimes called “near misses” [37].
The questions and topics use for the interview were as follows:
  • How do you learn the lab work and the safety aspects:
    o
    Incidents and accidents
    o
    Reading safety data sheet
    o
    Reading safety standard operating procedures (SOPs)
    o
    From mentors
  • Is there an incident reporting system in your organization
  • Is incident reporting encouraged in your organization?
  • Is there fear associated with reporting incidents

2.2. Aviation Industry Participants

Five workers from the commercial aviation industry whose roles were cockpit crew, air traffic controller and regulatory air traffic control management were interviewed.
The definition of safety occurrence used in this study is adopted from the UK Civil Aviation Authority’s guidance document entitled Occurrence Reporting for General Aviation, which states that “Occurrence means any safety-related event which endangers or which, if not corrected or addressed, could endanger an aircraft, its occupants or any other person and includes in particular an accident or serious incident” [9,42,43].
The questions and topics use for the interview were as follows:
  • How can organizations address the fear of blame when reporting incidents?
  • In your industry, how is action taken if an incident is reported?
  • What motivation works best for incident reporting especially relating to near misses and unsafe practices?
  • How does the culture of the organization help to encourage incident reporting?
  • Are there clear principles of blame-free incident reporting, and are these principles communicated and followed within your organization?
  • Does the ease of reporting and the paperwork that follows make a difference to reporting?
  • How does your industry focus on Learning vs. Blame?
  • How does your industry reward and support incident/near miss reporting?

3. Results

Data source 1: Fear of blame in reporting incidents and near misses in biomedical laboratories:
The 15 participants’ responses are as follows:
  • One participant said the organization has no system to report incidents.
  • Two participants said that organization has a system to report incidents, which is encouraged by the manager.
  • Twelve participants said organization has a system to report incidents but only those of serious nature were reported.
Focusing on the twelve responses where only incidents of a serious nature were reported, these participants performed a mental risk assessment on the nature of the incident to decide whether to report or not. The most common cause for not reporting were:
  • Blame assigned to the person and team
  • Stigma attached on the person reporting
  • The reporting staff may be deemed to be clumsy
  • Mistakes were seen as personal failures
  • Paperwork is very complicated
  • Fear of losing the job
  • Reporting resulted in more rules
  • Organization emphasizes zero incident policy
Below are specific examples of how participants described their mental risk assessment process and the decision to report. The responses are paraphrased for easy understanding.
  • If it’s a small cut with a clean sharp instrument, I will only discuss it within my team and I will not report it. I would only report if it was anything more serious or the sharp instrument was infected.
  • It’s very blame shifting; the safety department doesn’t think so much in terms of actual safety they just make rules. I guess that is the biggest impediment. If you report the incident, an email is sent with new rules as opposed to why this incident happened and what we can learn from it.
  • Even if they say no blame will be assigned, the person about whom we report will feel bad and we cannot be sure that no action will be taken. If a senior does something, I am afraid it will have repercussions if I report. Therefore, there are people who see someone doing something wrong but don’t want to say it.
  • Small confidential group discussion will be hugely beneficial because I am very sceptical that people will be willing to expose their frailties or errors by reporting incidents.
  • While working with HIV (Human Immunodeficiency Virus), we have to wear safety goggles when retrieving virus samples from the liquid nitrogen tanks. A particular staff has done this many times and probably wondered why we need to wear safety goggles. One time while retrieving samples, the staff did not wear the safety goggles. The vials burst open and some drops got into the staff’s eyes. The funny thing is, the staff hesitated even to tell the supervisor that this tube of HIV broke, because of fear of being scolded. I told the staff that it was stupid because HIV can ruin the whole life and the incident should be reported so that the staff can seek medical attention. I was amazed that there was this hesitation and it took a lot of persuasion from me before the staff reported it.
  • A staff was working with the dengue virus in the biosafety cabinet (primary protective equipment) and a needle scratched through the gloves. The staff felt a scratch but when the staff decontaminated and doffed the gloves and looked, there was no scratch on the finger, only the glove was punctured. The staff therefore chose not to report it because there was no break in the skin. According to protocol, the staff should have reported it anyway because it punctured the gloves and the staff felt the scratch even though it appeared not to break through the skin.
  • Reporting culture depends on the Principal Investigator (head of the laboratory), and his/her attitude. For every experiment there are always some risky parts, for example while using the ultracentrifuge. When we are taught by the seniors they tell us to just approximately pour the liquid into the tubes and that there was no need to balance the tubes by weighing it as per the correct procedure. A staff was using it and the centrifuge exploded. As always they accused the last person who used it before the explosion, of not balancing the tubes. No one was balancing it and this was not corrected. After this incident, they set up a booking system but in practice, only a few responsible ones logged in.
  • There are many instances with equipment. A student may have some problems, like equipment breaking during use, and they chose not to report it. We have to look at CCTV footage to see who was using the equipment. Even though we do not blame, if something happens or the tubes breaks within the centrifuge or you didn’t balance the centrifuge properly and the tube broke, they should be warned that is not proper way to use the centrifuge. Sometimes the work culture of dealing with things is incorrect; staff are doing it incorrectly and they teach this to newcomers. I also think that they don’t report due to multiple factors like the paperwork and they don’t want people fussing over them. Not reporting makes it easier and does not put the department under investigation. There is a tendency for the person to be blamed, leading to an inferiority complex. It should be done in a better manner, for example if there is an incident then we should discuss it without using names of the people involved but rather explain what happened and what might be the correct way to do it.
  • As a supervisor, I encourage my team to report incidents, but at first the incident reporting was seen only negatively. It took 2-3 years before the team members understood that incident reports are not negative. Now incident reports have become a part of life in the laboratory. Team members understand that they have to report to improve the system and also to learn from mistakes. At first though, it was not easy, but it got easier for them and now team members submit incident reports even without being asked. However, even with better incident reporting practices, unless, it is really bad like accidentally dropping a whole bottle of formaldehyde or something like that, people don’t report it. So, even with the current improved reporting practices, they do an assessment in their head about what to report. Of course if the incident is their fault they’re not happy about it, but they understand that this is how things happen here. It is not personal and you just have to follow this; sometimes people just make mistakes and things happen to us.
  • Speaking as a supervisor, my facility does not impose a zero incident requirement. When I was a junior, there was a blame system in the institution. Because of growing in such a system where they blame you instead of correcting you or mitigating the problem, I started the blame free system within my own unit/section during the course of my career. In other units/sections, the staff still take it negatively because if they make an incident report it may affect their performance evaluation. In my section, we encourage incident reporting not to blame them but to mitigate the situation and to learn.
Data source 2: Practice of Just Culture to encourage and enable reporting of safety occurrences in aviation industry:
All participants said that the law requires that air traffic management and the aviation industry adopt a just culture policy which is a part of the safety management system. Under these circumstances, not reporting safety occurrences is a non-conformity and against the law. All participants made it clear that arriving at work intoxicated, engaging in deliberate malicious acts, and committing acts of willful negligence fell outside the principles of just culture. Participants’ responses were collated into six themes listed below; each theme is accompanied by relevant paraphrased excerpts of the conversations.
  • Just culture: a legal requirement anchored in the aviation Safety Management System (SMS)
A.
All participants said that the introduction of just culture and the practice of blame-free reporting of safety occurrences came about in the wake of some serious accidents around 20 years ago. Participants talked about the stakeholders involved in the practice of just culture and the importance of trust among them. The aviation industry manages safety through a comprehensive SMS (IATA), 2025). They said that it took 15-20 years for the aviation industry to make just culture a part of their safety management system and legal requirements.
  • In the late 1990s it became a mandatory requirement to report incidents. Two years later, there was an accident where the air traffic controllers were blamed and treated badly. Following this, a more formal process to protect controllers and to encourage them to report was developed. We call this ‘just culture’, which is a clearly written requirement in the law and safety manuals. Now when a frontline operator files a report, they know it won’t be a witch hunt.
  • I have over 30 years’ experience in air traffic control. If we go back through the evolution of occurrence reporting, in the early 1990s people didn’t report. If you had a serious occurrence, you said nothing. There was a legal requirement that the radar tapes and the voice tapes be retained for 30 days and if there was an occurrence people would say nothing for the 30 days. After that you could talk about it because you knew the radar tapes would be recycled. This was called being part of the ’30-day club’. Then there became a requirement to report, and it started to become established in the mid-nineties. After that the industry got serious about safety and implemented a formal SMS.
  • Just culture is a legal term within aviation regulation, and it states that you can’t be prosecuted for reporting safety occurrences. It has taken nearly 15 years to establish this.
  • In the aviation industry, we really had to be very patient and accept that it took decades to implement this. If I suggest, we know not to be too optimistic and adjust expectations to a realistic time path. Before we get such a system implemented, we must take time to explore the possibilities. What we have now in aviation is not perfect, but I think given the circumstances it’s the best we can realistically achieve. You must look at the industry’s expectations: what does safety, prevention, and collective learning mean for the authorities. Once you get that picture about the expectations and what is in the interest of the authorities, then I think we can carefully go to the next step. We must be patient, and it will take many years to establish just culture requirements in the legal system, which has not yet happened in all industries.
B.
Just culture was explained by all participants as an organizational atmosphere of trust where people feel safe to report safety concerns and errors without fear of reprisal but are also held accountable for their actions. It fosters a learning environment by distinguishing between honest, unintentional mistakes and reckless or willful violations, encouraging systems to be examined rather than solely blaming individuals.
  • If you made a mistake during a flight, expelling the person cannot address the problem. You would learn more by asking, “why did you do that?” This information can change the training, system support, or even design constructions.
  • The information and data about the occurrences have more value than the punishment of people who make mistakes. But it is a sensitive balancing act and there’s no one-size-fits-all.
C.
When asked if reporting would be as prevalent even if it was not part of the law and the SMS, participants replied as follows:
  • I have worked before in an organization with blame culture and people would hide as much as they could, reporting only occurrences mandated in the manuals. Even if they reported they would provide very little information. Now I work for an internationally acclaimed airline and it’s a completely different world. Here the company says three things: if you’re not under influence of any substances, if you’re not stealing or if you’re not lying, nothing will happen to you, just report it. Just culture is taught when a worker is initially trained as well as in subsequent trainings and applies to all workers. In my current job the idea of reporting is to learn and to avoid the same thing happening again. This provides innovation of processes and makes work more efficient and safer.
  • Yes, the organizations saw the added value of investing in blame-free reporting culture. Insurers are a stakeholder in the aviation industry, who bear financial risk. Based on what I have heard, the insurers were promoting just culture principles even before it went into law. This is in their own interest, because if you decrease the number of incidents and promote excellence, the financial risk can be reduced.
  • There is a three-partner relationship: the reporter, the organization, and the authorities. The law is very clear that the organization must have a just culture policy anchored in the SMS. The same is for the authorities who must have a safety policy based on just culture principles and that’s why as regulators we are not allowed to share the information from individual occurrence reports with third parties, unless required by law.
2.
Mindset and thinking
The aviation industry especially flight details, are heavily monitored. All participants said that the mindset among workers is that they are being monitored even without reporting, and by reporting they are improving the system for safety and efficiency.
  • There is something called telemetry [44], which transmits data to a remote location for further analysis. So, we know that everything is tracked, so we might as well report it. That is the sentiment now.
  • One way to make the change to just culture and encourage incident reporting is to make people understand that their reporting is what has improved not only safety but also efficiency. It may get them to think that if everybody is reporting it and we are finding better ways to do things, why don’t I do it too. So, it’s a form of “official gossip” which is positive for you and the organization. Every Friday, we get a curated report and everyone gets to read what is happening in the company. Often, not reporting gets a reprimand. The supervisor will say that if you had reported it would have been OK but by not reporting it an opportunity to learn and correct is lost. This spreads the message that the person is not being reprimanded for the error but being reprimanded for hiding it.
  • There is very strong wording in the safety manual that we are given the ability to report and nothing will happen, thanks to unions and leadership style. I will give you one case example: we had some newly employed cabin crew who was reporting fatigue a few times, and it led to the probation period being extended and they were not able to continue the job as cabin crew. In this case I heard that they were called to a meeting with the supervisor. If you report fatigue frequently, the boss will verify is it because you were out late with your family or friends and then reported fatigue the day after? In that case, it’s bad planning. But if you’re working long shifts for 3-5 days continuously and you were given a noisy hotel room and that is the reason for reporting fatigue, then that it will not be a problem, they will investigate it. People understand that if they don’t report, nothing will happen but there also will not be any improvements in processes or safety.
  • The first thing that comes to my mind is the question, why do we report? People should know that reporting has huge learning potential, so I think that what is so important with the reporting system is the mindset of collective learning. We can all learn to become better and tomorrow if you report something about me, that becomes an eye opener for me.
  • So, there are a couple of things: if there’s no legal requirement for people to report they’ll say why would I bother to report. First, you must involve the frontline workers in the investigation process so that they can see it’s not a witch hunt that it’s a blame-free learning environment. When the de-identified information comes back to the frontline operator as learning opportunity, that’s when you get buy-in.
  • There is an app which can show if a worker has been removed from duty due to any incident. Coworkers can see this and previously could take a screenshot and share it among themselves so that becomes like a reason for shame. Now they’ve changed the app in a way that when you try to take a screenshot, the screen goes black. This tells co-workers that the management takes confidentiality seriously and does not encourage spreading information like this.
  • Airlines have worked very hard to change the mindset. They explain that error is part of our work and part of being human. It is important to acknowledge the error, report it and fix it rather than hide it. What we are looking at now is resilience, the ability to bounce back when you have made errors. In my previous blame-culture experience, I think workers feared the company not their colleagues. To achieve active reporting of incidents it is important that people feel safe, and they know that nothing will happen to them. Culture is very important; my current company has a very mature system and they are able to trickle down the blame-free reporting and just culture principles from the top. People report everything including efficient work practices as well as safety occurrences. The company decides how to disseminate it and whether it needs to be added to the safety manuals or the protocols.
  • I have not seen anyone write a report about a co-worker. If they have any issues, they just talk about it openly or take it to the manager. We have a very mature way to sort out things. For example, if a junior pilot or cabin crew did something non-compliant, we talk about it and discuss how to ensure training and communication is adequate. We aim at developing the team instead of blaming the team.
  • About eight years ago there was an occurrence that needed to be reported. The controller involved didn’t want to report it. After two or three days he was convinced by the rest of the team to report this event, and he did. Within the team if somebody saw something that they were uncomfortable with they would probably discuss with a colleague/supervisor.
3.
Trust
All participants agreed that trust is an attribute of paramount importance because if the involved parties do not trust each other there will be no reporting.
  • I think in 1989 there was a ground incident in a control tower which was investigated internally by the organization and it was deemed that there was no culpability. People were retrained there was no blame assigned. A few months later the state prosecutor became aware of the incident and decided to prosecute the staff involved. So, there was legal case taken against three members of staff as I understand it and two of them were convicted, but no sentence was given. What happened overnight in the organization was that reporting stopped, simply stopped [45].
  • For incident reporting you need peer buy-in and experienced people who are involved in the fully transparent process. To apply just culture, we need a joint decision-making panel which includes frontline operators, so that it’s not perceived as a management tool.
  • Regardless of the industry, an effective just culture policy and reporting system cannot be developed unless we first work on mutual trust. Without trust it’s basically a waste of time to start thinking about it. This requires a lot of time and patience because we can’t write trust into regulations and standard operating procedures (SOP).
  • There is a specific paragraph in our manuals that says the internal just culture policies and regulation of an organization may only be introduced after consulting with employee representatives. This means the organization must have given the employees the opportunity to give their input. Reporter(s) must feel fully confident that they will not be blamed for acceptable errors. If they are going to be blamed for that then you will ruin your whole reporting culture.
  • There was a serious occurrence a few years ago in which I expected that the controller would be blamed, but there was no blame. If you admit to a mistake your unit manager has a responsibility to debrief you on the occurrence; anecdotally controllers use the term ‘just cultured’ for this type of debrief. On the other hand, if you do not report an event, you can be prosecuted because you’re breaking the law. So, people are probably more afraid not to report than the consequences of reporting. The reporting system covers very minor events that have no significant impact, all the way to very serious events.
4.
Categorizing the severity of the accident
All participants said that the reportable incidents, their categories, and severities based on risk assessment are all written in the documents and that this is part of the safety management system. If in doubt, staff refer to these. Sometimes, the team will openly discuss among themselves on the reportable nature of an incident and if in doubt are likely to err on the side of caution and make the report.
  • There is a chapter in our operations manual which is based on ICAO Annex 13 [43]. This annex is about aircraft incidents and accidents and they have listed around 50 topics on which a safety report should be filed and which is mandatory. In addition, the airlines have their own manual which clearly states what is a reportable safety occurrence. There are situations where the crew may discuss among themselves the need to report a particular occurrence and often, they err on the side of caution and report it anyway
  • We have manuals that clearly states what needs to be reported. When we report to the company, the company will decide on reporting to the authorities based on the manuals and the legal requirements.
5.
The process of reporting and what is done with the data
All participants said that there are thousands of reports filed every year, which is looked at by a designated team. These large databases are curated, analyzed, and turned into valuable communications and training material. All participants said that before anything is shared, all identifying information is removed.
  • Reporting is quite easy and if you omit something it will be addressed afterwards, and you can give some extra information if required.
  • There is a department that parses through the reports and performs a very systematic analysis. They use the data to pick up occurrences that can provide good learning material and use them for training and retraining.
  • Trend analysis is very big because it’s data driven. Sometimes, we can’t be certain if an event is attributable to the person’s capability or it was just the luck of the draw. If the same person was involved in two or three things, the company would steer towards being conservative and assume that there is some capability issue and discuss, counsel, and retrain the person. Yes, sometimes it’s unfair and we almost sympathize with the person because we know that tomorrow if we mess up three times the same way we will not be spared ourselves; even the vice president’s son will not be spared. So, it’s that maturity of the system in dealing with these serious things.
  • The large databases are used to provide safety and other bulletins periodically. We have a very mature way to sort out things, for example if you have a junior staff and they’re doing something incorrect then we talk and determine the reason. We address the problem instead of blaming the person or team.
  • Reporting is confidential. Eventually when the report is made available for others to view, the identifiable information is removed. The person reporting also gets a notification to thank them for reporting and they may be given a file number of their report. That’s the beauty of it: once they take your name out of it you can see it’s available for everybody. I did my master’s degree based on this database.
  • If you report a serious occurrence, you may get interviewed, debriefed and put under investigation. During the investigation you may not be allowed to work. After the investigation the company will decide what to do; most of the time they retrain you in the simulator, but they will not dismiss you, unless it is a deliberate violation.
  • You can see your own report but not others’ reports. The safety department checks all the reports and maybe once or twice a year, we get bulletins. The bulletins will have information on the types of occurrences for example: air safety, fatigue, hazard and non-flight related. They will elaborate on specific examples to illustrate the issues and how to deal with it.
  • We received around 30,000 reports last year and we code them into the types of event. For more serious events if we need to, we will contact the controller’s units but that is rare. We have three severity categories and if we’re not completely sure about its severity level, we always contact a third party called the national investigation board. The investigation board works fully independently of us so they can also have a critical view at our role as a regulator.
  • Our main role as regulatory authority is that we oversee that the safety management system which the Air Navigation Service Provider [46] has implemented in accordance with legal regulations. We only contact the organization in case of an individual occurrence when there’s really a special training point or necessity to contact them because of their specific occurrence. If we have an audit for an air traffic control unit, one of our agenda points is a discussion or a meeting with the safety manager where we ask how many reports they received, and we cross check with our database. If the safety manager received 10 and we received 30 then we have to look at the discrepancy. So our main task is to audit whether the safety management system of the organization works properly.
  • The current methods of training like Crew Resources Management [47] programs can improve the competencies of a population of pilots, but not of the individuals. Evidence-based training builds resilience, and this is becoming popular in aviation. In the US they call it Advanced Qualification Program [48]. It uses the large databases of occurrence reports and produces training programs for simulator training [49]. The databases are decoded into competencies and observable behaviors. This methodology complements the existing training programs to focus on specific requirements for specific individuals. For example, if the event happened because of lack of communication, they would implement communication strategies in the training for the pilots. The challenge is to have a very robust data collection system and software; it’s not a cheap process. The information is not only used to learn what people are doing incorrectly but also what people are doing correctly.
  • The idea is to promote errors in the simulator so that they can learn in the simulator instead of the cockpit.
  • We share the de-identified reports information openly among the air traffic control and airport authorities. Mostly there’s one main service provider for each country. Within that organization there is are safety manager(s), who meet regularly with international counterparts and are constantly in touch. They form working groups to share and discuss significant events and take the information back to their country for improving safety and efficiency. The culture that everybody can have such problems and they should talk about it and fix it is prevalent among air traffic controllers. If a pilot made a mistake, the controller would let the pilot know that it’s going to be reported. So people talk openly about the requirement to report, so it’s known within the culture, within the workplace, that yes there is a strong reporting culture.
  • There was this incident with a controller, which was not a wilful act; there were no contributing factors, it’s just something he didn’t spot. So, he said I’ve made this mistake and the organization said that was a bit silly, wasn’t it? He said I don’t know why I did it, but he told his story to the group and his story circulated amongst the teams and there was no blame and that was a significant moment within the organizational culture. The frontline operators saw this guy who was known as a good operator had made this mistake and wasn’t punished for it.
  • The more serious events would be included in annual refresher training with opportunity to discuss. The rest would be communicated through emails with “read receipt” to make sure the controller has read it and within the team there would possibly be some discussion. There is also an internal quarterly report categorized into the five key risk areas. These also go to the regulators; in addition, there is an annual report that also goes to the regulator. The regulatory function is a separate body; for certain serious occurrences we must inform the Accident Investigation Unit, which is a third party a separate from us and the regulator.
6.
Reward system for reporting and aim for zero incident statistics
Four of the five participants said that there was no reward for reporting and people just reported because they have come to understand the importance of such a system to make it safe and efficient for the entire organization. The one participant who said that recognition is provided for reports, said that the only reward is a coffee coupon and nothing more than that. Participants felt that reporting must be done as part of cultural and mindset thinking because it helps the organization as a whole. As they would not be penalized for reporting, there need not be any reward system. In fact, two participants said that providing rewards may work counterproductively because people may make up reports of something that did not really happen.
  • We don’t put up signs saying we aim for incident free situations; we have moved away from that. We are taught is that you need to mitigate the threats and avoid errors; that’s the baseline of every operation. Errors will never be stopped we can never be 100% error free, but we need to mitigate the threats that exist.
  • In my previous company they made fake reports because the auditor would ask how it was possible to have no occurrence reports. In my current company there is no need to do that because there are lots of genuine reports to be shown during audits. We don’t have the culture of zero incident targets.
  • No we don’t have zero incidence targets. It can be seen from air traffic management platforms that we had 1000 days without accident and then something really serious happened, so if we say zero incident target we are just playing with figures.
  • I am bit skeptical that people are then starting to report to get their reward but it’s not about the reward. So, it’s a kind of mindset which is so important; it has to be natural for the person to report because they want to share their experience so that other people will not face the same situation.

4. Discussion

This study shows that in biomedical laboratories, safety occurrence or incident reporting is not a common practice. If there is an incident or even an accident [50], the workers do a mental risk assessment to see if it is worth it for them to report. If they are afraid of ill-health or injuries, they are more likely to report but if there is an incident that did not result in any injury, they are very unlikely to report. Among the 15 participants from biomedical laboratories there were only two laboratories in which the head of the lab encouraged reporting. But even in these laboratories, the neighboring laboratories in the same institution did not have a practice of reporting safety incidences. The most common cause of not reporting an incident was blame and the fear of repercussions. Similar observations are made in other studies. Chamberlain et al [51] conclude that laboratory workers do not report incidents due to fear and embarrassment. A more recent paper also shows that 42.7% of the questionnaire respondents did not feel comfortable to discuss work incidents with supervisors, stemming from fears of blame or discouragement from coworkers [52].
Just culture was first introduced in aviation to improve safety and reduce accidents and has gone a long way in making commercial aviation a safe mode of transportation [2,15]. It is true that its implementation is not uniform across all airlines and aviation related organizations in spite of requirements by national and international agencies [15,16,53]. Nevertheless, it continues to play a significant role in ensuring air travel safety. To understand how just culture may be implemented in biomedical laboratories and what concepts can be part of the initial steps, this study interviewed 5 workers from the commercial aviation industry. One of the key factors in the aviation industry is the inclusion of just culture principles in their regulations as well as in their safety manuals, which is not done in biomedical laboratories. Furthermore, the manuals as well as the regulations in aviation provide a categorization of a safety occurrence with clear guidelines as to what occurrence falls under mandatory reporting [43]. Reporting is done regularly, based on one participant’s response, “the crew may discuss among themselves the need to report a particular occurrence and often they err on the side of caution and report it anyway”.
An industry from which laboratory biorisk management can learn is healthcare organizations. Just culture is not a legal requirement in either of these industries. The inclusion of just culture in managing patient safety has been attempted for several years now with inconsistent results. This can provide valuable lessons on how to embark on this journey in biomedical laboratories. Based on a recent publication [54] reviewing existing literature for just culture implementation in healthcare organizations, there is a scarcity of empirical evidence. The paper identified four main themes that are crucial for the implementation of just culture namely: leadership commitment, education and training, accountability and open communication.
Repeatedly in healthcare, it is seen that the fear of blame and reprisal is the main reason for underreporting of safety lapses throughout the world [55,56]. Studies conducted among nursing staff found that fear of blame, lack of leadership support, workload and the reporting system itself were among the barriers to reporting [57,58]. Studies also show that healthcare workers did not have the knowledge and skills required for effective incident reporting. The reporting practices were based on staff’s own consideration of the importance of the incident to the patient [55], in a way like the risk assessment by laboratory workers about the need to report incidents.
Any small step towards the just culture implementation will be of great value in improving not only laboratory safety but also overall work processes. At the same time, if biorisk management is to adopt just culture policy, it is important to understand that one-size does not fit all. In aviation, many things are machine based, computerized, and standardized. In biorisk management, while many procedures are standardized, there are individual variations based on the situation. In biomedical laboratories it may not be possible to standardize all the processes because the working environment, the equipment, the layout of each laboratory is so different that over standardization and simplification could lead to more errors. But the concepts of open communication, learning from incidents and blame-free reporting remain the same.
In aviation the emphasis is not on every safety occurrence reporting but in a collective analysis of the data collected. These large data sets are used to develop evidence-based training suited to the different individuals and situations [59,60]. This was also mentioned by one participant who said that “Evidence-based training builds resilience, and this is becoming popular in aviation”. This is precisely the result of a study from Netherlands healthcare system, in which the authors conclude that there is a need to rethink the emphasis on investigating every singular safety occurrence reported. The authors are of the opinion that instead of struggling with analyzing every single report a lot of benefit can be obtained by aggregating them and looking at trends and practices to improve both safety and productivity, precisely the way it is done in aviation [61]. Biorisk management will benefit greatly by conserving time spent on investigating every single incident and developing large datasets that can be deidentified, anaylsed and used to collectively improve the safety and performance. Initially these databases can at least start at organization level and perhaps as the system matures move to national levels. Snyder et al talk about the need for an independent authority that could aid the shift in culture and practices required to implement just culture principles in biomedical laboratories [62] which could be considered as the system matures.
Participants from aviation talked about trust, they all agreed that trust is of paramount importance and without trust just culture cannot be implemented. One participant said that trust cannot be written in a SPOs or manuals. Unless staff feel fully confident that they will not be blamed for acceptable errors just culture cannot be implemented [16,63]. This code of silence needs to be broken through a supportive management that encourages reporting, simplifies reporting processes and prioritizes accountability [64].
A valuable source of guidance for biorisk management is the NHS Hospital Trust study by Tasker et al. [65], which examines how just culture can be implemented in practice. Drawing on their insights and all the existing literature biomedical laboratories can take the following steps:
  • Involve key stakeholders—especially laboratory workers, managers, and leaders—to define what just culture means within the organization.
  • Establish a simple incident reporting system, train a dedicated team to analyze reports, and translate findings into actionable learning outcomes.
  • Share lessons learned widely with all staff to foster a collective learning culture.
  • Build trust by clearly communicating that genuine mistakes can be reported without fear of blame or reprisal.

5. Conclusions

In biorisk management, we are at the earliest stage of adopting a just culture and benefit from a largely blank-slate environment. This calls for careful study, thoughtful planning, and deliberate implementation. Above all, trust—underpinned by visible leadership commitment—is essential. When workers are confident that reporting will not harm them and will collectively enhance safety and work practices, they are more likely to take pride in reporting. Making just culture a legal requirement anchored in the safety system will take time, as these concepts are still emerging. In the meantime, progress can begin with incremental steps at the departmental or organizational level, with successes documented and shared so others can see the value and adopt similar practices.

6. Limitations

The cohort studied is very small, but it is clear that the concept of just culture is almost nonexistent in biomedical laboratories. More research is needed to understand aspects of just culture that will work in biomedical laboratories and the progressive implementation timelines.

Author Contributions

Conceptualization, data curation, formal analysis, investigation, methodology, project administration, visualization, writing—original draft, and writing—review and editing by V.V.

Funding

This research received no external funding.

Institutional Review Board Statement

The Parkway Independent Ethics Committee has approved this study (reference number PIEC/2024/52).

Informed Consent Statement

Informed consent was obtained from all subjects involved in the study.

Data Availability Statement

The data presented in this study are available on request from the corresponding author. (please specify the reason for restriction, e.g., the data are not publicly available due to privacy or ethical restrictions.)

Conflicts of Interest

The author declares no conflict of interest.

References

  1. Wikipedia Contributors. Commercial Aviation. Wikipedia, The Free Encyclopedia. 2025. (accessed 2025). Available online: https://en.wikipedia.org/wiki/Commercial_aviation.
  2. Barnett, A.; Reig Torra, J. Airline safety: Still getting better? Journal of Air Transport Management 2024, 119, 102641. [Google Scholar] [CrossRef]
  3. Clinton, J.; John, S.; Zorn, C. Analyzing aviation safety: Problems, challenges, opportunities. Research in Transportation Economics 2013, 43, 148–164. [Google Scholar] [CrossRef]
  4. International Civil Aviation Organization. State of Global Aviation Safety; International Civil Aviation Organization, Montréal, Canada. 2025. Available online: https://www.icao.int/sites/default/files/sp-files/safety/Documents/ICAO_SR_2025.pdf.
  5. Bieder, C. Safety Management Systems and their Origins: Insights from the Aviation Industry, 1st ed.; CRC Press, 2022. [Google Scholar] [CrossRef]
  6. Cheetham, J.; Ma, Y.; Murphy, M. Are planes crashing more often? BBC Verify. 2025. Available online: https://www.bbc.com/news/articles/c5ym8n4lzp6o.
  7. Enten, H. It feels like there are suddenly way more plane crashes and incidents. Here’s the truth CNN Business (USA). 2025. Available online: https://edition.cnn.com/2025/02/19/business/airplane-crashes-statistics.
  8. Nanyonga, A.; Joiner, K. F.; Turhan, U.; Wild, G. Deep Learning Approaches for Classifying Aviation Safety Incidents: Evidence from Australian Data. AI 2025, 6(10), 251. [Google Scholar] [CrossRef]
  9. SKYbrary. Safety Occurrence Reporting . 2025. Available online: https://skybrary.aero/articles/safety-occurrence-reporting.
  10. Xing, Y.; Wu, Y.; Zhang, S.; Wang, L.; Cui, H.; Jia, B.; Wang, H. Discovering latent themes in aviation safety reports using text mining and network analytics. International Journal of Transportation Science and Technology 2024, 16, 292–316. [Google Scholar] [CrossRef]
  11. Xiong, M.; Wang, H.; Wong, Y. D.; Hou, Z. Enhancing aviation safety and mitigating accidents: A study on aviation safety hazard identification. Advanced engineering informatics 2024, 62, 102732. [Google Scholar] [CrossRef]
  12. European Union Aviation Safety Agency. Regulations . 2025. Available online: https://www.easa.europa.eu/en/regulations.
  13. FAA. AC 120-92D - Safety Management Systems for Aviation Service Providers; FAA, Ed.; 2024. [Google Scholar]
  14. Kıvanç, E.; Tuzkaya, G.; Vayvay, Ö. Safety management system and risk-based approach in aviation maintenance: A systematic literature review. Safety Science 2025, 184, 106755. [Google Scholar] [CrossRef]
  15. Çoban, R.; BÜKEÇ, C. M. Just Culture in Aviation: A Metaphorical Study on Aircraft Maintenance StudentsMaintenance Students. International Journal of Aviation, Aeronautics, and Aerospace 2024, 11(1). [Google Scholar] [CrossRef]
  16. Dekker, S. Restorative Just Culture: From Disciplinary Action to Meaningful Accountability; CRC Press, 2025. [Google Scholar] [CrossRef]
  17. Pellegrino, F. The Just Culture Principles in Aviation Law: Towards a Safety-Oriented Approach; Springer International Publishing, 2019. [Google Scholar] [CrossRef]
  18. Reason, J. Managing the Risks of Organizational Accidents; Routledge, 1997. [Google Scholar] [CrossRef]
  19. Wikipedia Contributors. Just culture. Wikipedia, The Free Encyclopedia, 2025. Available online: https://en.wikipedia.org/w/index.php?title=Just_culture&oldid=1292104615 (accessed on 23 September 2025).
  20. International Civil Aviation Organization. Safety Management Manual (SMM). In Doc 9859, (ICAO); I. C. A. O., Ed.; 2018. [Google Scholar]
  21. Kovacova, M.; Licu, A.; Balint, J. Just Culture – Eleven Steps Implementation Methodology for organisations in civil aviation – “JC 11”. Transportation Research Procedia 2019, 43, 104–112. [Google Scholar] [CrossRef]
  22. SKYbrary. Just Culture. SKYbrary, 2025. (accessed.
  23. Al-Dmour, H.; AlKhawaldeh, H.; Al-Dmour, A.; Obidat, B.; Al-Dmour, R. The integrated role of Safety Management Systems (SMS) and risk management in achieving aviation sustainability. Discover Sustainability 2025, 6(1), 985. [Google Scholar] [CrossRef]
  24. Mrusek, B.; Miller, M.; Olaganathan, R. Shared Leadership and Just Culture: Tools to Promote SMS Hazard Reporting. In 2020 IEEE Aerospace Conference, 7-14 March 2020, 2020; pp 1-13. [CrossRef]
  25. Ray, A. T.; Bhat, A. P.; White, R. T.; Nguyen, V. M.; Pinon Fischer, O. J.; Mavris, D. N. Examining the Potential of Generative Language Models for Aviation Safety Analysis: Case Study and Insights Using the Aviation Safety Reporting System (ASRS). Aerospace 2023, 10(9), 770. [Google Scholar] [CrossRef]
  26. Vempati, L.; Woods, S.; Solano, R. C. Qualitative Analysis of General Aviation Pilots’ Aviation Safety Reporting System Incident Narratives Using the Human Factors Analysis and Classification System. The International Journal of Aerospace Psychology 2023, 33(3), 182–196. [Google Scholar] [CrossRef]
  27. Markes, A.; Diab, M. Voluntary Incident Reporting in Health Care. The Journal of Bone and Joint Surgery 2025, 107(14), 1651–1656. [Google Scholar] [CrossRef] [PubMed]
  28. Morrow, S.; Koves, G.; Barnes, V. Exploring the relationship between safety culture and safety performance in U.S. nuclear power operations. Safety Science 2014, 69, 37–47. [Google Scholar] [CrossRef]
  29. International Organization for Standardization. ISO Standard No. 35001:2019; Biorisk management for laboratories and other related organisations. 2019.
  30. World Health Organization. Laboratory Biosafety Manual, 4th ed.; 2020. [Google Scholar]
  31. Centers for Disease Control and Prevention. Biosafety in Microbiological and Biomedical Laboratories 6th Edition . 2020. Available online: https://www.cdc.gov/labs/pdf/SF__19_308133-A_BMBL6_00-BOOK-WEB-final-3.pdf (accessed on 30 April 2025).
  32. Bayot, M. L.; King, K. C. Biohazard Levels; StatPearls Publishing, 2022. [Google Scholar]
  33. Ta, L.; Gosa, L.; Nathanson, D. A. Biosafety and Biohazards: Understanding Biosafety Levels and Meeting Safety Requirements of a Biobank. Methods Mol Biol 2019, 1897, 213–225. [Google Scholar] [CrossRef]
  34. Perkins, D.; Danskin, K.; Rowe, A. E.; Livinski, A. A. The Culture of Biosafety, Biosecurity, and Responsible Conduct in the Life Sciences: A Comprehensive Literature Review. Applied biosafety 2019, 24(1), 34–45. [Google Scholar] [CrossRef]
  35. Ritterson, R.; Kingston, L.; Fleming, A. E. J.; Lauer, E.; Dettmann, R. A.; Casagrande, R. A Call for a National Agency for Biorisk Management. Health Security 2022, 20(2), 187–191. [Google Scholar] [CrossRef]
  36. World Health Organization. Strengthening laboratory biological risk management . 2024. Available online: https://apps.who.int/gb/ebwha/pdf_files/WHA77/A77_R7-en.pdf.
  37. US Department of Labor. Incident Investigation . 2025. Available online: https://www.osha.gov/incident-investigation.
  38. Blacksell, S. D.; Dhawan, S.; Kusumoto, M.; Le, K. K.; Summermatter, K.; O’Keefe, J.; Kozlovac, J. P.; Almuhairi, S. S.; Sendow, I.; Scheel, C. M.; et al. Laboratory-acquired infections and pathogen escapes worldwide between 2000 and 2021: a scoping review. The Lancet Microbe 2024, 5(2), e194–e202. [Google Scholar] [CrossRef]
  39. Zavaleta-Monestel, E.; Rojas-Chinchilla, C.; Anchía-Alfaro, A.; Quesada-Loría, D.; García-Montero, J.; Arguedas-Chacón, S.; Hanley-Vargas, G. Tracking the Threat, 50 Years of Laboratory-Acquired Infections: A Systematic Review. Acta Microbiologica Hellenica 2025, 70(2), 11. [Google Scholar] [CrossRef]
  40. Manheim, D.; Lewis, G. High-risk human-caused pathogen exposure events from 1975-2016. F1000Res 2021, 10, 752. [Google Scholar] [CrossRef]
  41. Vijayan, V. Understanding Work-as-Imagined and Work-as-Done in Biomedical Laboratories. Applied Biosafety 2025. [Google Scholar] [CrossRef]
  42. UK Civil Aviation Authority. Occurence Reporting for General Aviation. 2023. [Google Scholar]
  43. International Civil Aviation Organization. Annex 13: Aircraft Accident and Incident Investigation; 2016. [Google Scholar]
  44. Wikipedia Contributors. Telemetry. Wikipedia, The Free Encyclopedia. 2025. (accessed 2025). Available online: https://en.wikipedia.org/wiki/Telemetry.
  45. SKYbrary. B763, Delta Air Lines, Amsterdam Schiphol Netherlands, 1998 (Legal Process - Air Traffic Controller) . 1998. Available online: https://skybrary.aero/articles/b763-delta-air-lines-amsterdam-schiphol-netherlands-1998-legal-process-air-traffic?__cf_chl_tk=HilAhBhOjVljbOw9XtyJEZZ7EVMNenYhkLx_WCaYZjo-1759371877-1.0.1.1-tg7FnXsX4Gqgla.i8xe915HLR4j24zFz2H99eiFZ0Fg.
  46. Wikipedia Contributors. Air Navigation Service Provider. Wikipedia, The Free Encyclopedia. 2025. (accessed 2025). Available online: https://en.wikipedia.org/wiki/Air_navigation_service_provider.
  47. Wikipedia Contributors. Crew resource management. Wikipedia, The Free Encyclopedia, 2025. (accessed 2025).
  48. Federal Aviation Administration. Advanced Qualification Program (AQP) . In US Department of Transportaion; 2024. Available online: https://www.faa.gov/training_testing/training/aqp.
  49. Wikipedia Contributors. Flight Simulator. Wikipedia, The Free Encyclopedia, 2025. Available online: https://en.wikipedia.org/wiki/Flight_simulator (accessed on 11 December 2025).
  50. Nemmers, P. The Differences Between Incidents vs. Accidents in the Workplace; Professinals, N. A. o. S., Ed.; 2023. [Google Scholar]
  51. Chamberlain, A. T.; Burnett, L. C.; King, J. P.; Whitney, E. S.; Kaufman, S. G.; Berkelman, R. L. Biosafety Training and Incident-Reporting Practices in the United States: A 2008 Survey of Biosafety Professionals. Applied Biosafety acccessed. 2009, 14(3), 135–143. [Google Scholar] [CrossRef] [PubMed]
  52. Al-Firm, A. T.; Alshalawi, M.; Almarzouqi, M.; Alhuthil, R.; Qanbar, S.; Alsalmi, L.; Alaklabi, A. Perception of just culture among staff in a research organization. Industrial and Commercial Training 2025, 57(2), 232–241. [Google Scholar] [CrossRef]
  53. Bükeç, C.; Çoban, R. A QUALITATIVE RESEARCH ON FACTORS AFFECTING JUST CULTURE IN AIRLINES / HAVAYOLU İŞLETMELERİNDE ADİL KÜLTÜRÜ ETKİLEYEN FAKTÖRLER ÜZERİNE NİTEL BİR ARAŞTIRMA. Anadolu Üniversitesi İktisadi ve İdari Bilimler Fakültesi Dergisi 2023, 24, 496–525. [Google Scholar] [CrossRef]
  54. Murray, J. S.; Lee, J.; Larson, S.; Range, A.; Scott, D.; Clifford, J. Requirements for implementing a ‘just culture’ within healthcare organisations: an integrative review. BMJ Open Qual 2023, 12(2). [Google Scholar] [CrossRef]
  55. Almansour, H. Barriers Preventing the Reporting of Incidents and Near Misses Among Healthcare Professionals. Journal of Health Management 2024, 26(1), 78–84. [Google Scholar] [CrossRef]
  56. Moshiri, E.; Abbaszadeh, A.; Shahcheragh, S. H. Prioritizing just culture: A call to action for patient safety. Nursing practice today 2025, 12(2). [Google Scholar] [CrossRef]
  57. Engeda, E. H. Incident Reporting Behaviours and Associated Factors among Nurses Working in Gondar University Comprehensive Specialized Hospital, Northwest Ethiopia. Scientifica 2016, 2016(1), 6748301. [Google Scholar] [CrossRef]
  58. Oweidat, I.; Al-Mugheed, K.; Alsenany, S. A.; Abdelaliem, S. M. F.; Alzoubi, M. M. Awareness of reporting practices and barriers to incident reporting among nurses. BMC Nursing 2023, 22(1), 231. [Google Scholar] [CrossRef]
  59. Aziida, N.; Joiner, K. F.; Turhan, U.; Wild, G. Deep Learning Approaches for Classifying Aviation Safety Incidents: Evidence from Australian Data. AI 2025, 6(10), 251, Publicly Available Content Database. [Google Scholar] [CrossRef]
  60. Zierman, R. Identifying Aircraft Damage Mitigating Factors with Explainable Artificial Intelligence (XAI): An Evidence-Based Approach to Rule-Making for Pilot Training Schools. Journal of aviation/aerospace education and research 2024, 33(4). [Google Scholar] [CrossRef]
  61. de Kam, D.; Kok, J.; Grit, K.; Leistikow, I.; Vlemminx, M.; Bal, R. How incident reporting systems can stimulate social and participative learning: A mixed-methods study. Health Policy 2020, 124(8), 834–841. [Google Scholar] [CrossRef] [PubMed]
  62. Snyder, B. C.; Wentzel, J. M.; Epstein, G. L.; Kadlec, R. P.; Parker, G. W. Trust, but Verify: A “Just Culture” Model for Oversight of Potentially High-Risk Life Sciences Research. Applied biosafety 2025, 30(2), 17–111. [Google Scholar] [CrossRef]
  63. van Baarle, E.; Widdershoven, G.; Molewijk, B. Just culture as dialogical learning: theoretical foundations and practical implications of restorative justice. Journal of Medical Ethics 2025, jme-2025-110761. [Google Scholar] [CrossRef]
  64. Miaoulis, G.; Manev, I. M. Personal and Organizational Responsibility in the Delivery of Healthcare Services: Breaking the Code of Silence. Health Services Insights 2025, 18, 11786329251356095. [Google Scholar] [CrossRef]
  65. Tasker, A.; Jones, J.; Brake, S. How effectively has a Just Culture been adopted? A qualitative study to analyse the attitudes and behaviours of clinicians and managers to clinical incident management within an NHS Hospital Trust and identify enablers and barriers to achieving a Just Culture. BMJ Open Qual 2023, 12(1). [Google Scholar] [CrossRef]
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.
Copyright: This open access article is published under a Creative Commons CC BY 4.0 license, which permit the free download, distribution, and reuse, provided that the author and preprint are cited in any reuse.
Prerpints.org logo

Preprints.org is a free preprint server supported by MDPI in Basel, Switzerland.

Subscribe

Disclaimer

Terms of Use

Privacy Policy

Privacy Settings

© 2026 MDPI (Basel, Switzerland) unless otherwise stated