1. Introduction
For critically ill patients in the intensive care unit with severe bacterial infections, the rapid and targeted initiation of adequate antibiotic therapy is of utmost importance for the patient’s outcome. However, antibiotic resistance has an increasingly important influence on the effectiveness of therapy and the chances of clinical recovery. The current worldwide increase in antibiotic-resistant bacteria has therefore been defined by the World Health Organization (WHO) as one of the 10 greatest global health threats to humanity [
1,
2,
3], which requires decisive action through differentiated measures.
Antibiotic resistance is mainly caused by overuse or misuse of antibiotics. The choice of substance used plays the major role here, but dosage aspects and the resulting low antibiotic levels in the blood or the target organ system are also relevant, as this can lead to the selection of resistant subpopulations of pathogens under antibiotic therapy. The choice of substance as well as the appropriate dosage of antibiotics is therefore not only important for patient outcome, but also for the development of resistance itself. Therefore, structured optimization measures against antibiotic resistance are of utmost medical importance. In addition to the pharmacological developments, the rational and responsible use of antibiotics is particularly important. This preventive approach to avoiding resistance is subsumed under the term Antibiotic Stewardship (ABS), also called antimicrobial stewardship (e.g. [
4]).
The aim of ABS is to treat patients in the best possible way and at the same time prevent selection processes and resistance from occurring in the bacteria. To this end, it would be desirable to regularly evaluate and optimize the rational use of antimicrobial substances with an integrative approach that combines routine intensive care and microbiological data from the clinic with mathematical and methodological analyses: a clinical decision support system. To establish such a system, however, the hurdles arising from the complexity of the clinical situation as briefly outlined below must be overcome.
At the beginning of an ABS with a chance of success is the clear identification of the infection. In addition to the actual microbiological identification of the pathogen, ABS includes in particular the selection of the appropriate antibiotic, including pharmacokinetic and pharmacodynamic aspects. Clinically, the efficacy of a substance is determined by measuring the minimum inhibitory concentration (MIC). The MIC is the lowest effective concentration of an antibiotic that still prevents the replication of a pathogen in a culture. Clinical threshold values defined for the respective pathogens (in Europe by the EUCAST - “European Committee on Antimicrobial Susceptibility Testing”) then identify a pathogen as resistant (R) or sensitive (S) to the respective antibiotic tested and support clinicians in selecting the correct substance in the form of the so-called antibiogram [
5]. However, the aspect of the active substance concentration achieved in the target compartment is also of decisive importance [
6,
7].
Attempts to ensure sufficiently high antibiotic levels and thus a sufficient dosage have been made, for example, by determining the blood or tissue levels of a substance as part of therapeutic drug monitoring (TDM) and any resulting dose adjustments [
8,
9]. However, a third category “I” in antibiograms (English for “increased exposure”) also addresses the requirement for sufficient antibiotic levels, which clinically means that the substance has been tested as sensitive but requires an increased dosage. With regard to the overall cohort, the local resistance situation in the form of ward and/or hospital-specific resistance statistics also plays a relevant role.
As mentioned, the clinical requirements of ABS also include careful and appropriate microbiological diagnostics. A comprehensive prevalence and mortality study recently addressed which pathogens are the most problematic cases [
3]. In summary, in 2019, approximately 14% of all deaths were due to a bacterial infection, based on the 33 most important pathogens. Furthermore, 56% of sepsis-associated deaths died as a result of these 33 infections. It is also noteworthy that 55% of deaths from the 33 bacterial species were due to infections of
Previously, the 6 pathogens from the so-called ESKAPE series
Enterococcus faecium
Staphylococcus aureus
Klebsiella pneumoniae
Acinetobacter baumannii
Pseudomonas aeruginosa
Enterobacter
were discussed as particularly critical with the highest clinical relevance [
2]. It almost goes without saying that successful ABS must keep an eye on the prevalence of these pathogens, which are classified as particularly dangerous, especially with regard to the development of resistance.
In a recent article we presented first steps toward harnessing the complex dynamics of antibiotic resistance [
10]. These analyses have been based on the clinical and microbiological data of a German hospital over an observation period of more than 7 years, which we evaluated descriptively and semi-quantitatively in order to obtain a basis for informed and intelligent action in terms of antibiotic stewardship. The main focus was on the particularly dangerous pathogens mentioned above. The aim of the present work is to extend the results from the recent study and deepen the insights by analyzing the same data set with a focus on the heterogeneity of antibiotic use.
So far we observed an increase in the resistance rate with increasing overall consumption, while increases over time independent of consumption are fairly moderate. Vancomycin and cefuroxim turned out as exceptions in the development of resistance, as resistance to these substances appears to decrease with increasing consumption. However, there have been substantial dose adjustments for these substances, which are likely to be decisive here. An intra-host increase in resistance due to treatment time on the one hand and repeated treatments on the other has been observed.
Within the sub-cohort of ineffectively treated patients due to resistance, mortality increased on average, but with ampicillin/sulbactam being a striking exception. Patients with infections caused by ampicillin-resistant bacteria turned out to have a lower mortality rate. Globally, 5 million infection-related deaths per year are attributed to antimicrobial resistance (cf. [
11]). However, this does not rule out the possibility of sporadic resistant germ variants that are less deadly than the susceptible variants. Interaction with a presumably greatly reduced microbiome is more likely. Further confounders can only be speculated upon.
The observed resistance rates of the eight most frequently administered antibiotics showed a temporal variability that includes random fluctuations as well as decidedly regular cycles. It has been argued that, in terms of evolutionary dynamics, it seems plausible that the proportion of resistant pathogen variants in a population follows a logistic growth process toward a carrying capacity, i.e., a stable equilibrium [
11]. The fluctuations we observed do not support this presumably oversimplified evolutionary dynamic, which can at best be maintained with constant or even increasing consumption of the corresponding antibiotic. Rather, it seems likely that with declining antibiotic consumption, the susceptible pathogen variant can gain an evolutionary advantage. In other words, the development of resistance is not a strictly irreversible evolutionary process. The concept of equilibrium in the sense of an asymptotically stable fixed point of one-dimensional logistic dynamics does not reflect the coupled dynamics of antibiotic consumption and evolutionary resistance development. The assumption tested by the authors [
11] that carrying capacity correlates with consumption, i.e., that environmental capacity is a function of consumption, is not particularly meaningful.
However, we have not yet considered individual pathogen-antibiotic pairs (referred to by [
11] as bug-drug pairs) in our evaluation, what we make up for in this publication. Specifically, we present a simple model that couples the “idle” dynamics in the form of a logistic growth process with differential antibiotic consumption. We consider this type of coupling to be more appropriate than a direct dependence of a presumed carrying capacity on consumption. In this regard, it is worth noting that the proportion of resistance to piperacillin/tazobactam across all pathogens shows remarkable constancy over time [
10], suggesting that the (total) trajectory in this special case has indeed stabilized at a carrying capacity that has already been reached. This is not a contradiction for the case of a constant consumption over the entire observation time.
Also shown previously (cf. [
10]), the time series associated with the various antibiotics showed pairwise time lag correlations, which indicates the existence of retardedly mediated cross-resistance. In particular, we refer back to these last-mentioned observations of complex time-delayed interactions in order to take a closer look at these relationships specifically with regard to possible resonances between dynamic consumption patterns and the incidence of resistance.
In this context, it is worth recalling the idea of cyclical variation in antibiotic use to suppress resistance without dispensing with antibiotic treatment and without the pressure to constantly develop new active agents. We therefore revisit the most important basic ideas and existing work already described in our recently published papers to outline the background [
10,
12]. Cyclical allocation of antibiotics or so-called mixing (other terms are in circulation) are strategies at hospitals with the aim of contributing to a reduction in the prevalence of resistant germs by means of spatial (between departments, mixing) or cyclical (temporal) variations in the proportions of consumption of different antibiotic groups. According to a systematic review [
13], there is only one randomized controlled trial (RCT) on the effectiveness of such strategies. However, there are some studies that at least compared systematic cyclical administration strategies and standard administration between clinics or carried out before-and-after comparisons (cross-over) [
13,
14]. In the RCT study, cycling performed worse than the control ABS [
13].
Noteworthy, in numerous studies, “mixing” was chosen as the control strategy, i.e. an alternating exchange between departments. Cycling, i.e. a temporal periodic change simultaneously across all departments, showed no difference to the mixing procedure, which is not surprising from a theoretical point of view if one assumes sufficiently isolated conditions between the departments. Nevertheless, all studies (RCT and cohort studies) including the cross-over studies were considered together in the meta-analyses, regardless of whether they tested against “mixing” or “without strategy”, which must be viewed very critically and neglects the important separate consideration of mixing and cycling. Of note, in the meta-analysis by [
14], a distinction was made between the two control strategies as part of a secondary analysis. In Gram-positive bacteria, the cycling strategy showed a slightly stronger effect in terms of avoiding resistance. It appears that there is no general evaluation independent of other biological and medical boundary conditions. It is therefore possible that the cycling concept itself is not well thought out. The question of whether cycling or mixing contributes to the “rational and responsible use of antibiotics” has therefore not been conclusively clarified.
Cycling according to a fixed scheme (scheduled cycling) means that the informed, i.e. rational use of antibiotics is deliberately avoided, so that conceptually a control rather than an intervention strategy is defined here. In this context, it is worth recalling that in a seminal cycling study published by [
15], the authors investigated interrupted time series in the administration of aminoclycosides. These irregular antibiotic prescriptions followed an observed pattern in the emergence of resistant germs rather than a fixed periodic cycling scheme. In retrospect, it appears that the authors intuitively used the method that is now referred to as “clinical cycling” and which is actually superior to scheduled cycling because it is based on clinical evidence for the necessity of adapting the antibiotic consumption. Unsurprisingly, scheduled cycling is explicitly not recommended in a German S3 guideline [
16], whose validity has expired 2024-01-31, by the way.
Despite the aforementioned counterarguments, the poor performance of “scheduled cycling” does no speak against cycling per se, but rather against cycling that is not carried out intelligently. One could even say that conceptually, scheduled cycling corresponds more to a “placebo-like” (control) ABS, whereas clinical cycling based on clinical expertise corresponds to the intervention group. In the context of this interpretation, the many studies on cycling are in fact rather evaluation procedures for assessing whether the respective “clinical cycling” is viable on the basis of implicit clinical knowledge in comparison to a non-informed cycling strategy. In other words, despite the clinically contra-indicated settings of scheduled cycling programs, these strategies represent a kind of basic model structure whose quantitative explanation is the basis for the description of more complex switching strategies (clinical cycling).
In our own preliminary work [
12], we have created and published a mathematical framework that is suitable for adequately quantifying the effect of clinical cycling and, in borderline cases, scheduled cycling. Subsequent correlation analyses revealed a relationship between the heterogeneity of antibiotic consumption and the prevalence of resistant pathogens, indicating a reduction in the prevalence of resistant germs. The heterogeneity changes on the pathogen side follow the changes on the consumption side. It is worth mentioning that in most of the older studies conducted, a quantification of the degree of mixing or cyclic variation has rarely been used. An exception is the study by [
17], in which an antibiotic heterogeneity index (AHI) was used. This work was followed by few publications in which AHI or a similar quantification of heterogeneity was discussed (e.g. [
18]), but if so, then only in passing. Of note, AHI is invariant to swapping the antibiotic classes. This means that if the consumption shares of two antibiotics are swapped, AHI remains unchanged. This has to be kept in mind when discussing cycling strategies based on AHI. The view expressed in publication [
19] that the focus in ABS on strategies of cyclical prescribing has shifted to a focus on heterogeneity sounds as if these were disjoint strategies. However, when correctly quantified, variations in the administration of antibiotics, including cyclical prescribing, result in a corresponding change in heterogeneity.
Measures of heterogeneity or, more generally, of diversity are frequently used in ecological studies to calculate “mixing”, i.e. the degree of heterogeneity. These measures are related to entropies, which originated in statistical physics for the quantitative description of mixing processes (dispersions) and similar phenomena. The Shannon entropy known from information theory is, like AHI, a global entropy, i.e. invariant to permutations of the species. Local heterogeneity measures should be used in order to be able to record changes over time. The Kullback-Leibler entropy is such a local entropy and has proven its worth in our preliminary work in the context of ABS as well as of spatio-temporal epidemic patterns [
12,
20]. However, the concrete choice of the final form of the analysis algorithms depends on specific conditions.
Theoretically, it would be conceivable to extrapolate the observed mixing states of antibiotic consumption and predict the resistance incidence. Alternatively, it appears to be appealing to create possible interaction scenarios, in such a way that an “optimal cycling regime” defined by the best possible reduction of resistance is achieved. However, such a project is a real challenge due to the necessary constraints, such as those imposed by biological and clinical conditions and regulations. Our preliminary work represents an important step in this direction, also taking into account clinical constraints (antibiogram, pharmacokinetics and microbiological parameters, guidelines) [
8,
9,
10,
12,
20]. In this publication, an exploratory observational study, we explore the relationship between antibiotic consumption heterogeneity and resistance emergence for the intensive care unit of a German University hospital, located in Bochum.