- Does Switching from Protease Inhibitors to Dolutegravir Reduce the HIV Reservoir or Host Immune Activation?
- AmpC β-lactamases: What the ID Clinician Needs to Know in 2019
- Antimicrobial Stewardship and Sepsis: A Great Debate
Click here for the previous edition of Journal Club. For a review of other recent research in the infectious diseases literature, see “In the Literature,” by Stanley Deresinski, MD, FIDSA, in each issue of Clinical Infectious Diseases.
Reviewed by Dr. Jennifer Brown
Switching patients with HIV infection from protease inhibitor (PI)-based antiretroviral regimens to integrase strand transfer inhibitor (INSTI)-based regimens can reduce the metabolic effects and drug interactions associated with PIs. Yet, it is unclear if switching to INSTIs can decrease HIV reservoirs or immune activation. To address these questions, Morón-López et al. conducted a randomized trial, reported in Clinical Infectious Diseases, in which HIV reservoirs and markers of activation in blood and ileum biopsies were evaluated longitudinally over 24 weeks in patients who either switched from a PI-based regimen to a dolutegravir (DTG)-based regimen or maintained their PI-based regimen.
Overall, 42 patients completed the study; 20 were switched to a DTG-based regimen and 22 continued their PI-based regimen. A subset of 33 patients completed ileum biopsies. For HIV reservoir dynamics, quantification of peripheral CD4+ T cell 2-LTR and plasma ultrasensitive viral load indicated that low-level viral replication continued in patients regardless of switching to DTG. Significant differences between the groups were not observed for total HIV DNA in ileal CD4+ T cells or for total and integrated HIV DNA and unspliced cell-associated HIV RNA in peripheral CD4+ T cells. When markers of activation (percentage of CD38+ HLA-DR+ in CD4+ and CD8+ T cells in ileum biopsies and peripheral blood) and plasma inflammatory markers (interleukin 6, C-reactive protein, TNF-related apoptosis-inducing ligand, and interferon gamma-induced protein 10) were evaluated, significant differences between the groups were not observed. Likewise, significant differences were not seen for frequency of peripheral blood CD4+ and CD8+ T cells, CD4/CD8 ratios, or maturation subsets.
The findings of this study, which suggest that the HIV reservoir and host immune activation in the blood and ileum are not affected by switching patients from PI-based to INSTI-based antiretroviral regimens, may provide insight for clinicians and further research directions.
Reviewed by Dr. A. Krishna Rao, MS
β-lactam antibiotics are central to our treatment of many infections, especially Gram-negative infections from Enterobacteriaceae. However, resistance to β-lactam antibiotics by production of β-lactamases is a growing threat. Broadly, there are different types of β-lactamase-based resistance: AmpC, extended-spectrum beta lactamases (ESBLs), and carbapenemases. A recent review in Clinical Infectious Diseases updates us on what is known about inducible, chromosomally encoded AmpC β-lactamase.
AmpC induction starts with a β-lactam antibiotic increasing cell-wall degradation products in the bacterial cytoplasm. This results in inhibition of AmpR, itself an inhibitor of AmpC production, thus de-repressing AmpC and consequently increasing its production. Although this comes at a fitness cost to bacteria, AmpC hyperproduction will be maintained if the β-lactam antibiotic exposure continues, which can select for resistant subpopulations during an infection. Many organisms have inducible AmpC production, most commonly E. cloacae, Klebsiella aerogenes, C. freundii, S. marcescens, Providencia stuartii, P. aeruginosa, Hafnia alvei, and Morganella morganii, often referred to as the ESCPM, SPACE, or SPICE organisms.
It is important to distinguish inducible AmpC-mediated resistance from ESBL production, as the latter may require therapy with a carbapenem. Phenotypic assays to detect AmpC cannot distinguish induction of AmpC from other resistance, and while molecular tests can, they are currently for research only. A clinician can determine AmpC- vs. ESBL-mediated resistance by carefully examining the susceptibility results. (Table 2 in the review provides more insight on susceptibility patterns.)
Understanding which antibiotics are inducers of AmpC production and which are resistant to hydrolysis can provide a starting point for antibiotic selection. The aminopenicillins, amoxicillin/clavulanate, and first- or second-generation cephalosporins are potent inducers and easily hydrolyzed, so they should be avoided. Third-generation cephalosporins, piperacillin-tazobactam, and aztreonam are only weak inducers but can still be hydrolyzed, so they are often still susceptible, but attention must be paid to the minimum inhibitory concentrations (MICs) during in vitro testing (although third-generation cephalosporins are best avoided). Cefepime is both a weak inducer and not hydrolyzed, and thus usually a good treatment choice. Finally, carbapenems such as imipenem are strong inducers, but they are not hydrolyzed and generally have potent activity against AmpC producers.
These observations are supported by both in vitro and clinical data, however the clinical studies are not interventional, limiting their conclusions. While carbapenems and newer β-lactamase inhibitors (e.g., avibactam and vaborbactam) are very potent, avoiding their overuse is wise or resistance is likely to emerge. Using piperacillin-tazobactam rather than cefepime remains controversial, and there is insufficient evidence to make a recommendation either way. Notably, non-β-lactam antibiotics such as fluoroquinolones and trimethoprim-sulfamethoxazole often have activity and are appropriate either as step-down therapy or even first-line treatment for mild or moderate infections.
So how is an ID physician to navigate this complex landscape? The authors provide this concluding advice: Enterobacter species are the most likely to have inducible AmpC production, so avoid treating them with third-generation cephalosporins, and laboratories should consider suppressing these results for non-urinary (i.e., systemic) isolates, as sensitivity testing results are misleading. Interventional studies including diverse species are lacking, but ongoing research may provide insights. Inducible AmpC production by other SPACE organisms is less prominent, so for now, these treatment decisions can be guided by in vitro susceptibility results while ensuring optimal dosing that accounts for elevated MICs, source control, and close monitoring.
Reviewed by Dr. Kelly Cawcutt, MS
Optimal management of sepsis has long been a holy grail in medicine. One area that remains fraught with debate is how to effectively balance the need for emergent antimicrobial administration with principles of antimicrobial stewardship. A recent point-counterpoint series published in CHEST explores whether broad-spectrum antibiotics should be routinely administered to all sepsis patients as soon as possible. Disselkamp et al. argue that yes, early administration of broad-spectrum antibiotics increases the likelihood of adequate coverage AND is associated with decreased mortality. This does not negate the need for a commitment to stewardship, but, the piece asks, “If we do not use antibiotics for patients with life-threatening organ dysfunction, who are we saving them for?”
In the counterpoint piece, Patel and Bergl argue that although delayed antibiotics may increase mortality in sepsis, the strong recommendation for empiric broad-antimicrobial therapy is inappropriate due to a paucity of high-quality evidence and risk of harm (such as adverse drug events and Clostridioides difficile infection) related to “indiscriminate broad-spectrum antibiotics.” They further argue that appropriate antibiotic therapy may not always be broad-spectrum, and that the assumption that broad-spectrum therapy is required is both flawed and potentially costly to patients and health care systems.
This debate, however, may not be the most critical for our sepsis patients currently. In a recently published article in Critical Care Medicine, Kashiouris et al. demonstrate that delays in first antimicrobial execution (time from order to administration) are common (50 percent of patients) and carry increased mortality (adjusted odds ratio 1.28 [95 percent confidence interval, 1.07–1.54; P = 0.007] for 1-2 hours), particularly for patients with many comorbidities. Regardless of what antibiotics were ordered, if they were not administered within the first hour, mortality continued to rise. These findings raise important questions: Do we have time to determine optimal antimicrobial therapy in sepsis when we frequently fail to administer it fast enough? How do we determine the greater good: Immediate mortality or potential downstream morbidity and mortality from adverse events?