SARS-CoV-2 and COVID-19: An Evolving Review of Diagnostics and Therapeutics

This manuscript (permalink) was automatically generated from greenelab/covid19-review@26d9bf1 on July 30, 2023. It is also available as a PDF. Snapshots of individual sections have been published (16) or posted as preprints (7).

Authors

COVID-19 Review Consortium: Vikas Bansal, John P. Barton, Simina M. Boca, Joel D Boerckel, Christian Brueffer, James Brian Byrd, Stephen Capone, Shikta Das, Anna Ada Dattoli, John J. Dziak, Jeffrey M. Field, Soumita Ghosh, Anthony Gitter, Rishi Raj Goel, Casey S. Greene, Marouen Ben Guebila, Daniel S. Himmelstein, Fengling Hu, Nafisa M. Jadavji, Jeremy P. Kamil, Sergey Knyazev, Likhitha Kolla, Alexandra J. Lee, Ronan Lordan, Tiago Lubiana, Temitayo Lukan, Adam L. MacLean, David Mai, Serghei Mangul, David Manheim, Lucy D'Agostino McGowan, Jesse G. Meyer, Ariel I. Mundo, Amruta Naik, YoSon Park, Dimitri Perrin, Yanjun Qi, Diane N. Rafizadeh, Bharath Ramsundar, Halie M. Rando, Sandipan Ray, Michael P. Robson, Vincent Rubinetti, Elizabeth Sell, Lamonica Shinholster, Ashwin N. Skelly, Yuchen Sun, Yusha Sun, Gregory L Szeto, Ryan Velazquez, Jinhui Wang, Nils Wellhausen

Authors are ordered arbitrarily.

1 Pathogenesis, Symptomatology, and Transmission of SARS-CoV-2 through Analysis of Viral Genomics and Structure

1.1 Abstract

The novel coronavirus SARS-CoV-2, which emerged in late 2019, has since spread around the world and infected hundreds of millions of people with coronavirus disease 2019 (COVID-19). While this viral species was unknown prior to January 2020, its similarity to other coronaviruses that infect humans has allowed for rapid insight into the mechanisms that it uses to infect human hosts, as well as the ways in which the human immune system can respond. Here, we contextualize SARS-CoV-2 among other coronaviruses and identify what is known and what can be inferred about its behavior once inside a human host. Because the genomic content of coronaviruses, which specifies the virus’s structure, is highly conserved, early genomic analysis provided a significant head start in predicting viral pathogenesis and in understanding potential differences among variants. The pathogenesis of the virus offers insights into symptomatology, transmission, and individual susceptibility. Additionally, prior research into interactions between the human immune system and coronaviruses has identified how these viruses can evade the immune system’s protective mechanisms. We also explore systems-level research into the regulatory and proteomic effects of SARS-CoV-2 infection and the immune response. Understanding the structure and behavior of the virus serves to contextualize the many facets of the COVID-19 pandemic and can influence efforts to control the virus and treat the disease.

1.2 Importance

COVID-19 involves a number of organ systems and can present with a wide range of symptoms. From how the virus infects cells to how it spreads between people, the available research suggests that these patterns are very similar to those seen in the closely related viruses SARS-CoV-1 and possibly MERS-CoV. Understanding the pathogenesis of the SARS-CoV-2 virus also contextualizes how the different biological systems affected by COVID-19 connect. Exploring the structure, phylogeny, and pathogenesis of the virus therefore helps to guide interpretation of the broader impacts of the virus on the human body and on human populations. For this reason, an in-depth exploration of viral mechanisms is critical to a robust understanding of SARS-CoV-2 and, potentially, future emergent HCoV.

1.3 Introduction

The current coronavirus disease 2019 (COVID-19) pandemic, caused by the Severe acute respiratory syndrome-related coronavirus 2 (SARS-CoV-2) virus, represents an acute global health crisis. Symptoms of the disease can range from mild to severe or fatal (8) and can affect a variety of organs and systems (9). Outcomes of infection can include acute respiratory distress (ARDS) and acute lung injury, as well as damage to other organ systems (9, 10). Understanding the progression of the disease, including these diverse symptoms, depends on understanding how the virus interacts with the host. Additionally, the fundamental biology of the virus can provide insights into how it is transmitted among people, which can, in turn, inform efforts to control its spread. As a result, a thorough understanding of the pathogenesis of SARS-CoV-2 is a critical foundation on which to build an understanding of COVID-19 and the pandemic as a whole.

The rapid identification and release of the genomic sequence of the virus in January 2020 (11) provided early insight into the virus in a comparative genomic context. The viral genomic sequence clusters with known coronaviruses (order Nidovirales, family Coronaviridae, subfamily Orthocoronavirinae). Phylogenetic analysis of the coronaviruses reveals four major subclades, each corresponding to a genus: the alpha, beta, gamma, and delta coronaviruses. Among them, alpha and beta coronaviruses infect mammalian species, gamma coronaviruses infect avian species, and delta coronaviruses infect both mammalian and avian species (12). The novel virus now known as SARS-CoV-2 was identified as a beta coronavirus belonging to the B lineage based on phylogenetic analysis of a polymerase chain reaction (PCR) amplicon fragment from five patients along with the full genomic sequence (13). This lineage also includes the Severe acute respiratory syndrome-related coronavirus (SARS-CoV-1) that caused the 2002-2003 outbreak of Severe Acute Respiratory Syndrome (SARS) in humans (13). (Note that these subclades are not to be confused with variants of concern within SARS-CoV-2 labeled with Greek letters; i.e., the Delta variant of SARS-CoV-2 is still a beta coronavirus.)

Because viral structure and mechanisms of pathogenicity are highly conserved within the order, this phylogenetic analysis provided a basis for forming hypotheses about how the virus interacts with hosts, including which tissues, organs, and systems would be most susceptible to SARS-CoV-2 infection. Coronaviruses that infect humans (HCoV) are not common, but prior research into other HCoV such as SARS-CoV-1 and Middle East respiratory syndrome-related coronavirus (MERS-CoV), as well as other viruses infecting humans such as a variety of influenza species, established a strong foundation that accelerated the pace of SARS-CoV-2 research.

Coronaviruses are large viruses that can be identified by their distinctive “crown-like” shape (Figure 1). Their spherical virions are made from lipid envelopes ranging from 100 to 160 nanometers in which peplomers (protruding structures) of two to three spike (S) glycoproteins are anchored, creating the crown (14, 15). These spikes, which are critical to both viral pathogenesis and to the response by the host immune response, have been visualized using cryo-electron microscopy (16). Because they induce the human immune response, they are also the target of many proposed therapeutic agents (2, 3). Viral pathogenesis is typically broken down into three major components: entry, replication, and spread (17). However, in order to draw a more complete picture of pathogenesis, it is also necessary to examine how infection manifests clinically, identify systems-level interactions between the virus and the human body, and consider the possible effects of variation or evolutionary change on pathogenesis and virulence. Thus, clinical medicine and traditional biology are both important pieces of the puzzle of SARS-CoV-2 presentation and pathogenesis.

1.4 Coronavirus Structure and Pathogenesis

1.4.1 Structure of Coronaviruses

Genome structure is highly conserved among coronaviruses, meaning that the relationship between the SARS-CoV-2 genome and its pathogenesis can be inferred from prior research in related viral species. The genomes of viruses in the Nidovirales order share several fundamental characteristics. They are non-segmented, which means the viral genome is a single continuous strand of RNA, and are enveloped, which means that the genome and capsid are encased by a lipid bilayer. Coronaviruses have large positive-sense RNA (ssRNA+) genomes ranging from 27 to 32 kilobases in length (18, 19). The SARS-CoV-2 genome lies in the middle of this range at 29,903 bp (19). Genome organization is highly conserved within the order (18). There are three major genomic regions: one containing the replicase gene, one containing the genes encoding structural proteins, and interspersed accessory genes (18) (Figure 1). The replicase gene comprises about two-thirds of the genome and consists of two open reading frames that are translated with ribosomal frameshifting (18). This polypeptide is then translated into 16 non-structural proteins (nsp), except in gammacoronaviruses where nsp1 is absent, that form the replication machinery used to synthesize viral RNA (20). The remaining third of the genome encodes structural proteins, including the spike (S), membrane, envelope, and nucleocapsid proteins. Additional accessory genes are sometimes present between these two regions, depending on the species or strain. Much attention has been focused on the S protein, which is a critical structure involved in cell entry.

Figure 1: Structure of SARS-CoV-2 capsid and genome. A) The genomic structure of coronaviruses is highly conserved and includes three main regions. Open reading frames (ORF) 1a and 1b contain two polyproteins that encode the non-structural proteins (nsp). The nsp include enzymes such as RNA-dependent RNA Polymerase (RdRp). The last third of the genome encodes structural proteins, including the spike (S), envelope (E), membrane (M) and nucleocapsid (N) proteins. Accessory genes can also be interspersed throughout the genome (18). B) The physical structure of the coronavirus virion, including the components determined by the conserved structural proteins S, E, M and N. This figure was adapted from “Human Coronavirus Structure”, by BioRender.com (2020), retrieved from https://app.biorender.com/biorender-templates.

1.4.2 Pathogenic Mechanisms of Coronaviruses

While it is possible that SARS-CoV-1 and SARS-CoV-2, like most viruses, enter cells through endocytosis, a process conserved among coronaviruses enables them to target cells for entry through fusion with the plasma membrane (21, 22). Cell entry proceeds in three steps: binding, cleavage, and fusion. First, the viral spike protein binds to a host cell via a recognized receptor or entry point. Coronaviruses can bind to a range of host receptors (23, 24), with binding conserved only at the genus level (12). Viruses in the beta coronavirus genus, to which SARS-CoV-2 belongs, are known to bind to the CEACAM1 protein, 5-N-acetyl-9-O-acetyl neuraminic acid, and to angiotensin-converting enzyme 2 (ACE2) (23). This recognition is driven by domains in the S1 subunit (25). SARS-CoV-2 has a high affinity for human ACE2, which is expressed in the vascular epithelium, other epithelial cells, and cardiovascular and renal tissues (26, 27), as well as many others (28). The binding process is guided by the molecular structure of the spike protein, which is structured in three segments: an ectodomain, a transmembrane anchor, and an intracellular tail (29). The ectodomain forms the crown-like structures on the viral membrane and contains two subdomains known as the S1 and S2 subunits (30). The S1 (N-terminal) domain forms the head of the crown and contains the receptor binding motif, and the S2 (C-terminal) domain forms the stalk that supports the head (30). The S1 subunit guides the binding of the virus to the host cell, and the S2 subunit guides the fusion process (29).

After the binding of the S1 subunit to an entry point, the spike protein of coronaviruses is often cleaved at the S1/S2 boundary into the S1 and S2 subunits by a host protease (25, 31, 32). This proteolytic priming is important because it prepares the S protein for fusion (31, 32). The two subunits remain bound by van der Waals forces, with the S1 subunit stabilizing the S2 subunit throughout the membrane fusion process (25). Cleavage at a second site within S2 (S2’) activates S for fusion by inducing conformational changes (25). Similar to SARS-CoV-1, SARS-CoV-2 exhibits redundancy in which host proteases can cleave the S protein (33). Both transmembrane protease serine protease-2 (TMPRSS-2) and cathepsins B/L have been shown to mediate SARS-CoV-2 S protein proteolytic priming, and small molecule inhibition of these enzymes fully inhibited viral entry in vitro (33, 34). Other proteases known to cleave the S1/S2 boundary in coronaviruses include TMPRSS-4, trypsin, furin, cathepsins, and human airway trypsin-like protease (HAT) (34).

Unlike in SARS-CoV-1, a second cleavage site featuring a furin-like binding motif is also present near the S1/S2 boundary in SARS-CoV-2 (35). This site is found in HCoV belonging to the A and C lineages of beta coronavirus, including MERS-CoV, but not in the other known members of the B lineage of beta coronavirus that contains SARS-CoV-1 and SARS-CoV-2 (35). It is associated with increased virulence in other viral species (35) and may facilitate membrane fusion of SARS-CoV-2 in the absence of other proteases that prime the S1/S2 site (36). However, given that proteases such as HAT are likely to be present in targets like the human airway, the extent to which this site has had a real-world effect on the spread of SARS-CoV-2 was initially unclear (36). Subsequent research has supported this site as an important contributor to pathogenesis: in vitro analyses have reported that it bolsters pathogenicity specifically in cell lines derived from human airway cells (Calu3 cell line) (3739) and that furin inhibitors reduced pathogenic effects in VeroE6 cells (40).

Electron microscopy suggests that in some coronaviruses, including SARS-CoV-1 and MERS-CoV, a six-helix bundle separates the two subunits in the postfusion conformation, and the unusual length of this bundle facilitates membrane fusion through the release of additional energy (12). The viral membrane can then fuse with the endosomal membrane to release the viral genome into the host cytoplasm. Once the virus enters a host cell, the replicase gene is translated and assembled into the viral replicase complex. This complex then synthesizes the double-stranded RNA (dsRNA) genome from the genomic ssRNA(+). The dsRNA genome is transcribed and replicated to create viral mRNAs and new ssRNA(+) genomes (18, 41). From there, the virus can spread into other cells. In SARS-CoV-2, the insertion of the furin-like binding site near the S1/S2 boundary is also thought to increase cell-cell adhesion, making it possible for the viral genome to spread directly from cell to cell rather than needing to propagate the virion itself (42). In this way, the genome of SARS-CoV-2 provides insight into the pathogenic behavior of the virus.

Evidence also suggests that SARS-CoV-2 may take advantage of the specific structure of endothelial cells to enter the circulatory system. Endothelial cells are specialized epithelial cells (43) that form a barrier between the bloodstream and surrounding tissues. The endothelium facilitates nutrient, oxygen, and cellular exchange between the blood and vascularized tissues (44). The luminal (interior) surface of the endothelium is lined with glycocalyx, a network of both membrane-bound and soluble proteins and carbohydrates, primarily proteoglycans and glycoproteins (45, 46). The glycocalyx varies in thickness from 0.5 microns in the capillaries to 4.5 microns in the carotid arteries and forms a meshwork that localizes both endothelial- and plasma-derived signals to the inner vessel wall (45). Heparan sulfate is the dominant proteoglycan in the glycocalyx, representing 50-90% of glycocalyx proteoglycan content (47). The SARS-CoV-2 spike protein can bind directly to heparan sulfate, which serves in part as a scaffolding molecule to facilitate ACE2 binding and entry into endothelial cells (46). A heparan sulfate binding site has also been identified near the ACE2 binding site on the viral receptor binding domain (RBD), and modeling has suggested that heparan sulfate binding yields an open conformation that facilitates binding to ACE2 on the cell surface (46). Degrading or removing heparan sulfate was associated with decreased binding (46). Heparan sulfate may also interact with the S1/S2 proteolytic cleavage site and other binding sites to promote binding affinity (48). Notably, treatment with soluble heparan sulfate or even heparin (a commonly used anti-coagulant and vasodilator that is similar in structure to heparan sulfate (49)) potently blocked spike protein binding and viral infection (46). This finding is particularly interesting because degradation of heparan sulfate in the glycocalyx has previously been identified as an important contributor to ARDS and sepsis (50), two common and severe outcomes of COVID-19, and suggests that heparan sulfate could be a target for pharmaceutical inhibition of cell entry by SARS-CoV-2 (5155). Together, this evidence suggests that heparan sulfate can serve as an important adhesion molecule for SARS-CoV-2 cell entry. It may represent a therapeutic target but has not been pursued as much as other candidate targets (3).

1.4.3 Immune Evasion Strategies

Research in other HCoV provides some indication of how SARS-CoV-2 infection can proceed despite human immune defenses. Infecting the epithelium can help viruses such as SARS-CoV-1 bypass the physical barriers, such as mucus, that comprise the immune system’s first line of defense (56). Once the virus infiltrates host cells, it is adept at evading detection. CD163+ and CD68+ macrophage cells are especially crucial for the establishment of SARS-CoV-1 in the body (56). These cells most likely serve as viral reservoirs that help shield SARS-CoV-1 from the innate immune response. According to a study on the viral dissemination of SARS-CoV-1 in Chinese macaques, viral RNA could be detected in some monocytes throughout the process of differentiation into dendritic cells (56). This lack of active viral replication allows SARS-CoV-1 to escape the innate immune response because reduced levels of detectable viral RNA allow the virus to avoid both natural killer cells and Toll-like receptors (56). Even during replication, SARS-CoV-1 is able to mask its dsRNA genome from detection by the immune system. Although dsRNA is a pathogen-associated molecular pattern that would typically initiate a response from the innate immune system (57), in vitro analysis of nidoviruses including SARS-CoV-1 suggests that these viruses can induce the development of double-membrane vesicles that protect the dsRNA signature from being detected by the host immune system (58). This protective envelope can therefore insulate these coronaviruses from the innate immune system’s detection mechanism (59).

HCoVs are also known to interfere with the host immune response, rather than just evade it. For example, the virulence of SARS-CoV-2 is increased by nsp1, which can suppress host gene expression by stalling mRNA translation and inducing endonucleolytic cleavage and mRNA degradation (60). SARS-CoV-1 also evades the immune response by interfering with type I IFN induction signaling, which is a mechanism that leads to cellular resistance to viral infections. SARS-CoV-1 employs methods such as ubiquitination and degradation of RNA sensor adaptor molecules MAVS and TRAF3/6 (61). Also, MERS-CoV downregulates antigen presentation via MHC class I and MHC class II, which leads to a reduction in T cell activation (61). These evasion mechanisms, in turn, may facilitate systemic infection. Coronaviruses such as SARS-CoV-1 are also able to evade the humoral immune response through other mechanisms, such as inhibiting certain cytokine pathways or down-regulating antigen presentation by the cells (58).

1.4.4 Host Cell Susceptibility

ACE2 and TMPRSS-2 have been identified as the primary entry portal and as a critical protease, respectively, in facilitating the entry of SARS-CoV-1 and SARS-CoV-2 into a target cell (16, 33, 6264). This finding has led to a hypothesized role for the expression of these molecules in determining which cells, tissues, and organs are most susceptible to SARS-CoV-2 infection. ACE2 is expressed in numerous organs, such as the heart, kidney, and intestine, but it is most prominently expressed in alveolar epithelial cells; this pattern of expression is expected to contribute to the virus’ association with lung pathology (26, 65, 66) as well as that of SARS (67). A retrospective observational study reported indirect evidence that certain antineoplastic therapies, such as the chemotherapy drug gemcitabine, may reduce risk of SARS-CoV-2 infection in patients with cancer, possibly via decreased ACE2 expression (68). Additionally, the addition of the furin site insertion at the S1/S2 boundary means that SARS-CoV-2 does not require TMPRSS-2 when furin, an ubiquitously expressed endoprotease (69), is present, enabling cell-cell fusion independent of TMPRSS-2 availability (70).

Clinical investigations of COVID-19 patients have detected SARS-CoV-2 transcripts in bronchoalveolar lavage fluid (BALF) (93% of specimens), sputum (72%), nasal swabs (63%), fibrobronchoscopy brush biopsies (46%), pharyngeal swabs (32%), feces (29%), and blood (1%) (71). Two studies reported that SARS-CoV-2 could not be detected in urine specimens (71, 72); however, a third study identified four urine samples (out of 58) that were positive for SARS-CoV-2 nucleic acids (73). Although respiratory failure remains the leading cause of death for COVID-19 patients (74), SARS-CoV-2 infection can damage many other organ systems including the heart (75), kidneys (76, 77), liver (78), and gastrointestinal tract (79, 80). As it becomes clear that SARS-CoV-2 infection can damage multiple organs, the scientific community is pursuing multiple avenues of investigation in order to build a consensus about how the virus affects the human body.

1.5 Clinical Presentation of COVID-19

SARS-CoV-2 pathogenesis is closely linked with the clinical presentation of the COVID-19 disease. Reports have described diverse symptom profiles associated with COVID-19, with a great deal of variability both within and between institutions and regions. Definitions for non-severe, severe, and critical COVID-19, along with treatment recommendations, are available from the World Health Organization living guidelines (81). A large study from Wuhan, China conducted early in the pandemic identified fever and cough as the two most common symptoms that patients reported at hospital admission (82), while a retrospective study in China described the clinical presentations of patients infected with SARS-CoV-2 as including lower respiratory tract infection with fever, dry cough, and dyspnea (shortness of breath) (83). This study (83) noted that upper respiratory tract symptoms were less common, suggesting that the virus preferentially targets cells located in the lower respiratory tract. However, data from the New York City region (84, 85) showed variable rates of fever as a presenting symptom, suggesting that symptoms may not be consistent across individuals. For example, even within New York City, one study (84) identified low oxygen saturation (<90% without the use of supplemental oxygen or ventilation support) in 20.4% of patients upon presentation, with fever being present in 30.7%, while another study (85) reported cough (79.4%), fever (77.1%), and dyspnea (56.5%) as the most common presenting symptoms; both of these studies considered only hospitalized patients. A later study reported radiographic findings such as ground-glass opacity and bilateral patchy shadowing in the lungs of many hospitalized patients, with most COVID-19 patients having lymphocytopenia, or low levels of lymphocytes (a type of white blood cell) (82). Patients may also experience loss of smell, myalgias (muscle aches), fatigue, or headache. Gastrointestinal symptoms can also present (86), and the CDC includes nausea and vomiting, as well as congestion and runny nose, on its list of symptoms consistent with COVID-19 (8). An analysis of an app-based survey of 500,000 individuals in the U.S. found that among those tested for SARS-CoV-2, a loss of taste or smell, fever, and a cough were significant predictors of a positive test result (87). It is important to note that in this study, the predictive value of symptoms may be underestimated if they are not specific to COVID-19. This underestimation could occur because the outcome measured was a positive, as opposed to a negative, COVID-19 test result, meaning an association would be more easily identified for symptoms that were primarily or exclusively found with COVID-19. At the time the surveys were conducted, due to limits in U.S. testing infrastructure, respondents typically needed to have some symptoms known to be specific to COVID-19 in order to qualify for testing. Widespread testing of asymptomatic individuals may therefore provide additional insight into the range of symptoms associated with COVID-19.

Consistent with the wide range of symptoms observed and the pathogenic mechanisms described above, COVID-19 can affect a variety of systems within the body in addition to causing respiratory problems (88). For example, COVID-19 can lead to acute kidney injury, especially in patients with severe respiratory symptoms or certain preexisting conditions (89). Some patients are at risk for collapsing glomerulopathy (90).

COVID-19 can also cause neurological complications (9193), potentially including stroke, seizures or meningitis (94, 95). One study on autopsy samples suggested that SARS-CoV-2 may be able to enter the central nervous system via the neural–mucosal interface (96). However, a study of 41 autopsied brains (97) found no evidence that the virus can actually infect the central nervous system. Although there was viral RNA in some brain samples, it was only found in very small amounts, and no viral protein was found. The RNA may have been in the blood vessels or blood components and not in the brain tissue itself. Instead, the neuropathological effects of COVID-19 are more likely to be caused indirectly by hypoxia, coagulopathy, or inflammatory processes rather than by infection in the brain (97). COVID-19 has been associated with an increased incidence of large vessel stroke, particularly in patients under the age of 40 (98), and other thrombotic events including pulmonary embolism and deep vein thrombosis (99). The mechanism behind these complications has been suggested to be related to coagulopathy, with reports indicating the presence of antiphospholipid antibodies (100) and elevated levels of d-dimer and fibrinogen degradation products in deceased patients (101). Other viral infections have been associated with coagulation defects and changes to the coagulation cascade; notably, SARS was also found to lead to disseminated intravascular coagulation and was associated with both pulmonary embolism and deep vein thrombosis (102). The mechanism behind these insults has been suggested to be related to inflammation-induced increases in the von Willebrand factor clotting protein, leading to a pro-coagulative state (102). Abnormal clotting (thromboinflammation or coagulopathy) has been increasingly discussed recently as a possible key mechanism in many cases of severe COVID-19, and may be associated with the high d-dimer levels often observed in severe cases (103105). This excessive clotting in lung capillaries has been suggested to be related to a dysregulated activation of the complement system, part of the innate immune system (106, 107).

Finally, concerns have been raised about long-term sequelae of COVID-19. Some COVID-19 patients have reported that various somatic symptoms (such as shortness of breath, fatigue, chest pain) and psychological (depression, anxiety or mild cognitive impairment) symptoms can last for months after infection (108). Such long-term affects occur both in adults (109) and children (110). Sustained symptoms affecting a variety of biological systems have been reported across many studies (e.g., (108, 111, 112)). The phenomenon of “long COVID” is not fully understood although various possible explanations have been proposed, including damage caused by immune response to infection as well as by the infection itself, in addition to negative consequences of the experience of lengthy illness and hospitalization. However, a lack of consistency among definitions used in different studies makes it difficult to develop precise definitions or identify specific symptoms associated with long-term effects of COVID-19 (113, 114). Patient and family support groups for “long haulers” have been formed online, and patient-driven efforts to collect data about post-acute COVID-19 provide valuable sources of information (e.g., (111)). The specific relationship between viral pathogenesis and these reported sequelae remains to be uncovered, however.

1.5.1 Pediatric Presentation

The presentation of COVID-19 infection can vary greatly among pediatric patients and, in some cases, manifests in distinct ways from COVID-19 in adults. Evidence suggests that children and adolescents tend to have mostly asymptomatic infections and that those who are symptomatic typically exhibit mild illness (115118). One review examined symptoms reported in 17 studies of children infected with COVID-19 during the early months of the COVID-19 epidemic in China and one study from Singapore (119). In the more than a thousand cases described, the most common reports were for mild symptoms such as fever, dry cough, fatigue, nasal congestion and/or runny nose, while three children were reported to be asymptomatic. Severe lower respiratory infection was described in only one of the pediatric cases reviewed. Gastrointestinal symptoms such as vomiting or diarrhea were occasionally reported. Radiologic findings were not always reported in the case studies reviewed, but when they were mentioned they included bronchial thickening, ground-glass opacities, and/or inflammatory lesions (119). Neurological symptoms have also been reported (120).

These analyses indicate that most pediatric cases of COVID-19 are not severe. Indeed, it is estimated that less than 1% of pediatric cases result in critical illness (117, 121), although reporting suggests that pediatric hospitalizations may be greater with the emergence of the Delta variant of concern (VOC) (122124). Serious complications and, in relatively rare cases, deaths have occurred (125). Of particular interest, children have occasionally experienced a serious inflammatory syndrome, multisystem inflammatory syndrome in children (MIS-C), following COVID-19 infection (126). This syndrome is similar in some respects to Kawasaki disease, including Kawasaki disease shock syndrome (127129), and is thought to be a distinct clinical manifestation of SARS-CoV-2 due to its distinct cytokine profile and the presence of burr cells in peripheral blood smears (130, 131). MIS-C has been associated with heart failure in some cases (132). A small number of case studies have identified presentations similar to MIS-C in adults associated with SARS-CoV-2 (133136). However, not all cases of severe COVID-19 in children are characterizable as MIS-C. A recent study (137) described demographic and clinical variables associated with MIS-C in comparison with non-MIS-C severe acute COVID-19 in young people in the United States. Efforts to characterize long-term sequelae of SARS-CoV-2 infection in children face the same challenges as in adults, but long-term effects remain a concern in pediatric patients (110, 138, 139), although some early studies have suggested that they may be less of a concern than in adults (140142). Research is ongoing into the differences between the pediatric and adult immune responses to SARS-CoV-2, and future research may shed light on the factors that lead to MIS-C; it is also unknown whether the relative advantages of children against severe COVID-19 will remain in the face of current and future variants (143).

1.5.2 Cytokine Release Syndrome

The inflammatory response was identified early on as a potential driver of COVID-19 outcomes due to existing research in SARS and emerging research in COVID-19. While too low of an inflammatory response is a concern because it will fail to eliminate the immune threat (144), excessive pro-inflammatory cytokine activity can cascade (145) and cause cell damage, among other problems (146). A dysregulated immune response can cause significant damage to the host (147149) including pathogenesis associated with sepsis. Sepsis, which can lead to multi-organ failure and death (150, 151), is traditionally associated with bacterial infections. However, sepsis associated with viral infections may be underidentified (152), and sepsis has emerged as a major concern associated with SARS-CoV-2 infection (153). Hyperactivity of the pro-inflammatory response due to lung infection is commonly associated with acute lung injury and more rarely with the more severe manifestation, ARDS, which can arise from pneumonia, SARS, and COVID-19 (145, 150). Damage to the capillary endothelium can cause leaks that disrupt the balance between pro-inflammatory cytokines and their regulators (154), and heightened inflammation in the lungs can also serve as a source for systemic inflammation, or sepsis, and potentially multi-organ failure (150). The shift from local to systemic inflammation is a phenomenon often referred to broadly as a cytokine storm (150) or, more precisely, as cytokine release syndrome (155).

Cytokine dysregulation is therefore a significant concern in the context of COVID-19. In addition to the known role of cytokines in ARDS and lung infection more broadly, immunohistological analysis at autopsy of deceased SARS patients revealed that ACE2-expressing cells that were infected by SARS-CoV-1 showed elevated expression of the cytokines IL-6, IL-1β, and TNF-α (156). Similarly, the introduction of the S protein from SARS-CoV-1 to mouse macrophages was found to increase production of IL-6 and TNF-α (157). For SARS-CoV-2 infection leading to COVID-19, early reports described a cytokine storm syndrome-like response in patients with particularly severe infections (65, 158, 159). Sepsis has been identified as a major contributor to COVID-19-related death. Among patients hospitalized with COVID-19 in Wuhan, China, 112 out of 191 (59%) developed sepsis, including all 54 of the non-survivors (83).

While IL-6 is sometimes used as a biomarker for cytokine storm activity in sepsis (150), the relationship between cytokine profiles and the risks associated with sepsis may be more complex. One study of patients with and at risk for ARDS, specifically those who were intubated for medical ventilation, found that shortly after the onset of ARDS, anti-inflammatory cytokine concentration in BALF increased relative to the concentration of pro-inflammatory cytokines (154). The results suggest that an increase in pro-inflammatory cytokines such as IL-6 may signal the onset of ARDS, but recovery depends on an increased anti-inflammatory response (154). However, patients with severe ARDS were excluded from this study. Another analysis of over 1,400 pneumonia patients in the United States reported that IL-6, tumor necrosis factor (TNF), and IL-10 were elevated at intake in patients who developed severe sepsis and/or ultimately died (160). However, unlike the study analyzing pro- and anti-inflammatory cytokines in ARDS patients (154), this study reported that unbalanced pro-/anti-inflammatory cytokine profiles were rare. This discrepancy could be related to the fact that the sepsis study measured only three cytokines. Although IL-6 has traditionally been considered pro-inflammatory, its pleiotropic effects via both classical and trans-signaling allow it to play an integral role in both the inflammatory and anti-inflammatory responses (161), leading it to be associated with both healthy and pathological responses to viral threat (162). While the cytokine levels observed in COVID-19 patients fall outside of the normal range, they are not as high as typically found in patients with ARDS (163). Regardless of variation in the anti-inflammatory response, prior work has therefore made it clear that pulmonary infection and injury are associated with systemic inflammation and with sepsis. Inflammation has received significant interest both in regards to the pathology of COVID-19 as well as potential avenues for treatment, as the relationship between the cytokine storm and the pathophysiology of COVID-19 has led to the suggestion that a number of immunomodulatory pharmaceutical interventions could hold therapeutic value for the treatment of COVID-19 (3, 164).

1.6 Insights from Systems Biology

Systems biology provides a cross-disciplinary analytical paradigm through which the host response to an infection can be analyzed. This field integrates the “omics” fields (genomics, transcriptomics, proteomics, metabolomics, etc.) using bioinformatics and other computational approaches. Over the last decade, systems biology approaches have been used widely to study the pathogenesis of diverse types of life-threatening acute and chronic infectious diseases (165). Omics-based studies have also provided meaningful information regarding host immune responses and surrogate protein markers in several viral, bacterial and protozoan infections (166). Though the complex pathogenesis and clinical manifestations of SARS-CoV-2 infection are not yet fully understood, omics technologies offer the opportunity for discovery-driven analysis of biological changes associated with SARS-CoV-2 infection.

1.6.1 Transcriptomics

Through transcriptomic analysis, the effect of a viral infection on gene expression can be assessed. Transcriptomic analyses, whether in vivo or in situ, can potentially reveal insights into viral pathogenesis by elucidating the host response to the virus. For example, infection by some viruses, including by the coronaviruses SARS-CoV-2, SARS-CoV-1, and MERS-CoV, is associated with the upregulation of ACE2 in human embryonic kidney cells and human airway epithelial cells (65). This finding suggests that SARS-CoV-2 facilitates the positive regulation of its own transmission between host cells (65). The host immune response also likely plays a key role in mediating infection-associated pathologies. Therefore, transcriptomics is one critical tool for characterizing the host response in order to gain insight into viral pathogenesis. For this reason, the application of omics technologies to the process of characterizing the host response is expected to provide novel insights into how hosts respond to SARS-CoV-2 infection and how these changes might influence COVID-19 outcomes.

Several studies have examined the cellular response to SARS-CoV-2 in vitro in comparison to other viruses. One study (167) compared the transcriptional responses of three human cell lines to SARS-CoV-2 and to other respiratory viruses, including MERS-CoV, SARS-CoV-1, Human parainfluenza virus 3, Respiratory syncytial virus, and Influenza A virus. The transcriptional response differed between the SARS-CoV-1 infected cells and the cells infected by other viruses, with changes in differential expression specific to each infection type. Where SARS-CoV-2 was able to replicate efficiently, differential expression analysis revealed that the transcriptional response was significantly different from the response to all of the other viruses tested. A unique pro-inflammatory cytokine signature associated with SARS-CoV-2 was present in cells exposed to both high and low doses of the virus, with the cytokines IL-6 and IL1RA uniquely elevated in response to SARS-CoV-2 relative to other viruses. However, one cell line showed significant IFN-I or IFN-III expression when exposed to high, but not low, doses of SARS-CoV-2, suggesting that IFN induction is dependent on the extent of exposure. These results suggest that SARS-CoV-2 induces a limited antiviral state with low IFN-I or IFN-III expression and a moderate IFN-stimulated gene response, in contrast to other viruses. Other respiratory viruses have been found to encode antagonists to the IFN response (168, 169), including SARS-CoV-1 (170) and MERS-CoV (171).

The analysis of SARS-CoV-2 suggested that this transcriptional state was specific to cells expressing ACE2, as it was not observed in cells lacking expression of this protein except with ACE2 supplementation and at very high (10-fold increase) level of SARS-CoV-2 exposure (167). In another study, direct stimulation with inflammatory cytokines such as type I interferons (e.g., IFNβ) was also associated with the upregulation of ACE2 in human bronchial epithelial cells, with treated groups showing four-fold higher ACE2 expression than control groups at 18 hours post-treatment (172). This hypothesis was further supported by studies showing that several nsp in SARS-CoV-2 suppress interferon activity (173) and that the SARS-CoV-2 ORF3b gene suppresses IFNB1 promoter activity (IFN-I induction) more efficiently than the SARS-CoV-1 ORF3b gene (174). Taken together, these findings suggest that a unique cytokine profile is associated with the response to the SARS-CoV-2 virus, and that this response differs depending on the magnitude of exposure.

Susceptibility and IFN induction may also vary by cell type. Using poly(A) bulk RNA-seq to analyzed dynamic transcriptional responses to SARS-CoV-2 and SARS-CoV-1 revealed negligible susceptibility of cells from the H1299 line (< 0.08 viral read percentage of total reads) compared to those from the Caco-2 and Calu-3 lines (>10% of viral reads) (175). This finding suggests that the risk of infection varies among cell types, and that cell type could influence which hosts are more or less susceptible. Based on visual inspection of microscopy images alongside transcriptional profiling, the authors also showed distinct responses among the host cell lines evaluated (175). In contrast to Caco-2, Calu-3 cells infected with SARS-CoV-2 showed signs of impaired growth and cell death at 24 hours post infection, as well as moderate IFN induction with a strong up-regulation of IFN-stimulated genes. Interestingly, the results were similar to those reported in Calu-3 cells exposed to much higher levels of SARS-CoV-2 (167), as described above. This finding suggests that IFN induction in Calu-3 cells is not dependent on the level of exposure, in contrast to A549-ACE2 cells. The discrepancy could be explained by the observations that Calu-3 cells are highly susceptible to SARS-CoV-2 and show rapid viral replication (34), whereas A549 cells are incompatible with SARS-CoV-2 infection (176). This discrepancy raises the concern that in vitro models may vary in their similarity to the human response, underscoring the importance of follow-up studies in additional models.

As a result, transcriptional analysis of patient tissue is an important application of omics technology to understanding COVID-19. Several studies have collected blood samples from COVID-19 patients and analyzed them using RNA-Seq (177182). Analyzing gene expression in the blood is valuable to understanding host-pathogen interactions because of the potential to identify alterations associated with the immune response and to gain insights into inflammation, among other potential insights (177). One study compared gene expression in 39 COVID-19 inpatients admitted with community-acquired pneumonia to that of control donors using whole blood cell transcriptomes (177). They also evaluated the effect of mild versus severe disease. A greater number of differentially expressed genes were found in severe patients compared to controls than in mild patients compared to controls. They also identified that the transcriptional profiles clustered into five groups and that the groups could not be explained by disease severity. Most severe cases fell into two clusters associated with increased inflammation and granulocyte and neutrophil activation. The presence of these clusters suggests the possibility that personalized medicine could be useful in the treatment of COVID-19 (177). Longitudinal analysis of granulocytes from patients with mild versus severe COVID-19 revealed that granulocyte activation-associated factors differentiated the disease states, with greater numbers of differentially expressed genes early in disease course (177). This study therefore revealed distinct patterns associated with COVID-19 and identified genes and pathways associated with each cluster.

Many other studies have also identified transcriptomic signatures associated with the immune response and inflammation. Other studies have profiled the transcriptome of BALF (179) and the nasopharynx (183). One study used single-cell transcriptomics techniques to investigate cell types including brain and choroid plexus cells compared to healthy controls and controls with influenza; among other signals of neuroinflammation, this study reported cortical T cells only in COVID-19 patients (184). Transcriptomic analysis can thus provide insight into the pathogenesis of SARS-CoV-2 and may also be useful in identifying candidate therapeutics (177).

1.6.2 Proteomics

Proteomics analysis offers an opportunity to characterize the response to a pathogen at a level above transcriptomics. Especially early on, this primarily involved evaluating the effect of the virus on cell lines. One early proteomics study investigated changes associated with in vitro SARS-CoV-2 infection using Caco-2 cells (185). This study reported that SARS-CoV-2 induced alterations in multiple vital physiological pathways, including translation, splicing, carbon metabolism and nucleic acid metabolism in the host cells. Another area of interest is whether SARS-CoV-2 is likely to induce similar changes to other HCoV. For example, because of the high level of sequence homology between SARS-CoV-2 and SARS-CoV-1, it has been hypothesized that sera from convalescent SARS-CoV-1 patients might show some efficacy in cross-neutralizing SARS-CoV-2-S-driven entry (33). However, despite the high level of sequence homology, certain protein structures might be immunologically distinct, which would be likely to prohibit effective cross-neutralization across different SARS species (186). Consequently, proteomic analyses of SARS-CoV-1 might also provide some essential information regarding the new pathogen (187, 188).

Proteomics research has been able to get ahead of the timeline for development of omics-level big data sets specific to SARS-CoV-2 by adopting a comparative bioinformatics approach. Data hubs such as UniProt (189), NCBI Genome Database (190), The Immune Epitope Database and Analysis Resource (191), and The Virus Pathogen Resource (192) contain a wealth of data from studies in other viruses and even HCoV. Such databases facilitate the systems-level reconstruction of protein-protein interaction networks, providing opportunities to generate hypotheses about the mechanism of action of SARS-CoV-2 and identify potential drug targets. In an initial study (193), 26 of the 29 SARS-CoV-2 proteins were cloned and expressed in HEK293T kidney cells, allowing for the identification of 332 high-confidence human proteins interacting with them. Notably, this study suggested that SARS-CoV-2 interacts with innate immunity pathways. Ranking pathogens by the similarity between their interactomes and that of SARS-CoV-2 suggested West Nile virus, Mycobacterium tuberculosis, and human papillomavirus infections as the top three hits. The fact that the host-pathogen interactome of the bacterium Mycobacterium tuberculosis was found to be similar to that of SARS-CoV-2 suggests that changes related to lung pathology might comprise a significant contributor to these expression profiles. Additionally, it was suggested that the envelope protein, E, could disrupt host bromodomain-containing proteins, i.e., BRD2 and BRD4, that bind to histones, and the spike protein could likely intervene in viral fusion by modulating the GOLGA7-ZDHHC5 acyl-transferase complex to increase palmitoylation, which is a post-translational modification that affects how proteins interact with membranes (194).

An example of an application of this in silico approach comes from another study (195), which used patient-derived peripheral blood mononuclear cells to identify 251 host proteins targeted by SARS-CoV-2. This study also reported that more than 200 host proteins were disrupted following infection. In particular, a network analysis showed that nsp9 and nsp10 interacted with NF-Kappa-B-Repressing Factor, which encodes a transcriptional repressor that mediates repression of genes responsive to Nuclear Factor kappa-light-chain-enhancer of activated B-cells. These genes are important to pro-, and potentially also anti-, inflammatory signaling (196). This finding could explain the exacerbation of the immune response that shapes the pathology and the high cytokine levels characteristic of COVID-19, possibly due to the chemotaxis of neutrophils mediated by IL-8 and IL-6. Finally, it was suggested (197) that the E protein of both SARS-CoV-1 and SARS-CoV-2 has a conserved Bcl-2 Homology 3-like motif, which could inhibit anti-apoptosis proteins, e.g., BCL2, and trigger the apoptosis of T cells. Several compounds are known to disrupt the host-pathogen protein interactome, largely through the inhibition of host proteins. Therefore, this research identifies candidate targets for intervention and suggests that drugs modulating protein-level interactions between virus and host could be relevant to treating COVID-19.

As with other approaches, analyzing the patterns found in infected versus healthy human subjects is also important. COVID-19 infection has been associated with quantitative changes in transcripts, proteins, metabolites, and lipids in patient blood samples (198). One longitudinal study (199) compared COVID-19 patients to symptomatic controls who were PCR-negative for SARS-CoV-2. The longitudinal nature of this study allowed it to account for differences in the scale of inter- versus intraindividual changes. At the time of first sampling, common functions of proteins upregulated in COVID-19 patients relative to controls were related to immune system mediation, coagulation, lipid homeostasis, and protease inhibition. They compared these data to the patient-specific timepoints associated with the highest levels of SARS-CoV-2 antibodies and found that the actin-binding protein gelsolin, which is involved in recovery from disease, showed the steepest decline between those two time points. Immunoglobulins comprised the only proteins that were significantly different between the COVID-19 and control patients at both of these timepoints. The most significantly downregulated proteins between these time points were related to inflammation, while the most significantly upregulated proteins were immunoglobulins. Proteins related to coagulation also increased between the two timepoints. The selection of a symptomatic control cohort rather than healthy comparisons also suggests that the results are more likely to highlight the response to SARS-CoV-2 and COVID-19 specifically, rather than to disease more broadly. This study also compared the disease course in patients who ultimately survived to those who died and found that ITIH4, a protein associated with the inflammatory response to trauma, may be a biomarker useful to identifying patients at risk of death. Thus, these results indicate the value of studying patients in a longitudinal manner over the disease course. By revealing which genes are perturbed during SARS-CoV-2 infection, proteomics-based analyses can thus provide novel insights into host-virus interaction and serve to generate new avenues of investigation for therapeutics.

1.7 Viral Virulence

Like that of SARS-CoV-1, the entry of SARS-CoV-2 into host cells is mediated by interactions between the viral spike glycoprotein, S, and human ACE2 (hACE2) (25, 33, 200205). Differences in how the S proteins of the two viruses interact with hACE2 could partially account for the increased transmissibility of SARS-CoV-2. Studies have reported conflicting binding constants for the S-hACE2 interaction, though they have agreed that the SARS-CoV-2 S protein binds with equal, if not greater, affinity than the SARS-CoV-1 S protein does (16, 25, 203). The C-terminal domain of the SARS-CoV-2 S protein in particular was identified as the key region of the virus that interacts with hACE2, and the crystal structure of the C-terminal domain of the SARS-CoV-2 S protein in complex with hACE2 reveals stronger interaction and a higher affinity for receptor binding than that of SARS-CoV-1 (204). Among the 14 key binding residues identified in the SARS-CoV-1 S protein, eight are conserved in SARS-CoV-2, and the remaining six are semi-conservatively substituted, potentially explaining variation in binding affinity (25, 203). Studies of crystal structure have shown that the RBD of the SARS-CoV-2 S protein, like that of other coronaviruses, undergoes stochastic hinge-like movement that flips it from a “closed” conformation, in which key binding residues are hidden at the interface between protomers, to an “open” one (16, 25). Spike proteins cleaved at the furin-like binding site are substantially more likely to take an open conformation (66%) than those that are uncleaved (17%) (206). Because the RBD plays such a critical role in viral entry, blocking its interaction with ACE2 could represent a promising therapeutic approach. Nevertheless, despite the high structural homology between the SARS-CoV-2 RBD and that of SARS-CoV-1, monoclonal antibodies targeting SARS-CoV-1 RBD failed to bind to SARS-CoV-2-RBD (16). However, in early research, sera from convalescent SARS patients were found to inhibit SARS-CoV-2 viral entry in vitro, albeit with lower efficiency than it inhibited SARS-CoV-1 (33).

Comparative genomic analysis reveals that several regions of the coronavirus genome are likely critical to virulence. The S1 domain of the spike protein, which contains the receptor binding motif, evolves more rapidly than the S2 domain (23, 24). However, even within the S1 domain, some regions are more conserved than others, with the receptors in S1’s N-terminal domain (S1-NTD) evolving more rapidly than those in its C-terminal domain (S1-CTD) (24). Both S1-NTD and S1-CTD are involved in receptor binding and can function as RBDs to bind proteins and sugars (23), but RBDs in the S1-NTD typically bind to sugars, while those in the S1-CTD recognize protein receptors (12). Viral receptors show higher affinity with protein receptors than sugar receptors (12), which suggests that positive selection on or relaxed conservation of the S1-NTD might reduce the risk of a deleterious mutation that would prevent binding. The SARS-CoV-2 S protein also contains an RRAR furin recognition site at the S1/S2 junction (16, 25), setting it apart from both bat coronavirus RaTG13, with which it shares 96% genome sequence identity, and SARS-CoV-1 (207). Such furin cleavage sites are commonly found in highly virulent influenza viruses (208, 209). The furin recognition site at the S1/S2 junction is likely to increase pathogenicity via destabilization of the spike protein during fusion to ACE2 and the facilitation of cell-cell adhesion (16, 25, 42, 206, 208, 209). These factors may influence the virulence of SARS-CoV-2 relative to other beta coronaviruses. Additionally, a major concern has been the emergence of SARS-CoV-2 variants with increased virulence. The extent to which evolution within SARS-CoV-2 may affect pathogenesis is reviewed below.

1.8 Molecular Signatures, Transmission, and Variants of Concern

Genetic variation in SARS-CoV-2 has been used to elucidate patterns over time and space. Many mutations are neutral in their effect and can be used to trace transmission patterns. Such signatures within SARS-CoV-2 have provided insights during outbreak investigations (210212). Similar mutations observed in several patients may indicate that the patients belong to the same transmission group. The tracking of SARS-CoV-2 mutations is recognized as an essential tool for controlling future outbreaks and tracing the path of the spread of SARS-CoV-2. In the first months of the pandemic in early 2020, early genomic surveillance efforts in Guangdong, China revealed that local transmission rates were low and that most cases arising in the province were imported (213). Since then, efforts have varied widely among countries: for example, the U.K. has coordinated a national database of viral genomes (214), but efforts to collect this type of data in the United States have been more limited (215). Studies have applied phylogenetic analyses of viral genomes to determine the source of local COVID-19 outbreaks in Connecticut (USA), (216), the New York City area (USA) (217), and Iceland (218). There has been an ongoing effort to collect SARS-CoV-2 genomes throughout the COVID-19 outbreak, and as of summer 2021, millions of genome sequences have been collected from patients. The sequencing data can be found at GISAID (219), NCBI (220), and COVID-19 data portal (221).

Ongoing evolution can be observed in genomic data collected through molecular surveillance efforts. In some cases, mutations can produce functional changes that can impact pathogenesis. One early example is the spike protein mutation D614G, which appeared in March 2020 and became dominant worldwide by the end of May 2020 (222, 223). This variant was associated with increased infectivity and increased viral load, but not with more severe disease outcomes (222, 224). This increased virulence is likely achieved by altering the conformation of the S1 domain to facilitate binding to ACE2 (224). Similarly, the N439K mutation within the RBD of the spike protein is likely associated with increased transmissibility and enhanced binding affinity for hACE2, although it is also not thought to affect disease outcomes (225). In contrast, a mutation in ORF8 that was identified in Singapore in the early months of 2020 was associated with cases of COVID-19 that were less likely to require treatment with supplemental oxygen (226), and a deletion surrounding the furin site insertion at the S1/S2 boundary has been identified only rarely in clinical settings (227), suggesting that these mutations may disadvantage viral pathogenesis in human hosts. Thus, mutations have been associated with both virological and clinical differences in pathogenesis.

Several VOCs have also been identified and designated through molecular surveillance efforts (228). The Alpha variant (lineage B.1.1.7) was first observed in the U.K. in October 2020 before it quickly spread around the world (229). Other variants meriting further investigation have also been identified, including the Beta variant (B.1.351 lineage) first identified in South Africa and the Gamma variant (P.1 lineage) initially associated with outbreaks in Brazil. These lineages share independently acquired mutations that may affect pathogenicity (230234). For example, they are all associated with a greater binding affinity for hACE2 than that of the wildtype variant (232, 235, 236), but they were not found to have more efficient cell entry than the wildtype virus (237). A fourth VOC, the Delta variant (B.1.617.2 and AY.1, AY.2, and AY.3 lineages), was identified in India in late 2020 (238). Some of the mutations associated with this lineage may alter fusogenicity and enhance furin cleavage, among other effects associated with increased pathogenicity (239). The changes in these VOC demonstrate how ongoing evolution in SARS-CoV-2 can drive changes in how the virus interacts with host cells.

1.9 Quantifying Viral Presence

Assessing whether a virus is present in a sample is a more complex task than it initially seems. Many diagnostic tests rely on real-time polymerase chain reaction (RT-PCR) to test for the presence versus absence of a virus (7). They may report the cycle threshold (Ct) indicating the number of doubling cycles required for the target (in this case, SARS-CoV-2) to become detectable. A lower Ct therefore corresponds to a higher viral load. The Ct that corresponds to a positive can vary widely, but is often around 35. This information is sufficient to answer many questions, since an amplicon must be present in order to be duplicated in RT-PCR. For example, if a patient is presenting with COVID-19 symptoms, a positive RT-PCR test can confirm the diagnosis.

However, RT-PCR analysis alone cannot provide the information needed to determine whether a virus is present at sufficient levels to be infectious (240). Some studies have therefore taken the additional step of cultivating samples in vitro in order to observe whether cells become infected with SARS-CoV-2. One study collected upper respiratory tract samples from COVID-19 patients, analyzed them with RT-PCR to determine the cycle threshold, and then attempted to cultivate the SARS-CoV-2 virus in VeroE6 cells (240). This study found that out of 246 samples, less than half (103) produced a positive culture. Moreover, at a Ct of 35, only 5 out of 60 samples grew in vitro. Therefore, the RT-PCR-confirmed presence of SARS-CoV-2 in a sample does not necessarily indicate that the virus is present at a high enough concentration to grow and/or spread.

1.10 Mechanisms of Transmission

When a human host is infected with a virus and is contagious, person-to-person viral transmission can occur through several possible mechanisms. When a contagious individual sneezes, coughs, or exhales, they produce respiratory droplets that can contain a large number of viral particles (241). Viral particles can enter the body of a new host when they then come in contact with the oral, nasal, eye, or other mucus membranes (241). The primary terms typically used to discuss the transmission of viruses via respiratory droplets are droplet, aerosol, and contact transmission (242). The distinction between droplet and aerosol transmission is typically anchored on whether a particle containing the virus is larger or smaller than 5 micrometers (μm) (243, 244). Droplet transmission typically refers to contact with large droplets that fall quickly to the ground at close range, such as breathing in droplets produced by a sneeze (241, 243). Aerosol transmission typically refers to much smaller particles (less than 5 μm) produced by sneezing, coughing, or exhaling (241, 242) that can remain suspended over a longer period of time and potentially to be moved by air currents (241). It is also possible that viral particles deposited on surfaces via large respiratory droplets could later be aerosolized (241). The transmission of viral particles that have settled on a surface is typically referred to as contact or fomite transmission (241, 245). Any respiratory droplets that settle on a surface could contribute to fomite transmission (241). Droplet and contact transmission are both well-accepted modes of transmission for many viruses associated with common human illnesses, including influenza and rhinovirus (241).

The extent to which aerosol transmission contributes to the spread of respiratory viruses is more widely debated. In influenza A, for example, viral particles can be detected in aerosols produced by infected individuals, but it is not clear to what extent these particles drive the spread of influenza A infection (241, 242, 246248). Regardless of its role in the spread of influenza A, however, aerosol transmission likely played a role in outbreaks such as the 1918 Spanish Influenza (H1N1) and 2009 “swine flu” (pH1N1) (248). All three of these mechanisms have been identified as possible contributors to the transmission of HCoVs (241), including the highly pathogenic coronaviruses SARS-CoV-1 and MERS-CoV (249, 250). Transmission of SARS-CoV-1 is thought to proceed primarily through droplet transmission, but aerosol transmission is also considered possible (241, 251, 252), and fomite transmission may have also played an important role in some outbreaks (253). Similarly, the primary mechanism of MERS transmission is thought to be droplets because inter-individual transmission appears to be associated with close interpersonal contact (e.g., household or healthcare settings), but aerosolized particles of the MERS virus have been reported to persist much more robustly than influenza A under a range of environmental conditions (254, 255). However, few of these analyses have sought to grow positive samples in culture and thus to confirm their potential to infect new hosts.

Contact, droplet, and aerosol transmission are therefore all worth evaluating when considering possible modes of transmission for a respiratory virus like SARS-CoV-2. The stability of the SARS-CoV-2 virus both in aerosols and on a variety of surfaces was found to be similar to that of SARS-CoV-1 (256). Droplet-based and contact transmission were initially put forward as the greatest concern for the spread of SARS-CoV-2 (257), with droplet transmission considered the dominant mechanism driving the spread of the virus (258) because the risk of fomite transmission under real-world conditions is likely to be substantially lower than the conditions used for experimental analyses (259). The COVID-19 pandemic has, however, exposed significant discrepancies in how terms pertaining to airborne viral particles are interpreted in different contexts (243). The 5-μm distinction between “droplets” and “aerosols” is typical in the biological literature but is likely an artifact of historical science rather than a meaningful boundary in biology or physics (244). Additionally, various ambient conditions such as air flow can influence how particles of different sizes fall or spread (243). Despite initial skepticism about airborne transmission of SARS-CoV-2 through small particles (244), evidence now suggests that small particles can contribute to SARS-CoV-2 transmission (256, 260262). For example, one early study detected SARS-CoV-2 viral particles in air samples taken from hospitals treating COVID-19 patients, although the infectivity of these samples was not assessed (263). Subsequently, other studies have been successful in growing SARS-CoV-2 in culture with samples taken from the air (264, 265) while others have not (266, 267) (see (268) for a systematic review of available findings as of July 2020). The fact that viable SARS-CoV-2 may exist in aerosolized particles calls into question whether some axioms of COVID-19 prevention, such as 2-meter social distancing, are sufficient (244, 264, 269).

1.10.1 Symptoms and Viral Spread

Other aspects of pathogenesis are also important to understanding how the virus spreads, especially the relationship between symptoms, viral shedding, and contagiousness. Symptoms associated with reported cases of COVID-19 range from mild to severe (8), but some individuals who contract COVID-19 remain asymptomatic throughout the duration of the illness (270). The incubation period, or the time period between exposure and the onset of symptoms, has been estimated at five to eight days, with means of 4.91 (95% confidence interval (CI) 4.35-5.69) and 7.54 (95% CI 6.76-8.56) reported in two different Asian cities and a median of 5 (IQR 1 to 6) reported in a small number of patients in a Beijing hospital (271, 272).

However, the exact relationship between contagiousness and viral shedding remains unclear. Estimates suggest that viral shedding can, in some cases, begin as early as 12.3 days (95% CI 5.9-17.0) before the onset of symptoms, although this was found to be very rare, with less than 0.1% of transmission events occurring 7 or more days before symptom onset (273). Transmissibility appeared to peak around the onset of symptoms (95% CI -0.9 - 0.9 days), and only 44% (95% CI 30–57%) of transmission events were estimated to occur from presymptomatic contacts (273). A peak in viral load corresponding to the onset of symptoms was also confirmed by another study (240). As these trends became apparent, concerns arose due to the potential for individuals who did not yet show symptoms to transmit the virus (274). Recovered individuals may also be able to transmit the virus after their symptoms cease. A study of the communicable period based on twenty-four individuals who tested positive for SARS-CoV-2 prior to or without developing symptoms estimated that individuals may be contagious for one to twenty-one days, but they note that this estimate may be low (270). In an early study, viral nucleic acids were reported to remain at observable levels in the respiratory specimens of recovering hospitalized COVID-19 patients for a median of 20 days and with a maximum observed duration through 37 days, when data collection for the study ceased (83).

As more estimates of the duration of viral shedding were released, they converged around approximately three weeks from first positive PCR test and/or onset of symptoms (which, if present, are usually identified within three days of the initial PCR test). For example, in some studies, viral shedding was reported for up to 28 days following symptom onset (275) and for one to 24 days from first positive PCR test, with a median of 12 days (72). On the other hand, almost 70% of patients were reported to still have symptoms at the time that viral shedding ceased, although all symptoms reduced in prevalence between onset and cessation of viral shedding (276). The median time that elapsed between the onset of symptoms and cessation of viral RNA shedding was 23 days and between first positive PCR test and cessation of viral shedding was 17 days (276). The fact that this study reported symptom onset to predate the first positive PCR test by an average of three days, however, suggests that there may be some methodological differences between it and related studies. Furthermore, an analysis of residents of a nursing home with a known SARS-CoV-2 case measured similar viral load in residents who were asymptomatic regardless of whether they later developed symptoms, and the load in the asymptomatic residents was comparable to that of residents who displayed either typical or atypical symptoms (277). Taken together, these results suggest that the presence or absence of symptoms are not reliable predictors of viral shedding or of SARS-CoV-2 status (e.g, (278)). However, it should be noted that viral shedding is not necessarily a robust indicator of contagiousness. The risk of spreading the infection was low after ten days from the onset of symptoms, as viral load in sputum was found to be unlikely to pose a significant risk based on efforts to culture samples in vitro (275). The relationship between symptoms, detectable levels of the virus, and risk of viral spread is therefore complex.

The extent to which asymptomatic or presymptomatic individuals are able to transmit SARS-CoV-2 has been a question of high scientific and community interest. Early reports (February and March 2020) described transmission from presymptomatic SARS-CoV-2-positive individuals to close family contacts (279, 280). One of these reports (280) also included a description of an individual who tested positive for SARS-CoV-2 but never developed symptoms. Later analyses also sought to estimate the proportion of infections that could be traced back to a presymptomatic or asymptomatic individual (e.g., (281)). Estimates of the proportion of individuals with asymptomatic infections have varied widely. The proportion of asymptomatic individuals on board the Diamond Princess cruise ship, which was the site of an early COVID-19 outbreak, was estimated at 17.9% (282). In contrast, a model using the prevalence of antibodies among residents of Wuhan, China estimated a much higher rate of asymptomatic cases, at approximately 7 in 8, or 87.5% (283). An analysis of the populations of care homes in London found that, among the residents (median age 85), the rate of asymptomatic infection was 43.8%, and among the caretakers (median age 47), the rate was 49.1% (284). The duration of viral shedding may also be longer in individuals with asymptomatic cases of COVID-19 compared to those who do show symptoms (285). As a result, the potential for individuals who do not know they have COVID-19 to spread the virus raises significant concerns. In Singapore and Tianjin, two cities studied to estimate incubation period, an estimated 40-50% and 60-80% of cases, respectively, were considered to be caused by contact with asymptomatic individuals (271). An analysis of viral spread in the Italian town of Vo’, which was the site of an early COVID-19 outbreak, revealed that 42.5% of cases were asymptomatic and that the rate was similar across age groups (286). The argument was thus made that the town’s lockdown was imperative for controlling the spread of COVID-19 because it isolated asymptomatic individuals. While more models are likely to emerge to better explore the effect of asymptomatic individuals on SARS-CoV-2 transmission, these results suggest that strategies for identifying and containing asymptomatic but contagious individuals are important for managing community spread.

1.10.2 Estimating the Fatality Rate

Estimating the occurrence of asymptomatic and mild COVID-19 cases is important to identifying the mortality rate associated with COVID-19. The mortality rate of greatest interest would be the total number of fatalities as a fraction of the total number of people infected. One commonly reported metric is the case fatality rate (CFR), which compares the number of COVID-19 related deaths to the number of confirmed or suspected cases. However, in locations without universal testing protocols, it is impossible to identify all infected individuals because so many asymptomatic or mild cases go undetected. Therefore, a more informative metric is the infection fatality rate (IFR), which compares the known deaths to the estimated number of cases. It thus requires the same numerator as CFR, but divides by an approximation of the total number of cases rather than only the observed/suspected cases. IFR varies regionally, with some locations observed to have IFRs as low as 0.17% while others are as high as 1.7% (287). Estimates of CFR at the national and continental level and IFR at the continent level is maintained by the Centre for Evidence-Based Medicine (288). Several meta-analyses have also sought to estimate IFR at the global scale. These estimates have varied; one peer-reviewed study aggregated data from 24 other studies and estimated IFR at 0.68% (95% CI 0.53%–0.82%), but a preprint that aggregated data from 139 countries calculated a global IFR of 1.04% (95% CI 0.77%-1.38%) when false negatives were considered in the model (287, 289). A similar prevalence estimate was identified through a repeated cross-sectional serosurvey conducted in New York City that estimated the IFR as 0.97% (290). Examination of serosurvey-based estimates of IFR identified convergence on a global IFR estimate of 0.60% (95% CI 0.42%–0.77%) (287). All of these studies note that IFR varies widely by location, and it is also expected to vary with demographic and health-related variables such as age, sex, prevalence of comorbidities, and access to healthcare and testing (291). Estimates of infection rates are becoming more feasible as more data becomes available for modeling and will be bolstered as serological testing becomes more common and more widely available. However, this research may be complicated due to the emergence of variants over time, as well as the varying availability and acceptance of vaccines in different communities and locations.

1.11 Dynamics of Transmission

Disease spread dynamics can be estimated using R0, the basic reproduction number, and Rt, the effective reproduction number. Accurate estimates of both are crucial to understanding the dynamics of infection and to predicting the effects of different interventions. R0 is the average number of new (secondary) infections caused by one infected person, assuming a wholly susceptible population (292), and is one of the most important epidemiological parameters (293). A simple mechanistic model used to describe infectious disease dynamics is a susceptible-infected-recovered compartmental model (294, 295). In this model, individuals move through three states: susceptible, infected, and recovered; two parameters, \(\gamma\) and \(\beta\), specify the rate at which the infectious recover, and the infection transmission rate, respectively, and R0 is estimated as the ratio of \(\beta\) and \(\gamma\) (293, 296). A pathogen can invade a susceptible population only if R0 > 1 (293, 297). The spread of an infectious disease at a particular time t can be quantified by Rt, the effective reproduction number, which assumes that part of the population has already recovered (and thus gained immunity to reinfection) or that mitigating interventions have been put into place. For example, if only a fraction St of the population is still susceptible, Rt = St x R0. When Rt is greater than 1, an epidemic grows (i.e., the proportion of the population that is infectious increases); when Rt is less than 1, the proportion of the population that is infectious decreases. R0 and Rt can be estimated directly from epidemiological data or inferred using susceptible-infected-recovered-type models. To capture the dynamics of SARS-CoV-2 accurately, the addition of a fourth compartment, i.e. a susceptible-exposed-infectious-recovered model, may be appropriate because such models account for the relative lengths of incubation and infectious periods (298).

Original estimates of R0 for COVID-19 lie in the range R0=1.4-6.5 (299301). Variation in R0 is expected between different populations, and the estimated values of R0 discussed below are for specific populations in specific environments. The different estimates of R0 should not necessarily be interpreted as a range of estimates of the same underlying parameter. In one study of international cases, the predicted value was R0=1.7 (302). In China (both Hubei province and nationwide), the value was predicted to lie in the range R0=2.0-3.6 (299, 303, 304). Another estimate based on a cruise ship where an outbreak occurred predicted R0=2.28 (305). Susceptible-exposed-infectious-recovered model-derived estimates of R0 range from 2.0 - 6.5 in China (306309) to R0=4.8 in France (310). Using the same model as for the French population, a study estimated R0=2.6 in South Korea (310), which is consistent with other studies (311). From a meta-analysis of studies estimating R0, (300) the median R0 was estimated to be 2.79 (IQR 1.16) based on twelve studies published between January 1 and February 7, 2020.

Inference of the effective reproduction number can provide insight into how populations respond to an infection and the effectiveness of interventions. In China, Rt was predicted to lie in the range 1.6-2.6 in January 2020, before travel restrictions (312). Rt decreased from 2.35 one week before travel restrictions were imposed (January 23, 2020), to 1.05 one week after. Using their model, the authors also estimated the probability of new outbreaks occurring. Assuming individual-level variation in transmission comparable to that of MERS or SARS, the probability of a single individual exporting the virus and causing a large outbreak is 17-25%, and assuming variation like that of SARS and transmission patterns like those observed for COVID-19 in Wuhan, the probability of a large outbreak occurring after ≥4 infections exist at a new location is greater than 50%. An independent study came to similar conclusions, finding Rt=2.38 in the two-week period before January 23 with a decrease to Rt = 1.34 (using data from January 24 to February 3) or Rt=0.98 (using data from January 24 to February 8) (301). In South Korea, Rt was inferred for February through March 2020 in two cities, Daegu (the center of the outbreak) and Seoul (311). Metro data was also analyzed to estimate the effects of social distancing measures. Rt decreased in Daegu from around 3 to <1 over the period that social distancing measures were introduced. In Seoul, Rt decreased slightly, but remained close to 1 (and larger than Rt in Daegu). These findings indicate that social distancing measures appeared to be effective in containing the infection in Daegu, but in Seoul, Rt remained above 1, meaning secondary outbreaks remained possible. The study also shows the importance of region-specific analysis: the large decline in case load nationwide was mainly due to the Daegu region and could mask persistence of the epidemic in other regions, such as Seoul and Gyeonggi-do. In Iran, estimates of Rt declined from 4.86 in the first week to 2.1 by the fourth week after the first cases were reported (313). In Europe, analysis of 11 countries inferred the dynamics of Rt over a time range from the beginning of the outbreak until March 28, 2020, by which point most countries had implemented major interventions (such as stay-at-home orders, public gathering bans, and school closures) (314). Across all countries, the mean Rt before interventions began was estimated as 3.87; Rt varied considerably, from below 3 in Norway to above 4.5 in Spain. After interventions, Rt decreased by an average of 64% across all countries, with mean Rt=1.43. The lowest predicted value was 0.97 for Norway and the highest was 2.64 for Sweden, which could be related to the fact that Sweden did not implement social distancing measures on the same scale as other countries. The study concludes that while large changes in Rt are observed, it is too early to tell whether the interventions put into place are sufficient to decrease Rt below 1.

Evolution within SARS-CoV-2 has also driven changes in the estimated reproduction number for different populations at different times. As of June 2021, the reproduction number had increased globally relative to 2020, and increased transmissibility over the wildtype variant was observed for the Alpha, Beta, Gamma, and Delta VOC (315). In the U.S. between December 2020 and January 2021, B.1.1.7 (Alpha) was estimated to have an increased transmission of 35 to 45% relative to common SARS-CoV-2 variants at the time, with B.1.1.7 the dominant SARS-CoV-2 variant in some places at some timepoints (316). This lineage was estimated to have increased transmissibility of 43 to 90% in the U.K. (317). An estimate of the reproduction number of B.1.1.7 in the U.K. from September to December 2020 yielded 1.59 overall and between 1.56 and 1.95 in different regions of the country (234). The Delta variant is particularly transmissible, and it has been estimated to be twice as transmissible than the wildtype variant of SARS-CoV-2 (315). A review of the literature describing the Delta variant identified a mean estimated R0 of 5.08 (318). Such differences can affect fitness and therefore influence the relative contributions of different lineages to a given viral gene pool over time (319). Therefore, the evolution of the virus can result in shifts in the reproduction rate.

More generally, population-level epidemic dynamics can be both observed and modeled (296). Data and empirically determined biological mechanisms inform models, while models can be used to try to understand data and systems of interest or to make predictions about possible future dynamics, such as the estimation of capacity needs (320) or the comparison of predicted outcomes among prevention and control strategies (321, 322). Many current efforts to model Rt have also led to tools that assist the visualization of estimates in real time or over recent intervals (323, 324). These are valuable resources, yet it is also important to note that the estimates arise from models containing many assumptions and are dependent on the quality of the data they use, which varies widely by region.

1.12 Effect of Vaccines on Pathogenesis and Community Spread

The vaccine clinical trial data demonstrate a significant reduction in the likelihood of contracting symptomatic COVID-19, thereby succeeding in the primary goal of vaccination. The mRNA vaccines in particular were initially so effective in preventing disease that they were also assumed to have an effect on the likelihood of transmission (e.g., venues requiring proof of vaccination). However, in light of the reduced efficacy in response to VOC, it is especially important to consider whether this assumption is supported by the available evidence.

This question is made up of several components. The crux is whether vaccinated individuals with a SARS-CoV-2 infection, regardless of symptom status, are as contagious as unvaccinated, infected individuals. Additionally, as outlined above, an important qualification is that the variants of SARS-CoV-2 circulating at the time of each study must be considered in light of the effect of evolution on vaccine efficacy.

The phase II/III clinical trials evaluating the mRNA vaccines assessed vaccine efficacy based on COVID-19 diagnosis, thereby detecting only patients who received a diagnosis. In order to identify patients infected with SARS-CoV-2 who did not receive a diagnosis, for example, potentially those who did not develop symptoms, it would be necessary to conduct routine PCR testing even in the absence of symptoms. Prior to the development of vaccines, the evidence suggested that asymptomatic individuals could spread SARS-CoV-2. Investigation of viral dynamics of asymptomatic infection in early 2020 indicated that asymptomatic patients continued to shed the virus for a duration similar to that of symptomatic patients (325) (although viral shedding should not be conflated with contagiousness without further investigation). Another study found viral load to be higher in the nasopharyngeal/oropharyngeal samples of asymptomatic patients compared to symptomatic patients hospitalized due to symptoms and/or known exposure (326). However, the sample size in both of these studies was small, and a larger study found higher viral load in symptomatic than asymptomatic cases (327) along with a systematic review finding a reduced probability of asymptomatic transmission (328). While far from conclusive, these studies suggest that asymptomatic cases still cary a risk of transmitting SARS-CoV-2.

One important consideration is therefore how likely vaccinated individuals are to develop asymptomatic SARS-CoV-2. Considering asymptomatic cases is necessary to establish a more complete picture of efficacy with respect to spread. Routine testing of healthcare workers in California who had received an mRNA vaccine revealed slightly higher rates of absolute risk for testing positive than those identified in the phase II/III trials, although the extent to which asymptomatic infection influenced these numbers was not investigated (329). Another study analyzed the results of COVID-19 screening tests administered to asymptomatic individuals prior to receiving certain medical services at the Mayo Clinic in several locations across the United States. This study found patients who had received two doses of an mRNA vaccine to be 73% less likely to have asymptomatic COVID-19 than patients who had received zero doses (330). Because this study began on December 17, 2020, a date selected to coincide with the first day vaccines were available at the Mayo Clinic, this number may underestimate the efficacy of the vaccines given that many people eligible for early vaccination were at increased risk for exposure (e.g., healthcare workers and residents of long-term care facilities) (330). In Israel, a longitudinal study of nearly 12,000 healthcare workers found that of the 5,372 fully vaccinated people with Pfizer/BioNTech BNT162b2, 8 developed symptomatic COVID-19 (0.15%) and 19 developed asymptomatic COVID-19 (0.35%) (331). While the study itself analyzed the efficacy of the vaccine based on person-days, these findings also suggest that many or even the majority of SARS-CoV-2 infections in vaccinated individuals are likely to be asymptomatic. Therefore, in addition to the symptomatic cases reported by the vaccine clinical trials, these findings suggest that asymptomatic cases can also occur in vaccinated people. In the absence of symptoms, individuals are less likely to know to self-isolate, and therefore evaluating the effect of the vaccine on viral load is critical to understanding the role vaccinated individuals can play in spreading SARS-CoV-2.

Another question of interest is therefore whether vaccinated individuals positive for SARS-CoV-2 carry a similar viral load to unvaccinated individuals. Viral load is often approximated by cycle threshold (Ct), or the cycle at which viral presence is detected during RT-qPCR, with a lower Ct corresponding to a greater viral load. A prospective cohort study that evaluated front-line workers in six U.S. states from December 2020 to April 2021 reported a 40% reduction in viral load even with just a single dose of an mRNA vaccine (332). The vaccine also appeared to influence the time to viral clearance: the risk of having detectable levels of SARS-CoV-2 for more than one week was reduced by 66% in participants who had received at least one dose (332). However, this study compared the mean viral load across the two groups, meaning that these findings cannot be extrapolated across all points in the disease course. Similarly, between December 2020 and February 2021, positive RT-qPCR tests were analyzed for almost 5,000 Israeli patients (333). Ct was analyzed relative to when each patient received the first dose of the Pfizer mRNA vaccine. A sharp increase in Ct (corresponding to reduced viral load) was observed between days 11 and 12, consistent with what is known about the onset of immunity following vaccination. This pattern therefore suggested a direct effect of vaccination on viral load.

Other studies, however, have not offered support for a reduced viral load in breakthrough cases. In Singapore, which has strict protocols for screening individuals with potential COVID-19 exposure, a retrospective cohort of patients who tested positive for SARS-CoV-2 between April and June 2021 was analyzed to compare viral kinetics and symptom course between vaccinated and unvaccinated cases. Vaccinated individuals who tested positive experienced fewer symptoms than unvaccinated, SARS-CoV-2-positive individuals and were more likely to be asymptomatic (334). Additionally, this study analyzed Ct over time and found that, though the median values were similar between the two groups at disease onset, viral load appeared to decrease more rapidly in vaccinated cases (334). This study is likely to have evaluated a more accurate representation of all COVID-19 outcomes than has been feasible in most studies, but one limitation was that the RT-PCR reactions were conducted in many different facilities. A third study investigated viral load (as approximated by Ct) using samples processed in a single laboratory during the summer of 2021 (335). This study identified no significant differences in Ct between fully vaccinated and unvaccinated cases, but this study used samples sent for diagnosis and was not longitudinal. It offered the additional benefit of culturing samples to assess whether their Ct threshold was likely to represent contagiousness and found that SARS-CoV-2 could be cultured from 51 of 55 samples with Ct less than 25 (the cut-off used in many studies). Another study of samples collected at two sites in San Francisco, one of which tested only asymptomatic individuals, reported no difference in Ct between asymptomatic and symptomatic cases regardless of whether vaccination status was included in the model (336). Though each of these three studies offers distinct strengths and weaknesses, taken together, they suggest that viral load is likely to be similar in vaccinated and unvaccinated individuals, but that vaccinated individuals clear the virus more rapidly, meaning that the average viral load is lower over time.

Given the emergence of VOC, especially the Delta and Omicron variants, for which breakthrough infections are more common, the potential for vaccinated individuals to spread SARS-CoV-2 is not static over time. In fact, studies reporting reduced viral load in vaccinated individuals collected samples, for the most part, prior to the emergence of the Delta variant’s dominance. The emergence of this variant may partially account for why more recent studies tend to find no difference between viral load in vaccinated and unvaccinated cases.

Taken together, these findings can provide some insight into how vaccines influence community spread. While vaccinated individuals may be more likely to experience asymptomatic infection, current evidence about viral load in asymptomatic versus symptomatic cases is ambiguous. Similarly, no conclusions can be drawn about whether viral load is different in vaccinated versus unvaccinated cases. Therefore, at present, the evidence suggests that vaccinated individuals who are infected can still contribute to community spread. The one potential mitigating factor supported at present is that differences in the viral kinetics may result in vaccinated cases infecting fewer individuals over time due to a more rapid decrease in viral load (334), although this study did not examine patterns in secondary transmission. Thus, the virological evidence suggests that public health measures such as masking and distancing remain important even in areas with high vaccination rates.

1.13 Conclusions

The novel coronavirus SARS-CoV-2 is the third HCoV to emerge in the 21st century, and research into previous HCoVs has provided a strong foundation for characterizing the pathogenesis and transmission of SARS-CoV-2. Critical insights into how the virus interacts with human cells have been gained from previous research into HCoVs and other viral infections. With the emergence of three devastating HCoV over the past twenty years, emergent viruses are likely to represent an ongoing threat. Contextualizing SARS-CoV-2 alongside other viruses serves not only to provide insights that can be immediately useful for combating this virus itself but may also prove valuable in the face of future viral threats.

Host-pathogen interactions provide a basis not only for understanding COVID-19, but also for developing a response. As with other HCoVs, the immune response to SARS-CoV-2 is likely driven by detection of its spike protein, which allows it to enter cells through ACE2. Epithelial cells have also emerged as the major cellular target of the virus, contextualizing the respiratory and gastrointestinal symptoms that are frequently observed in COVID-19. Many of the mechanisms that facilitate the pathogenesis of SARS-CoV-2 are currently under consideration as possible targets for the treatment or prevention of COVID-19 (2, 3). Research in other viruses also provides a foundation for understanding the transmission of SARS-CoV-2 among people and can therefore inform efforts to control the virus’s spread. Airborne forms of transmission (droplet and aerosol transmission) have emerged as the primary modes by which the virus spreads to new hosts. Asymptomatic transmission was also a concern in the SARS outbreak of 2002-03 and, as in the current pandemic, presented challenges for estimating rates of infection (337). These insights are important for developing a public health response, such as the CDC’s shift in its recommendations surrounding masking (338).

Even with the background obtained from research in SARS and MERS, COVID-19 has revealed itself to be a complex and difficult-to-characterize disease that has many possible presentations that vary with age. Variability in presentation, including cases with no respiratory symptoms or with no symptoms altogether, were also reported during the SARS epidemic at the beginning of the 21st century (337). The variability of both which symptoms present and their severity have presented challenges for public health agencies seeking to provide clear recommendations regarding which symptoms indicate SARS-CoV-2 infection and should prompt isolation. Asymptomatic cases add complexity both to efforts to estimate statistics such as R0 and Rt, which are critical to understanding the transmission of the virus, and IFR, which is an important component of understanding its impact on a given population. The development of diagnostic technologies over the course of the pandemic has facilitated more accurate identification, including of asymptomatic cases (7). As more cases have been diagnosed, the health conditions and patient characteristics associated with more severe infection have also become more clear, although there are likely to be significant sociocultural elements that also influence these outcomes (339). While many efforts have focused on adults, and especially older adults because of the susceptibility of this demographic, additional research is needed to understand the presentation of COVID-19 and MIS-C in pediatric patients. As more information is uncovered about the pathogenesis of HCoV and SARS-CoV-2 specifically, the diverse symptomatology of COVID-19 has and likely will continue to conform with the ever-broadening understanding of how SARS-CoV-2 functions within a human host.

While the SARS-CoV-2 virus is very similar to other HCoV in several ways, including in its genomic structure and the structure of the virus itself, there are also some differences that may account for differences in the COVID-19 pandemic compared to the SARS and MERS epidemics of the past two decades. The R0 of SARS-CoV-2 has been estimated to be similar to SARS-CoV-1 but much higher than that of MERS-CoV (340), although a higher R0 has been estimated for some VOC. While the structures of the viruses are very similar, evolution among these species may account for differences in their transmissibility and virulence. For example, the acquisition of a furin cleavage site the S1/S2 boundary within the SARS-CoV-2 S protein may be associated with increased virulence. Additionally, concerns have been raised about the accumulation of mutations within the SARS-CoV-2 species itself, and whether these could influence virulence (341). These novel variants may be resistant to vaccines and antibody treatments such as Bamlanivimab that were designed based on the wildtype spike protein (3, 6). As a consequence of reliance on targeting the SARS-CoV-2 spike protein for many therapeutic and prophylactic strategies, increased surveillance is required to rapidly identify and prevent the spread of novel SARS-CoV-2 variants with alterations to the spike protein. The coming of age of genomic technologies has made these types of analyses feasible, and genomics research characterizing changes in SARS-CoV-2 along with temporal and spatial movement is likely to provide additional insights into whether within-species evolution influences the effect of the virus on the human host. Additionally, the rapid development of sequencing technologies over the past decade has made it possible to rapidly characterize the host response to the virus. For example, proteomics analysis of patient-derived cells revealed candidate genes whose regulation is altered by SARS-CoV-2 infection, suggesting possible approaches for pharmaceutical invention and providing insight into which systems are likely to be disrupted in COVID-19 (195). As more patient data becomes available, the biotechnological advances of the 2000s are expected to allow for more rapid identification of potential drug targets than was feasible during the SARS, or even MERS, pandemic.

Thus, the COVID-19 crisis continues to evolve, but the insights acquired over the past 20 years of HCoV research have provided a solid foundation for understanding the SARS-CoV-2 virus and the disease it causes. As the scientific community continues to respond to COVID-19 and to elucidate more of the relationships between pathogenesis, transmission, host regulatory responses, and symptomatology, this understanding will no doubt continue to evolve and to reveal additional connections among virology, pathogenesis, and health. This review represents a collaboration between scientists from diverse backgrounds to contextualize this virus at the union of many different biological disciplines (4). At present, understanding the SARS-CoV-2 virus and its pathogenesis is critical to a holistic understanding of the COVID-19 pandemic. In the future, interdisciplinary work on SARS-CoV-2 and COVID-19 may guide a response to a new viral threat.

2 Evolutionary Perspectives on SARS-CoV-2

2.1 Abstract

2.2 Importance

2.3 Introduction

The emergence of what is now known to be the pathogen SARS-CoV-2 has dramatically reshaped modern life for the past two years. The genomic revolution provided the tools needed to understand the virus in ways that were not feasible during previous pandemics. For example, the first genome sequence of the pathogen was released on January 3, 2020, providing valuable information about the pathogen within a month and a half of the first known cases. As the pandemic has unfolded, evolutionary questions and methods of investigation have framed the scientific approach to understanding the virus. These questions have evolved along with the pandemic. Thus far, five major evolutionary questions have emerged. The first was “what is it?”, the second “where did it come from?”, the third and fourth address “whom does it affect?”, the fifth “how is it changing?” and the sixth “what is next?” Evolutionary biology provides a framework through which these questions can be evaluated and explored.

2.4 Question 1: What Is It?

What is now known as SARS-CoV-2 emerged in November 2019 as an unknown pathogen causing a cluster of pneumonia cases in Wuhan, China. The initial genome sequence, which was released in early January 2020, revealed the pathogen to be a novel coronavirus (11). Although most coronaviruses show little transmission in humans, several human coronaviruses (HCoV) have been identified since the 1960s. Therefore, in the early days of the pandemic, many strategies to understand or manage the emergent viral threat focused on contextualizing it amongst better-studied coronaviruses.

Many people have previously been infected by an HCoV. Approximately one-third of common cold infections are thought to be caused by four seasonal HCoV: Human coronavirus 229E (HCoV-229E), Human coronavirus NL63 (HCoV-NL63), Human coronavirus OC43 (HCoV-OC43), and Human coronavirus HKU1 (HCoV-HKU1) (342344). The first HCoV were identified in the 1960s: HCoV-229E in 1965 (345) and HCoV-OC43 in 1967 (346). Both of these viruses typically cause cold-like symptoms, including upper and lower respiratory infections (347349), but they have also been associated with gastrointestinal symptoms (350). Two additional HCoV were subsequently identified (351, 352). In 2003, HCoV-NL63 (351) was first identified in a 7-month-old infant and then in clinical specimens collected from seven additional patients, five of whom were infants younger than 1 year old and the remainder of whom were adults. CoV-HKU1 was identified in samples collected from a 71-year-old pneumonia patient in 2004 and then found in samples collected from a second adult patient (352). These viruses are associated with respiratory diseases of varying severity, ranging from common cold to severe pneumonia, with severe symptoms mostly observed in immunocompromised individuals (353), and also have gastrointestinal involvement in some cases (350).

In addition to these relatively mild HCoV, however, highly pathogenic human coronaviruses have been identified, including Severe acute respiratory syndrome-related coronavirus (SARS-CoV or SARS-CoV-1) and Middle East respiratory syndrome-related coronavirus (MERS-CoV) (250, 342, 354). At the time that SARS-CoV-1 emerged in the early 2000s, no HCoV had been identified in almost 40 years (250). The first case of SARS was reported in November 2002 in the Guangdong Province of China, and over the following month, the disease spread more widely within China and then into several countries across multiple continents (250, 340). Unlike previously identified HCoV, SARS was much more severe, with an estimated death rate of 9.5% (340). It was also highly contagious via droplet transmission, with a basic reproduction number (R0) of 4 (i.e., each person infected was estimated to infect four other people) (340).

However, the identity of the virus behind the infection remained unknown until April of 2003, when the SARS-CoV-1 virus was identified through a worldwide scientific effort spearheaded by the WHO (250). SARS-CoV-1 belonged to a distinct lineage from the two other HCoV known at the time (340). By July 2003, the SARS outbreak was officially determined to be under control, with the success credited to infection management practices (250). A decade later, a second outbreak of severe respiratory illness associated with a coronavirus emerged, this time in the Arabian Peninsula. This disease, known as Middle East respiratory syndrome (MERS), was linked to another novel coronavirus, MERS-CoV. The fatality rate associated with MERS is much higher than that of SARS, at almost 35%, but the disease is much less easily transmitted, with an R0 of 1 (340). Although MERS is still circulating, its low reproduction number has allowed for its spread to be contained (340). The COVID-19 pandemic is thus associated with the seventh HCoV to be identified and the fifth since the turn of the millennium, though additional HCoVs may be in circulation but remain undetected (e.g., (355)).

Following the release of the SARS-CoV-2 genome sequence, multiple research groups sequenced the genomes of SARS-CoV-2 specimens identified in clinical samples. These samples were primarily collected from patients’ lower respiratory tract, namely bronchoalveolar lavage fluid (BALF), and the upper respiratory tract, in the form of throat and nasopharyngeal swabs (19, 207, 356). Integration of these sequences allowed for a more complete picture of the viral genome. Analysis of the viral genome revealed significant sequence homology with two known HCoV: the novel coronavirus shared about 79% sequence identity with SARS-CoV-1 and 50% with MERS-CoV (19). Therefore, this early phylogenetic analysis of the novel coronavirus allow its similarity to other, known viruses to be established. SARS-CoV-1 and MERS-CoV were ultimately managed largely through infection management practices (e.g., mask wearing) and properties of the virus itself (i.e., low rate of transmission), respectively (250, 340). Research in response to prior outbreaks of HCoV-borne infections, such as SARS and MERS, provided a strong foundation for hypotheses about the pathogenesis of SARS-CoV-2 as well as potential diagnostic and therapeutic approaches, as we review elsewhere (1, 3, 5, 6). Therefore, this phylogenetic information was valuable for gaining an understanding of the pathogen and identifying strategies to manage it.

2.5 Question 2: Where Did It Come From?

Despite the high degree of similarity to SARS-CoV-1, even greater sequence identity was observed between SARS-CoV-2 and zoonotic coronaviruses. A 2001 literature review estimated that 61% of human pathogens have a zoonotic origin (357). A zoonotic disease, or zoonosis, arises when a pathogen can both a) infect and b) cause a disease in humans (358). As a result, the risk of zoonotic disease increases when there is substantial interaction between humans and wildlife (358). Many factors can influence this human/wildlife interface and therefore the risk of zoonotic transmission events (358, 359).

In the SARS epidemic, SARS-CoV-1 was also thought to have emerged in a live animal market. A survey of a market in Shenzhen, China revealed that individuals from two carnivore species, namely several masked palm civets (Paguma larvata) and one raccoon dog (Nyctereutes procyonoides), were likely carriers of SARS-CoV-1, despite presenting as healthy (360). However, further analysis suggested that these species might be only intermediate hosts who were exposed in the market setting (361). A closely related virus was identified in Chinese horseshoe bats (Rhinolophus sinicus), but the sequence identity was only 88% with SARS-CoV-1 (362). Therefore, the species of origin for SARS-CoV-1 remains unresolved.

In the case of SARS-CoV-2, early interest for the emergence of the pathogen turned to live-animal markets in Wuhan (363, 364–add-to-Wuhan-riddle), where it would later emerge that many animals were sold suffering from poor health and hygiene (365). A large percentage of early patients had visited the Huanan seafood market in Wuhan, and next-generation sequencing of samples collected from nine patients, eight of whom had visited the market, revealed extremely high sequence identity (99.98%), indicative of rapid spread (19). The sequence of the viral pathogen collected from these patients was also compared to known zoonotic pathogens. In particular, genomic research quickly highlighted significant similarity (about 88% sequence identity) between SARS-CoV-2 and bat-derived SARS-like coronaviruses, namely bat-SL-CoVZC45 and bat-SL-CoVZXC21 (19). Other analyses have reported even greater similarity between SARS-CoV-2 and the bat coronavirus BatCoV-RaTG13, with shared sequence identity as high as 96.2% (207, 211). Bats are well-established as a disease reservoir, including for RNA viruses (366368). This evidence therefore suggested that the virus may have emerged as a result of zoonotic transfer of a virus from bats to humans, with the wildlife trade considered a potential source of exposure.

Nevertheless, some fragments of the genome differ between SARS-CoV-2 and RATG13 by up to 17%, suggesting a complex natural selection process. Additionally, SARS-CoV-2 is closely related (91.02%) to a novel coronaviruses identified in Malayan pangolins (Manis javanica) infected with a respiratory disease in October 2019 (369). Although the genome-wide sequence identity was lower between SARS-CoV-2 and this pangolin virus than BatCoV-RaTG13, its particularly high similarity in the receptor binding domain (RBD) of the spike (S) gene with SARS-CoV-2 drew further attention (369, 370). The SARS-CoV-2 RBD differs from the pangolin coronavirus RBD by only one amino acid change (369), and the sequence identity between the regions is 97.4% (370). Pangolins were therefore identified as a potential intermediate host of SARS-CoV-2 between bats and humans.

However, data collected from May 2017 to November 2019 by a research team interested in tick-borne illnesses identified no bats or pangolins sold at these markets leading up to the emergence of COVID-19 (365). Additionally, endemic bat species are typically in hibernation at the time that SARS-CoV-2 emerged (19). Therefore, it is possible that animals associated with these markets were infected by bats, but it is not clear whether the disease emerged in a different location and/or whether it is associated with a different species. There were 38 species observed at the market in the 2.5 years leading up to the emergence of SARS-CoV-2, indicating significant diversity in the animals with which humans were interacting (365). As with SARS-CoV-1, the species of origin for SARS-CoV-2 therefore remains unresolved.

Genomic analyses and comparisons to other known coronaviruses suggest that SARS-CoV-2 is unlikely to have originated in a laboratory – either purposely engineered and released, or escaped – and instead evolved naturally in an animal host (371). However, potentially due to public misunderstanding about recombination and complex evolutionary processes like coevolution, the similarity to pangolin S has resulted in popular conspiracy theories that the virus did not arise naturally. The similarity of S to that of pangolin viruses could arise from either recombination or coevolution (211, 372), rather than requiring human intervention. Such suspicions may also have been fueled, in part, by the lack of well-characterized bat coronaviruses which means that SARS-CoV-2 is still relatively derived from documented coronaviruses surveyed in bats (373). While it has been suggested that more thorough investigation of the origins of COVID-19 may have some value (374), in many cases, support for the “lab-leak” theory is politically motivated (375). A more robust panel of zoonotic viruses against which to compare SARS-CoV-2 would allow for conclusive dismissal of these politicized claims, underscoring another potential benefit of more thorough monitoring of zoonotic diseases. More importantly, it would allow researchers to have a better understanding of and to community concerns about potential emerging viral threats.

2.6 Question 3: Which Species Are Susceptible?

Given the strong evidence for a zoonotic origin of SARS-CoV-2, another evolutionary question that received significant attention, especially early on, was whether humans could infect other species with SARS-CoV-2. In the modern age, opportunities for human-to-animal transmission events could arise in interactions with companion animals, zoo animals, house pests, hunting, urbanized wildlife, and livestock. Outbreaks of zoonotic diseases have been known to originate in environments such as zoos, farms, and petting zoos (376), indicating that disease transmission is likely to be possible in these contexts. Additionally, many coronaviruses infect animals and have been the subject of veterinary medical investigations and vaccine development efforts due to their effect on the health of companion and agricultural animals (377). Concerns about anthroponotic (human-to-animal) transmission focused on a few issues. First, if animal species were susceptible to COVID-19-like infection, in addition to concerns about animal health, infections in livestock could have significant effects on food supply chains. Additionally, even if pathology in these species was limited, if they could serve as viral reservoirs, then they would pose additional risk to humans. The breadth of species susceptible to infection by a pathogen is known as the pathogen’s host range (378). Understanding the host-pathogen relationship throughout SARS-CoV-2’s host range can therefore offer valuable information for managing the spread of SARS-CoV-2.

The phylogeny of the species implicated in the origination of COVID-19 suggested that the host range of SARS-CoV-2 could encompass many species with a high level of interaction with humans. Humans last shared an ancestor with bats and pangolins almost 100 million years ago (379). Bats belong to the order Chiroptera and pangolins to Pholidota, which both belong to the clade Pegasoferae (380382). They are closely related to many other species that have close relationships with humans, namely odd-toed ungulates (Euungulata) and carnivores (Carnivora) (380383). The part of the evolutionary tree that includes both humans and the Pegasoferae encompasses many species of significant social and economic importance. Therefore, concerns were raised that the species with which humans have close interactions, many of which are much more closely related to bats and pangolins than humans are, could also be infected. It seemed plausible that the host range could include both livestock, many of which are odd-toed ungulates, and companion animals, many of which are carnivores. Infection of these animals was identified as a major concern (384).

Genomic analyses seeking to identify which species were likely to be susceptible focused largely on the comparative genetics of angiotensin-converting enzyme 2 (ACE2). ACE2 is the primary protein used by SARS-CoV-2 to enter the cell (see (1)). Recognition of this protein is largely determined by domains in the S1 subunit of the RBD (25). Alignment of the ACE2 sequence from 19 species revealed high conservation among mammals (385). This analysis suggested that non-human primates (three monkey and two ape species), companion animals (dogs and cats), and livestock (both odd- and even-toed ungulates) may all be susceptible to SARS-CoV-2 (385). Similarly, another study conducted an in silico analysis of ACE2 protein structures and their predicted binding to SARS-CoV-2 for 410 vertebrate species (386). The species identified as having the highest predicted binding affinities were all primates, including humans. Other taxa with high predicted affinities included other primates, rodents, even-toed ungulates (namely, several species of cetaceans and deer), and anteaters. Reindeer were the only domesticated species predicted to belong to either of these groups, but many common zoo animal species with threatened or worse IUCN risk status were identified as at risk.

Considering the evidence generated by in silico studies, it may not be surprising that many cases of reverse zoonotic, or anthroponotic, SARS-CoV-2 transmission have been reported. Ferrets (Mustela furo) as well as cats and dogs were reported to be susceptible to SARS-CoV-2 in an experimental infection study (387). The earliest reported anthroponotic transmission events were observed in house pets, primarily cats (Felis catus) (388390). Similarly, cases of SARS-CoV-2 infection have been reported in dogs (Canis familiaris): two of fifteen dogs monitored for SARS-CoV-2 by the Hong Kong Agriculture, Fisheries, and Conservation Department during the owners’ quarantine in March 2020 were found to be positive for SARS-CoV-2 (389). Comparing estimates in studies where cats (Felis catus) living with SARS-CoV-2-positive humans were tested for SARS-CoV-2 suggest that 6 to 15% of house cats may become infected (388), and a large-scale study of pet dogs and cats in Italy suggested that 4.5% of cats and 12.8% of dogs from known COVID-19-positive households had developed antibodies to the virus (391). Some of these SARS-CoV-2-positive domestic carnivores have also shown clinical symptoms (392), and a pilot study of seven cats and three dogs found that cats, but not dogs, shed SARS-CoV-2 virus for several days after viral challenge, although none of the animals were symptomatic (393). A few dogs and cats have reportedly died after becoming infected with SARS-CoV-2, although in most cases whether the virus is causally related to the death is unclear (394396, 397/?sh=4b653381275e, 398, 399).

Domestic pests, on the other hand, seem to be less susceptible to SARS-CoV-2. In the comparative genomic analysis of ACE2, the two rodent species analyzed, despite being the most phylogenetically similar to humans aside from the other primates, showed the most sequence divergence in ACE2 (385). This finding was supported by experimental evidence that SARS-CoV-2 cannot use mouse (Mus musculus) ACE2 for cell entry (207). In fact, research using murine models to study SARS and COVID-19 therefore uses transgenic mice designed to be sensitive to the virus (as summarized in (400)).

Similarly, SARS-CoV-2 in livestock also raised concern because of the potential effect on food supply. However, studies using in vivo viral challenge reported that livestock species in general do not develop clinical manifestations of SARS-CoV-2 and do not shed infectious virus (387, 401). In vitro exposure to SARS-CoV-2 suggested that sheep (Ovis aries), but not cattle (Bos taurus), might be susceptible to infection, but in vivo viral challenge suggested that sheep did not show notable susceptibility to infection (402). Similarly, analyses of antibody response (403) suggested that sheep exposed to a high level of human interaction did not appear to have developed infections. Following viral challenge of several species, including cattle, sheep, and horses (Equus ferus caballus), none were found to shed culturable levels of virus (401). As a note, despite the low risk posed by livestock themselves, the working conditions of the meat industry itself were associated with a very high risk of SARS-CoV-2 infection for workers that did cause disruptions to food supply chains (404, 405).

However, one species of domesticated agricultural animal severely affected by SARS-CoV-2 was the mink (Neovison vison). While fur farming has declined significantly since the twentieth century, mink farming is still common in China and some European countries, and mink farms continue to exist in the United States. Mink belong to the Mustelidae family within Carnivora. SARS-CoV-2 was first reported on mink farms in the Netherlands and Denmark in 2020 (406, 407). Mink were observed to show symptoms of respiratory infection, with varied severity among individuals (406). Dissection revealed lung pathology consistent with interstitial pneumonia (406). An analysis of five farms in the United States reported mortality rates between 35 and 55% of adult minks (408). Subsequently, mink farms worldwide reported outbreaks of SARS-CoV-2. Concerns were amplified when novel variants of SARS-CoV-2 were identified as having emerged on Danish mink farms and spread into the human population (407, 407, 409411). The fact that these variants appeared in mink populations before being observed in humans suggests that mink can indeed serve as a viral reservoir (407). Concerns about mink-to-human transmission led to the mass destruction of domesticated mink populations in Europe (412, 413). Introgression from fur farms into wild populations (i.e., feralization) may have also resulted in the spread of SARS-CoV-2 into wild mink populations (414, 415). Therefore, while the specific zoonotic origin of SARS-CoV-2 may still not be clear, the potential for the virus to take hold in species other than humans has been clearly demonstrated by the mink outbreak.

Finally, some species of zoo animals were also monitored to determine whether they were at risk. Several species closely related to humans (i.e., the Great Apes) are threatened with extinction and had been identified through in silico studies as likely to be susceptible to SARS-CoV-2 (386), and therefore the potential for a virus to target these close relatives presented a major concern. In early 2021, three gorillas (Gorilla beringei beringei) at the San Diego Zoo Safari Park developed respiratory symptoms that were confirmed to be associated with SARS-CoV-2 (416). Gorillas at other zoos have also been infected (417419). Additionally, given the susceptibility of house cats, it is not so surprising that other felids are also susceptible to SARS-CoV-2. Infections of several “big cats” including Malayan tigers (Panthera tigris jacksoni), Amur tigers (Panthera tigris altaica), and African lions (Panthera leo krugeri) were reported at New York City’s Bronx Zoo in March 2020 (420). In late 2020, four lions (Panthera leo bleyenberghi) at the Barcelona Zoo also developed respiratory symptoms that were found to be caused by SARS-CoV-2 (411). Several captive snow leopards (Panthera uncia) in the United States have died from COVID-19 (421, 422).

While discussions of zoonoses often focus on the risk that animal diseases carry for human populations, the COVID-19 pandemic has also underscored the risks that human diseases pose for animals. COVID-19 precautions may have reduced the spread of other respiratory illnesses to wild mountain gorilla (Gorilla beringei beringei) populations (423), reducing one of the most significant threats to this endangered species (424). In the case of gorillas, the potential for cross-species application of pharmaceutical advances has also become clear: captive gorillas with COVID-19 received monoclonal antibodies (425). Additionally, several companies are developing veterinary vaccines against SARS-CoV-2. The most visible has been Zoetis, a veterinary pharmaceutical company, that has developed vaccines that have been administered to several species, including felids in zoos, minks, and gorillas (426429). Russian researchers have also developed a COVID-19 vaccine for carnivores (430).

Therefore, the host range of SARS-CoV-2 is broad, including primates, bats, and carnivores. The most severe infections have been observed in humans, felids, and mustelids (426). In the United States, as of late 2021, dogs and cats made up the majority of non-human SARS-CoV-2 infections (431), but the most severe infections were observed in felids and mustelids (in addition to humans) (426). Interestingly, comparing ACE2 binding activity across species (432, 433) revealed that it did not always align with which species known to be susceptible to SARS-CoV-2 infection, suggesting other binding sites might also be important. While the specific zoonotic origins of SARS-CoV-2 remain unknown, pharmaceutical developments in the treatment of COVID-19 have included non-human species. The complex relationship between animals, humans, and disease highlights the importance of a broad perspective on health that extends beyond a single species.

2.7 Question 4: Do Genes Influence Who is Affected?

Throughout the pandemic, many hypotheses have been raised about factors that might influence individuals’ susceptibility to COVID-19 or to severe disease . Many risk factors, such as underlying health conditions, are related to the body’s inflammatory response, as we review elsewhere (339). Here, we focus narrowly on genetic bases of differences in susceptibility or outcomes. Historically, the identification of genetic risk factors for a disease typically utilized a candidate gene approach, where a gene of interest was evaluated to identify variants that showed an association with the outcome of interest. While economical in terms of sequencing, this approach is prone to spurious results when applied to complex traits (434). Today, in the age of next-generation sequencing (NGS), alternative approaches have emerged. NGS makes it possible to conduct genome-wide scans where a large number of single-nucleotide polymorphisms (SNPs) or variants are evaluated to identify regions of the genome associated with variation in a phenotype. Genome-wide association studies (GWAS) in particular are a popular approach that employs this strategy. During COVID-19, both of these paradigms have been applied to the problem of identifying genetic correlates of disease severity.

2.7.1 Candidate-Gene Approaches

Many candidate genes have been investigated throughout the pandemic. Here, we review three examples of candidate gene studies in COVID-19. First, an early study (published in April 2020) investigated a known variant in interferon-induced transmembrane protein 3 (IFITM3) among hospitalized patients in Beijing (435). This gene and variant were selected because of a prior candidate gene study by some of the same authors that found an association with influenza severity among Chinese patients during the 2009 influenza A H1N1/09 pandemic (436). Here, they evaluated a small number (n=80) hospitalized COVID-19 patients to determine whether homozygosity for the previously identified risk allele was associated with mild versus severe disease (435). They stated that they found an association between homozygosity for the SNP of interest and the severity of COVID-19. A follow-up study demonstrated worldwide variation in the frequency of these SNPs (437), and subsequent studies claimed to support this result by comparing the frequency of the SNP in different groups to the COVID-19 case fatality rate in those groups; they examined SNPs in several candidate genes and identified an association with another SNP in IFITM3 (438). However, in the original study, the population-level frequency of the risk allele was consistent with its frequency in the mild population (436). A similar analysis examined both SNPs in Britons of different ancestral backgrounds and also reported a correlation (439). While this gene has been investigated for functions potentially relevant to COVID-19 pathogenesis by other groups as well (e.g., (440, 441), a follow-up analysis in Germany evaluated the effect of in 239 cases and 252 controls and reported non-significant effects (439). The narrative surrounding IFITM3 therefore reflects a broad methodological critique about candidate gene studies, where results often fail to replicate (442). The region associated with this gene was not identified in the large-scale GWAS conducted by the COVID-19 Host Genetics Initiative (COVID-19 HGI) (443), which is described in more detail below.

A second source of genetic variability that was hypothesized to have an effect on COVID-19 outcomes were human leukocyte antigens (HLA), or the major histocompatibility complex (MHC). Both MHC classes I and II play a critical role in both the innate and adaptive immune system because they are a pivotal component of antigen presentation. HLA classes I and II are also the most polymorphic loci in the human genome (444). Additionally, because HLA polymorphisms are associated with geographic ancestry, study location and participant background offers important context (445). Given the important role of the HLA complex in the immune response and the standing variation in the human population, HLA variation has been investigated for potential associations with COVID-19 outcomes.

Several approaches have been taken to evaluate a potential role of HLA in COVID-19. In silico analysis suggested one particular HLA locus that could affect binding of SARS-CoV-2 peptides to MHC class II (446). Other studies evaluated outcomes using retrospective cohort analyses. An analysis of 95 South Asian COVID-19 patients found that HLA genotype was not significant in differentiating case severity when the necessary statistical corrections were applied (447). Another study in a European population (n =147) did identify HLA alleles associated with severity (448). In St. Louis, MO (USA), another study enrolled 234 COVID-19 cases, who were genotyped for HLA alleles and compared to a control population of 20,000 individuals from the National Marrow Donor Program (449). They compared cases and controls on the basis of four “race/ethnic” populations and reported alleles showing a statistical association within each group (449). However, because of this stratification, two of the demographic categories had less than ten cases. Across all of these studies, there was minimal overlap in the risk alleles identified, and the small sample sizes raise concerns about the possibility for spurious hits. The hypervariability of this region means that statistical power will necessarily be reduced, with much higher recruitment needed than for studies of biallelic loci. A much larger analysis of 72,912 Israelis, 8.8% of whom tested positive for COVID-19, found no association between HLA genotype and infection or hospitalization (450). Therefore, while MHC is functionally important to the immune response to COVID-19, it is not clear whether HLA genotypes are predictive of COVID-19 severity, and certainly such studies face exacerbated versions of the typical challenges of candidate gene studies. Because of the challenges associated with analyzing such a variable region, it was excluded from the large-scale COVID-19 HGI GWAS analysis (443).

Finally, significant attention has been paid to the question of whether ABO blood type is associated with COVID-19 outcomes. ABO blood type has been found to modulate susceptibility to other pathogens (451). While ABO blood type is a genetic trait, it is more easily evaluated than the genetic regions discussed above because of the simple relationship between genetic variants and phenotype. The possibility for an association between blood type and COVID-19 infection was raised early in the pandemic in a preprint that reported associations in 2,173 patients in Wuhan and Shenzhen, China (452). The protective effect of O and increased risk associated with A blood types that they reported was subsequently investigated by many studies that returned varied results (e.g., (453456); see (457) for a literature review). Observations of higher and lower risk, respectively, of SARS-CoV-2 infection with A and O blood types was supported by a meta-analysis (458). While the support for the association was independent of a mechanism, a possible relationship between ACE activity and blood type has been proposed (459) as has an effect on carbohydrate-carbohydrate interactions relevant to ACE2 binding (460). This is the only candidate gene described that has received additional support from GWAS, as is discussed below.

The COVID-19 literature related to candidate gene investigations demonstrates relatively low inter-study consistency in findings. In particular, sample size is a major challenge in designing these studies. However, for many traits, the relationships between genes and phenotypes are complex, and selecting which variants to sequence is not always straightforward. As a result, in the age of next-generation sequencing, discovery-driven studies have emerged as an alternative approach.

2.7.2 Genome-Wide Association Studies

Genome-wide association studies (GWAS) offers a discovery-driven approach that provides a different perspective than candidate gene studies. Instead of selecting a gene or variant a priori, in GWAS, a large number of SNPs (usually several million) are evaluated at once to identify those most likely to vary in correlation with a trait of interest. Because of the large number of statistical tests, statistical power and multiple hypothesis testing are both very important considerations in executing GWAS, which have also struggled with issues related to replicability (461). In cases such as COVID-19 where outcomes can differ among ancestry groups (likely for non-genetic reasons, as reviewed in (339)), it is especially important that GWAS samples be selected with attention paid to ancestry, as incorrect or misleading associations can otherwise be identified with neutral markers indicative of ancestry itself (462).

Over the past two years, many GWAS have been undertaken with the aim of identifying variants associated with COVID-19 outcomes. In some cases, the results have been consistent with hypothesized genetic correlates of susceptibility to COVID-19. One study conducted a GWAS on a total of 435 COVID-19 patients from four countries and identified another HLA allele to be associated with an increased risk of intubation (463). Other GWAS have identified an association with the ABO blood group locus. One conducted a case/control GWAS in two populations, Italians and Spaniards, with 1980 cases and 2205 controls. They reported two loci that met the genome-wide significance threshold, one on chromosome 3 and one on chromosome 9 (464). The hit on chromosome 9 fell on the ABO locus and the alleles identified suggested a protective association with blood group O and a risk association with blood group A (464).

As the pandemic has progressed, large-scale efforts have been assembled to conduct GWAS on massive scales. In March 2020, COVID-19 HGI was established as a world-wide consortium that combines data to conduct meta-analyses (465). One year later, COVID-19 HGI released a meta-analysis of data from 46 studies, comprising 49,562 cases and 1,770,206 controls (443). They identified 13 loci, seven of which were significant at the genome-wide level when considering all data available, that were associated with one or more phenotypes related to COVID-19 infection or severity. Notably, strong signals were identified for both of the loci suggested by previous medium-scale GWAS in association with COVID-19 infection (464). Additionally, several other loci could be mapped onto hypotheses about genetic contributors to immune function, lung function and disease. This world-wide GWAS study made an effort towards strategic incorporation of genetic information from different ancestral groups. Interestingly, the risk variant on chromosome 3 is likely to be inherited from Neanderthal introgression, meaning it is likely to be more prevalent in certain populations, especially non-African populations (466, 467). The potential functional relationship between this region of the genome and COVID-19 is unknown, but phenome-wide association study has suggested blood cell traits as a potential trait regulated by this region (468).

Identifying genetic variants associated with a complex disease is always complicated. In COVID-19 studies, the results of candidate gene analyses have in general been difficult to replicate. However, large-scale collaboration on GWAS has made it possible to detect at least two loci that do appear to replicate across studies and potentially even across ancestral backgrounds.

2.8 Question 5: How is it Changing?

Evolution in SARS-CoV-2 has also been observed over a short timescale. After zoonotic transfer, SARS-CoV-2 continued evolving in the human population (210). The SARS-CoV-2 mutation rate is moderate compared to other RNA viruses (212), which likely restricts the pace of evolution in SARS-CoV-2. Nevertheless, genomic analyses have yielded statistical evidence of ongoing evolution. Initially, two known variants of the spike protein emerged that differed by a single amino acid at position 614 (G614 and D614), and there is evidence that G614 had become more prevalent than D614 by June 2020 (222). While there is a hypothesis that this genomic change increased the SARS-CoV-2 infectivity and virulence, this hypothesis has not yet been tested due to a lack of data (469). Another study (212) identified 198 recurrent mutations in a dataset of 7,666 curated sequences, all of which defined non-synonymous protein-level modifications. This pattern of convergent evolution at some sites could indicate that certain mutations confer an adaptive advantage. While it is evident that SARS-CoV-2 exhibits moderate potential for ongoing and future evolution, the relationship between mutations and pathogenicity is not yet known. Additional data is needed in order to understand patterns of evolutionary change and the mechanisms by which they might affect virulence.

Several factors could promote the evolution of SARS-CoV-2, including host immunodeficiency and transient exposure to antibodies directed against SARS-CoV-2 proteins. A single case study of SARS-CoV-2 infection in an immunocompromised female with chronic lymphocytic leukemia and hypogammaglobulinemia (470) suggested that an accelerated evolution of the virus could occur in conditions of immunodeficiency. A first administration of convalescent plasma did not clear the virus, and an ensuing increase in the genomic diversity in the samples was observed, suggesting an accelerated evolution due to selection pressure. A second administration of convalescent plasma cleared the virus from the host 105 days after the initial diagnosis. However, throughout the duration of infection, the patient was asymptomatic but contagious. A second single case study in a 45-year old male with antiphospholipid syndrome (471) confirmed the earlier results, providing evidence of persistent COVID-19 symptoms in an immunocompromised patient for 154 days following diagnosis, ultimately leading to the death of patient. The treatments administered included remdesivir and the Regeneron anti-spike protein antibody cocktail. Genomic analyses of the patient’s nasopharyngeal swabs confirmed an accelerated evolution of the virus through mutations in the spike gene and the receptor-binding domain. In summary, these two case studies suggested an accelerated evolution and persistent shedding of the virus in conditions of immunodeficiency. In particular, the first case highlighted the role of convalescent plasma in creating escape variants. In fact, one study (472) exposed the SARS-CoV-2 virus to convalescent plasma in vitro repeatedly to see how much plasma was required to neutralize the virus. The results of the first six exposures were similar, but they reported that after the seventh exposure (on day 45), the amount of plasma required began to increase. In analyzing the viral variants present, they found that this viral escape was promoted by the sudden accumulation of mutations, especially in the receptor-binding domain (RBD) and N-terminal domain (NTD), that quickly rose in frequency. By the thirteenth exposure (day 85), the virus had evolved three mutations and could no longer be neutralized by the plasma used, even though the plasma was comprised of polyclonal serum that targeted a variety of epitopes. Taken together, these observations suggest that evolutionary analyses of SARS-CoV-2 can provide crucial information about the conditions that promote resistance in SARS-CoV-2 and the kinetics of how resistance develops, information which will be important for understanding the implications of how vaccine regimens are designed and whether/when next-generation vaccines will be needed.

When variants occur, they can rise in frequency by chance or through an adaptive process that confers a competitive advantage to the virus. Variants that had the D614G mutation in the spike glycoprotein seemed to spread faster. However, it has been suggested that the mutation rose in frequency due to early chance events rather than by adaptive events (473). Another mutation, Y453F, that occurred in the receptor binding domain of S, was first detected in mink; however, the transmission to humans has been established. In mink, this mutation conferred an advantage by increasing the affinity towards ACE2 (474). Similarly, N501Y mutation induces an increased affinity towards human ACE2 and has been involved in the dominance of B.1.1.7 by outcompeting other variants (475). Therefore, genomic surveillance is essential to prevent the emergence of super-spreaders (476).

Emerging methods are being applied to this problem in an effort to understand which mutations are most likely to be of significant concern. Novel machine learning methods were developed to predict the mutations in the sequence that promote viral escape. While they preserve the pathogenicity of the virus, escape mutations change the virus’s sequence to evade detection by the immune system. By using tools from natural language processing (NLP), viral escape was modeled as an NLP problem (477) where a modification makes a sentence grammatically correct but semantically different. Therefore, language models of viruses could predict mutations that change the presentation of the virus to the immune system but preserve its infectivity.

2.8.1 Variants of Concern and Variants under Surveillance

Viral replication naturally leads to the occurrence of mutations, and thus to genetic variation (478). However, due to an intrinsic RNA proof-reading process in the SARS-CoV-2 virus, the pace of evolution of SARS-CoV-2 is moderate in comparison to other viruses (479). The declaration of the first SARS-CoV-2 variant of concern (VOC) B.1.1.7 in December 2020 has attracted significant media attention. While the B.1.1.7 lineage garnered attention in November 2020, two genomes of the lineage were detected as early as September 20th, 2020 from routine genomic data sampled in Kent (U.K.) by the COVID-19 Genomics UK Consortium (COG-UK). The following day, a second B.1.1.7 genome was reported in greater London (234, 473, 480, 481) Since then, B.1.1.7 has spread across the UK and internationally, and it has now been detected in at least 62 countries (482), despite several countries imposing travel restrictions on travelers from the UK. Of the twenty-three mutations that define B.1.1.7 from the original strain isolated in Wuhan (lineage A), fourteen are lineage-specific and three appear to be biologically consequential mutations associated with the spike protein, namely N501Y, P681H, and 69-70del (480, 481). The latter is a 6-bp deletion that leads to the loss of two amino acids and has consequences for immune recognition; it may, in conjunction with N501Y, be responsible for the increased transmissibility of the B.1.1.7 VOC due to changes in the RBD that increase binding affinity with ACE2 (230, 480). B.1.1.7 has increased transmissibility by up to 56%, leading to an R0 of approximately 1.4. Additionally, this VOC has been shown to be associated with increased disease severity and increased mortality (483). Other variants also express the 69-70del mutation (484, 485), and public health officials in the United States and the UK have been able to use RT-PCR-based assays (ThermoFisher TaqPath COVID-19 assay) to identify sequences with this deletion because it occurs where the qPCR probe binds (234). In the UK, B.1.1.7 is present in more than 97% of diagnostic tests that return negative for S-gene targets and positive for the other targets; thus, the frequency of S-gene target failure can be used as a proxy for the detection of B.1.1.7 (480, 486). The FDA has highlighted that the performance of three diagnostic tests may be affected by the B.1.1.7 lineage because it could cause false negative tests (487).

While B.1.1.7 is currently the main VOC, other genetic variants also currently designated as VOCs have been detected, including B.1.351 and P.1, both of which emerged independently (488, 489). B.1.351 was first detected in October 2020 in South Africa, was later detected in the EU on December 28th, 2020 and has now spread to at least 26 countries (231, 490, 491). B.1.351 contains several mutations at the RBD including K417N, E484K, and N501Y. While the biological significance of these mutations are still under investigation, it does appear that this lineage may be associated with increased transmissibility (492) due to the N501Y mutation (230, 481). Additionally, an analysis of a pseudovirus expressing the 501Y.V2 spike protein (B.1.351) showed that this variant demonstrates increased resistance to neutralization by convalescent plasma, even though total binding activity remained mostly intact (493). Further, using a live virus neutralization assay (LVNA), it was shown that 501Y.V2 (B.1.351) is poorly neutralized by convalescent plasma obtained from individuals who responded to non-501Y.V2 variants (494). However, 501Y.V2 infection-elicited plasma was able to cross-neutralize earlier non-501Y.V2 variants, suggesting that vaccines targeting VOCs may be effective against other mutant lineages (494).

The P.1 variant is a sublineage of the B.1.1.28 lineage that was first detected in Japan in samples obtained from four travelers from Brazil during a screening at a Tokyo airport on January 10, 2021 (495). Shortly thereafter, it was established that there was a concentration of cases of the P.1 variant in Manaus, Brazil. In a small number of samples (n=31) sequenced in Manaus, 42% were identified as the P.1 variant as early as mid-December, but the variant seemed to be absent in genome surveillance testing prior to December (496). To date, at least eight countries have detected the P.1 lineage (497). While the majority of P.1 cases detected internationally have been linked to travel originating from Brazil, the UK has also reported evidence of community transmission detected via routine community sequencing (497, 498).
P.1 has eight lineage-specific mutations along with three concerning spike protein mutations in the RBD, including K417T, E484K, and N501Y (492).

There have been multiple different SARS-CoV-2 lineages detected that have mostly been of no more clinical concern than the original devastating lineage originating in Wuhan (499). However, the spotlight has been cast on other variants of unknown clinical relevance due to the increase of cases observed that have been associated with B.1.1.7 in particular.
Although early in its ascendency, B.1.427/429 are SARS-CoV-2 variants that was detected in California, USA and also known as CAL.20C (500). It was first detected in July 2020 but was not detected again until October 2020. In December 2020, B.1.427/429 accounted for ~24% of the total cases in Southern California and ~36% of total cases in the Los Angeles area. B.1.427/429 have now been detected in several U.S. states and at least 38 countries worldwide (500, 501). This variant is characterized by five key lineage-specific mutations (ORF1a: I4205V, ORF1b:D1183Y, S: S13I;W152C;L452R). The latter spike mutation, L452R, is found in an area of the RBD known to resist monoclonal antibodies to the spike protein (502), and it is hypothesized that this mutation may resist polyclonal sera in convalescent patients or in individuals post-vaccination (500, 503). B.1.427/429 are now designated VOCs (489); however, further research is still required to determine the implications of the mutations encoded in this genetic variant.
Another notable variant has recently been discovered in 35 patients in a Bavarian hospital in Germany; however, the sequencing data has not been published to date and it remains to be determined whether this variant is of any further concern (504).

There are several shared mutations and deletions between the three lineages, P.1, B.1.1.7, and B.1.315 and indeed other variants of SARS-CoV-2 that are under investigation (496). For example, N501Y, which appears to have occurred independently in each of the three lineages.
E484K is present in both B.1.351 and P.1 (505). The mutations N501Y and E484K are found in the RBD within the receptor-binding motif responsible for forming an interface with the ACE2 receptor, which seems to be consequential for ACE2 binding affinity (506). Indeed, N501Y is associated with increased virulence and infectivity in mouse models (507). E484K has also been associated with evasion from neutralizing antibodies (472, 503, 508). The del69-70 (del:11288:9) is also shared between P.1 and B.1.1.7 and happens to be a common deletion found in the N terminal mutation of the spike protein. This deletion has also been associated with several RBD mutations (230, 481, 509). There is concern that mutations in the spike protein of variants may lead to clinical consequences for transmissibility, disease severity, re-infection, therapeutics, and vaccinations (472, 503, 510514).

Vaccine producers are working to determine whether the vaccines are still effective against the novel genetic variants. Moderna recently published data for their mRNA-1273 vaccine that showed no significant impact of neutralization against the B.1.1.7 variant upon vaccination in humans and non-human primates. On the other hand, Moderna reported a reduced but significant neutralization against the B.1.351 variant upon vaccination (515). Indeed, Pfizer–BioNTech reported that sera from twenty participants vaccinated with the BNT162b COVID-19 vaccine in previous clinical trials (516, 517) elicited equivalent neutralizing titers against isogenic Y501 SARS-CoV-2 on an N501Y genetic background in vitro (518). Another study has reported that the plasma neutralizing activity against SARS-CoV-2 variants encoding the combination of K417N:E484K:N501Y or E484K or N501Y was variably and significantly reduced in the sera of twenty participants who received either the Pfizer–BioNTech BNT162b (n = 6) vaccine or the Moderna’s mRNA-1273 vaccine (n =14) (519). In a study focusing on serum samples from a combination of convalescent individuals, those who obtained the mRNA-1273 vaccine, and those who obtained Novavax, in comparison to the D614G variant, the B.1.419 variant was 2-3 times less sensitive to neutralization while the B.1.351 variant was 9-14 times less sensitive (520). Indeed, the E484K substitution seen in the P.1 and B.1.315 variants of the B.1.1.7 lineage are broadly reported to substantially reduce the efficacy of mRNA-based vaccines (520522). For now, the consensus appears to be that the FDA-approved vaccines still seem to be generally effective against the genetic variants of SARS-CoV-2 and their accompanying mutations, albeit with a lower neutralizing capacity (515, 518, 519, 523), though select VOCs may present challenges. Further research is required to discern the clinical, prophylactic, and therapeutic consequences of these genetic SARS-CoV-2 variants as the pandemic evolves.

2.9 SARS-CoV-2 Evolution and Vaccine Efficacy

With these vaccines in place, one concern is how the virus’s continued evolution will affect their efficacy. Since the start of this pandemic, we have already seen multiple variants emerge: B.1.1.7, which emerged in the UK, B.1.351, which emerged in South Africa, and P.1, which emerged in Brazil.

Viruses evolve or mutate at different rates. Mutation rate is measured as the number of substitutions per nucleotide per cell infected (μs/n/c) (524). RNA viruses tend to have mutation rates between 10-6 to 10-4 (524). As a reference, influenza A virus has a mutation rate of 10-5, whereas the mutation rate of SARS-CoV-2 is lower, with the mutation rate estimated at 10-6 (525). The accumulation of mutations allows the virus to escape recognition by the immune system (526).

The efficacy of vaccines depends on their ability to train the immune system to recognize the virus. Therefore, viruses can develop resistance to vaccines through the accumulation of mutations that affect recognition. The lower mutation rate of SARS-CoV-2 suggests the possibility of SARS-CoV-2 vaccines having a more long-lasting effect compared to vaccines targeting the influenza A virus.

2.9.1 Alpha and Beta Variants

The current SARS-CoV-2 vaccines in distribution have been reported to provide similar efficacy against the B.1.1.7 variant compared to the variants common at the time they were developed but reduced efficacy against the B.1.351 variant (527). Pfizer and Moderna announced that they are working on developing a booster shot to improve efficacy against the B.1.351 variant (528). The WHO continues to monitor the emergence of variants and their impact on vaccine efficacy (529). Previous research in the computational prediction of the efficacy of vaccines targeting the influenza A virus might complement efforts to monitor these types of viral outbreaks (530). To adapt, future vaccines may need to account for multiple variants and strains of SARS-CoV-2, and booster shots may be required (531).

2.9.2 Delta Variant and Ct

One preprint (334) analyzed a retrospective cohort of patients in Singapore who contracted COVID-19 from April to June of 2021. This study focused on those who were confirmed or inferred to have been infected by the Delta variant of concern, and its aim was to analyze virological kinetics. They identified 218 cases, 71 (33%) of whom were fully vaccinated with either the Pfizer/BioNTech or Moderna mRNA vaccines, 13 (6%) of whom had received only one dose or had received the second dose less than two weeks prior to infection, and four (2%) of whom had received a vaccine developed with another technology. Unvaccinated patients were more likely to be symptomatic or to progress to severe COVID-19 and showed more symptoms than vaccinated patients, despite the higher age of the vaccinated cohort. Ct was assessed over disease course, although the specific procedures for when additional RT-PCR was conducted is not clear; however, it is stated that the data was smoothed based on day of illness. There was no significant difference in median Ct in the initial samples taken from fully vaccinated and unvaccinated patients, but Ct increased (signifying reduced viral load) more rapidly in fully vaccinated patients. Like most analyses analyzing Ct (1), this study does not provide the data to make conclusions about contagiousness, as the samples were not cultured. All the same, these findings do suggest that vaccinated individuals may be able to clear the infection more quickly.

A second analysis was based in Dane County in Wisconsin, USA during summer 2021, when the Delta variant was known to be the dominant variant in the region (335). According to Our World in Data (532), at the beginning of the study, 49.3% of residents of Dane County were fully vaccinated, with this number rising to 51.4% by the end of the study, although an earlier version of the preprint reported the vaccination rate in Dane County as 67.4%. The authors identified no significant differences in Ct among fully vaccinated and unvaccinated cases. The Ct thresholds reported were consistent with contagiousness as evaluated in other studies, and in the present study, SARS-CoV-2 could be cultured from 51 of 55 samples with Ct less than 25. This study was not longitudinal, but the timing of testing relative to symptom onset between symptomatic vaccinated and unvaccinated patients. The findings of this study are therefore consistent with the idea that vaccinated people are less likely to contract symptomatic or severe COVID-19, but in cases of breakthrough infection, are still likely to be able to transmit SARS-CoV-2 to others.

2.10 Question 6: What is Next?

The SARS-CoV-2 pandemic has presented many unprecedented scientific opportunities. The rapid identification of the genomic sequence of the virus allowed for early contextualization of SARS-CoV-2 among other known respiratory viruses, and the scientific community has continued to collect, analyze, and disseminate information about the SARS-CoV-2 virus and the associated illness, COVID-19 at previously unimaginable rates (4). The accessibility of genome sequencing technology has allowed for deep sequencing of the virus to establish a level of viral surveillance that had never before been achieved (214, 533, 534). The information obtained from genetic, bioinformatics, and evolutionary analysis has played a significant role in shaping the global pandemic response (533, 535, 536). For example, wastewater surveillance has emerged as a potential epidemiological tool to monitor SARS-CoV-2 spread over large regions, complementing clinical surveillance (537539). Humans shed SARS-CoV-2 viral RNA in feces (540) that can be detected in wastewater. Protocols have been developed to safely and reproducibly isolate and quantify SARS-CoV-2 in samples obtained from wastewater processing plants (539, 541). To date, studies show that wastewater surveillance is an effective tool to monitor SARS-CoV-2 spread over large sewersheds (537539, 542). Indeed, data from a study in New York City indicated that wastewater SARS-CoV-2 detection correlated with clinical detection of infection (542). Similar studies have been conducted in Nevada (543) and Boston (544). To date, studies have shown that factors such as temperature, the travel time of wastewater, and diurnal variability may affect detection of SARS-CoV-2 (537, 543). Additionally, wastewater surveillance provides a tool to monitor fluctuations in the viral strains present in a community (545, 546). Due to its demonstrated utility so far, the United States CDC established the National Wastewater Surveillance System (NWSS), which has emerged as an important surveillance tool for SARS-CoV-2 spread (547).

Knowledge of the evolution of SARS-CoV-2 is imperative to managing it moving forward (533, 548).

The evolutionary questions highlighted here all point back to the fact that efforts to prevent future epidemics and pandemics will benefit greatly from long-term, sustainable efforts to monitor disease. Beyond understanding the status and evolution of known pathogens via genomic surveillance, greater preparedness for novel viral threats would also result from monitoring zoonotic disease. If not addressed, economic and environmental stressors are likely to cause future zoonotic transfer of diseases in the future (549). The COVID-19 pandemic has highlighted both the incredible insights available with modern evolutionary and genomic methodologies, but has also revealed the reluctance of political actors to commit resources to these efforts outside of periods of acute need. The One Health framework has emerged from collaborations by many prominent non-governmental organizations such as the World Heath Organization to promote scientific goals supportive of pandemic preparedness (534). Genomic surveillance of human pathogens and of pathogens at the human-wildlife interface is an important component needed to meet the goals of One Health (534). These efforts are especially important as anthropogenic alterations to the landscape such as climate change and urbanization increase the risk of zoonotic disease transmission (550, 551). With the COVID-19 pandemic serving as a clear illustration of why this surveillance is imperative and of its feasibility, wider awareness and adoption of the One Health paradigm is the last piece needed to develop practices that will prevent the next pandemic.

3 Molecular and Serologic Diagnostic Technologies for SARS-CoV-2

3.1 Abstract

The COVID-19 pandemic has presented many challenges that have spurred biotechnological research to address specific problems. Diagnostics is one area where biotechnology has been critical. Diagnostic tests play a vital role in managing a viral threat by facilitating the detection of infected and/or recovered individuals. From the perspective of what information is provided, these tests fall into two major categories, molecular and serological. Molecular diagnostic techniques assay whether a virus is present in a biological sample, thus making it possible to identify individuals who are currently infected. Additionally, when the immune system is exposed to a virus, it responds by producing antibodies specific to the virus. Serological tests make it possible to identify individuals who have mounted an immune response to a virus of interest and therefore facilitate the identification of individuals who have previously encountered the virus. These two categories of tests provide different perspectives valuable to understanding the spread of SARS-CoV-2. Within these categories, different biotechnological approaches offer specific advantages and disadvantages. Here we review the categories of tests developed for the detection of the SARS-CoV-2 virus or antibodies against SARS-CoV-2 and discuss the role of diagnostics in the COVID-19 pandemic.

3.2 Importance

Testing is critical to pandemic management. Among molecular tests, messaging about testing strategies has varied widely between countries, with the United States in particular emphasizing the higher sensitivity of polymerase chain reaction tests above immunoassays. However, these tests offer different advantages, and a holistic view of the testing landscape is needed to identify the information provided by each test and its relevance to addressing different questions. Another important consideration is the ease of use and ability to scale for each test, which determines how widely they can be deployed. Here we describe the different diagnostic technologies available as well as the information they provide about SARS-CoV-2 and COVID-19.

3.3 Introduction

Since the emergence of Severe acute respiratory syndrome-like coronavirus 2 (SARS-CoV-2) in late 2019, significant international efforts have focused on managing the spread of the virus. Identifying individuals who have contracted coronavirus disease 2019 (COVID-19) and may be contagious is crucial to reducing the spread of the virus. Given the high transmissibility of SARS-CoV-2 and the potential for asymptomatic or presymptomatic individuals to be contagious (1), the development of rapid, reliable, and affordable methods to detect SARS-CoV-2 infection is and was vitally important for understanding and controlling spread. For instance, test-trace-isolate procedures were an early cornerstone of many nations’ efforts to control the outbreak (552554). Such efforts depend on diagnostic testing.

The genetic sequence of the SARS-CoV-2 virus was first released by Chinese officials on January 10, 2020 (555), and the first test to detect the virus was released about 13 days later (556). The genomic information was critical to the development of diagnostic approaches. There are two main classes of diagnostic tests: molecular tests, which can diagnose an active infection by identifying the presence of SARS-CoV-2, and serological tests, which can assess whether an individual was infected in the past via the presence or absence of antibodies against SARS-CoV-2. Over the course of the COVID-19 pandemic, a variety of tests have emerged within these two categories.

Molecular tests detect either viral RNA or protein in a patient sample. They are essential to identifying infected individuals, which can be important for determining courses of action related to treatment, quarantine, and contact tracing. Tests for viral RNA are done by reverse transcription (RT) of viral RNA to DNA followed by DNA amplification, usually with polymerase chain reaction (PCR) (557). Tests for viral proteins typically use an antibody pair for detection as implemented in techniques such as lateral flow tests (LFTs) and enzyme-linked immunosorbent assays (ELISAs) (558, 559). Molecular tests require the viral genome sequence in order to develop DNA primers for viral RNA detection or to express a viral protein for use as an antigen in antibody production.

Serological tests, on the other hand, detect the presence of antibodies in blood plasma samples or other biological samples, providing insight into whether an individual has acquired immunity against SARS-CoV-2. Assays that can detect the presence of antibodies in blood plasma samples include ELISA, lateral flow immunoassay, and chemiluminescence immunoassay (CLIA) (560). To distinguish past infection from vaccination, serological tests detect antibodies that bind the nucleocapsid protein of the SARS-CoV-2 virus (561). They are useful for collecting population-level information for epidemiological analysis, as they can be used to estimate the extent of the infection in a given area. Thus, serological tests may be useful to address population-level questions, such as the percent of cases that manifest as severe versus mild and for guiding public health and economic decisions regarding resource allocation and counter-disease measures.

Molecular and serological tests therefore offer distinct, complementary perspectives on COVID-19 infections. Some of the same technologies are useful to both strategies, and different technologies have been employed to varying extents throughout the world since the start of the COVID-19 pandemic. Two of the primary metrics used to evaluate these tests are sensitivity and specificity. Sensitivity refers to a test’s ability to correctly identify a true positive; for example, a test with 50% sensitivity would identify SARS-CoV-2 in only one of every two positive samples. On the other hand, specificity refers to how well a test is able to identify a negative sample as negative. This metric can be relevant both in terms of understanding the risk of false positives and in discussing whether a test is susceptible to identifying other coronaviruses. Here, we review the different types of tests within each category that have been developed and provide perspective on their applications.

3.4 Molecular Tests to Identify SARS-CoV-2

Molecular tests are used to identify distinct genomic subsequences of a viral molecule in a sample or the presence of viral protein, and they thus can be used to diagnose an active viral infection. An important first step is identifying which biospecimens are likely to contain the virus in infected individuals and then acquiring these samples from the patient(s) to be tested. Common sampling sources for molecular tests include nasopharyngeal cavity samples, such as throat washes, throat swabs, and saliva (562), and stool samples (563). Once a sample is acquired from a patient, molecular tests detect SARS-CoV-2 based on the presence of either viral nucleic acids or viral proteins.

3.4.1 PCR-Based Tests

When testing for RNA from viruses like SARS-CoV-2, the first step involves pre-processing in order to create complementary DNA (cDNA) from the RNA sample using RT. The second step involves the amplification of a region of interest in the cDNA using successive cycles of heating and cooling. Depending on the application, this amplification is achieved using variations of PCR. Reverse transcription polymerase chain reaction (RT-PCR) tests determine whether a target is present by amplifying a region of interest of cDNA (564). Some tests use the results of the PCR itself (e.g., a band on a gel) to determine whether the pathogen is present. However, this approach has not been employed widely in diagnostic testing, and instead most PCR-based tests are quantitative.

3.4.1.1 Quantitative Real-Time PCR

In contrast to RT-PCR, quantitative, real-time PCR uses fluorescent dyes that bind to the amplified DNA, thereby allowing a real time assessment of the amplification procedure (564) (in this manuscript we refer to quantitative real-time PCR as qPCR, following the Minimum Information for Publication of Quantitative Real-Time PCR Experiments guidelines (565), and when combined with reverse transcriptase steps, as is required for the evaluation of RNA, it is known as RT-qPCR.) The time resolution provided by qPCR and RT-qPCR is useful because the amount of fluorescence emitted by the sample is proportional to the amount of DNA amplified, and therefore the amount of virus present can be indirectly measured using the cycle threshold (Ct) determined by qPCR.

The first test developed and validated for the detection of SARS-CoV-2 used RT-qPCR to detect several regions of the viral genome: the ORF1b of the RNA-dependent RNA polymerase (RdRP), the envelope protein gene (E), and the nucleocapsid protein gene (N) (556). The publication reporting this test was released on January 23, 2020, less than two weeks after the sequence of the virus was first reported (556). In collaboration with several other labs in Europe and in China, the researchers confirmed the specificity of this test with respect to other coronaviruses against specimens from 297 patients infected with a broad range of respiratory agents. Specifically, this test uses two probes against RdRP, one of which is specific to SARS-CoV-2 (556). Importantly, this assay was not found to return false positive results.

In January 2020, Chinese researchers developed a test that used RT-qPCR to identify two gene regions of the viral genome, ORF1b and N (566). This assay was tested on samples from two COVID-19 patients and a panel of positive and negative controls consisting of RNA extracted from several cultured viruses. The assay uses the N gene to screen patients, while the ORF1b gene region is used to confirm the infection (566). The test was designed to detect sequences conserved across sarbecoviruses, or viruses within the same subgenus as SARS-CoV-2. Considering that Severe acute respiratory syndrome-related coronavirus 1 (SARS-CoV-1) and SARS-CoV-2 are the only sarbecoviruses currently known to infect humans, a positive test can be assumed to indicate that the patient is infected with SARS-CoV-2, although this test is not able to discriminate the genetics of viruses within the sarbecovirus clade. The fact that the targets are so conserved offers the advantage of reduced concern about sensitivity in light of the evolution of SARS-CoV-2.

qPCR tests have played an important role in diagnostics during the COVID-19 pandemic. For SARS-CoV-2, studies have typically considered a patient to be infectious if the Ct is below 33 or sometimes 35 (1, 567, 568). A lower Ct corresponds to fewer qPCR cycles needed to reach a detectable level, indicating that higher amounts of virus were present in the initial reaction. Interpretations of the Ct values obtained from these tests have raised some interesting questions related to viral load and contagiousness. Lower Ct values correspond to a higher probability of a positive viral culture, but no threshold could discriminate all positive from all negative cultures (240). Additionally, because of the variability introduced by sample collection and clinical components of testing, Ct is not a proxy for viral load (569). Positive PCR results have also been reported for extended periods of time from symptom onset and/or the first positive PCR test (275), meaning that in some cases, a positive PCR may not indicate that someone is contagious (1).

In addition to the nuance required to interpret PCR results, there are also factors that influence their accuracy. The specificity of these tests is very high (570), meaning that a positive RT-PCR result is very likely to indicate SARS-CoV-2 infection. The weight given to these tests as an indicator of SARS-CoV-2 infection regardless of other clinical considerations is not typical (571). In fact, while the analytical specificity of the assay is extremely high, the challenges of implementing testing can introduce variability that results in a lower clinical specificity (571). Several factors may influence the sensitivity and specificity, with sample collection being a critically important factor in the reliability of RT-PCR results. The most reliable results were found to come from nasopharyngeal swabs and from pooled nasal and throat swabs, with lower accuracies produced by saliva or by throat or nasal swabs alone (570, 572). Differences in experimental parameters such as the use of primers more specific to SARS-CoV-2 has been found to improve sensitivity in these specimens (573). Additionally, the impact of viral evolution on RT-PCR sensitivity is a concern (574, 575). Using a panel that includes multiple targets can mitigate these effects (576). Additionally, a test designed to incorporate genomic differences with SARS-CoV-1 was found to offer improved sensitivity and specificity (573). Thus, while various factors can influence the exact parameters of testing accuracy, RT-PCR is known to have very high specificity and lower, but still high, sensitivity.

3.4.1.2 Digital PCR

Digital PCR (dPCR) is a new generation of PCR technologies offering an alternative to traditional qPCR (577). In dPCR, a sample is partitioned into thousands of compartments, such as nanodroplets (droplet dPCR or ddPCR) or nanowells, and a PCR reaction takes place in each compartment. This design allows for a digital read-out where each partition is either positive or negative for the nucleic acid sequence being tested for, allowing for absolute target quantification through Poisson statistics. While dPCR equipment is not yet as common as that for qPCR, dPCR for DNA targets generally achieves higher sensitivity than other PCR technologies while maintaining high specificity, though sensitivity is slightly lower for RNA targets (578).

High sensitivity is particularly relevant for SARS-CoV-2 detection, since low viral load in clinical samples can lead to false negatives. In one study, Suo et al. (579) performed a double-blind evaluation of ddPCR for SARS-CoV-2 detection. They evaluated on 63 samples collected from suspected positive outpatients and 14 from supposed convalescent patients. Of the 63 outpatients, only 21 (33%) were identified as positive for SARS-CoV-2 with qPCR. However, ddPCR identified 49 (78%) as positive, 10 (16%) as negative, and 4 (6%) as suspected/borderline for SARS-CoV-2 infection. While both qPCR and ddPCR were found to have very high specificity (100%), this analysis reported that the sensitivity was 40% with qPCR compared to 94% with ddPCR. Analysis of serial dilutions of a linear DNA standard suggested that ddPCR was approximately 500 times more sensitive than qPCR (579). Thus, this study suggests that ddPCR provides an extremely sensitive molecular test that is able to detect SARS-CoV-2 even at very low viral loads.

A second study (580) confirmed that RT-ddPCR is able to detect SARS-CoV-2 at a lower threshold for viral load relative to RT-PCR. This study analyzed 196 samples, including 103 samples from suspected patients, 77 from contacts and close contacts, and 16 from suspected convalescents, using both RT-qPCR and RT-ddPCR. First, the authors evaluated samples from the 103 suspected cases. Using RT-qPCR, 29 (28%) were identified as positive, 25 (24%) as negative, and 49 (48%) as borderline, i.e., the Ct value was higher than the positive threshold of 35 but lower than the negative threshold of 40. When the 61 negative and borderline samples were reanalyzed with ddPCR, 19 (31%) of the negative and 42 (69%) of the borderline samples were identified as positive. All of the suspected cases were later confirmed to be COVID-19 through a combination of symptom development and RT-qPCR resampling, indicating that ddPCR improved the overall detection rate compared to RT-qPCR from 28.2% to 87.4%.

They repeated this analysis in patient samples from contacts and close contacts. Patients who tested negative with both methods (n = 48) were observed to remain healthy over a period of 14 days. Among the remaining 29 samples from contacts, RT-qPCR identified 12 as positive, 1 as negative, and 16 as borderline. All of the samples that tested positive using RT-qPCR also tested positive using ddPCR. In contrast, the negative result and all but one of the borderline results were identified as positive by RT-ddPCR, and these patients were later determined to be SARS-CoV-2 positive based on clinical evaluation and repeated molecular sampling. Similarly, in the final group, 16 convalescent patients, RT-qPCR identified 12 as positive, three as suspect, and one as negative, but RT-dPCR identified all as positive. The evidence from this study therefore supports a lower limit of detection with ddPCR. Overall, these studies suggest that ddPCR is a promising tool for overcoming the problem of false negatives in SARS-CoV-2 RNA testing, but this method is unlikely to affect the current pandemic due to its lack of availability.

3.4.1.3 Sequencing

In some cases, the DNA amplified with PCR is sequenced. Sequencing requires an additional sample pre-processing step called library preparation. Library preparation is the process of preparing the sample for sequencing, typically by fragmenting the sequences and adding adapters (581). In some cases, library preparation can involve other modifications of the sample, such as adding barcodes to identify a particular sample within the sequence data. Barcoding can therefore be used to pool samples from multiple sources. There are different reagents used for library preparation that are specific to identifying one or more target sections with PCR (582). Sequential pattern matching is then used to identify unique subsequences of the virus, and if sufficient subsequences are found, the test is considered positive. Therefore, tests that use sequencing require a number of additional molecular and analytical steps relative to tests that use PCR alone.

Sequencing has been an important strategy for discovery of SARS-CoV-2 variants (e.g., see (500)). Sequencing elucidates any genetic variants located between the PCR primers. For this reason, it is critical to genomic surveillance efforts. Genomic surveillance is an important complement to epidemiological surveillance efforts (583), as described below. Through genomic surveillance, it has become possible to monitor the emergence of variants of interest and variants of concern (VOC) that may pose additional threats due to increased contagiousness, virulence, or immune escape (583, 584). Sequencing also allows for analysis of the dominant strains in an area at a given time. Worldwide, the extent of genomic surveillance varies widely, with higher-income countries typically able to sequence a higher percentage of cases (585). Sequencing efforts are important for identifying variants containing mutations that might affect the reliability of molecular diagnostic tests, as well as mitigation measures such as therapeutics and prophylactics (574, 575). Therefore, sequencing is an important component of diagnostics: while it is not necessary for diagnosing an individual case, it is critical to monitoring trends in the variants affecting a population and to staying aware of emerging variants that may pose additional challenges.

3.4.1.4 Pooled and Automated PCR Testing

Due to limited supplies and the need for more tests, several labs have found ways to pool or otherwise strategically design tests to increase throughput. The first such result came from Yelin et al. (586), who reported that they could pool up to 32 samples in a single qPCR run. This was followed by larger-scale pooling with slightly different methods (587). Although these approaches are also PCR based, they allow for more rapid scaling and higher efficiency for testing than the initial PCR-based methods developed. Conceptually, pooling could also be employed in analysis with RT-qPCR (588), and this strategy has been evaluated in settings such as schools (589) and hospitals (590).

3.4.2 RT-LAMP

RT-PCR remains the gold standard for detection of SARS-CoV-2 RNA from infected patients, but the traditional method requires special equipment and reagents, including a thermocycler. Loop-mediated isothermal amplification (LAMP) is an alternative to PCR that does not require specialized equipment (591). In this method, nucleic acids are amplified in a 25 μL reaction that is incubated and chilled on ice (591). It uses primers designed to facilitate auto-cycling strand displacement DNA synthesis (591). LAMP can be combined with reverse transcription (RT-LAMP) to enable the detection of RNA.

One study showed that RT-LAMP is effective for detection of SARS-CoV-2 with excellent specificity and sensitivity and that this method can be applied to unprocessed saliva samples (592). This method was benchmarked against RT-PCR using 177 human nasopharyngeal RNA samples, of which 126 were COVID positive. The authors break down the sensitivity of their test according to the Ct value from RT-PCR of the same samples; RT-LAMP performs at 100% sensitivity for samples with a Ct from RT-PCR of 32 or less. The performance is worse when considering all RT-PCR positive samples (including those with Ct values between 32-40). However, there is some evidence suggesting that samples obtained from individuals that achieve Ct values >30 measured using RT-PCR tend to be less infective that those that record a Ct value <30 (593595), so RT-LAMP is still a useful diagnostic tool. Various combinations of reagents are available, but one example is the WarmStart Colorimetric LAMP 2X Master Mix with a set of six primers developed previously by Zhang et al. (596). To determine assay sensitivity, serial tenfold dilutions of in vitro transcribed N-gene RNA standard were tested using quantities from 105 copies down to 10 copies. The assay readout is the color of the dye changing from pink to yellow due to binding to the DNA product over 30 minutes. The RT-LAMP assay was then applied to clinical nasopharyngeal samples. For viral loads above 100 copies of genomic RNA, the RT-LAMP assay had a sensitivity of 100% and a specificity of 96.1% from purified RNA. The sensitivity of the direct assay of saliva by RT-LAMP was 85%. Sensitivity and specificity metrics were obtained by comparison with results from RT-PCR. RT-LAMP pilot studies for detection of SARS-CoV-2 were reviewed in a meta-analysis (597). In the meta-analysis of all 2,112 samples, the cumulative sensitivity of RT-LAMP was calculated at 95.5%, and the cumulative specificity was 99.5%.

This test aims to bring the sensitivity of nucleic acid detection to the point of care or home testing setting. It could be applied for screening, diagnostics, or as a definitive test for people who are positive based on LFTs (see below). The estimated cost per test is about 2 euros when RNA extraction is included. The main strength of this test over RT-PCR is that it can be done isothermally, but the main drawback is that it is about 10-fold less sensitive than RT-PCR. The low cost, excellent sensitivity/specificity, and quick readout of RT-LAMP makes this an attractive alternative to RT-PCR. Alternative strategies like RT-LAMP are needed to bring widespread testing away from the lab and into under-resourced areas.

3.4.3 CRISPR-based Detection

Technology based on CRISPR (clustered regularly interspaced short palindromic repeats) (598) has also been instrumental in scaling up testing protocols. Two CRISPR-associated nucleases, Cas12 and Cas13, have been used for nucleic acid detection. Multiple assays exploiting these nucleases have emerged as potential diagnostic tools for the rapid detection of SARS-CoV-2 genetic material and therefore SARS-CoV-2 infection. The SHERLOCK method (Specific High-sensitivity Enzymatic Reporter unLOCKing) from Sherlock Biosciences relies on Cas13a to discriminate between inputs that differ by a single nucleotide at very low concentrations (599). The target RNA is amplified by real-time recombinase polymerase amplification (RT-RPA) and T7 transcription, and the amplified product activates Cas13a. The nuclease then cleaves a reporter RNA, which liberates a fluorescent dye from a quencher. Several groups have used the SHERLOCK method to detect SARS-CoV-2 viral RNA. An early study reported that the method could detect 7.5 copies of viral RNA in all 10 replicates, 2.5 copies in 6 out of 10, and 1.25 copies in 2 out of 10 runs (600). It also reported 100% specificity and sensitivity on 114 RNA samples from clinical respiratory samples (61 suspected cases, among which 52 were confirmed and nine were ruled out by metagenomic next-generation sequencing, 17 SARS-CoV-2-negative but human coronavirus (HCoV)-positive cases, and 36 samples from healthy subjects) and a reaction turnaround time of 40 minutes. A separate study screened four designs of SHERLOCK and extensively tested the best-performing assay. They determined the limit of detection to be 10 copies/μl using both fluorescent and lateral flow detection (601).

LFT strips are simple to use and read, but there are limitations in terms of availability and cost per test. Another group therefore proposed the CREST (Cas13-based, Rugged, Equitable, Scalable Testing) protocol, which uses a P51 cardboard fluorescence visualizer, powered by a 9-volt battery, for the detection of Cas13 activity instead of immunochromatography (602). CREST can be run, from RNA sample to result, with no need for AC power or a dedicated facility, with minimal handling in approximately 2 hours. Testing was performed on 14 nasopharyngeal swabs. CREST picked up the same positives as the CDC-recommended TaqMan assay with the exception of one borderline sample that displayed low-quality RNA. This approach may therefore represent a rapid, accurate, and affordable procedure for detecting SARS-CoV-2.

The DETECTR (DNA Endonuclease-Targeted CRISPR Trans Reporter) method from Mammoth Biosciences involves purification of RNA extracted from patient specimens, amplification of extracted RNAs by loop-mediated amplification, and application of their Cas12-based technology. In this assay, guide RNAs (gRNAs) were designed to recognize portions of sequences corresponding to the SARS-CoV-2 genome, specifically the N2 and E regions (603). In the presence of SARS-CoV-2 genetic material, sequence recognition by the gRNAs results in double-stranded DNA cleavage by Cas12, as well as cleavage of a single-stranded DNA molecular beacon. The cleavage of this molecular beacon acts as a colorimetric reporter that is subsequently read out in a lateral flow assay and indicates the presence of SARS-CoV-2 genetic material and therefore SARS-CoV-2 infection. The 40-minute assay is considered positive if there is detection of both the E and N genes or presumptive positive if there is detection of either of them. The assay had 95% positive predictive agreement and 100% negative predictive agreement with the US Centers for Disease Control and Prevention SARS-CoV-2 RT-qPCR assay. The estimated limit of detection was 10 copies per μl reaction, versus 1 copy per μl reaction for the CDC assay.

These results have been confirmed by other DETECTR approaches. Using RT-RPA for amplification, another group detected 10 copies of synthetic SARS-CoV-2 RNA per μl of input within 60 minutes of RNA sample preparation in a proof-of-principle evaluation (604). Through a similar approach, another group reported detection at 1 copy per μl (605). The DETECTR protocol was improved by combining RT-RPA and CRISPR-based detection in a one-pot reaction that incubates at a single temperature and by using dual CRISPR RNAs, which increases sensitivity. This new assay, known as All-In-One Dual CRISPR-Cas12a, detected 4.6 copies of SARS-CoV-2 RNA per μl of input in 40 minutes (606). Another single-tube, constant-temperature approach using Cas12b instead of Cas12a achieved a detection limit of 5 copies/μl in 40-60 minutes (607).

It was also reported that electric field gradients can be used to control and accelerate CRISPR assays by co-focusing Cas12-gRNA, reporters, and target (608). The authors generated an appropriate electric field gradient using a selective ionic focusing technique known as isotachophoresis (ITP) implemented on a microfluidic chip. They also used ITP for automated purification of target RNA from raw nasopharyngeal swab samples. Combining this ITP purification with loop-mediated isothermal amplification, their ITP-enhanced assay achieved detection of SARS-CoV-2 RNA (from raw sample to result) in 30 minutes.

All these methods require upstream nucleic acid amplification prior to CRISPR-based detection. They rely on type V (Cas12-based) and type IV (Cas13-based) CRISPR systems. In contrast, type III CRISPR systems have the unique property of initiating a signaling cascade, which could boost the sensitivity of direct RNA detection. In type III CRISPR systems, guide CRISPR RNAs (crRNAs) are bound by several Cas proteins (609) and can target both DNA and RNA molecules (610, 611). A study tested this hypothesis using the type III-A crRNA-guided surveillance complex from Thermus thermophilus (612). The authors showed that activation of the Cas10 polymerase generates three products (cyclic nucleotides, protons, and pyrophosphates) that can all be used to detect SARS-CoV-2 RNA. Detection of viral RNA in patient samples still required an initial nucleic acid amplification step, but improvements may in the future remove that requirement.

This goal of amplification-free detection was later achieved for a Cas13a-based system (613). This approach combined multiple CRISPR RNAs to increase Cas13a activation, which is detected by a fluorescent reporter. Importantly, because the viral RNA is detected directly, the test yields a quantitative measurement rather than a binary result. The study also shows that fluorescence can be measured in a custom-made dark box with a mobile phone camera and a low-cost laser illumination and collection optics. This approach is a truly portable assay for point-of-care diagnostics. The authors achieved detection of 100 copies/μl of pre-isolated RNA in 30 minutes, and correctly identified all SARS-CoV-2-positive patient RNA samples tested in 5 minutes (n = 20).

There is an increasing body of evidence that CRISPR-based assays offer a practical solution for rapid, low-barrier testing in areas that are at greater risk of infection, such as airports and local community hospitals. In the largest study to date, DETECTR was compared to RT-qPCR on 378 patient samples (614). The authors reported 95% reproducibility. Both techniques were equally sensitive in detecting SARS-CoV-2. Lateral flow strips showed 100% correlation to the high-throughput DETECTR assay. Importantly, DETECTR was 100% specific for SARS-CoV-2 and did not detect other human coronaviruses. A method based on a Cas9 ortholog from Francisella novicida known as FnCas9 achieved 100% sensitivity and 97% specificity in clinical samples, and the diagnostic kit is reported to have completed regulatory validation in India (615).

3.4.4 Immunoassays for the Detection of Antigens

Immunoassays can detect molecular indicators of SARS-CoV-2 infection, such as the proteins that act as antigens from the SARS-CoV-2 virus. They offer the advantage of generally being faster and requiring less specialized equipment than other molecular tests, especially those involving PCR. As a result, immunoassays hold particular interest for implementation at home and in situations where resources for PCR testing are limited. The trade-off is that these tests typically have a lower sensitivity, and sometimes a lower specificity, than other molecular tests. However, these tests tend to return a positive result five to 12 days after symptom onset, which may therefore correlate more closely with the timeframe during which viral replication occurs (616). Immunoassays for the detection of the SARS-CoV-2 antigen can include LFTs and ELISA, as discussed here, as well as CLIA and chromatographic immunoassays (617), as described in the serological testing section below.

3.4.4.1 Lateral Flow Tests

LFTs provide distinct value relative to PCR tests. They can return results within 30 minutes and can be performed without specialized equipment and at low cost. They also do not require training to operate and are cheap to produce. Thus, they can be distributed widely to affected populations making them an important public health measure to curb pandemic spread. LFTs rely on the detection of viral protein with an antibody. Often this is done with an antibody sandwich format, where one antibody conjugated to a dye binds at one site on the antigen, and an immobilized antibody on the strip binds at another site (558). This design allows the dye to accumulate to form a characteristic positive test line on the strip (558). Outside of COVID-19 diagnostics, the applications of LFTs are broad; they are routinely used for home pregnancy tests, disease detection, and even drugs of abuse detection in urine (618).

A recent review surveyed the performance of LFTs for detection of current SARS-CoV-2 infection (619). This review covered 24 studies that included more than 26,000 total LFTs. They reported significant heterogeneity in test sensitivities, with estimates ranging from 37.7% to 99.2%. The estimated specificities of these tests were more homogeneous, spanning 92.4% to 100.0%.

Despite having lower sensitivity than PCR tests, LFTs occupy an important niche in the management of SARS-CoV-2. Current infection detection by LFTs enables the scale and speed of testing that is beneficial to managing viral spread. LFTs were available freely to citizens in the United Kingdom until April 1, 2022 (620) and to citizens of the United States in early 2022 (621). These tests are particularly useful for ruling out SARS-CoV-2 infection in cases where the likelihood of infection is low (e.g., asymptomatic individuals) and positives (including false positives) can be validated with testing by alternate means (622).

3.4.4.2 Enzyme-Linked Immunosorbent Assay

ELISA is a very sensitive immunoassay that can be considered a gold standard for the detection of biological targets, including antibodies and antigenic proteins (559). It can be used to generate either quantitative or qualitative results that can be returned within a few hours (623). ELISA builds on the idea that antibodies and antigens bind together to form complexes (559) and utilizes an enzyme covalently linked to an antibody against the antigen to produce assay signal, usually a color change (624). The main advantage of ELISA is that it enables signal amplification through the enzyme’s activity, which increases sensitivity. With sandwich ELISA, antibodies are immobilized on a surface such as a plate, and viral protein antigens in the sample bind and are retained (625). A second antibody is added that binds to another site on the antigen is then added, and that second antibody is covalently linked to an enzyme. A substrate for that enzyme is then added to produce signal, usually light or a color change The exact strategy for tagging with a reporter enzyme varies among different types of ELISA (559, 625). For COVID-19 diagnostics, ELISAs have been designed to detect the antigenic Spike protein (626).

One of these assays uses two monoclonal antibodies specific to the nucleocapsid of SARS-CoV-2 to evaluate the relationship between the effect of (estimated) viral load on the ability of the assay to detect the SARS-CoV-2 antigen (627). This study analyzed 339 naso-oropharyngeal samples that were also analyzed with RT-qPCR as a gold standard. RT-qPCR identified 147 samples as positive and 192 as negative. The authors estimated the overall sensitivity and specificity to be 61.9% and 99.0%, respectively. Sensitivity increased with higher Ct. This study also assessed the performance of the ELISA test under different conditions in order to evaluate how robust it would be to the challenges of testing in real-world settings globally. Higher sensitivity was achieved for samples that were stored under ideal conditions (immediate placement in -80° C). Therefore, while immediate access to laboratory equipment is an advantage, it is not strictly necessary for ELISA to detect the antigen.

3.4.5 Limitations of Molecular Tests

Tests that identify SARS-CoV-2 using molecular technologies will identify only individuals with current infections and are not appropriate for identifying individuals who have recovered from a previous infection. Among molecular tests, different technologies have different sensitivities and specificities. In general, specificity is high, and even then, the public health repercussions of a false positive can generally be mitigated with follow-up testing. On the other hand, a test’s sensitivity, which indicates the risk of a false-negative response, can pose significant challenge to large-scale testing. False negatives are a significant concern for several reasons. Importantly, clinical reports indicate that it is imperative to exercise caution when interpreting the results of molecular tests for SARS-CoV-2 because negative results do not necessarily mean a patient is virus-free (628). To reduce occurrence of false negatives, correct execution of the analysis is crucial (629). Additionally, PCR-based tests can remain positive for a much longer time than the virus is likely to be actively replicating (616), raising concerns about their informativeness after the acute phase of the disease. Hence, the CDC has advised individuals who suspect they have been re-infected with SARS-CoV-2 to avoid using diagnostic tests within 90 days of receiving a previous positive test (630).

Additionally, the emerging nature of the COVID-19 pandemic has introduced some challenges related to uncertainty surrounding interactions between SARS-CoV-2 and its human hosts. For example, viral shedding kinetics are still not well understood but are expected to introduce a significant effect of timing of sample collection on test results (629). Similarly, the type of specimen could also influence outcomes, as success in viral detection varies among clinical sample types (570, 572, 629). With CRISPR-based testing strategies, the gRNA can recognize off-target interspersed sequences in the viral genome (631), potentially resulting in false positives and a loss of specificity.

There are also significant practical and logistical concerns related to the widespread deployment of molecular tests. Much of the technology used for molecular tests is expensive, and while it might be available in major hospitals and/or diagnostic centers, it is often not available to smaller facilities (632). At times during the pandemic, the availability of supplies for testing, including swabs and testing media, has also been limited (633). Similarly, processing times can be long, and tests might take up to 4 days to return results (632), especially during times of high demand, such as spikes in case numbers (634). Countries have employed various and differing molecular testing strategies as a tool to reduce viral transmission, even among high-income countries (635). The rapid development of molecular tests has provided a valuable, albeit imperfect, tool to identify active SARS-CoV-2 infections.

3.5 Serological Tests to Identify Recovered Individuals

Although several molecular diagnostic tests to detect viral genetic material have high specificity and sensitivity, they provide information only about active infection, and therefore offer just a snapshot-in-time perspective on the spread of a disease. Most importantly, they would not work on a patient who has fully recovered from the virus at the time of sample collection. In such contexts, serological tests are informative.

Serological tests use many of the same technologies as the immunoassays used to detect the presence of an antigen but are instead used to evaluate the presence of antibodies against SARS-CoV-2 in a serum sample. These tests are particularly useful for insight into population-level dynamics and can also offer a glimpse into the development of antibodies by individual patients during the course of a disease. Immunoassays can detect antibodies produced by the adaptive immune system in response to viral threat. Understanding the acquisition and retention of antibodies is important both to the diagnosis of prior (inactive) infections and to the development of vaccines. The two immunoglobulin classes that are most pertinent to these goals are immunoglobulin M (IgM), which are the first antibodies produced in response to an infection, and immunoglobulin G (IgG), which are the most abundant antibodies (636, 637). Serological tests detect these antibodies, offering a mechanism through which prior infection can be identified. However, the complexity of the human immune response means that there are many facets to such analyses.

In general, SARS-CoV-2 infection will induce the immune system to produce antibodies fairly quickly. Prior research is available about the development of antibodies to SARS-CoV-1 during the course of the associated disease, severe acute respiratory syndrome (SARS). IgM and IgG antibodies were detected in the second week following SARS-CoV-1 infection. IgM titers peaked by the first month post-infection, and then declined to undetectable levels after day 180. IgG titers peaked by day 60 and persisted in all donors through the two-year duration of study (638). Such tests can also illuminate the progression of viral disease, as IgM are the first antibodies produced by the body and indicate that the infection is active. Once the body has responded to the infection, IgG are produced and gradually replace IgM, indicating that the body has developed immunogenic memory (639). Therefore, it was hoped that the development of assays to detect the presence of IgM and IgG antibodies against SARS-CoV-2 would allow the identification of cases from early in the infection course (via IgM) and for months or years afterwards (via IgG). Several technologies have been used to develop serological tests for COVID-19, including ELISA, lateral flow immunoassay, chemiluminescence immunoassay, and neutralizing antibody assays (640).

3.5.1 ELISA

The application of ELISA to serological testing is complementary to its use in molecular diagnostics (see above). Instead of using an enzyme-labeled antibody as a probe that binds to the target antigen, the probe is an antigen and the target is an antibody. The enzyme used for detection and signal amplification is on a secondary antibody raised generally against human IgG or IgM. In March 2020, the Krammer lab proposed an ELISA test that detects IgG and IgM that react against the receptor-binding domain (RBD) of the spike proteins (S) of the virus (641). A subsequent ELISA test developed to detect SARS-CoV-2 IgG based on the RBD reported a specificity of over 99% and a sensitivity of up to 88.24%, which was observed in samples collected 21 to 27 days after the onset of infection (approximated with symptom onset or positive PCR test) (642). Earlier in the disease course, sensitivity was lower: 53.33% between days 0 and 13 and 80.47% between days 14 and 20. This study reported that their laboratory ELISA outperformed two commercial kits that also used an ELISA design (642). Therefore, while analysis with ELISA requires laboratory support and equipment, these results do suggest that ELISA achieves relatively high sensitivity, especially in the weeks following infection. Efforts have been made to develop low-cost strategies for conducting these tests that will make them more accessible worldwide (643).

3.5.2 Chemiluminescence Immunoassay

Another early approach investigated for detection of antibodies against SARS-CoV-2 was CLIA. Like ELISA, CLIA is a type of enzyme immunoassay (EIA) (644). While the technique varies somewhat, in one approach, a bead is coated with the antigen and then washed with the sample (645). If the antibody is present in the sample, it will bind to the bead. Then the bead is exposed to a label, a luminescent molecule that will bind to the antigen/antibody complex and can therefore be used as an indicator (645). One CLIA approach to identify COVID-19 used a synthetic peptide derived from the amino acid sequence of the SARS-CoV-2 S protein (646). It was highly specific to SARS-CoV-2 and detected IgM in 57.2% and IgG in 71.4% of serum samples from 276 COVID-19 cases confirmed with RT-qPCR. IgG could be detected within two days of the onset of fever, but IgM could not be detected any earlier (646), which has been supported by other analyses as well (647). This pattern was consistent with observations in Middle East respiratory syndrome, which is also caused by an HCoV. In comparisons of different commercial immunoassays, accuracy of CLIA tests were often roughly comparable to other EIAs (648), although one CLIA did not perform as well as several other EIAs (647, 649). The sensitivity and specificities reported vary among CLIA tests and for the detection of IgM versus IgG, but sensitivities and specificities as high as 100% have been reported among various high-throughput tests (649651). CLIA has previously been used to develop tests that can be used at point of care (e.g., (644)) which may allow for this technique to become more widely accessible in the future.

3.5.3 Lateral Flow Immunoassay

The first serological test approved for emergency use in the United States was developed by Cellex (652). The Cellex qSARS-CoV-2 IgG/IgM Rapid Test is a chromatographic immunoassay, also known as a lateral flow immunoassay, designed to qualitatively detect IgM and IgG antibodies against SARS-CoV-2 in the plasma of patients suspected to have developed a SARS-CoV-2 infection (652). The Cellex test cassette contains a pad of SARS-CoV-2 antigens and a nitrocellulose strip with lines for each of IgG and IgM, as well as a control (goat IgG) (652). In a specimen that contains antibodies against the SARS-CoV-2 antigen, the antibodies will bind to the strip and be captured by the IgM and/or IgG line(s), resulting in a change of color (652). With this particular assay, results can be read within 15 to 20 minutes (652). Lateral flow immunoassays are often available at point of care but can have very low sensitivity (649).

3.5.4 Neutralizing Antibody Assays

Neutralizing antibody assays play a functional role in understanding immunity that distinguishes them from other serological tests. The tests described above are all binding antibody tests. On the other hand, rather than simply binding an antibody to facilitate detection, neutralizing antibody assays determine whether an antibody response is present that would prevent infection (653, 654). Therefore, these tests serve the purpose of evaluating the extent to which a sample donor has acquired immunity that will reduce susceptibility to SARS-CoV-2. As a result, neutralizing antibody assays have been used widely to characterize the duration of immunity following infection, to assess vaccine candidates, and to establish correlates of protection against infection and disease (655657). These tests are typically performed in a laboratory (653), and in SARS-CoV-2, the results of neutralizing antibody assays are often correlated with the results of binding antibody tests (653).

The gold standard for assessing the presence of neutralizing antibodies is the plaque reduction neutralization test (PRNT), but this approach does not scale well (654). An early high-throughput neutralizing antibody assay designed against SARS-CoV-2 used a fluorescently labeled reporter virus that was incubated with different dilutions of patient serum (654). The cells used for incubation would turn green if antibodies were not present. Essentially, this assay evaluates whether the virus is able to infect the cell in the presence of the serum. The specificity of this assay was 100%, and the correlation between the results of this assay and of PRNT was 0.85 with the results suggesting that the sensitivity of the high-throughput approach was higher than that of PRNT (654). While this approach was performed on a plate and using cells, other methods have been developed using methods such as bead arrays (658).

3.5.5 Duration of Immune Indicators

While the adaptive immune system produces antibodies in response to SARS-CoV-2 viral challenge, these indicators of seroconversion are unlikely to remain in circulation permanently. Previously, a two-year longitudinal study following convalesced SARS patients with a mean age of 29 found that IgG antibodies were detectable in all 56 patients surveyed for at least 16 months and remained detectable in all but 4 patients (11.8%) through the full two-year study period (659). These results suggest that immunity to SARS-CoV-1 is sustained for at least a year. Circulating antibody titers to other coronaviruses have been reported to decline significantly after one year (660). Evidence to date suggests that sustained immunity to the SARS-CoV-2 virus remains for a shorter period of time but at least 6 to 8 months after infection (661664). However, this does not mean that all serological evidence of infection dissipates, but rather that the immune response becomes insufficient to neutralize the virus.

In order to study the persistence of SARS-CoV-2 antibodies, one study assessed sustained immunity using 254 blood samples from 188 COVID-19 positive patients (662). The samples were collected at various time points between 6 and 240 days post-symptom onset; some patients were assessed longitudinally. Of the samples, 43 were collected at least 6 months after symptom onset. After one month, 98% of patients were seropositive for IgG to S. Moreover, S IgG titers were stable and heterogeneous among patients over a period of 6 to 8 months post-symptom onset, with 90% of subjects seropositive at 6 months. Similarly, at 6 to 8 months 88% of patients were seropositive for RBD IgG, and 90% were seropositive for SARS-CoV-2 neutralizing antibodies. Another study examined 119 samples from 88 donors who had recovered from mild to severe cases of COVID-19 (664). A relatively stable level of IgG and plasma neutralizing antibodies was identified up to 6 months post diagnosis. Significantly lower but considerable levels of anti-SARS-CoV-2 IgG antibodies were still present in 80% of samples obtained 6 to 8 months post-symptom onset.

Titers of IgM and IgG antibodies against the RBD were found to decrease from 1.3 to 6.2 months post infection in a study of 87 individuals (665). However, the decline of IgA activity (15%) was less pronounced than that of IgM (53%) or IgG (32%). It was noted that higher levels of anti-RBD IgG and anti-N total antibodies were detected in individuals that reported persistent post-acute symptoms at both study visits. Moreover, plasma neutralizing activity decreased five-fold between 1.3 and 6.2 months in an assay of HIV-1 virus pseudotyped with SARS-CoV-2 S protein, and this neutralizing activity was directly correlated with IgG anti-RBD titers (665). These findings are in accordance with other studies that show that the majority of seroconverters have detectable, albeit decreasing, levels of neutralizing antibodies at least 3 to 6 months post infection (666668).

Determining the potency of anti-RBD antibodies early in the course of an infection may be important moving forward, as their neutralizing potency may be prognostic for disease severity and survival (669). The duration of immunity might also vary with age (670) or ABO blood type (671). Autopsies of lymph nodes and spleens from severe acute COVID-19 patients showed a loss of T follicular helper cells and germinal centers that may explain some of the impaired development of antibody responses (672). Therefore, serological testing may be time-limited in its ability to detect prior infection.

Other immune indicators of prior infection have also been evaluated to see how they persist over time. SARS-CoV-2 memory CD8+ T cells were slightly decreased (50%) 6 months post-symptom onset. In this same subset of COVID-19 patients, 93% of subjects had detectable levels of SARS-CoV-2 memory CD4+ T cells, of which 42% had more than 1% SARS-CoV-2-specific CD4+ T cells. At 6 months, 92% of patients were positive for SARS-CoV-2 memory CD4+ T cells. Indeed, the abundance of S-specific memory CD4+ T cells over time was similar to that of SARS-CoV-2-specific CD4+ T cells overall (662). T cell immunity to SARS-CoV-2 at 6 to 8 months following symptom onset has also been confirmed by other studies (664, 673, 674). In another study, T cell reactivity to SARS-CoV-2 epitopes was also detected in some individuals never been exposed to SARS-CoV-2. This finding suggests the potential for cross-reactive T cell recognition between SARS-CoV-2 and pre-existing circulating HCoV that are responsible for the “common cold” (675), but further research is required. Therefore, whether T cells will provide a more stable measure through which to assess prior infection remains unknown. Notably, commercial entities have tried to develop tests specifically for T cells, some of which have been authorized by the United States Food and Drug Administration (676, 677) to identify people with adaptive T cell immune responses to SARS-CoV-2, either from a previous or ongoing infection.

3.5.6 Applications of Serological Tests

In addition to the limitations posed by the fact that antibodies are not permanent indicators of prior infection, serological immunoassays carry a number of limitations that influence their utility in different situations. Importantly, false positives can occur due to cross-reactivity with other antibodies according to the clinical condition of the patient (652). Due to the long incubation times and delayed immune responses of infected patients, serological immunoassays are insufficiently sensitive for a diagnosis in the early stages of an infection. Therefore, such tests must be used in combination with RNA detection tests if intended for diagnostic purposes (678). False positives are particularly harmful if they are erroneously interpreted to mean that a population is more likely to have acquired immunity to a disease (679). Similarly, while serological tests may be of interest to individuals who wish to confirm they were infected with SARS-CoV-2 in the past, their potential for false positives means that they are not currently recommended for this use. However, in the wake of vaccines becoming widely available, accurate serological tests that could be administered at point of care were investigated in the hope that they could help to prioritize vaccine recipients (680). Another concern with serological testing is the potential for viral evolution to reduce the sensitivity of assays, especially for neutralizing antibody assays. Chen et al. performed a systematic re-analysis of published data examining the neutralizing effect of serum from vaccinated or recovered individuals on four VOC (681). They found reduced neutralizing titers against these variants relative to the lineages used for reference. These findings suggest that such techniques will need to be modified over time as SARS-CoV-2 evolves.

These limitations make serological tests far less useful for diagnostics and for test-and-trace strategies; however, serological testing is valuable for public health monitoring at the population level. Serosurveys provide a high-level perspective of the prevalence of a disease and can provide insight into the susceptibility of a population as well as variation in severity, e.g., between geographic regions (679). From a public health perspective, they can also provide insight into the effectiveness of mitigation efforts and to gain insight into risk factors influencing susceptibility (682). EIA methods are high-throughput (683, 684), and, as with molecular tests, additional efforts have been made to scale up the throughput of serological tests (685). Therefore, serological tests can be useful to developing strategies for the management of viral spread.

Early in the course of the pandemic, it was also hoped that serological tests would provide information relevant to advancing economic recovery. Some infectious agents can be controlled through “herd immunity”, which is when a critical mass within the population acquires immunity through vaccination and/or infection, preventing an infectious agent from spreading widely. It was hoped that people who had recovered and developed antibodies might be able to return to work (686, 687). This strategy would have relied on recovered individuals acquiring long-term immunity, which has not been borne out (688). Additionally, it was hoped that identifying seroconverters and specifically those who had mounted a strong immune response would reveal strong candidates for convalescent plasma donation (641); however, convalescent plasma has not been found to offer therapeutic benefit (reviewed in (3)). While these hopes have not been realized, serological tests have been useful for gaining a better understanding of the pandemic (682).

3.6 Possible Alternatives to Current Diagnostic Practices

One possible alternative or complement to molecular and serological testing would be diagnosing COVID-19 cases based on symptomatology. COVID-19 can present with symptoms similar to other types of pneumonia, and symptoms can vary widely among COVID-19 patients; therefore, clinical presentation is often insufficient as a sole diagnostic criterion. In addition, identifying and isolating mild or asymptomatic cases is critical to efforts to manage outbreaks. Even among mildly symptomatic patients, a predictive model based on clinical symptoms had a sensitivity of only 56% and a specificity of 91% (689). More problematic is that clinical symptom-based tests are only able to identify already symptomatic cases, not presymptomatic or asymptomatic cases. They may still be important for clinical practice and for reducing tests needed for patients deemed unlikely to have COVID-19.

In some cases, clinical signs may also provide information that can inform diagnosis. Using computed tomography of the chest in addition to RT-qPCR testing was found to provide a higher sensitivity than either measure alone (690). X-ray diagnostics have been reported to have high sensitivity but low specificity in some studies (691). Other studies have shown that specificity varies between radiologists (692), though the sensitivity reported here was lower than that published in the previous paper. While preliminary machine-learning results suggested that chest X-rays might provide high sensitivity and specificity and potentially facilitate the detection of asymptomatic and presymptomatic infections (e.g., (693)), further investigation suggested that such approaches are prone to bias and are unlikely to be clinically useful (694). Given the above, the widespread use of X-ray tests on otherwise healthy adults is likely inadvisable.

Finally, in addition to genomic and serological surveillance, other types of monitoring have proven useful in managing the pandemic (695). One that has received significant attention is wastewater surveillance. This approach can use several of the technologies described for molecular testing, such as qPCR and dPCR, as well as in vitro culturing (696) and can provide insight into trends in the prevalence of SARS-CoV-2 regionally.

3.7 Strategies and Considerations for Testing

Deciding whom to test, when to test, and which test to use have proven challenging as the COVID-19 pandemic has unfolded. Early in the COVID-19 pandemic, testing was typically limited to individuals considered high risk for developing serious illness (697). This approach often limited testing to people with severe symptoms and people showing mild symptoms that had been in contact with a person who had tested positive. Individuals who were asymptomatic (i.e., potential spreaders) and individuals who were able to recover at home were thus often unaware of their status. Therefore, this method of testing administration misses a high proportion of infections and does not allow for test-and-trace methods to be used. For instance, a study from Imperial College estimates that in Italy, the true number of infections was around 5.9 million in a total population of ~60 million, compared to the 70,000 detected as of March 28, 2020 (314). Another analysis, which examined New York state, indicated that as of May 2020, approximately 300,000 cases had been reported in a total population of approximately 20 million (698). This corresponded to ~1.5% of the population, but ~12% of individuals sampled statewide were estimated as positive through antibody tests (along with indications of spatial heterogeneity at higher resolution) (698). Technological advancements that facilitate widespread, rapid testing would therefore be valuable for accurately assessing the rate of infection and aid in controlling the virus’ spread. Additionally, the trade off of accessibility, sensitivity, and time to results has raised some complex questions around which tests are best suited to certain situations. Immunoassays, including serological tests, have much higher limits of detection than PCR tests do (699).

Changes in public attitudes and the lifting of COVID-19 restrictions due to the multifactorial desire to stimulate economic activities has required a shift of testing paradigms in 2022, despite warnings from public health officials against a hard exit from public health restrictions (700, 701). An important strategy for testing moving forward is to determine when someone becomes infectious or is no longer infectious following a positive test for COVID-19. Generally, patient specimens tend to not contain culturable virus past day 5 of symptom onset (702, 703). However, due to their sensitivity to post-infectious viral RNA in specimens, PCR-based methods may mislead individuals to believe that they are still infectious several days after symptom onset (678). Furthermore, detection of viral RNA can occur days and weeks after an active infection due to the sensitivity of PCR-based methods (568, 704, 705).

In contrast, LFTs were thought to have poor sensitivity and their value for identifying infections and managing the pandemic was questioned (706, 707). However, LFTs do reliably detect SARS-CoV-2 proteins when there is a high viral load, which appears to correlate with a person’s infectiousness (616, 708). Therefore, LFTs are an important diagnostic tool to determine infectiousness with fast turnaround times, ease of use, and accessibility by the general public (678, 709). One study has suggested that the test sensitivity of LFTs appears to be less important than accessibility to LFTs, frequent testing, and fast reporting times for reducing the impact of viral spread (710). While PCR-based methods are important for COVID-19 surveillance, their use is labor intensive and time consuming, and laboratories are often slow to report results, rendering such methods limited in their surveillance capacity (678).

These limitations are demonstrated by the estimated 10-fold under-reporting of cases in the United States in 2020 due to shortages in testing and slow rollout of testing and slow reporting of results (711). However, one strategy that may balance the strengths and weaknesses of both types of tests is to corroborate a positive LFT result using a PCR-based method. Indeed, in December 2021 sufficient surveillance and reduction of COVID-19 spread using this joint LFT-PCR strategy was demonstrated in Liverpool, U.K., where there was an estimated 21% reduction of cases (709, 712).

3.8 What Lies Ahead

Diagnostic tools have played an important role during the COVID-19 pandemic. Different tests offer different advantages (Figure 2). Specifically, the results of SARS-CoV-2 diagnostic tests (typically qPCR or LFT-based tests) have been used to estimate the number of infections in the general population, thus informing public health strategies around the globe (574). During the surges caused by the different SARS-CoV-2 variants between 2020 and 2021, government-sponsored efforts to conduct mass testing and to provide free diagnostic tests to the population were a common occurrence in many parts of the world (713715). However, recent reports indicate that such public health policies are starting to change during 2022. For example, it is known that the UK plans to dismantle its COVID-19 testing program and scale back its daily reporting requirements (716, 717). A similar approach can be seen in the US as well, where multiple state-run testing facilities are closing, despite some groups advocating to keep them open (718, 719). These ongoing changes in testing policy are likely to have a direct effect on how the pandemic is managed moving forward. SARS-CoV-2 diagnostic tests can be used effectively to slow the spread of the disease only when 1) they are used to share testing results in a timely manner so that they can reasonably be used to approximate the number of infections in the population and 2) those tests are easily accessible by the general public.

Figure 2: Summary of Diagnostic Technologies used in COVID-19 Testing. The immune response to SARS-CoV-2 means that different diagnostic approaches offer different views of COVID-19. Early in the infection course, viral load is high. This means that PCR-based testing and EIA testing for antigens are likely to return positives (as indicated by the green bars at the bottom). As viral load decreases, EIA antigen tests become negative, but PCR-based tests can still detect even very low viral loads. From a serological perspective, IgM peaks in the first few weeks following infection and then decreases, while IgG peaks much later in the infection course. Therefore, serological tests are likely to return positives in first few months following the acute infection course. Additional detail is available above and in several analyses and reviews (1, 678, 705, 720).

Children are one segment of the population where the importance of the two aforementioned conditions can be exemplified. This group is particularly vulnerable as there are ongoing challenges with testing in schools, increased COVID-19 mortality rates, and COVID-19-associated orphanhood. In this regard, although there is evidence of the efficacy of routine diagnostic testing to reduce the probability of having infectious students (721, 722), as of March of 2022 there is an increasing number of schools that have stopped or plan to stop contact tracing efforts (723, 724), in line with an announcement made by the CDC where it no longer recommended contact tracing as a strategy to contain the virus (725). An estimated 197 children have died in the US from COVID-19 during the first three months of 2022 (726), compared to 735 deaths in the preceding 20 months of the pandemic (727), and millions of children have been orphaned as a consequence of parent or caregiver death due to COVID-19 (728). It is likely that reducing or eliminating testing capacity in schools will directly exacerbate these negative outcomes for the remainder of 2022.

The SARS-CoV-2 diagnostic tools presented in this paper are far less useful if they are difficult to obtain, or if their limited use results in biased data that would lead to ill-informed public health strategies. Under conditions of limited supply, different strategies for testing are needed (729). The pandemic is still an ongoing public health threat and it is worrying that active testing and tracing efforts are a low priority for public health authorities in many countries. If this trend continues, the lack of testing could result in increased morbidity and mortality and an overall failure to manage the pandemic.

4 Identification and Development of Therapeutics for COVID-19

4.1 Abstract

After emerging in China in late 2019, the novel coronavirus SARS-CoV-2 spread worldwide and as of mid-2021 remains a significant threat globally. Only a few coronaviruses are known to infect humans, and only two cause infections similar in severity to SARS-CoV-2: Severe acute respiratory syndrome-related coronavirus, a closely related species of SARS-CoV-2 that emerged in 2002, and Middle East respiratory syndrome-related coronavirus, which emerged in 2012. Unlike the current pandemic, previous epidemics were controlled rapidly through public health measures, but the body of research investigating severe acute respiratory syndrome and Middle East respiratory syndrome has proven valuable for identifying approaches to treating and preventing novel coronavirus disease 2019 (COVID-19). Building on this research, the medical and scientific communities have responded rapidly to the COVID-19 crisis to identify many candidate therapeutics. The approaches used to identify candidates fall into four main categories: adaptation of clinical approaches to diseases with related pathologies, adaptation based on virological properties, adaptation based on host response, and data-driven identification of candidates based on physical properties or on pharmacological compendia. To date, a small number of therapeutics have already been authorized by regulatory agencies such as the Food and Drug Administration (FDA), while most remain under investigation. The scale of the COVID-19 crisis offers a rare opportunity to collect data on the effects of candidate therapeutics. This information provides insight not only into the management of coronavirus diseases, but also into the relative success of different approaches to identifying candidate therapeutics against an emerging disease.

4.2 Importance

The COVID-19 pandemic is a rapidly evolving crisis. With the worldwide scientific community shifting focus onto the SARS-CoV-2 virus and COVID-19, a large number of possible pharmaceutical approaches for treatment and prevention have been proposed. What was known about each of these potential interventions evolved rapidly throughout 2020 and 2021. This fast-paced area of research provides important insight into how the ongoing pandemic can be managed and also demonstrates the power of interdisciplinary collaboration to rapidly understand a virus and match its characteristics with existing or novel pharmaceuticals. As illustrated by the continued threat of viral epidemics during the current millennium, a rapid and strategic response to emerging viral threats can save lives. In this review, we explore how different modes of identifying candidate therapeutics have borne out during COVID-19.

4.3 Introduction

The novel coronavirus Severe acute respiratory syndrome-related coronavirus 2 (SARS-CoV-2) emerged in late 2019 and quickly precipitated the worldwide spread of novel coronavirus disease 2019 (COVID-19). COVID-19 is associated with symptoms ranging from mild or even asymptomatic to severe, and up to 2% of patients diagnosed with COVID-19 die from COVID-19-related complications such as acute respiratory disease syndrome (ARDS) (1). As a result, public health efforts have been critical to mitigating the spread of the virus. However, as of mid-2021, COVID-19 remains a significant worldwide concern (Figure 3), with 2021 cases in some regions surging far above the numbers reported during the initial outbreak in early 2020. While a number of vaccines have been developed and approved in different countries starting in late 2020 (5), vaccination efforts have not proceeded at the same pace throughout the world and are not yet close to ending the pandemic.

Due to the continued threat of the virus and the severity of the disease, the identification and development of therapeutic interventions have emerged as significant international priorities. Prior developments during other recent outbreaks of emerging diseases, especially those caused by human coronaviruses (HCoV), have guided biomedical research into the behavior and treatment of this novel coronavirus infection. However, previous emerging HCoV-related disease threats were controlled much more quickly than SARS-CoV-2 through public health efforts (Figure 3). The scale of the COVID-19 pandemic has made the repurposing and development of pharmaceuticals more urgent than in previous coronavirus epidemics.

4.3.1 Lessons from Prior HCoV Outbreaks

Figure 3: Cumulative global incidence of COVID-19 and SARS. As of March 9, 2023, 676,570,149 COVID-19 cases and 6,881,802 COVID-19 deaths had been reported worldwide since January 22, 2020. A total of 8,432 cases and 813 deaths were reported for SARS from March 17 to July 11, 2003. SARS-CoV-1 was officially contained on July 5, 2003, within 9 months of its appearance (730). In contrast, SARS-CoV-2 remains a significant global threat nearly two years after its emergence. COVID-19 data are from the COVID-19 Data Repository by the Center for Systems Science and Engineering at Johns Hopkins University (731, 732). SARS data are from the WHO (733) and were obtained from a dataset on GitHub (734). See https://greenelab.github.io/covid19-review/ for the most recent version of this figure, which is updated daily.

At first, SARS-CoV-2’s rapid shift from an unknown virus to a significant worldwide threat closely paralleled the emergence of Severe acute respiratory syndrome-related coronavirus (SARS-CoV-1), which was responsible for the 2002-03 SARS epidemic. The first documented case of COVID-19 was reported in Wuhan, China in November 2019, and the disease quickly spread worldwide in the early months of 2020. In comparison, the first case of SARS was reported in November 2002 in the Guangdong Province of China, and it spread within China and then into several countries across continents during the first half of 2003 (250, 340, 730). In fact, genome sequencing quickly revealed the virus causing COVID-19 to be a novel betacoronavirus closely related to SARS-CoV-1 (11).

While similarities between these two viruses are unsurprising given their close phylogenetic relationship, there are also some differences in how the viruses affect humans. SARS-CoV-1 infection is severe, with an estimated case fatality rate (CFR) for SARS of 9.5% (340), while estimates of the CFR associated with COVID-19 are much lower, at up to 2% (1). SARS-CoV-1 is highly contagious and spread primarily by droplet transmission, with a basic reproduction number (R0) of 4 (i.e., each person infected was estimated to infect four other people) (340). There is still some controversy whether SARS-CoV-2 is primarily spread by droplets or is primarily airborne (258, 259, 262, 735). Most estimates of its R0 fall between 2.5 and 3 (1). Therefore, SARS is thought to be a deadlier and more transmissible disease than COVID-19.

With the 17-year difference between these two outbreaks, there were major differences in the tools available to efforts to organize international responses. At the time that SARS-CoV-1 emerged, no new HCoV had been identified in almost 40 years (250). The identity of the virus underlying the SARS disease remained unknown until April of 2003, when the SARS-CoV-1 virus was characterized through a worldwide scientific effort spearheaded by the World Health Organization (WHO) (250). In contrast, the SARS-CoV-2 genomic sequence was released on January 3, 2020 (11), only days after the international community became aware of the novel pneumonia-like illness now known as COVID-19. While SARS-CoV-1 belonged to a distinct lineage from the two other HCoVs known at the time of its discovery (340), SARS-CoV-2 is closely related to SARS-CoV-1 and is a more distant relative of another HCoV characterized in 2012, Middle East respiratory syndrome-related coronavirus (19, 736). Significant efforts had been dedicated towards understanding SARS-CoV-1 and MERS-CoV and how they interact with human hosts. Therefore, SARS-CoV-2 emerged under very different circumstances than SARS-CoV-1 in terms of scientific knowledge about HCoVs and the tools available to characterize them.

Despite the apparent advantages for responding to SARS-CoV-2 infections, COVID-19 has caused many orders of magnitude more deaths than SARS did (Figure 3). The SARS outbreak was officially determined to be under control in July 2003, with the success credited to infection management practices such as mask wearing (250). Middle East respiratory syndrome-related coronavirus (MERS-CoV) is still circulating and remains a concern; although the fatality rate is very high at almost 35%, the disease is much less easily transmitted, as its R0 has been estimated to be 1 (340). The low R0 in combination with public health practices allowed for its spread to be contained (340). Neither of these trajectories are comparable to that of SARS-CoV-2, which remains a serious threat worldwide over a year and a half after the first cases of COVID-19 emerged (Figure 3).

4.3.2 Potential Approaches to the Treatment of COVID-19

Therapeutic interventions can utilize two approaches: they can either mitigate the effects of an infection that harms an infected person, or they can hinder the spread of infection within a host by disrupting the viral life cycle. The goal of the former strategy is to reduce the severity and risks of an active infection, while for the latter, it is to inhibit the replication of a virus once an individual is infected, potentially freezing disease progression. Additionally, two major approaches can be used to identify interventions that might be relevant to managing an emerging disease or a novel virus: drug repurposing and drug development. Drug repurposing involves identifying an existing compound that may provide benefits in the context of interest (737). This strategy can focus on either approved or investigational drugs, for which there may be applicable preclinical or safety information (737). Drug development, on the other hand, provides an opportunity to identify or develop a compound specifically relevant to a particular need, but it is often a lengthy and expensive process characterized by repeated failure (738). Drug repurposing therefore tends to be emphasized in a situation like the COVID-19 pandemic due to the potential for a more rapid response.

Even from the early months of the pandemic, studies began releasing results from analyses of approved and investigational drugs in the context of COVID-19. The rapid timescale of this response meant that, initially, most evidence came from observational studies, which compare groups of patients who did and did not receive a treatment to determine whether it may have had an effect. This type of study can be conducted rapidly but is subject to confounding. In contrast, randomized controlled trials (RCTs) are the gold-standard method for assessing the effects of an intervention. Here, patients are prospectively and randomly assigned to treatment or control conditions, allowing for much stronger interpretations to be drawn; however, data from these trials take much longer to collect. Both approaches have proven to be important sources of information in the development of a rapid response to the COVID-19 crisis, but as the pandemic draws on and more results become available from RCTs, more definitive answers are becoming available about proposed therapeutics. Interventional clinical trials are currently investigating or have investigated a large number of possible therapeutics and combinations of therapeutics for the treatment of COVID-19 (Figure 4).

Figure 4: COVID-19 clinical trials. Trials data are from the University of Oxford Evidence-Based Medicine Data Lab’s COVID-19 TrialsTracker (739). As of December 31, 2020, there were 6,987 COVID-19 clinical trials of which 3,962 were interventional. The study types include only types used in at least five trials. Only interventional trials are analyzed in the figures depicting status, phase, and intervention. Of the interventional trials, 98 trials had reported results as of December 31, 2020. Recruitment status and trial phase are shown only for interventional trials in which the status or phase is recorded. Common interventions refers to interventions used in at least ten trials. Combinations of interventions, such as hydroxychloroquine with azithromycin, are tallied separately from the individual interventions. See https://greenelab.github.io/covid19-review/ for the most recent version of this figure, which is updated daily.

The purpose of this review is to provide an evolving resource tracking the status of efforts to repurpose and develop drugs for the treatment of COVID-19. We highlight four strategies that provide different paradigms for the identification of potential pharmaceutical treatments. The WHO guidelines (81) and a systematic review (740) are complementary living documents that summarize COVID-19 therapeutics.

4.4 Repurposing Drugs for Symptom Management

A variety of symptom profiles with a range of severity are associated with COVID-19 (1). In many cases, COVID-19 is not life threatening. A study of COVID-19 patients in a hospital in Berlin, Germany reported that the highest risk of death was associated with infection-related symptoms, such as sepsis, respiratory symptoms such as ARDS, and cardiovascular failure or pulmonary embolism (741). Similarly, an analysis in Wuhan, China reported that respiratory failure (associated with ARDS) and sepsis/multi-organ failure accounted for 69.5% and 28.0% of deaths, respectively, among 82 deceased patients (742). COVID-19 is characterized by two phases. The first is the acute response, where an adaptive immune response to the virus is established and in many cases can mitigate viral damage to organs (743). The second phase characterizes more severe cases of COVID-19. Here, patients experience a cytokine storm, whereby excessive production of cytokines floods into circulation, leading to systemic inflammation, immune dysregulation, and multiorgan dysfunction that can cause multiorgan failure and death if untreated (744). ARDS-associated respiratory failure can occur during this phase. Cytokine dysregulation was also identified in patients with SARS (745, 746).

In the early days of the COVID-19 pandemic, physicians sought to identify potential treatments that could benefit patients, and in some cases shared their experiences and advice with the medical community on social media sites such as Twitter (747). These on-the-ground treatment strategies could later be analyzed retrospectively in observational studies or investigated in an interventional paradigm through RCTs. Several notable cases involved the use of small-molecule drugs, which are synthesized compounds of low molecular weight, typically less than 1 kilodalton (kDa) (748). Small-molecule pharmaceutical agents have been a backbone of drug development since the discovery of penicillin in the early twentieth century (749). It and other antibiotics have long been among the best known applications of small molecules to therapeutics, but biotechnological developments such as the prediction of protein-protein interactions (PPIs) have facilitated advances in precise targeting of specific structures using small molecules (749). Small molecule drugs today encompass a wide range of therapeutics beyond antibiotics, including antivirals, protein inhibitors, and many broad-spectrum pharmaceuticals.

Many treatments considered for COVID-19 have relied on a broad-spectrum approach. These treatments do not specifically target a virus or particular host receptor, but rather induce broad shifts in host biology that are hypothesized to be potential inhibitors of the virus. This approach relies on the fact that when a virus enters a host, the host becomes the virus’s environment. Therefore, the state of the host can also influence the virus’s ability to replicate and spread. The administration and assessment of broad-spectrum small-molecule drugs on a rapid time course was feasible because they are often either available in hospitals, or in some cases may also be prescribed to a large number of out-patients. One of the other advantages is that these well-established compounds, if found to be beneficial, are often widely available, in contrast to boutique experimental drugs.

In some cases, prior data was available from experiments examining the response of other HCoVs or HCoV infections to a candidate drug. In addition to non-pharmaceutical interventions such as encouraging non-intubated patients to adopt a prone position (750), knowledge about interactions between HCoVs and the human body, many of which emerged from SARS and MERS research over the past two decades, led to the suggestion that a number of common drugs might benefit COVID-19 patients. However, the short duration and low case numbers of prior outbreaks were less well-suited to the large-scale study of clinical applications than the COVID-19 pandemic is. As a result, COVID-19 has presented the first opportunity to robustly evaluate treatments that were common during prior HCoV outbreaks to determine their clinical efficacy. The first year of the COVID-19 pandemic demonstrated that there are several different trajectories that these clinically suggested, widely available candidates can follow when assessed against a widespread, novel viral threat.

One approach to identifying candidate small molecule drugs was to look at the approaches used to treat SARS and MERS. Treatment of SARS and MERS patients prioritized supportive care and symptom management (340). Among the clinical treatments for SARS and MERS that were explored, there was generally a lack of evidence indicating whether they were effective. Most of the supportive treatments for SARS were found inconclusive in meta-analysis (751), and a 2004 review reported that not enough evidence was available to make conclusions about most treatments (752). However, one strategy adopted from prior HCoV outbreaks is currently the best-known treatment for severe cases of COVID-19. Corticosteroids represent broad-spectrum treatments and are a well-known, widely available treatment for pneumonia (753758) that have also been debated as a possible treatment for ARDS (759764). Corticosteroids were also used and subsequently evaluated as possible supportive care for SARS and MERS. In general, studies and meta-analyses did not identify support for corticosteroids to prevent mortality in these HCoV infections (765767); however, one found that the effects might be masked by variability in treatment protocols, such as dosage and timing (752). While the corticosteroids most often used to treat SARS were methylprednisolone and hydrocortisone, availability issues for these drugs at the time led to dexamethasone also being used in North America (768).

Dexamethasone (9α-fluoro-16α-methylprednisolone) is a synthetic corticosteroid that binds to glucocorticoid receptors (769, 770). It functions as an anti-inflammatory agent by binding to glucocorticoid receptors with higher affinity than endogenous cortisol (771). Dexamethasone and other steroids are widely available and affordable, and they are often used to treat community-acquired pneumonia (772) as well as chronic inflammatory conditions such as asthma, allergies, and rheumatoid arthritis (773775). Immunosuppressive drugs such as steroids are typically contraindicated in the setting of infection (776), but because COVID-19 results in hyperinflammation that appears to contribute to mortality via lung damage, immunosuppression may be a helpful approach to treatment (158). A clinical trial that began in 2012 recently reported that dexamethasone may improve outcomes for patients with ARDS (759), but a meta-analysis of a small amount of available data about dexamethasone as a treatment for SARS suggested that it may, in fact, be associated with patient harm (777). However, the findings in SARS may have been biased by the fact that all of the studies examined were observational and a large number of inconclusive studies were not included (778). The questions of whether and when to counter hyperinflammation with immunosuppression in the setting of COVID-19 (as in SARS (746)) was an area of intense debate, as the risks of inhibiting antiviral immunity needed to be weighed against the beneficial anti-inflammatory effects (779). As a result, guidelines early in the pandemic typically recommended avoiding treating COVID-19 patients with corticosteroids such as dexamethasone (777).

Despite this initial concern, dexamethasone was evaluated as a potential treatment for COVID-19 (Appendix 1). Dexamethasone treatment comprised one arm of the multi-site Randomized Evaluation of COVID-19 Therapy (RECOVERY) trial in the United Kingdom (780). This study found that the 28-day mortality rate was lower in patients receiving dexamethasone than in those receiving standard of care (SOC). However, this finding was driven by differences in mortality among patients who were receiving mechanical ventilation or supplementary oxygen at the start of the study. The report indicated that dexamethasone reduced 28-day mortality relative to SOC in patients who were ventilated (29.3% versus 41.4%) and among those who were receiving oxygen supplementation (23.3% versus 26.2%) at randomization, but not in patients who were breathing independently (17.8% versus 14.0%). These findings also suggested that dexamethasone may have reduced progression to mechanical ventilation, especially among patients who were receiving oxygen support at randomization. Other analyses have supported the importance of disease course in determining the efficacy of dexamethasone: additional results suggest greater potential for patients who have experienced symptoms for at least seven days and patients who were not breathing independently (781). A meta-analysis that evaluated the results of the RECOVERY trial alongside trials of other corticosteroids, such as hydrocortisone, similarly concluded that corticosteroids may be beneficial to patients with severe COVID-19 who are receiving oxygen supplementation (782). Thus, it seems likely that dexamethasone is useful for treating inflammation associated with immunopathy or cytokine release syndrome (CRS), which is a condition caused by detrimental overactivation of the immune system (1). In fact, corticosteroids such as dexamethasone are sometimes used to treat CRS (783). Guidelines were quickly updated to encourage the use of dexamethasone in severe cases (784), and this affordable and widely available treatment rapidly became a valuable tool against COVID-19 (785), with demand surging within days of the preprint’s release (786).

4.5 Approaches Targeting the Virus

Therapeutics that directly target the virus itself hold the potential to prevent people infected with SARS-CoV-2 from developing potentially damaging symptoms (Figure 5). Such drugs typically fall into the broad category of antivirals. Antiviral therapies hinder the spread of a virus within the host, rather than destroying existing copies of the virus, and these drugs can vary in their specificity to a narrow or broad range of viral targets. This process requires inhibiting the replication cycle of a virus by disrupting one of six fundamental steps (787). In the first of these steps, the virus attaches to and enters the host cell through endocytosis. Then the virus undergoes uncoating, which is classically defined as the release of viral contents into the host cell. Next, the viral genetic material enters the nucleus where it gets replicated during the biosynthesis stage. During the assembly stage, viral proteins are translated, allowing new viral particles to be assembled. In the final step new viruses are released into the extracellular environment. Although antivirals are designed to target a virus, they can also impact other processes in the host and may have unintended effects. Therefore, these therapeutics must be evaluated for both efficacy and safety. As the technology to respond to emerging viral threats has also evolved over the past two decades, a number of candidate treatments have been identified for prior viruses that may be relevant to the treatment of COVID-19.

Figure 5: Mechanisms of Action for Potential Therapeutics Potential therapeutics currently being studied can target the SARS-CoV-2 virus or modify the host environment through many different mechanisms. Here, the relationships between the virus, host cells, and several therapeutics are visualized. Drug names are color-coded according to the grade assigned to them by the Center for Cytokine Storm Treatment & Laboratory’s CORONA Project (788) (Green = A, Lime = B, Orange = C, and Red = D).

Many antiviral drugs are designed to inhibit the replication of viral genetic material during the biosynthesis step. Unlike DNA viruses, which can use the host enzymes to propagate themselves, RNA viruses like SARS-CoV-2 depend on their own polymerase, the RNA-dependent RNA polymerase (RdRP), for replication (789, 790). RdRP is therefore a potential target for antivirals against RNA viruses. Disruption of RdRP is the proposed mechanism underlying the treatment of SARS and MERS with ribavirin (791). Ribavirin is an antiviral drug effective against other viral infections that was often used in combination with corticosteroids and sometimes interferon (IFN) medications to treat SARS and MERS (250). However, analyses of its effects in retrospective and in vitro analyses of SARS and the SARS-CoV-1 virus, respectively, have been inconclusive (250). While IFNs and ribavirin have shown promise in in vitro analyses of MERS, their clinical effectiveness remains unknown (250). The current COVID-19 pandemic has provided an opportunity to assess the clinical effects of these treatments. As one example, ribivarin was also used in the early days of COVID-19, but a retrospective cohort study comparing patients who did and did not receive ribivarin revealed no effect on the mortality rate (792).

Since nucleotides and nucleosides are the natural building blocks for RNA synthesis, an alternative approach has been to explore nucleoside and nucleotide analogs for their potential to inhibit viral replication. Analogs containing modifications to nucleotides or nucleosides can disrupt key processes including replication (793). A single incorporation does not influence RNA transcription; however, multiple events of incorporation lead to the arrest of RNA synthesis (794). One candidate antiviral considered for the treatment of COVID-19 is favipiravir (Avigan), also known as T-705, which was discovered by Toyama Chemical Co., Ltd. (795). It was previously found to be effective at blocking viral amplification in several influenza subtypes as well as other RNA viruses, such as Flaviviridae and Picornaviridae, through a reduction in plaque formation (796) and viral replication in Madin-Darby canine kidney cells (797). Favipiravir (6-fluoro-3-hydroxy-2-pyrazinecarboxamide) acts as a purine and purine nucleoside analogue that inhibits viral RNA polymerase in a dose-dependent manner across a range of RNA viruses, including influenza viruses (798802). Biochemical experiments showed that favipiravir was recognized as a purine nucleoside analogue and incorporated into the viral RNA template. In 2014, the drug was approved in Japan for the treatment of influenza that was resistant to conventional treatments like neuraminidase inhibitors (803). Though initial analyses of favipiravir in observational studies of its effects on COVID-19 patients were promising, recent results of two small RCTs suggest that it is unlikely to affect COVID-19 outcomes (Appendix 1).

In contrast, another nucleoside analog, remdesivir, is one of the few treatments against COVID-19 that has received FDA approval. Remdesivir (GS-5734) is an intravenous antiviral that was proposed by Gilead Sciences as a possible treatment for Ebola virus disease. It is metabolized to GS-441524, an adenosine analog that inhibits a broad range of polymerases and then evades exonuclease repair, causing chain termination (804806). Gilead received an emergency use authorization (EUA) for remdesivir from the FDA early in the pandemic (May 2020) and was later found to reduce mortality and recovery time in a double-blind, placebo-controlled, phase III clinical trial performed at 60 trial sites, 45 of which were in the United States (807810). Subsequently, the WHO Solidarity trial, a large-scale, open-label trial enrolling 11,330 adult in-patients at 405 hospitals in 30 countries around the world, reported no effect of remdesivir on in-hospital mortality, duration of hospitalization, or progression to mechanical ventilation (811). Therefore, additional clinical trials of remdesivir in different patient pools and in combination with other therapies may be needed to refine its use in the clinic and determine the forces driving these differing results. Remdesivir offers proof of principle that SARS-CoV-2 can be targeted at the level of viral replication, since remdesivir targets the viral RNA polymerase at high potency. Identification of such candidates depends on knowledge about the virological properties of a novel threat. However, the success and relative lack of success, respectively, of remdesivir and favipiravir underscore the fact that drugs with similar mechanisms will not always produce similar results in clinical trials.

4.6 Disrupting Host-Virus Interactions

4.6.1 Interrupting Viral Colonization of Cells

Some of the most widely publicized examples of efforts to repurpose drugs for COVID-19 are broad-spectrum, small-molecule drugs where the mechanism of action made it seem that the drug might disrupt interactions between SARS-CoV-2 and human host cells (Figure 5). However, the exact outcomes of such treatments are difficult to predict a priori, and there are several examples where early enthusiasm was not borne out in subsequent trials. One of the most famous examples of an analysis of whether a well-known medication could provide benefits to COVID-19 patients came from the assessment of chloroquine (CQ) and hydroxychloroquine (HCQ), which are used for the treatment and prophylaxis of malaria as well as the treatment of lupus erythematosus and rheumatoid arthritis in adults (812). These drugs are lysosomotropic agents, meaning they are weak bases that can pass through the plasma membrane. It was thought that they might provide benefits against SARS-CoV-2 by interfering with the digestion of antigens within the lysosome and inhibiting CD4 T-cell stimulation while promoting the stimulation of CD8 T-cells (813). These compounds also have anti-inflammatory properties (813) and can decrease the production of certain key cytokines involved in the immune response, including interleukin-6 (IL-6) and inhibit the stimulation of Toll-like receptors (TLR) and TLR signaling (813).

In vitro analyses reported that CQ inhibited cell entry of SARS-CoV-1 (814) and that both CQ and HCQ inhibited viral replication within cultured cells (815), leading to early hope that it might provide similar therapeutic or protective effects in patients. However, while the first publication on the clinical application of these compounds to the inpatient treatment of COVID-19 was very positive (816), it was quickly discredited (817). Over the following months, extensive evidence emerged demonstrating that CQ and HCQ offered no benefits for COVID-19 patients and, in fact, carried the risk of dangerous side effects (Appendix 1). The nail in the coffin came when findings from the large-scale RECOVERY trial were released on October 8, 2020. This study enrolled 11,197 hospitalized patients whose physicians believed it would not harm them to participate and used a randomized, open-label design to study the effects of HCQ compared to standard of care (SOC) at 176 hospitals in the United Kingdom (818). Rates of COVID-19-related mortality did not differ between the control and HCQ arms, but patients receiving HCQ were slightly more likely to die due to cardiac events. Patients who received HCQ also had a longer duration of hospitalization than patients receiving usual care and were more likely to progress to mechanical ventilation or death (as a combined outcome). As a result, enrollment in the HCQ arm of the RECOVERY trial was terminated early (819). The story of CQ/HCQ therefore illustrates how initial promising in vitro analyses can fail to translate to clinical usefulness.

A similar story has arisen with the broad-spectrum, small-molecule anthelmintic ivermectin, which is a synthetic analog of avermectin, a bioactive compound produced by a microorganism known as Streptomyces avermectinius and Streptomyces avermitilis (820, 821). Avermectin disrupts the ability of parasites to avoid the host immune response by blocking glutamate-gated chloride ion channels in the peripheral nervous system from closing, leading to hyperpolarization of neuronal membranes, disruption of neural transmission, and paralysis (820, 822, 823). Ivermectin has been used since the early 1980s to treat endo- and ecto-parasitic infections by helminths, insects, and arachnids in veterinary contexts (820, 824) and since the late 1980s to treat human parasitic infections as well (820, 822). More recent research has indicated that ivermectin might function as a broad-spectrum antiviral by disrupting the trafficking of viral proteins by both RNA and DNA viruses (823, 825, 826), although most of these studies have demonstrated this effect in vitro (826). The potential for antiviral effects on SARS-CoV-2 were investigated in vitro, and ivermectin was found to inhibit viral replication in a cell line derived from Vero cells (Vero-hSLAM) (827). However, inhibition of viral replication was achieved at concentrations that were much higher than that explored by existing dosage guidelines (828, 829), which are likely to be associated with significant side effects due to the increased potential that the compound could cross the mammalian blood-brain barrier (830, 831).

Retrospective studies and small RCTs began investigating the effects of standard doses of this low-cost, widely available drug. One retrospective study reported that ivermectin reduced all-cause mortality (832) while another reported no difference in clinical outcomes or viral clearance (833). Small RCTs enrolling less than 50 patients per arm have also reported a wide array of positive (834838) and negative results (839, 840). A slightly larger RCT enrolling 115 patients in two arms reported inconclusive results (841). Hope for the potential of ivermectin peaked with the release of a preprint reporting results of a multicenter, double-blind RCT where a four-day course of ivermectin was associated with clinical improvement and earlier viral clearance in 400 symptomatic patients and 200 close contacts (842); however, concerns were raised about both the integrity of the data and the paper itself (843, 844), and this study was removed by the preprint server Research Square (845). A similarly sized RCT suggested no effect on the duration of symptoms among 400 patients split evenly across the intervention and control arms (846), and although meta-analyses have reported both null (847, 848) and beneficial (849856) effects of ivermectin on COVID-19 outcomes, the certainty is likely to be low (850). These findings are potentially biased by a small number of low-quality studies, including the preprint that has been taken down (857), and the authors of one (858) have issued a notice (849) that they will revise their study with the withdrawn study removed. Thus, much like HCQ/CQ, enthusiasm for research that either has not or should not have passed peer review has led to large numbers of patients worldwide receiving treatments that might not have any effect or could even be harmful. Additionally, comments on the now-removed preprint include inquiries into how best to self-administer veterinary ivermectin as a prophylactic (845), and the FDA has posted information explaining why veterinary ivermectin should not be taken by humans concerned about COVID-19 (859). Ivermectin is now one of several candidate therapeutics being investigated in the large-scale TOGETHER (860) and PRINCIPLE (861) clinical trials. The TOGETHER trial, which previously demonstrated no effect of HCQ and lopinavir-ritonavir (862), released preliminary results in early August 2021 suggesting that ivermectin also has no effect on COVID-19 outcomes (863).

While CQ/HCQ and ivermectin are well-known medications that have long been prescribed in certain contexts, investigation of another well-established type of pharmaceutical was facilitated by the fact that it was already being taken by a large number of COVID-19 patients. Angiotensin-converting enzyme inhibitors (ACEIs) and angiotensin II receptor blockers (ARBs) are among today’s most commonly prescribed medications, often being used to control blood pressure (864, 865). In the United States, for example, they are prescribed well over 100,000,000 times annually (866). Prior to the COVID-19 pandemic, the relationship between ACE2, ACEIs, and SARS had been considered as possible evidence that ACE2 could serve as a therapeutic target (867), and the connection had been explored through in vitro and molecular docking analysis (868) but ultimately was not pursued clinically (869). Data from some animal models suggest that several, but not all, ACEIs and several ARBs increase ACE2 expression in the cells of some organs (870), but clinical studies have not established whether plasma ACE2 expression is increased in humans treated with these medications (871). In this case, rather than introducing ARBs/ACEIs, a number of analyses have investigated whether discontinuing use affects COVID-19 outcomes. An initial observational study of the association of exposure to ACEIs or ARBs with outcomes in COVID-19 was retracted from the New England Journal of Medicine (872) due to concerns related to data availability (873). As RCTs have become available, they have demonstrated no effect of continuing versus discontinuing ARBs/ACEIs on patient outcomes (874, 875) (Appendix 1). Thus, once again, despite a potential mechanistic association with the pathology of SARS-CoV-2 infection, these medications were not found to influence the trajectory of COVID-19 illness.

For medications that are widely known and common, clinical research into their efficacy against a novel threat can be developed very quickly. This feasibility can present a double-edged sword. For example, HCQ and CQ were incorporated into SOC in many countries early in the pandemic and had to be discontinued once their potential to harm COVID-19 patients became apparent (876, 877). Dexamethasone remains the major success story from this category of repurposed drugs and is likely to have saved a large number of lives since summer 2020 (785).

4.6.2 Manipulating the Host Immune Response

Treatments based on understanding a virus and/or how a virus interacts with the human immune system can fall into two categories: they can interact with the innate immune response, which is likely to be a similar response across viruses, or they can be specifically designed to imitate the adaptive immune response to a particular virus. In the latter case, conservation of structure or behavior across viruses enables exploring whether drugs developed for one virus can treat another. During the COVID-19 pandemic, a number of candidate therapeutics have been explored in these categories, with varied success.

Knowledge gained from trying to understand SARS-CoV-1 and MERS-CoV from a fundamental biological perspective and characterize how they interact with the human immune system provides a theoretical basis for identifying candidate therapies. Biologics are a particularly important class of drugs for efforts to address HCoV through this paradigm. They are produced from components of living organisms or viruses, historically primarily from animal tissues, but have become increasingly feasible to produce as recombinant technologies have advanced (878).

There are many differences on the development side between biologics and synthesized pharmaceuticals, such as small molecule drugs. Typically, biologics are orders of magnitude larger than small molecule drugs and are catabolized by the body to their amino acid components (879). They are often heat sensitive, and their toxicity can vary, as it is not directly associated with the primary effects of the drug; in general, their physiochemical properties are much less understood compared to small molecules (879). Biologics include significant medical breakthroughs such as insulin for the management of diabetes and vaccines, as well monoclonal antibodies (mAbs) and interferons (IFNs), which can be used to target the host immune response after infection.

mAbs have revolutionized the way we treat human diseases and have become some of the best-selling drugs in the pharmaceutical market in recent years (880). There are currently 79 FDA approved mAbs on the market, including antibodies for viral infections (e.g. Ibalizumab for Human immunodeficiency virus and Palivizumab for Respiratory syncytial virus) (880, 881). Virus-specific neutralizing antibodies commonly target viral surface glycoproteins or host structures, thereby inhibiting viral entry through receptor binding interference (882, 883). This interference is predicted to reduce the viral load, mitigate disease, and reduce overall hospitalization. mAbs can be designed for a particular virus, and significant advances have been made in the speed at which new mAbs can be identified and produced. At the time of the SARS and MERS epidemics, interest in mAbs to reduce infection was never realized (884, 885), but this allowed for mAbs to quickly be considered among the top candidates against COVID-19.

4.6.2.1 Biologics and the Innate Immune Response

Deaths from COVID-19 often occur when inflammation becomes dysregulated following an immune response to the SARS-CoV-2 virus. Therefore, one potential approach to reducing COVID-19 mortality rates is to manage the inflammatory response in severely ill patients. One candidate therapeutic identified that uses this mechanism is tocilizumab (TCZ). TCZ is a mAb that was developed to manage chronic inflammation caused by the continuous synthesis of the cytokine IL-6 (886). IL-6 is a pro-inflammatory cytokine belonging to the interleukin family, which is comprised by immune system regulators that are primarily responsible for immune cell differentiation. Often used to treat chronic inflammatory conditions such as rheumatoid arthritis (886), TCZ has become a pharmaceutical of interest for the treatment of COVID-19 because of the role IL-6 plays in this disease. It has also been approved to treat CRS caused by CAR-T treatments (887). While the secretion of IL-6 can be associated with chronic conditions, IL-6 is a key player in the innate immune response and is secreted by macrophages in response to the detection of pathogen-associated molecular patterns and damage-associated molecular patterns (886). An analysis of 191 in-patients at two Wuhan hospitals revealed that blood concentrations of IL-6 differed between patients who did and did not recover from COVID-19. Patients who ultimately died had higher IL-6 levels at admission than those who recovered (83). Additionally, IL-6 levels remained higher throughout the course of hospitalization in the patients who ultimately died (83).

Currently, TCZ is being administered either as a monotherapy or in combination with other treatments in 73 interventional COVID-19 clinical trials (Figure 4). A number of retrospective studies have been conducted in several countries (888893). In general, these studies have reported a positive effect of TCZ on reducing mortality in COVID-19 patients, although due to their retrospective designs, significant limitations are present in all of them (Appendix 1). It was not until February 11, 2021 that a preprint describing preliminary results of the first RCT of TCZ was released as part of the RECOVERY trial (894). TCZ was found to reduce 28-day mortality from 33% in patients receiving SOC alone to 29% in those receiving TCZ. Combined analysis of the RECOVERY trial data with data from smaller RCTs suggested a 13% reduction in 28-day mortality (894). While this initial report did not include the full results expected from the RECOVERY trial, this large-scale, RCT provides strong evidence that TCZ may offer benefits for COVID-19 patients. The RECOVERY trial along with results from several other RCTs (895899) were cited as support for the EUA issued for TCZ in June 2021 (900). However, the fact that TCZ suppresses the immune response means that it does carry risks for patients, especially a potential risk of secondary infection (Appendix 1).

TCZ is just one example of a candidate drug targeting the host immune response and specifically excessive inflammation. For example, interferons (IFNs) have also been investigated; these are a family of cytokines critical to activating the innate immune response against viral infections. Synairgen has been investigating a candidate drug, SNG001, which is an IFN-𝛽-1a formulation to be delivered to the lungs via inhalation (901) that they reported reduced progression to ventilation in a double-blind, placebo-controlled, multi-center study of 101 patients with an average age in the late 50s (902, 903). However, these findings were not supported by the large-scale WHO Solidarity trial, which reported no significant effect of IFN-β-1a on patient survival during hospitalization (811), although differences in the designs of the two studies, and specifically the severity of illness among enrolled patients, may have influenced their divergent outcomes (Appendix 1). Other biologics influencing inflammation are also being explored (Appendix 1). It is also important that studies focused on inflammation as a possible therapeutic target consider the potential differences in baseline inflammation among patients from different backgrounds, which may be caused by differing life experiences (see (339)).

4.6.2.2 Biologics and the Adaptive Immune Response

While TCZ is an example of an mAb focused on managing the innate immune response, other treatments are more specific, targeting the adaptive immune response after an infection. In some cases, treatments can utilize biologics obtained directly from recovered individuals. From the very early days of the COVID-19 pandemic, polyclonal antibodies from convalescent plasma were investigated as a potential treatment for COVID-19 (904, 905). Convalescent plasma was used in prior epidemics including SARS, Ebola Virus Disease, and even the 1918 Spanish Influenza (904, 906). Use of convalescent plasma transfusion (CPT) over more than a century has aimed to reduce symptoms and improve mortality in infected people (906), possibly by accelerating viral clearance (904). However, it seems unlikely that this classic treatment confers any benefit for COVID-19 patients. Several systematic reviews have investigated whether CPT reduced mortality in COVID-19 patients, and although findings from early in the pandemic (up to April 19, 2020) did support use of CPT (906), the tide has shifted as the body of available literature has grown (907). While titer levels were suggested as a possible determining factor in the success of CPT against COVID-19 (908), the large-scale RECOVERY trial evaluated the effect of administering high-titer plasma specifically and found no effect on mortality or hospital discharge over a 28-day period (909). These results thus suggest that, despite initial optimism and an EUA from the FDA, CPT is unlikely to be an effective therapeutic for COVID-19.

A different narrative is shaping up around the use of mAbs specifically targeting SARS-CoV-2. During the first SARS epidemic in 2002, neutralizing antibodies (nAbs) were found in SARS-CoV-1-infected patients (910, 911). Several studies following up on these findings identified various S-glycoprotein epitopes as the major targets of nAbs against SARS-CoV-1 (912). Coronaviruses use trimeric spike (S) glycoproteins on their surface to bind to the host cell, allowing for cell entry (25, 33). Each S glycoprotein protomer is comprised of an S1 domain, also called the receptor binding domain (RBD), and an S2 domain. The S1 domain binds to the host cell while the S2 domain facilitates the fusion between the viral envelope and host cell membranes (912). The genomic identity between the RBD of SARS-CoV-1 and SARS-CoV-2 is around 74% (913). Due to this high degree of similarity, preexisting antibodies against SARS-CoV-1 were initially considered candidates for neutralizing activity against SARS-CoV-2. While some antibodies developed against the SARS-CoV-1 spike protein showed cross-neutralization activity with SARS-CoV-2 (914, 915), others failed to bind to SARS-CoV-2 spike protein at relevant concentrations (16). Cross-neutralizing activities were dependent on whether the epitope recognized by the antibodies were conserved between SARS-CoV-1 and SARS-CoV-2 (914).

Technological advances in antibody drug design as well as in structural biology massively accelerated the discovery of novel antibody candidates and the mechanisms by which they interact with the target structure. Within just a year of the structure of the SARS-CoV-2 spike protein being published, an impressive pipeline of monoclonal antibodies targeting SARS-CoV-2 entered clinical trials, with hundreds more candidates in preclinical stages. The first human monoclonal neutralizing antibody specifically against the SARS-CoV-2 S glycoprotein was developed using hybridoma technology (916), where antibody-producing B-cells developed by mice are inserted into myeloma cells to produce a hybrid cell line (the hybridoma) that is grown in culture. The 47D11 antibody clone was able to cross-neutralize SARS-CoV-1 and SARS-CoV-2. This antibody (now ABVV-47D11) has recently entered clinical trials in collaboration with AbbVie. Additionally, an extensive monoclonal neutralizing antibody pipeline has been developed to combat the ongoing pandemic, with over 50 different antibodies in clinical trials (917). Thus far, the monotherapy sotrovimab and two antibody cocktails (bamlanivimab/estesevimab and casirivimab/imdevimab) have been granted EUAs by the FDA.

One of the studied antibody cocktails consists of bamlanivimab and estesevimab. Bamlanivimab (Ly-CoV555) is a human mAb that was derived from convalescent plasma donated by a recovered COVID-19 patient, evaluated in research by the National Institute of Allergy and Infectious Diseases (NIAID), and subsequently developed by AbCellera and Eli Lilly. The neutralizing activity of bamlanivimab was initially demonstrated in vivo using a nonhuman primate model (918). Based on these positive preclinical data, Eli Lilly initiated the first human clinical trial for a monoclonal antibody against SARS-CoV-2. The phase 1 trial, which was conducted in hospitalized COVID-19 patients, was completed in August 2020 (919). Estesevimab (LY-CoV016 or JS-016) is also a monoclonal neutralizing antibody against the spike protein of SARS-CoV-2. It was initially developed by Junshi Biosciences and later licensed and developed through Eli Lilly. A phase 1 clinical trial to assess the safety of etesevimab was completed in October 2020 (920). Etesevimab was shown to bind a different epitope on the spike protein than bamlanivimab, suggesting that the two antibodies used as a combination therapy would further enhance their clinical use compared to a monotherapy (921). To assess the efficacy and safety of bamlanivimab alone or in combination with etesevimab for the treatment of COVID-19, a phase 2/3 trial (BLAZE-1) (922) was initiated. The interim analysis of the phase 2 portion suggested that bamlanivimab alone was able to accelerate the reduction in viral load (923). However, more recent data suggests that only the bamlanivimab/etesevimab combination therapy is able to reduce viral load in COVID-19 patients (921). Based on this data, the combination therapy received an EUA for COVID-19 from the FDA in February 2021 (924).

A second therapy is comprised of casirivimab and imdevimab (REGN-COV2). Casirivimab (REGN10933) and imdevimab (REGN10987) are two monoclonal antibodies against the SARS-CoV-2 spike protein. They were both developed by Regeneron in a parallel high-throughput screening (HTS) to identify neutralizing antibodies from either humanized mice or patient-derived convalescent plasma (925). In these efforts, multiple antibodies were characterized for their ability to bind and neutralize the SARS-CoV-2 spike protein. The investigators hypothesized that an antibody cocktail, rather than each individual antibody, could increase the therapeutic efficacy while minimizing the risk for virus escape. Therefore, the authors tested pairs of individual antibodies for their ability to simultaneously bind the RBD of the spike protein. Based on this data, casirivimab and imdevimab were identified as the lead antibody pair, resulting in the initiation of two clinical trials (926, 927). Data from this phase 1-3 trial published in the New England Journal of Medicine shows that the REGN-COV2 antibody cocktail reduced viral load, particularly in patients with high viral load or whose endogenous immune response had not yet been initiated (928). However, in patients who already initiated an immune response, exogenous addition of REGN-COV2 did not improve the endogenous immune response. Both doses were well tolerated with no serious events related to the antibody cocktail. Based on this data, the FDA granted an EUA for REGN-COV2 in patients with mild to moderate COVID-19 who are at risk of developing severe disease (929). Ongoing efforts are trying to evaluate the efficacy of REGN-COV2 to improve clinical outcomes in hospitalized patients (926).

Sotrovimab is the most recent mAb to receive an EUA. It was identified in the memory B cells of a 2003 survivor of SARS (930) and was found to be cross-reactive with SARS-CoV-2 (915). This cross-reactivity is likely attributable to conservation within the epitope, with 17 out of 22 residues conserved between the two viruses, four conservatively substituted, and one semi-conservatively substituted (915). In fact, these residues are highly conserved among sarbecoviruses, a clade that includes SARS-CoV-1 and SARS-CoV-2 (915). This versatility has led to it being characterized as a “super-antibody” (931), a potent, broadly neutralizing antibody (932). Interim analysis of data from a clinical trial (933) reported high safety and efficacy of this mAb in 583 COVID-19 patients (934). Compared to placebo, sotrovimab was found to be 85% more effective in reducing progression to the primary endpoint, which was the proportion of patients who, within 29 days, were either hospitalized for more than 24 hours or died. Additionally, rates of adverse events were comparable, and in some cases lower, among patients receiving sotrovimab compared to patients receiving a placebo. Sotrovimab therefore represents a mAb therapeutic that is effective against SARS-CoV-2 and may also be effective against other sarbecoviruses.

Several potential limitations remain in the application of mAbs to the treatment of COVID-19. One of the biggest challenges is identifying antibodies that not only bind to their target, but also prove to be beneficial for disease management. Currently, use of mAbs is limited to people with mild to moderate disease that are not hospitalized, and it has yet to be determined whether they can be used as a successful treatment option for severe COVID-19 patients. While preventing people from developing severe illness provides significant benefits, patients with severe illness are at the greatest risk of death, and therefore therapeutics that provide benefits against severe illness are particularly desirable. It remains to be seen whether mAbs confer any benefits for patients in this category.

Another concern about therapeutics designed to amplify the response to a specific viral target is that they may need to be modified as the virus evolves. With the ongoing global spread of new SARS-CoV-2 variants, there is a growing concern that mutations in SARS-CoV-2 spike protein could escape antibody neutralization, thereby reducing the efficacy of monoclonal antibody therapeutics and vaccines. A comprehensive mutagenesis screen recently identified several amino acid substitutions in the SARS-CoV-2 spike protein that can prevent antibody neutralization (935). While some mutations result in resistance to only one antibody, others confer broad resistance to multiple mAbs as well as polyclonal human sera, suggesting that some amino acids are “hotspots” for antibody resistance. However, it was not investigated whether the resistance mutations identified result in a fitness advantage. Accordingly, an impact on neutralizing efficiency has been reported for the B.1.1.7 (Alpha) variant first identified in the UK and the B.1.351 (Beta) variant first identified in in South Africa (936938). As of June 25, 2021, the CDC recommended a pause in the use of bamlanivimab and etesevimab due to decreased efficacy against the P.1 (Gamma) and B.1.351 (Beta) variants of SARS-CoV-2 (939). While the reported impact on antibody neutralization needs to be confirmed in vivo, it suggests that some adjustments to therapeutic antibody treatments may be necessary to maintain the efficacy that was reported in previous clinical trials.

Several strategies have been employed to try to mitigate the risk of diminished antibody neutralization. Antibody cocktails such as those already holding an EUA may help overcome the risk for attenuation of the neutralizing activity of a single monoclonal antibody. These cocktails consist of antibodies that recognize different epitopes on the spike protein, decreasing the likelihood that a single amino acid change can cause resistance to all antibodies in the cocktail. However, neutralizing resistance can emerge even against an antibody cocktail if the individual antibodies target subdominant epitopes (937). Another strategy is to develop broadly neutralizing antibodies that target structures that are highly conserved, as these are less likely to mutate (940, 941) or to target epitopes that are insensitive to mutations (942). Sotrovimab, one such “super-antibody”, is thought to be somewhat robust to neutralization escape (943) and has been found to be effective against all variants assessed as of August 12, 2021 (944). Another antibody (ADG-2) targets a highly conserved epitope that overlaps the hACE2 binding site of all clade 1 sarbecoviruses (945). Prophylactic administration of ADG-2 in an immunocompetent mouse model of COVID-19 resulted in protection against viral replication in the lungs and respiratory burden. Since the epitope targeted by ADG-2 represents an Achilles’ heel for clade 1 sarbecoviruses, this antibody, like sotrovimab, might be a promising candidate against all circulating variants as well as emerging SARS-related coronaviruses. To date, it has fared well against the Alpha, Beta, Gamma, and Delta variants (944).

The development of mAbs against SARS-CoV-2 has made it clear that this technology is rapidly adaptable and offers great potential for the response to emerging viral threats. However, additional investigation may be needed to adapt mAb treatments to SARS-CoV-2 as it evolves and potentially to pursue designs that confer benefits for patients at the greatest risk of death. While polyclonal antibodies from convalescent plasma have been evaluated as a treatment for COVID-19, these studies have suggested fewer potential benefits against SARS-CoV-2 than mAbs; convalescent plasma therapy has been thoroughly reviewed elsewhere (904, 905). Thus, advances in biologics for COVID-19 illustrate that an understanding of how the host and virus interact can guide therapeutic approaches. The FDA authorization of two combination mAb therapies, in particular, underscores the potential for this strategy to allow for a rapid response to a novel pathogen. Additionally, while TCZ is not yet as established, this therapy suggests that the strategy of using biologics to counteract the cytokine storm response may provide therapies for the highest-risk patients.

4.7 High-Throughput Screening for Drug Repurposing

The drug development process is slow and costly, and developing compounds specifically targeted to an emerging viral threat is not a practical short-term solution. Screening existing drug compounds for alternative indications is a popular alternative (946949). HTS has been a goal of pharmaceutical development since at least the mid-1980s (950). Traditionally, phenotypic screens were used to test which compounds would induce a desired change in in vitro or in vivo models, focusing on empirical, function-oriented exploration naïve to molecular mechanism (951953). In many cases, these screens utilize large libraries that encompass a diverse set of agents varying in many pharmacologically relevant properties (e.g., (954)). The compounds inducing a desired effect could then be followed up on. Around the turn of the millennium, advances in molecular biology allowed for HTS to shift towards screening for compounds interacting with a specific molecular target under the hypothesis that modulating that target would have a desired effect. These approaches both offer pros and cons, and today a popular view is that they are most effective in combination (951, 953, 955).

Today, some efforts to screen compounds for potential repurposing opportunities are experimental, but others use computational HTS approaches (946, 956). Computational drug repurposing screens can take advantage of big data in biology (737) and as a result are much more feasible today than during the height of the SARS and MERS outbreaks in the early 2000s and early 2010s, respectively. Advancements in robotics also facilitate the experimental component of HTS (948). For viral diseases, the goal of drug repurposing is typically to identify existing drugs that have an antiviral effect likely to impede the virus of interest. While both small molecules and biologics can be candidates for repurposing, the significantly lower price of many small molecule drugs means that they are typically more appealing candidates (957).

Depending on the study design, screens vary in how closely they are tied to a hypothesis. As with the candidate therapeutics described above, high-throughput experimental or computational screens can proceed based on a hypothesis. Just as remdesivir was selected as a candidate antiviral because it is a nucleoside analog (958), so too can high-throughput screens select libraries of compounds based on a molecular hypothesis. Likewise, when the library of drugs is selected without basis in a potential mechanism, a screen can be considered hypothesis free (958). Today, both types of analyses are common both experimentally and computationally. Both strategies have been applied to identifying candidate therapeutics against SARS-CoV-2.

4.7.1 Hypothesis-Driven Screening

Hypothesis-driven screens often select drugs likely to interact with specific viral or host targets or drugs with desired clinical effects, such as immunosuppressants. There are several properties that might identify a compound as a candidate for an emerging viral disease. Drugs that interact with a target that is shared between pathogens (i.e., a viral protease or a polymerase) or between a viral pathogen and another illness (i.e., a cancer drug with antiviral potential) are potential candidates, as are drugs that are thought to interact with additional molecular targets beyond those they were developed for (956). Such research can be driven by in vitro or in silico experimentation. Computational analyses depend on identifying compounds that modulate pre-selected proteins in the virus or host. As a result, they build on experimental research characterizing the molecular features of the virus, host, and candidate compounds (949).

One example of the application of this approach to COVID-19 research comes from work on protease inhibitors. Studies have shown that viral proteases play an important role in the life cycle of viruses, including coronaviruses, by modulating the cleavage of viral polyprotein precursors (959). Several FDA-approved drugs target proteases, such as lopinavir and ritonavir for HIV infection and simeprevir for hepatitis C virus infection. Serine protease inhibitors were previously suggested as possible treatments for SARS and MERS (960). One early study (33) suggested that camostat mesylate, a protease inhibitor, could block the entry of SARS-CoV-2 into lung cells in vitro. Two polyproteins encoded by the SARS-CoV-2 replicase gene, pp1a and pp1ab, are critical for viral replication and transcription (961). These polyproteins must undergo proteolytic processing, which is usually conducted by main protease (MPro), a 33.8-kDa SARS-CoV-2 protease that is therefore fundamental to viral replication and transcription. Therefore, it was hypothesized that compounds targeting MPro could be used to prevent or slow the replication of the SARS-CoV-2 virus.

Both computational and experimental approaches facilitated the identification of compounds that might inhibit SARS-CoV-2 MPro. In 2005, computer-aided design facilitated the development of a Michael acceptor inhibitor, now known as N3, to target MPro of SARS-like coronaviruses (962). N3 binds in the substrate binding pocket of MPro in several HCoV (962965). The structure of N3-bound SARS-CoV-2 MPro has been solved, confirming the computational prediction that N3 would similarly bind in the substrate binding pocket of SARS-CoV-2 (961). N3 was tested in vitro on SARS-CoV-2-infected Vero cells, which belong to a line of cells established from the kidney epithelial cells of an African green monkey, and was found to inhibit SARS-CoV-2 (961). A library of approximately 10,000 compounds was screened in a fluorescence resonance energy transfer assay constructed using SARS-CoV-2 MPro expressed in Escherichia coli (961).

Six leads were identified in this hypothesis-driven screen. In vitro analysis revealed that ebselen had the strongest potency in reducing the viral load in SARS-CoV-2-infected Vero cells (961). Ebselen is an organoselenium compound with anti-inflammatory and antioxidant properties (966). Molecular dynamics analysis further demonstrated the potential for ebselen to bind to MPro and disrupt the protease’s enzymatic functions (967). However, ebselen is likely to be a promiscuous binder, which could diminish its therapeutic potential (961, 968), and compounds with higher specificity may be needed to translate this mechanism effectively to clinical trials. In July 2020, phase II clinical trials commenced to assess the effects of SPI-1005, an investigational drug from Sound Pharmaceuticals that contains ebselen (969), on 60 adults presenting with each of moderate (970) and severe (971) COVID-19. Other MPro inhibitors are also being evaluated in clinical trials (972, 973, 973). Pending the results of clinical trials, N3 remains a computationally interesting compound based on both computational and experimental data, but whether these potential effects will translate to the clinic remains unknown.

4.7.2 Hypothesis-Free Screening

Hypothesis-free screens use a discovery-driven approach, where screens are not targeted to specific viral proteins, host proteins, or desired clinical modulation. Hypothesis-free drug screening began twenty years ago with the testing of libraries of drugs experimentally. Today, like many other areas of biology, in silico analyses have become increasingly popular and feasible through advances in biological big data (958, 974). Many efforts have collected data about interactions between drugs and SARS-CoV-2 and about the host genomic response to SARS-CoV-2 exposure, allowing for hypothesis-free computational screens that seek to identify new candidate therapeutics. Thus, they utilize a systems biology paradigm to extrapolate the effect of a drug against a virus based on the host interactions with both the virus and the drug (949).

Resources such as the COVID-19 Drug and Gene Set Library, which at the time of its publication contained 1,620 drugs sourced from 173 experimental and computational drug sets and 18,676 human genes sourced from 444 gene sets (975), facilitate such discovery-driven approaches. Analysis of these databases indicated that some drugs had been identified as candidates across multiple independent analyses, including high-profile candidates such as CQ/HCQ and remdesivir (975). Computational screening efforts can then mine databases and other resources to identify potential PPIs among the host, the virus, and established and/or experimental drugs (976). Subject matter expertise from human users may be integrated to varying extents depending on the platform (e.g,. (976, 977)). These resources have allowed studies to identify potential therapeutics for COVID-19 without an a priori reason for selecting them.

One example of a hypothesis-free screen for COVID-19 drugs comes from a PPI-network-based analysis that was published early in the pandemic (193). Here, researchers cloned the proteins expressed by SARS-CoV-2 in vitro and quantified 332 viral-host PPI using affinity purification mass spectrometry (193). They identified two SARS-CoV-2 proteins (Nsp6 and Orf9c) that interacted with host Sigma-1 and Sigma-2 receptors. Sigma receptors are located in the endoplasmic reticulum of many cell types, and type 1 and 2 Sigma receptors have overlapping but distinct affinities for a variety of ligands (978). Molecules interacting with the Sigma receptors were then analyzed and found to have an effect on viral infectivity in vitro (193). A follow-up study evaluated the effect of perturbing these 332 proteins in two cells lines, A549 and Caco-2, using knockdown and knockout methods, respectively, and found that the replication of SARS-CoV-2 in cells from both lines was dependent on the expression of SIGMAR1, which is the gene that encodes the Sigma-1 receptor (979). Following these results, drugs interacting with Sigma receptors were suggested as candidates for repurposing for COVID-19 (e.g., (980)). Because many well-known and affordable drugs interact with the Sigma receptors (193, 981), they became a major focus of drug repurposing efforts. Some of the drugs suggested by the apparent success of Sigma receptor-targeting drugs were already being investigated at the time. HCQ, for example, forms ligands with both Sigma-1 and Sigma-2 receptors and was already being explored as a candidate therapeutic for COVID-19 (193). Thus, this computational approach yielded interest in drugs whose antiviral activity was supported by initial in vitro analyses.

Follow-up research, however, called into question whether the emphasis on drugs interacting with Sigma receptors might be based on a spurious association (982). This study built on the prior work by examining whether antiviral activity among compounds correlated with their affinity for the Sigma receptors and found that it did not. The study further demonstrated that cationic amphiphilicity was a shared property among many of the candidate drugs identified through both computational and phenotypic screens and that it was likely to be the source of many compounds’ proposed antiviral activity (982). Cationic amphiphilicity is associated with the induction of phospholipidosis, which is when phospholipids accumulate in the lysosome (983). Phospholipidosis can disrupt viral replication by inhibiting lipid processing (984) (see discussion of HCQ in Appendix 1). However, phospholipidosis is known to translate poorly from in vitro models to in vivo models or clinical applications. Thus, this finding suggested that these screens were identifying compounds that shared a physiochemical property rather than a specific target (982). The authors further demonstrated that antiviral activity against SARS-CoV-2 in vitro was correlated with the induction of phospholipidosis for drugs both with and without cationic amphiphilicity (982). This finding supports the idea that the property of cationic amphicility was being detected as a proxy for the shared effect of phospholipidosis (982). They demonstrated that phospholipidosis-inducing drugs were not effective at preventing viral propagation in vivo in a murine model of COVID-19 (982). Therefore, removing hits that induce phospholipidosis from computational and in vitro experimental repurposing screens (e.g., (985)) may help emphasize those that are more likely to provide clinical benefits. This work illustrates the importance of considering confounding variables in computational screens, a principle that has been incorporated into more traditional approaches to drug development (986).

One drug that acts on Sigma receptors does, however, remain a candidate for the treatment of COVID-19. Several psychotropic drugs target Sigma receptors in the central nervous system and thus attracted interest as potential COVID-19 therapeutics following the findings of two host-virus PPI studies (987). For several of these drugs, the in vitro antiviral activity (979) was not correlated with their affinity for the Sigma-1 receptor (982, 987) but was correlated with phospholipidosis (982). However, fluvoxamine, a selective serotonin reuptake inhibitor that is a particularly potent Sigma-1 receptor agonist (987), has shown promise as a preventative of severe COVID-19 in a preliminary analysis of data from the large-scale TOGETHER trial (863). As of August 6, 2021, this trial had collected data from over 1,400 patients in the fluvoxamine arm of their study, half of whom received a placebo (863). Only 74 patients in the fluvoxamine group had progressed to hospitalization for COVID-19 compared to 107 in the placebo group, corresponding to a relative risk of 0.69; additionally, the relative risk of mortality between the two groups was calculated at 0.71. These findings support the results of small clinical trials that have found fluvoxamine to reduce clinical deterioration relative to a placebo (988, 989). However, the ongoing therapeutic potential of fluvoxamine does not contradict the finding that hypothesis-free screening hits can be driven by confounding factors. The authors point out that its relevance would not just be antiviral as it has a potential immunomodulatory mechanism (988). It has been found to be protective against septic shock in an in vivo mouse model (990). It is possible that fluvoxamine also exerts an antiviral effect (991). Thus, Sigma-1 receptor activity may contribute to fluvoxamine’s potential effects in treating COVID-19, but is not the only mechanism by which this drug can interfere with disease progression.

4.7.3 Potential and Limitations of High-Throughput Analyses

Computational screening allows for a large number of compounds to be evaluated to identify those most likely to display a desired behavior or function. This approach can be guided by a hypothesis or can aim to discover underlying characteristics that produce new hypotheses about the relationship between a host, a virus, and candidate pharmaceuticals. The examples outlined above illustrate that HTS-based evaluations of drug repurposing can potentially provide valuable insights. Computational techniques were used to design compounds targeting MPro based on an understanding of how this protease aids viral replication, and MPro inhibitors remain promising candidates (948), although the clinical trial data is not yet available. Similarly, computational analysis correctly identified the Sigma-1 receptor as a protein of interest. Although the process of identifying which drugs might modulate the interaction led to an emphasis on candidates that ultimately have not been supported, fluvoxamine remains an appealing candidate. The difference between the preliminary evidence for fluvoxamine compared to other drugs that interact with Sigma receptors underscores a major critique of hypothesis-free HTS in particular: while these approaches allow for brute force comparison of a large number of compounds against a virus of interest, they lose the element of expertise that is associated with most successes in drug repurposing (958).

There are also practical limitations to these methods. One concern is that computational analyses inherently depend on the quality of the data being evaluated. The urgency of the COVID-19 pandemic led many research groups to pivot towards computational HTS research without familiarity with best practices in this area (948). As a result, there is an excessive amount of information available from computational studies (992), but not all of it is high-quality. Additionally, the literature used to identify and validate targets can be difficult to reproduce (993), which may pose challenges to target-based experimental screening and to in silico screens. Some efforts to repurpose antivirals have focused on host, rather than viral, proteins (949), which might be expected to translate poorly in vivo if the targeted proteins serve essential functions in the host. Concerns about the practicality of hypothesis-free screens to gain novel insights are underscored by the fact that very few or possibly no success stories have emerged from hypothesis-free screens over the past twenty years (958). These findings suggest that data-driven research can be an important component of the drug repurposing ecosystem, but that drug repurposing efforts that proceed without a hypothesis, an emphasis on biological mechanisms, or an understanding of confounding effects may not produce viable candidates.

4.8 Considerations in Balancing Different Approaches

The approaches described here offer a variety of advantages and limitations in responding to a novel viral threat and building on existing bodies of knowledge in different ways. Medicine, pharmacology, basic science (especially virology and immunology), and biological data science can all provide different insights and perspectives for addressing the challenging question of which existing drugs might provide benefits against an emerging viral threat. A symptom management-driven approach allows clinicians to apply experience with related diseases or related symptoms to organize a rapid response aimed at saving the lives of patients already infected with a new disease. Oftentimes, the pharmaceutical agents that are applied are small-molecule, broad-spectrum pharmaceuticals that are widely available and affordable to produce, and they may already be available for other purposes, allowing clinicians to administer them to patients quickly either with an EUA or off-label. In this vein, dexamethasone has emerged as the strongest treatment against severe COVID-19 (Table 1).

Alternatively, many efforts to repurpose drugs for COVID-19 have built on information gained through basic scientific research of HCoV. Understanding how related viruses function has allowed researchers to identify possible pharmacological strategies to disrupt pathogenesis (Figure 5). Some of the compounds identified through these methods include small-molecule antivirals, which can be boutique and experimental medications like remdesivir (Table 1). Other candidate drugs that intercept host-pathogen interactions include biologics, which imitate the function of endogenous host compounds. Most notably, several mAbs that have been developed (casirivimab, imdevimab, bamlanivimab and etesevimab) or repurposed (sotrovimab, tocilizumab) have now been granted EUAs (Table 1). Although not discussed here, several vaccine development programs have also met huge success using a range of strategies (5, 6).

Table 1: Summary table of candidate therapeutics examined in this manuscript. “Grade” is the rating given to each treatment by the Systematic Tracker of Off-label/Repurposed Medicines Grades (STORM) maintained by the Center for Cytokine Storm Treatment & Laboratory (CSTL) at the University of Pennsylvania (788). A grade of A indicates that a treatment is considered effective, B that all or most RCTs have shown positive results, C that RCT data are not yet available, and D that multiple RCTs have produced negative results. Treatments not in the STORM database are indicated as N/A. FDA status is also provided where available. The evidence available is based on the progression of the therapeutic through the pharmaceutical development pipeline, with RCTs as the most informative source of evidence. The effectiveness is summarized based on the current available evidence; large trials such as RECOVERY and Solidarity are weighted heavily in this summary. This table was last updated on August 20, 2021.
Treatment Grade Category FDA Status Evidence Available Suggested Effectiveness
Dexamethasone A Small molecule, broad spectrum Used off-label RCT Supported: RCT shows improved outcomes over SOC, especially in severe cases such as CRS
Remdesivir A Small molecule, antiviral, adenosine analog Approved for COVID-19 (and EUA for combination with baricitinib) RCT Mixed: Conflicting evidence from large WHO-led Solidarity trial vs US-focused RCT and other studies
Tocilizumab A Biologic, monoclonal antibody EUA RCT Mixed: It appears that TCZ may work well in combination with dexamethasone in severe cases, but not as monotherapy
Sotrovimab N/A Biologic, monoclonal antibody EUA RCT Supported: Phase 2/3 clinical trial showed reduced hospitalization/death
Bamlanivimab and etesevimab B & N/A Biologic, monoclonal antibodies EUA RCT Supported: Phase 2 clinical trial showed reduction in viral load, but FDA pause recommended because may be less effective against Delta variant
Casirivimab and imdevimab N/A Biologic, monoclonal antibodies EUA RCT Supported: Reduced viral load at interim analysis
Fluvoxamine B Small-molecule, Sigma-1 receptor agonist N/A RCT Supported: Support from two small RCTs and preliminary support from interim analysis of TOGETHER
SNG001 B Biologic, interferon None RCT Mixed: Support from initial RCT but no effect found in WHO’s Solidarity trial
MPro Protease Inhibitors N/A Small molecule, protease inhibitor None Computational prediction, in vitro studies Unknown
ARBs & ACEIs C Small molecule, broad spectrum None Observational studies and some RCTs Not supported: Observational study retracted, RCTs suggest no association
Favipiravir D Small molecule, antiviral, nucleoside analog None RCT Not supported: RCTs do not show significant improvements for individuals taking this treatment, good safety profile
HCQ/CQ D Small molecule, broad spectrum None RCT Not supported, possibly harmful: Non-blinded RCTs showed no improvement over SOC, safety profile may be problematic
Convalescent plasma transfusion D Biologic, polyclonal antibodies EUA RCT Mixed: Supported in small trials but not in large-scale RECOVERY trial
Ivermectin D Small molecule, broad spectrum None RCT Mixed: Mixed results from small RCTs, major supporting RCT now withdrawn, preliminary results of large RCT (TOGETHER) suggest no effect on emergency room visits or hospitalization for COVID-19

All of the small-molecule drugs evaluated and most of the biologics are repurposed, and thus hinge on a theoretical understanding of how the virus interacts with a human host and how pharmaceuticals can be used to modify those interactions rather than being designed specifically against SARS-CoV-2 or COVID-19. As a result, significant attention has been paid to computational approaches that automate the identification of potentially desirable interactions. However, work in COVID-19 has made it clear that relevant compounds can also be masked by confounds, and spurious associations can drive investment in candidate therapeutics that are unlikely to translate to the clinic. Such spurious hits are especially likely to impact hypothesis-free screens. However, hypothesis-free screens may still be able to contribute to the drug discovery or repurposing ecosystem, assuming the computational arm of HTS follows the same trends seen in its experimental arm. In 2011, a landmark study in drug discovery demonstrated that although more new drugs were discovered using target-based rather than phenotypic approaches, the majority of drugs with a novel molecular mechanism of action (MMOA) were identified in phenotypic screens (994). This pattern applied only to first-in-class drugs, with most follower drugs produced by target-based screening (952). These findings suggest that target-based drug discovery is more successful when building on a known MMOA, and that modulating a target is most valuable when the target is part of a valuable MMOA (953). Building on this, many within the field suggested that mechanism-informed phenotypic investigations may be the most useful approach to drug discovery (951, 953, 955). As it stands, data-driven efforts to identify patterns in the results of computational screens allowed researchers to notice the shared property of cationic amphicility among many of the hits from computational screening analyses (982). While easier said than done, efforts to fill in the black box underlying computational HTS and recognize patterns among the identified compounds aid in moving data-oriented drug repurposing efforts in this direction.

The unpredictable nature of success and failure in drug repurposing for COVID-19 thus highlights one of the tenets of phenotypic screening: there are a lot of “unknown unknowns”, and a promising mechanism at the level of an MMOA will not necessarily propagate up to the pathway, cellular, or organismal level (951). Despite the fact that apparently mechanistically relevant drugs may exist, identifying effective treatments for a new viral disease is extremely challenging. Targets of repurposed drugs are often non-specific, meaning that the MMOA can appear to be relevant to COVID-19 without a therapeutic or prophylactic effect being observed in clinical trials. The difference in the current status of remdesivir and favipiravir as treatments for COVID-19 (Table 1) underscores how difficult to predict whether a specific compound will produce a desired effect, even when the mechanisms are similar. Furthermore, the fact that many candidate COVID-19 therapeutics were ultimately identified because of their shared propensity to induce phospholipidosis underscores how challenging it can be to identify a mechanism in silico or in vitro that will translate to a successful treatment. While significant progress has been made thus far in the pandemic, the therapeutic landscape is likely to continue to evolve as more results become available from clinical trials and as efforts to develop novel therapeutics for COVID-19 progress.

4.9 Towards the Next HCoV Threat

Only very limited testing of candidate therapies was feasible during the SARS and MERS epidemics, and as a result, few treatments were available at the outset of the COVID-19 pandemic. Even corticosteroids, which were used to treat SARS patients, were a controversial therapeutic prior to the release of the results of the large RECOVERY trial. The scale and duration of the COVID-19 pandemic has made it possible to conduct large, rigorous RCTs such as RECOVERY, Solidarity, TOGETHER, and others. As results from these trials have continued to emerge, it has become clear that small clinical trials often produce spurious results. In the case of HCQ/CQ, the therapeutic had already attracted so much attention based on small, preliminary (and in some cases, methodologically concerning) studies that it took the results of multiple large studies before attention began to be redirected to more promising candidates (995). In fact, most COVID-19 clinical trials lack the statistical power to reliably test their hypotheses (996, 997). In the face of an urgent crisis like COVID-19, the desire to act quickly is understandable, but it is imperative that studies maintain strict standards of scientific rigor (948, 986), especially given the potential dangers of politicization, as illustrated by HCQ/CQ (998). Potential innovations in clinical trial structure, such as adaptable clinical trials with master protocols (999) or the sharing of data among small clinical trials (997) may help to address future crises and to bolster the results from smaller studies, respectively.

In the long-term, new drugs specific for treatment of COVID-19 may also enter development. Development of novel drugs is likely to be guided by what is known about the pathogenesis and molecular structure of SARS-CoV-2. For example, understanding the various structural components of SARS-CoV-2 may allow for the development of small molecule inhibitors of those components. Crystal structures of the SARS-CoV-2 main protease have been resolved (961, 1000). Much work remains to be done to determine further crystal structures of other viral components, understand the relative utility of targeting different viral components, perform additional small molecule inhibitor screens, and determine the safety and efficacy of the potential inhibitors. While still nascent, work in this area is promising. Over the longer term, this approach and others may lead to the development of novel therapeutics specifically for COVID-19 and SARS-CoV-2. Such efforts are likely to prove valuable in managing future emergent HCoV, just as research from the SARS and MERS pandemic has provided a basis for the COVID-19 response.

5 Appendix: Identification and Development of Therapeutics for COVID-19

5.1 Dexamethasone

In order to understand how dexamethasone reduces inflammation, it is necessary to consider the stress response broadly. In response to stress, corticotropin‐releasing hormone stimulates the release of neurotransmitters known as catecholamines, such as epinephrine, and steroid hormones known as glucocorticoids, such as cortisol (1001, 1002). While catecholamines are often associated with the fight-or-flight response, the specific role that glucocorticoids play is less clear, although they are thought to be important to restoring homeostasis (1003). Immune challenge is a stressor that is known to interact closely with the stress response. The immune system can therefore interact with the central nervous system; for example, macrophages can both respond to and produce catecholamines (1001). Additionally, the production of both catecholamines and glucocorticoids is associated with inhibition of proinflammatory cytokines such as IL-6, IL-12, and tumor necrosis factor-α (TNF‐α) and the stimulation of anti-inflammatory cytokines such as IL-10, meaning that the stress response can regulate inflammatory immune activity (1002). Administration of dexamethasone has been found to correspond to dose-dependent inhibition of IL-12 production, but not to affect IL-10 (1004); the fact that this relationship could be disrupted by administration of a glucocorticoid-receptor antagonist suggests that it is regulated by the receptor itself (1004). Thus, the administration of dexamethasone for COVID-19 is likely to simulate the release of glucocorticoids endogenously during stress, resulting in binding of the synthetic steroid to the glucocorticoid receptor and the associated inhibition of the production of proinflammatory cytokines. In this model, dexamethasone reduces inflammation by stimulating the biological mechanism that reduces inflammation following a threat such as immune challenge.

Initial support for dexamethasone as a treatment for COVID-19 came from the United Kingdom’s RECOVERY trial (780), which assigned over 6,000 hospitalized COVID-19 patients to the standard of care (SOC) or treatment (dexamethasone) arms of the trial at a 2:1 ratio. At the time of randomization, some patients were ventilated (16%), others were on non-invasive oxygen (60%), and others were breathing independently (24%). Patients in the treatment arm were administered dexamethasone either orally or intravenously at 6 mg per day for up to 10 days. The primary end-point was the patient’s status at 28-days post-randomization (mortality, discharge, or continued hospitalization), and secondary outcomes analyzed included the progression to invasive mechanical ventilation over the same period. The 28-day mortality rate was found to be lower in the treatment group than in the SOC group (21.6% vs 24.6%, p < 0.001). However, the effect was driven by improvements in patients receiving mechanical ventilation or supplementary oxygen. One possible confounder is that patients receiving mechanical ventilation tended to be younger than patients who were not receiving respiratory support (by 10 years on average) and to have had symptoms for a longer period. However, adjusting for age did not change the conclusions, although the duration of symptoms was found to be significantly associated with the effect of dexamethasone administration. Thus, this large, randomized, and multi-site, albeit not placebo-controlled, study suggests that administration of dexamethasone to patients who are unable to breathe independently may significantly improve survival outcomes. Additionally, dexamethasone is a widely available and affordable medication, raising the hope that it could be made available to COVID-19 patients globally.

It is not surprising that administration of an immunosuppressant would be most beneficial in severe cases where the immune system was dysregulated towards inflammation. However, it is also unsurprising that care must be taken in administering an immunosuppressant to patients fighting a viral infection. In particular, the concern has been raised that treatment with dexamethasone might increase patient susceptibility to concurrent (e.g., nosocomial) infections (1005). Additionally, the drug could potentially slow viral clearance and inhibit patients’ ability to develop antibodies to SARS-CoV-2 (777, 1005), with the lack of data about viral clearance being put forward as a major limitation of the RECOVERY trial (1006). Furthermore, dexamethasone has been associated with side effects that include psychosis, glucocorticoid-induced diabetes, and avascular necrosis (777), and the RECOVERY trial did not report outcomes with enough detail to be able to determine whether they observed similar complications. The effects of dexamethasone have also been found to differ among populations, especially in high-income versus middle- or low-income countries (1007). However, since the RECOVERY trial’s results were released, strategies have been proposed for administering dexamethasone alongside more targeted treatments to minimize the likelihood of negative side effects (1005). Given the available evidence, dexamethasone is currently the most promising treatment for severe COVID-19.

5.2 Favipiravir

The effectiveness of favipiravir for treating patients with COVID-19 is currently under investigation. Evidence for the drug inhibiting viral RNA polymerase are based on time-of-drug addition studies that found that viral loads were reduced with the addition of favipiravir in early times post-infection (798, 801, 802). An open-label, nonrandomized, before-after controlled study for COVID-19 was recently conducted (1008). The study included 80 COVID-19 patients (35 treated with favipiravir, 45 control) from the isolation ward of the National Clinical Research Center for Infectious Diseases (The Third People’s Hospital of Shenzhen), Shenzhen, China. The patients in the control group were treated with other antivirals, such as lopinavir and ritonavir. It should be noted that although the control patients received antivirals, two subsequent large-scale analyses, the WHO Solidarity trial and the Randomized Evaluation of COVID-19 Therapy (RECOVERY) trial, identified no effect of lopinavir or of a lopinavir-ritonavir combination, respectively, on the metrics of COVID-19-related mortality that each assessed (811, 1009, 1010). Treatment was applied on days 2-14; treatment stopped either when viral clearance was confirmed or at day 14. The efficacy of the treatment was measured by, first, the time until viral clearance using Kaplan-Meier survival curves, and, second, the improvement rate of chest computed tomography (CT) scans on day 14 after treatment. The study found that favipiravir increased the speed of recovery, measured as viral clearance from the patient by RT-PCR, with patients receiving favipiravir recovering in four days compared to 11 days for patients receiving antivirals such as lopinavir and ritonavir. Additionally, the lung CT scans of patients treated with favipiravir showed significantly higher improvement rates (91%) on day 14 compared to control patients (62%, p = 0.004). However, there were adverse side effects in 4 (11%) favipiravir-treated patients and 25 (56%) control patients. The adverse side effects included diarrhea, vomiting, nausea, rash, and liver and kidney injury. Despite the study reporting clinical improvement in favipiravir-treated patients, several study design issues are problematic and lower confidence in the overall conclusions. For example, the study was neither randomized nor blinded. Moreover, the selection of patients did not take into consideration important factors such as previous clinical conditions or sex, and there was no age categorization. Additionally, it should be noted that this study was temporarily retracted and then restored without an explanation (1011).

In late 2020 and early 2021, the first randomized controlled trials of favipiravir for the treatment of COVID-19 released results (10121014). One study (1013) was retracted in November 2021 due to concerns about the data. Of the two remaining, the first (1012) used a randomized, controlled, open-label design to compare two drugs, favipiravir and baloxavir marboxil, to SOC alone. Here, SOC included antivirals such as lopinavir/ritonavir and was administered to all patients. The primary endpoint analyzed was viral clearance at day 14. The sample size for this study was very small, with 29 total patients enrolled, and no significant effect of the treatments was found for the primary or any of the secondary outcomes analyzed, which included mortality. The second trial examined 60 patients and reported a significant effect of favipiravir on viral clearance at four days (a secondary endpoint), but not at 10 days (the primary endpoint) (1014). This study, as well as a prior study of favipiravir (1015), also reported that the drug was generally well-tolerated. Thus, in combination, these small studies suggest that the effects of favipiravir as a treatment for COVID-19 cannot be determined based on the available evidence, but additionally, none raise major concerns about the safety profile of the drug.

5.3 Remdesivir

At the outset of the COVID-19 pandemic, remdesivir did not have any have any FDA-approved use. A clinical trial in the Democratic Republic of Congo found some evidence of effectiveness against ebola virus disease (EVD), but two antibody preparations were found to be more effective, and remdesivir was not pursued (1016). Remdesivir also inhibits polymerase and replication of the coronaviruses MERS-CoV and SARS-CoV-1 in cell culture assays with submicromolar IC50s (1017). It has also been found to inhibit SARS-CoV-2, showing synergy with CQ in vitro (806).

Remdesivir was first used on some COVID-19 patients under compassionate use guidelines (1020). All were in late stages of COVID-19 infection, and initial reports were inconclusive about the drug’s efficacy. Gilead Sciences, the maker of remdesivir, led a recent publication that reported outcomes for compassionate use of the drug in 61 patients hospitalized with confirmed COVID-19. Here, 200 mg of remdesivir was administered intravenously on day 1, followed by a further 100 mg/day for 9 days (810). There were significant issues with the study design, or lack thereof. There was no randomized control group. The inclusion criteria were variable: some patients only required low doses of oxygen, while others required ventilation. The study included many sites, potentially with variable inclusion criteria and treatment protocols. The patients analyzed had mixed demographics. There was a short follow-up period of investigation. Eight patients were excluded from the analysis mainly due to missing post-baseline information; thus, their health was unaccounted for. Therefore, even though the study reported clinical improvement in 68% of the 53 patients ultimately evaluated, due to the significant issues with study design, it could not be determined whether treatment with remdesivir had an effect or whether these patients would have recovered regardless of treatment. Another study comparing 5- and 10-day treatment regimens reported similar results but was also limited because of the lack of a placebo control (1021). These studies did not alter the understanding of the efficacy of remdesivir in treating COVID-19, but the encouraging results provided motivation for placebo-controlled studies.

The double-blind placebo-controlled ACTT-1 trial (807, 808) recruited 1,062 patients and randomly assigned them to placebo treatment or treatment with remdesivir. Patients were stratified for randomization based on site and the severity of disease presentation at baseline (807). The treatment was 200 mg on day 1, followed by 100 mg on days 2 through 10. Data was analyzed from a total of 1,059 patients who completed the 29-day course of the trial, with 517 assigned to remdesivir and 508 to placebo (807). The two groups were well matched demographically and clinically at baseline. Those who received remdesivir had a median recovery time of 10 days, as compared with 15 days in those who received placebo (rate ratio for recovery, 1.29; 95% confidence interval (CI), 1.12 to 1.49; p < 0.001). The Kaplan-Meier estimates of mortality by 14 days were 6.7% with remdesivir and 11.9% with placebo, with a hazard ratio (HR) for death of 0.55 and a 95% CI of 0.36 to 0.83, and at day 29, remdesivir corresponded to 11.4% and the placebo to 15.2% (HR: 0.73; 95% CI, 0.52 to 1.03). Serious adverse events were reported in 131 of the 532 patients who received remdesivir (24.6%) and in 163 of the 516 patients in the placebo group (31.6%). This study also reported an association between remdesivir administration and both clinical improvement and a lack of progression to more invasive respiratory intervention in patients receiving non-invasive and invasive ventilation at randomization (807). Largely on the results of this trial, the FDA reissued and expanded the EUA for remdesivir for the treatment of hospitalized COVID-19 patients ages twelve and older (1022). Additional clinical trials (806, 10231026) are currently underway to evaluate the use of remdesivir to treat COVID-19 patients at both early and late stages of infection and in combination with other drugs (Figure 4). As of October 22, 2020, remdesivir received FDA approval based on three clinical trials (1027).

However, results suggesting no effect of remdesivir on survival were reported by the WHO Solidarity trial (811). Patients were randomized in equal proportions into four experimental conditions and a control condition, corresponding to four candidate treatments for COVID-19 and SOC, respectively; no placebo was administered. The 2,750 patients in the remdesivir group were administered 200 mg intravenously on the first day and 100 mg on each subsequent day until day 10 and assessed for in-hospital death (primary endpoint), duration of hospitalization, and progression to mechanical ventilation. There were also 2,708 control patients who would have been eligible and able to receive remdesivir were they not assigned to the control group. A total of 604 patients among these two cohorts died during initial hospitalization, with 301 in the remdesivir group and 303 in the control group. The rate ratio of death between these two groups was therefore not significant (0.95, p = 0.50), suggesting that the administration of remdesivir did not affect survival. The two secondary analyses similarly did not find any effect of remdesivir. Additionally, the authors compared data from their study with data from three other studies of remdesivir (including (807)) stratified by supplemental oxygen status. A meta-analysis of the four studies yielded an overall rate ratio for death of 0.91 (p = 0.20). These results thus do not support the previous findings that remdesivir reduced median recovery time and mortality risk in COVID-19 patients.

In response to the results of the Solidarity trial, Gilead, which manufactures remdesivir, released a statement pointing to the fact that the Solidarity trial was not placebo-controlled or double-blind and at the time of release, the statement had not been peer reviewed (1028); these sentiments have been echoed elsewhere (1029). Other critiques of this study have noted that antivirals are not typically targeted at patients with severe illness, and therefore remdesivir could be more beneficial for patients with mild rather than severe cases (1010, 1030). However, the publication associated with the trial sponsored by Gilead did purport an effect of remdesivir on patients with severe disease, identifying an 11 versus 18 day recovery period (rate ratio for recovery: 1.31, 95% CI 1.12 to 1.52) (807). Additionally, a smaller analysis of 598 patients, of whom two-thirds were randomized to receive remdesivir for either 5 or 10 days, reported a small effect of treatment with remdesivir for five days relative to standard of care in patients with moderate COVID-19 (1031). These results suggest that remdesivir could improve outcomes for patients with moderate COVID-19, but that additional information would be needed to understand the effects of different durations of treatment. Therefore, the Solidarity trial may point to limitations in the generalizability of other research on remdesivir, especially since the broad international nature of the Solidarity clinical trial, which included countries with a wide range of economic profiles and a variety of healthcare systems, provides a much-needed global perspective in a pandemic (1010). On the other hand, only 62% of patients in the Solidarity trial were randomized on the day of admission or one day afterwards (811), and concerns have been raised that differences in disease progression could influence the effectiveness of remdesivir (1010). Despite the findings of the Solidarity trial, remdesivir remains available for the treatment of COVID-19 in many places. Remdesivir has also been investigated in combination with other drugs, such as baricitinib, which is an inhibitor of Janus kinase 1 and 2 (1032); the FDA has issued an EUA for the combination of remdesivir and baricitinib in adult and pediatric patients (1033). Follow-up studies are needed and, in many cases, are underway to further investigate remdesivir-related outcomes.

Similarly, the extent to which the remdesivir dosing regimen could influence outcomes continues to be under consideration. A randomized, open-label trial compared the effect of remdesivir on 397 patients with severe COVID-19 over 5 versus 10 days (809, 1021), complementing the study that found that a 5-day course of remdesivir improved outcomes for patients with moderate COVID-19 but a 10-day course did not (1031). Patients in the two groups were administered 200 mg of remdesivir intravenously on the first day, followed by 100 mg on the subsequent four or nine days, respectively. The two groups differed significantly in their clinical status, with patients assigned to the 10-day group having more severe illness. This study also differed from most because it included not only adults, but also pediatric patients as young as 12 years old. It reported no significant differences across several outcomes for patients receiving a 5-day or 10-day course, when correcting for baseline clinical status. The data did suggest that the 10-day course might reduce mortality in the most severe patients at day 14, but the representation of this group in the study population was too low to justify any conclusions (1021). Thus, additional research is also required to determine whether the dosage and duration of remdesivir administration influences outcomes.

In summary, remdesivir is the first FDA approved anti-viral against SARS-CoV-2 as well as the first FDA approved COVID-19 treatment. Early investigations of this drug established proof of principle that drugs targeting the virus can benefit COVID-19 patients. Moreover, one of the most successful strategies for developing therapeutics for viral diseases is to target the viral replication machinery, which are typically virally encoded polymerases. Small molecule drugs targeting viral polymerases are the backbones of treatments for other viral diseases including human immunodeficiency virus (HIV) and herpes. Notably, the HIV and herpes polymerases are a reverse transcriptase and a DNA polymerase, respectively, whereas SARS-CoV-2 encodes an RdRP, so most of the commonly used polymerase inhibitors are not likely to be active against SARS-CoV-2. In clinical use, polymerase inhibitors show short term benefits for HIV patients, but for long term benefits they must be part of combination regimens. They are typically combined with protease inhibitors, integrase inhibitors, and even other polymerase inhibitors. Remdesivir provides evidence that a related approach may be beneficial for the treatment of COVID-19.

5.4 Hydroxychloroquine and Chloroquine

CQ and hydroxychloroquine (HCQ) increase cellular pH by accumulating in their protonated form inside lysosomes (813, 1034). This shift in pH inhibits the breakdown of proteins and peptides by the lysosomes during the process of proteolysis (813). Interest in CQ and HCQ for treating COVID-19 was catalyzed by a mechanism observed in in vitro studies of both SARS-CoV-1 and SARS-CoV-2. In one study, CQ inhibited viral entry of SARS-CoV-1 into Vero E6 cells, a cell line that was derived from Vero cells in 1968, through the elevation of endosomal pH and the terminal glycosylation of ACE2 (814). Increased pH within the cell, as discussed above, inhibits proteolysis, and terminal glycosylation of ACE2 is thought to interfere with virus-receptor binding. An in vitro study of SARS-CoV-2 infection of Vero cells found both HCQ and CQ to be effective in inhibiting viral replication, with HCQ being more potent (815). Additionally, an early case study of three COVID-19 patients reported the presence of antiphospholipid antibodies in all three patients (100). Antiphospholipid antibodies are central to the diagnosis of the antiphospholipid syndrome, a disorder that HCQ has often been used to treat (10351037). Because the 90% effective concentration (EC90) of CQ in Vero E6 cells (6.90 μM) can be achieved in and tolerated by rheumatoid arthritis (RA) patients, it was hypothesized that it might also be possible to achieve the effective concentration in COVID-19 patients (1038). Additionally, clinical trials have reported HCQ to be effective in treating HIV (1039) and chronic Hepatitis C (1040). Together, these studies triggered initial enthusiasm about the therapeutic potential for HCQ and CQ against COVID-19. HCQ/CQ has been proposed both as a treatment for COVID-19 and a prophylaxis against SARS-CoV-2 exposure, and trials often investigated these drugs in combination with azithromycin (AZ) and/or zinc supplementation. However, as more evidence has emerged, it has become clear that HCQ/CQ offer no benefits against SARS-CoV-2 or COVID-19.

5.4.1 Trials Assessing Therapeutic Administration of HCQ/CQ

The initial study evaluating HCQ as a treatment for COVID-19 patients was published on March 20, 2020 by Gautret et al. (816). This non-randomized, non-blinded, non-placebo clinical trial compared HCQ to SOC in 42 hospitalized patients in southern France. It reported that patients who received HCQ showed higher rates of virological clearance by nasopharyngeal swab on days 3-6 when compared to SOC. This study also treated six patients with both HCQ + AZ and found this combination therapy to be more effective than HCQ alone. However, the design and analyses used showed weaknesses that severely limit interpretability of results, including the small sample size and the lack of: randomization, blinding, placebo (no “placebo pill” given to SOC group), Intention-To-Treat analysis, correction for sequential multiple comparisons, and trial pre-registration. Furthermore, the trial arms were entirely confounded by the hospital and there were false negative outcome measurements (see (1041)). Two of these weaknesses are due to inappropriate data analysis and can therefore be corrected post hoc by recalculating the p-values (lack of Intention-To-Treat analysis and multiple comparisons). However, all other weaknesses are fundamental design flaws and cannot be corrected for. Thus, the conclusions cannot be generalized outside of the study. The International Society of Antimicrobial Chemotherapy, the scientific organization that publishes the journal where the article appeared, subsequently announced that the article did not meet its expected standard for publications (817), although it has not been officially retracted.

Because of the preliminary data presented in this study, HCQ treatment was subsequently explored by other researchers. About one week later, a follow-up case study reported that 11 consecutive patients were treated with HCQ + AZ using the same dosing regimen (1042). One patient died, two were transferred to the intensive care unit (ICU), and one developed a prolonged QT interval, leading to discontinuation of HCQ + AZ administration. As in the Gautret et al. study, the outcome assessed was virological clearance at day 6 post-treatment, as measured from nasopharyngeal swabs. Of the ten living patients on day 6, eight remained positive for SARS-CoV-2 RNA. Like in the original study, interpretability was severely limited by the lack of a comparison group and the small sample size. However, these results stand in contrast to the claims by Gautret et al. that all six patients treated with HCQ + AZ tested negative for SARS-CoV-2 RNA by day 6 post-treatment. This case study illustrated the need for further investigation using robust study design to evaluate the efficacy of HCQ and/or CQ.

On April 10, 2020, a randomized, non-placebo trial of 62 COVID-19 patients at the Renmin Hospital of Wuhan University was released (1043). This study investigated whether HCQ decreased time to fever break or time to cough relief when compared to SOC (1043). This trial found HCQ decreased both average time to fever break and average time to cough relief, defined as mild or no cough. While this study improved on some of the methodological flaws in Gautret et al. by randomizing patients, it also had several flaws in trial design and data analysis that prevent generalization of the results. These weaknesses include the lack of placebo, lack of correction for multiple primary outcomes, inappropriate choice of outcomes, lack of sufficient detail to understand analysis, drastic disparities between pre-registration (1044) and published protocol (including differences in the inclusion and exclusion criteria, the number of experimental groups, the number of patients enrolled, and the outcome analyzed), and small sample size. The choice of outcomes may be inappropriate as both fevers and cough may break periodically without resolution of illness. Additionally, for these outcomes, the authors reported that 23 of 62 patients did not have a fever and 25 of 62 patients did not have a cough at the start of the study, but the authors failed to describe how these patients were included in a study assessing time to fever break and time to cough relief. It is important to note here that the authors claimed “neither the research performers nor the patients were aware of the treatment assignments.” This blinding seems impossible in a non-placebo trial because at the very least, providers would know whether they were administering a medication or not, and this knowledge could lead to systematic differences in the administration of care. Correction for multiple primary outcomes can be adjusted post hoc by recalculating p-values, but all of the other issues were design and statistical weaknesses that cannot be corrected for. Additionally, disparities between the pre-registered and published protocols raise concerns about experimental design. The design limitations mean that the conclusions cannot be generalized outside of the study.

A second randomized trial, conducted by the Shanghai Public Health Clinical Center, analyzed whether HCQ increased rates of virological clearance at day 7 in respiratory pharyngeal swabs compared to SOC (1045). This trial was published in Chinese along with an abstract in English, and only the English abstract was read and interpreted for this review. The trial found comparable outcomes in virological clearance rate, time to virological clearance, and time to body temperature normalization between the treatment and control groups. The small sample size is one weakness, with only 30 patients enrolled and 15 in each arm. This problem suggests the study is underpowered to detect potentially useful differences and precludes interpretation of results. Additionally, because only the abstract could be read, other design and analysis issues could be present. Thus, though these studies added randomization to their assessment of HCQ, their conclusions should be interpreted very cautiously. These two studies assessed different outcomes and reached differing conclusions about the efficacy of HCQ for treating COVID-19; the designs of both studies, especially with respect to sample size, meant that no general conclusions can be made about the efficacy of the drug.

Several widely reported studies on HCQ also have issues with data integrity and/or provenance. A Letter to the Editor published in BioScience Trends on March 16, 2020 claimed that numerous clinical trials have shown that HCQ is superior to control treatment in inhibiting the exacerbation of COVID-19 pneumonia (1046). This letter has been cited by numerous primary literature, review articles, and media alike (1047, 1048). However, the letter referred to 15 pre-registration identifiers from the Chinese Clinical Trial Registry. When these identifiers are followed back to the registry, most trials claim they are not yet recruiting patients or are currently recruiting patients. For all of these 15 identifiers, no data uploads or links to publications could be located on the pre-registrations. At the very least, the lack of availability of the primary data means the claim that HCQ is efficacious against COVID-19 pneumonia cannot be verified. Similarly, a recent multinational registry analysis (1049) analyzed the efficacy of CQ and HCQ with and without a macrolide, which is a class of antibiotics that includes Azithromycin, for the treatment of COVID-19. The study observed 96,032 patients split into a control and four treatment conditions (CQ with and without a macrolide; HCQ with and without a macrolide). They concluded that treatment with CQ or HCQ was associated with increased risk of de novo ventricular arrhythmia during hospitalization. However, this study has since been retracted by The Lancet due to an inability to validate the data used (1050). These studies demonstrate that increased skepticism in evaluation of the HCQ/CQ and COVID-19 literature may be warranted, possibly because of the significant attention HCQ and CQ have received as possible treatments for COVID-19 and the politicization of these drugs.

Despite the fact that the study suggesting that CQ/HCQ increased risk of ventricular arrhythmia in COVID-19 patients has now been retracted, previous studies have identified risks associated with HCQ/CQ. A patient with systemic lupus erythematosus developed a prolonged QT interval that was likely exacerbated by use of HCQ in combination with renal failure (1051). A prolonged QT interval is associated with ventricular arrhythmia (1052). Furthermore, a separate study (1053) investigated the safety associated with the use of HCQ with and without macrolides between 2000 and 2020. The study involved 900,000 cases treated with HCQ and 300,000 cases treated with HCQ + AZ. The results indicated that short-term use of HCQ was not associated with additional risk, but that HCQ + AZ was associated with an enhanced risk of cardiovascular complications (such as a 15% increased risk of chest pain, calibrated HR = 1.15, 95% CI, 1.05 to 1.26) and a two-fold increased 30-day risk of cardiovascular mortality (calibrated HR = 2.19; 95% CI, 1.22 to 3.94). Therefore, whether studies utilize HCQ alone or HCQ in combination with a macrolide may be an important consideration in assessing risk. As results from initial investigations of these drug combinations have emerged, concerns about the efficacy and risks of treating COVID-19 with HCQ and CQ have led to the removal of CQ/HCQ from SOC practices in several countries (1054, 1055). As of May 25, 2020, WHO had suspended administration of HCQ as part of the worldwide Solidarity Trial (1056), and later the final results of this large-scale trial that compared 947 patients administered HCQ to 906 controls revealed no effect on the primary outcome, mortality during hospitalization (rate ratio: 1.19; p = 0.23)

Additional research has emerged largely identifying HCQ/CQ to be ineffective against COVID-19 while simultaneously revealing a number of significant side effects. A randomized, open-label, non-placebo trial of 150 COVID-19 patients was conducted in parallel at 16 government-designated COVID-19 centers in China to assess the safety and efficacy of HCQ (1057). The trial compared treatment with HCQ in conjunction with SOC to SOC alone in 150 infected patients who were assigned randomly to the two groups (75 per group). The primary endpoint of the study was the negative conversion rate of SARS-CoV-2 in 28 days, and the investigators found no difference in this parameter between the groups (estimated difference between SOC plus HCQ and SOC 4.1%; 95% CI, –10.3% to 18.5%). The secondary endpoints were an amelioration of the symptoms of the disease such as axillary temperature ≤36.6°C, SpO2 >94% on room air, and disappearance of symptoms like shortness of breath, cough, and sore throat. The median time to symptom alleviation was similar across different conditions (19 days in HCQ + SOC versus 21 days in SOC, p = 0.97). Additionally, 30% of the patients receiving SOC+HCQ reported adverse outcomes compared to 8.8% of patients receiving only SOC, with the most common adverse outcome in the SOC+HCQ group being diarrhea (10% versus 0% in the SOC group, p = 0.004). However, there are several factors that limit the interpretability of this study. Most of the enrolled patients had mild-to-moderate symptoms (98%), and the average age was 46. SOC in this study included the use of antivirals (Lopinavir-Ritonavir, Arbidol, Oseltamivir, Virazole, Entecavir, Ganciclovir, and Interferon alfa), which the authors note could influence the results. Thus, they note that an ideal SOC would need to exclude the use of antivirals, but that ceasing antiviral treatment raised ethical concerns at the time that the study was conducted. In this trial, the samples used to test for the presence of the SARS-CoV-2 virus were collected from the upper respiratory tract, and the authors indicated that the use of upper respiratory samples may have introduced false negatives (e.g., (71)). Another limitation of the study that the authors acknowledge was that the HCQ treatment began, on average, at a 16-day delay from the symptom onset. The fact that this study was open-label and lacked a placebo limits interpretation, and additional analysis is required to determine whether HCQ reduces inflammatory response. Therefore, despite some potential areas of investigation identified in post hoc analysis, this study cannot be interpreted as providing support for HCQ as a therapeutic against COVID-19. This study provided no support for HCQ against COVID-19, as there was no difference between the two groups in either negative seroconversion at 28 days or symptom alleviation, and in fact, more severe adverse outcomes were reported in the group receiving HCQ.

Additional evidence comes from a retrospective analysis (1058) that examined data from 368 COVID-19 patients across all United States Veteran Health Administration medical centers. The study retrospectively investigated the effect of the administration of HCQ (n=97), HCQ + AZ (n=113), and no HCQ (n=158) on 368 patients. The primary outcomes assessed were death and the need for mechanical ventilation. Standard supportive care was rendered to all patients. Due to the low representation of women (N=17) in the available data, the analysis included only men, and the median age was 65 years. The rate of death was 27.8% in the HCQ-only treatment group, 22.1% in the HCQ + AZ treatment group, and 14.1% in the no-HCQ group. These data indicated a statistically significant elevation in the risk of death for the HCQ-only group compared to the no-HCQ group (adjusted HR: 2.61, p = 0.03), but not for the HCQ + AZ group compared to the no-HCQ group (adjusted HR: 1.14; p = 0.72). Further, the risk of ventilation was similar across all three groups (adjusted HR: 1.43, p = 0.48 (HCQ) and 0.43, p = 0.09 (HCQ + AZ) compared to no HCQ). The study thus showed evidence of an association between increased mortality and HCQ in this cohort of COVID-19 patients but no change in rates of mechanical ventilation among the treatment conditions. The study had a few limitations: it was not randomized, and the baseline vital signs, laboratory tests, and prescription drug use were significantly different among the three groups. All of these factors could potentially influence treatment outcome. Furthermore, the authors acknowledge that the effect of the drugs might be different in females and pediatric subjects, since these subjects were not part of the study. The reported result that HCQ + AZ is safer than HCQ contradicts the findings of the previous large-scale analysis of twenty years of records that found HCQ + AZ to be more frequently associated with cardiac arrhythmia than HCQ alone (1053); whether this discrepancy is caused by the pathology of COVID-19, is influenced by age or sex, or is a statistical artifact is not presently known.

Finally, findings from the RECOVERY trial were released on October 8, 2020. This study used a randomized, open-label design to study the effects of HCQ compared to SOC in 11,197 patients at 176 hospitals in the United Kingdom (818). Patients were randomized into either the control group or one of the treatment arms, with twice as many patients enrolled in the control group as any treatment group. Of the patients eligible to receive HCQ, 1,561 were randomized into the HCQ arm, and 3,155 were randomized into the control arm. The demographics of the HCQ and control groups were similar in terms of average age (65 years), proportion female (approximately 38%), ethnic make-up (73% versus 76% white), and prevalence of pre-existing conditions (56% versus 57% overall). In the HCQ arm of the study, patients received 800 mg at baseline and again after 6 hours, then 400 mg at 12 hours and every subsequent 12 hours. The primary outcome analyzed was all-cause mortality, and patient vital statistics were reported by physicians upon discharge or death, or else at 28 days following HCQ administration if they remained hospitalized. The secondary outcome assessed was the combined risk of progression to invasive mechanical ventilation or death within 28 days. By the advice of an external data monitoring committee, the HCQ arm of the study was reviewed early, leading to it being closed due a lack of support for HCQ as a treatment for COVID-19. COVID-19-related mortality was not affected by HCQ in the RECOVERY trial (rate ratio, 1.09; 95% CI, 0.97 to 1.23; p = 0.15), but cardiac events were increased in the HCQ arm (0.4 percentage points), as was the duration of hospitalization (rate ratio for discharge alive within 28 days: 0.90; 95% CI, 0.83 to 0.98) and likelihood of progression to mechanical ventilation or death (risk ratio 1.14; 95% CI, 1.03 to 1.27). This large-scale study thus builds upon studies in the United States and China to suggest that HCQ is not an effective treatment, and in fact may negatively impact COVID-19 patients due to its side effects. Therefore, though none of the studies have been blinded, examining them together makes it clear that the available evidence points to significant dangers associated with the administration of HCQ to hospitalized COVID-19 patients, without providing any support for its efficacy.

5.4.2 HCQ for the Treatment of Mild Cases

One additional possible therapeutic application of HCQ considered was the treatment of mild COVID-19 cases in otherwise healthy individuals. This possibility was assessed in a randomized, open-label, multi-center analysis conducted in Catalonia (Spain) (1059). This analysis enrolled adults 18 and older who had been experiencing mild symptoms of COVID-19 for fewer than five days. Participants were randomized into an HCQ arm (N=136) and a control arm (N=157), and those in the treatment arm were administered 800 mg of HCQ on the first day of treatment followed by 400 mg on each of the subsequent six days. The primary outcome assessed was viral clearance at days 3 and 7 following the onset of treatment, and secondary outcomes were clinical progression and time to complete resolution of symptoms. No significant differences between the two groups were found: the difference in viral load between the HCQ and control groups was 0.01 (95% CI, -0.28 to 0.29) at day 3 and -0.07 (95% CI -0.44 to 0.29) at day 7, the relative risk of hospitalization was 0.75 (95% CI, 0.32 to 1.77), and the difference in time to complete resolution of symptoms was -2 days (p = 0.38). This study thus suggests that HCQ does not improve recovery from COVID-19, even in otherwise healthy adult patients with mild symptoms.

5.4.3 Prophylactic Administration of HCQ

An initial study of the possible prophylactic application of HCQ utilized a randomized, double-blind, placebo-controlled design to analyze the administration of HCQ prophylactically (1060). Asymptomatic adults in the United States and Canada who had been exposed to SARS-CoV-2 within the past four days were enrolled in an online study to evaluate whether administration of HCQ over five days influenced the probability of developing COVID-19 symptoms over a 14-day period. Of the participants, 414 received HCQ and 407 received a placebo. No significant difference in the rate of symptomatic illness was observed between the two groups (11.8% HCQ, 14.3% placebo, p = 0.35). The HCQ condition was associated with side effects, with 40.1% of patients reporting side effects compared to 16.8% in the control group (p < 0.001). However, likely due to the high enrollment of healthcare workers (66% of participants) and the well-known side effects associated with HCQ, a large number of participants were able to correctly identify whether they were receiving HCQ or a placebo (46.5% and 35.7%, respectively). Furthermore, due to a lack of availability of diagnostic testing, only 20 of the 107 cases were confirmed with a PCR-based test to be positive for SARS-CoV-2. The rest were categorized as “probable” or “possible” cases by a panel of four physicians who were blind to the treatment status. One possible confounder is that a patient presenting one or more symptoms, which included diarrhea, was defined as a “possible” case, but diarrhea is also a common side effect of HCQ. Additionally, four of the twenty PCR-confirmed cases did not develop symptoms until after the observation period had completed, suggesting that the 14-day trial period may not have been long enough or that some participants also encountered secondary exposure events. Finally, in addition to the young age of the participants in this study, which ranged from 32 to 51, there were possible impediments to generalization introduced by the selection process, as 2,237 patients who were eligible but had already developed symptoms by day 4 were enrolled in a separate study. It is therefore likely that asymptomatic cases were over-represented in this sample, which would not have been detected based on the diagnostic criteria used. Therefore, while this study does represent the first effort to conduct a randomized, double-blind, placebo-controlled investigation of HCQ’s effect on COVID-19 prevention after SARS-CoV-2 exposure in a large sample, the lack of PCR tests and several other design flaws significantly impede interpretation of the results. However, in line with the results from therapeutic studies, once again no evidence was found suggesting an effect of HCQ against COVID-19.

A second study (1061) examined the effect of administering HCQ to healthcare workers as a pre-exposure prophylactic. The primary outcome assessed was the conversion from SARS-CoV-2 negative to SARS-CoV-2 positive status over the 8 week study period. This study was also randomized, double-blind, and placebo-controlled, and it sought to address some of the limitations of the first prophylactic study. The goal was to enroll 200 healthcare workers, preferentially those working with COVID-19 patients, at two hospitals within the University of Pennsylvania hospital system in Philadelphia, PA. Participants were randomized 1:1 to receive either 600 mg of HCQ daily or a placebo, and their SARS-CoV-2 infection status and antibody status were assessed using RT-PCR and serological testing, respectively, at baseline, 4 weeks, and 8 weeks following the beginning of the treatment period. The statistical design of the study accounted for interim analyses at 50 and 100 participants in case efficacy or futility of HCQ for prophylaxis became clear earlier than completion of enrollment. The 139 individuals enrolled comprised a study population that was fairly young (average age 33) and made of largely of people who were white, women, and without pre-existing conditions. At the second interim analysis, more individuals in the treatment group than the control group had contracted COVID-19 (4 versus 3), causing the estimated z-score to fall below the pre-established threshold for futility. As a result, the trial was terminated early, offering additional evidence against the use of HCQ for prophylaxis.

5.4.4 Summary of HCQ/CQ Research Findings

Early in vitro evidence indicated that HCQ could be an effective therapeutic against SARS-CoV-2 and COVID-19, leading to significant media attention and public interest in its potential as both a therapeutic and prophylactic. Initially it was hypothesized that CQ/HCQ might be effective against SARS-CoV-2 in part because CQ and HCQ have both been found to inhibit the expression of CD154 in T-cells and to reduce TLR signaling that leads to the production of pro-inflammatory cytokines (1062). Clinical trials for COVID-19 have more often used HCQ rather than CQ because it offers the advantages of being cheaper and having fewer side effects than CQ. However, research has not found support for a positive effect of HCQ on COVID-19 patients. Multiple clinical studies have already been carried out to assess HCQ as a therapeutic agent for COVID-19, and many more are in progress. To date, none of these studies have used randomized, double-blind, placebo-controlled designs with a large sample size, which would be the gold standard. Despite the design limitations (which would be more likely to produce false positives than false negatives), initial optimism about HCQ has largely dissipated. The most methodologically rigorous analysis of HCQ as a prophylactic (1060) found no significant differences between the treatment and control groups, and the WHO’s global Solidarity trial similarly reported no effect of HCQ on mortality (811). Thus, HCQ/CQ are not likely to be effective therapeutic or prophylactic agents against COVID-19. One case study identified drug-induced phospholipidosis as the cause of death for a COVID-19 patient treated with HCQ (984), suggesting that in some cases, the proposed mechanism of action may ultimately be harmful. Additionally, one study identified an increased risk of mortality in older men receiving HCQ, and administration of HCQ and HCQ + AZ did not decrease the use of mechanical ventilation in these patients (1058). HCQ use for COVID-19 could also lead to shortages for anti-malarial or anti-rheumatic use, where it has documented efficacy. Despite significant early attention, these drugs appear to be ineffective against COVID-19. Several countries have now removed CQ/HCQ from their SOC for COVID-19 due to the lack of evidence of efficacy and the frequency of adverse effects.

5.5 ACE Inhibitors and Angiotensin II Receptor Blockers

Several clinical trials testing the effects of ACEIs or ARBs on COVID-19 outcomes are ongoing (10631069). Clinical trials are needed because the findings of the various observational studies bearing on this topic cannot be interpreted as indicating a protective effect of the drug (1070, 1071). Two analyses (1063, 1069) have reported no effect of continuing or discontinuing ARBs and ACEIs on patients admitted to the hospital for COVID-19. The first, known as REPLACE COVID (874), was a randomized, open-label study that enrolled patients who were admitted to the hospital for COVID-19 and were taking an ACEI at the time of admission. They enrolled 152 patients at 20 hospitals across seven countries and randomized them into two arms, continuation (n=75) and discontinuation (n=77). The primary outcome evaluated was a global rank score that integrated several dimensions of illness. The components of this global rank score, such as time to death and length of mechanical ventilation, were evaluated as secondary endpoints. This analysis reported no differences between the two groups in the primary or any of the secondary outcomes.

Similarly, a second study (875) used a randomized, open-label design to examine the effects of continuing versus discontinuing ARBs and ACEIs on patients hospitalized for mild to moderate COVID-19 at 29 hospitals in Brazil. This study enrolled 740 patients but had to exclude one trial site from all analyses due to the discovery of violations of Good Clinical Trial practice and data falsification. After this exclusion, 659 patients remained, with 334 randomized to discontinuation and 325 to continuation. In this study, the primary endpoint analyzed was the number of days that patients were alive and not hospitalized within 30 days of enrollment. The secondary outcomes included death (including in-hospital death separately), number of days hospitalized, and specific clinical outcomes such as heart failure or stroke. Once again, no significant differences were found between the two groups. Initial studies of randomized interventions therefore suggest that ACEIs and ARBs are unlikely to affect COVID-19 outcomes. These results are also consistent with findings from observational studies (summarized in (874)). Additional information about ACE2, observational studies of ACEIs and ARBs in COVID-19, and clinical trials on this topic have been summarized (1072). Therefore, despite the promising potential mechanism, initial results have not provided support for ACEIs and ARBs as therapies for COVID-19.

5.6 Tocilizumab

Human IL-6 is a 26-kDa glycoprotein that consists of 184 amino acids and contains two potential N-glycosylation sites and four cysteine residues. It binds to a type I cytokine receptor (IL-6Rα or glycoprotein 80) that exists in both membrane-bound (IL-6Rα) and soluble (sIL-6Rα) forms (1073). It is not the binding of IL-6 to the receptor that initiates pro- and/or anti-inflammatory signaling, but rather the binding of the complex to another subunit, known as IL-6Rβ or glycoprotein 130 (gp130) (1073, 1074). Unlike membrane-bound IL-6Rα, which is only found on hepatocytes and some types of leukocytes, gp130 is found on most cells (1075). When IL-6 binds to sIL-6Rα, the complex can then bind to a gp130 protein on any cell (1075). The binding of IL-6 to IL-6Rα is termed classical signaling, while its binding to sIL-6Rα is termed trans-signaling (10751077). These two signaling processes are thought to play different roles in health and illness. For example, trans-signaling may play a role in the proliferation of mucosal T-helper TH2 cells associated with asthma, while an earlier step in this proliferation process may be regulated by classical signaling (1075). Similarly, IL-6 is known to play a role in Crohn’s Disease via trans-, but not classical, signaling (1075). Both classical and trans-signaling can occur through three independent pathways: the Janus-activated kinase-STAT3 pathway, the Ras/Mitogen-Activated Protein Kinases pathway and the Phosphoinositol-3 Kinase/Akt pathway (1073). These signaling pathways are involved in a variety of different functions, including cell type differentiation, immunoglobulin synthesis, and cellular survival signaling pathways, respectively (1073). The ultimate result of the IL-6 cascade is to direct transcriptional activity of various promoters of pro-inflammatory cytokines, such as IL-1, TFN, and even IL-6 itself, through the activity of NF-κB (1073). IL-6 synthesis is tightly regulated both transcriptionally and post-transcriptionally, and it has been shown that viral proteins can enhance transcription of the IL-6 gene by strengthening the DNA-binding activity between several transcription factors and IL-6 gene-cis-regulatory elements (1078). Therefore, drugs inhibiting the binding of IL-6 to IL-6Rα or sIL-6Rα are of interest for combating the hyperactive inflammatory response characteristic of cytokine release syndrome (CRS) and cytokine storm syndrome (CSS). TCZ is a humanized monoclonal antibody that binds both to the insoluble and soluble receptor of IL-6, providing de facto inhibition of the IL-6 immune cascade. Interest in TCZ as a possible treatment for COVID-19 was piqued by early evidence indicating that COVID-19 deaths may be induced by the hyperactive immune response, often referred to as CRS or CSS (83), as IL-6 plays a key role in this response (150). The observation of elevated IL-6 in patients who died relative to those who recovered (83) could reflect an over-production of proinflammatory interleukins, suggesting that TCZ could potentially palliate some of the most severe symptoms of COVID-19 associated with increased cytokine production.

This early interest in TCZ as a possible treatment for COVID-19 was bolstered by a very small retrospective study in China that examined 20 patients with severe symptoms in early February 2020 and reported rapid improvement in symptoms following treatment with TCZ (893). Subsequently, a number of retrospective studies have been conducted in several countries. Many studies use a retrospective, observational design, where they compare outcomes for COVID-19 patients who received TCZ to those who did not over a set period of time. For example, one of the largest retrospective, observational analyses released to date (888), consisting of 1,351 patients admitted to several care centers in Italy, compared the rates at which patients who received TCZ died or progressed to invasive medical ventilation over a 14-day period compared to patients receiving only SOC. Under this definition, SOC could include other drugs such as HCQ, azithromycin, lopinavir-ritonavir or darunavir-cobicistat, or heparin. While this study was not randomized, a subset of patients who were eligible to receive TCZ were unable to obtain it due to shortages; however, these groups were not directly compared in the analysis. After adjusting for variables such as age, sex, and SOFA (sequential organ failure assessment) score, they found that patients treated with TCZ were less likely to progress to invasive medical ventilation and/or death (adjusted HR = 0.61, CI 0.40-0.92, p = 0.020); analysis of death and ventilation separately suggests that this effect may have been driven by differences in the death rate (20% of control versus 7% of TCZ-treated patients). The study reported particular benefits for patients whose PaO2/FiO2 ratio, also known as the Horowitz Index for Lung Function, fell below a 150 mm Hg threshold. They found no differences between groups administered subcutaneous versus intravenous TCZ.

Another retrospective observational analysis of interest examined the charts of patients at a hospital in Connecticut, USA where 64% of all 239 COVID-19 patients in the study period were administered TCZ based on assignment by a standardized algorithm (889). They found that TCZ administration was associated with more similar rates of survivorship in patients with severe versus nonsevere COVID-19 at intake, defined based on the amount of supplemental oxygen needed. They therefore proposed that their algorithm was able to identify patients presenting with or likely to develop CRS as good candidates for TCZ. This study also reported higher survivorship in Black and Hispanic patients compared to white patients when adjusted for age. The major limitation with interpretation for these studies is that there may be clinical characteristics that influenced medical practitioners decisions to administer TCZ to some patients and not others. One interesting example therefore comes from an analysis of patients at a single hospital in Brescia, Italy, where TCZ was not available for a period of time (890). This study compared COVID-19 patients admitted to the hospital before and after March 13, 2020, when the hospital received TCZ. Therefore, patients who would have been eligible for TCZ prior to this arbitrary date did not receive it as treatment, making this retrospective analysis something of a natural experiment. Despite this design, demographic factors did not appear to be consistent between the two groups, and the average age of the control group was older than the TCZ group. The control group also had a higher percentage of males and a higher incidence of comorbidities such as diabetes and heart disease. All the same, the multivariate HR, which adjusted for these clinical and demographic factors, found a significant difference between survival in the two groups (HR=0.035, CI=0.004-0.347, p = 0.004). The study reported improvement of survival outcomes after the addition of TCZ to the SOC regime, with 11 of 23 patients (47.8%) admitted prior to March 13th dying compared to 2 of 62 (3.2%) admitted afterwards (HR=0.035; 95% CI, 0.004 to 0.347; p = 0.004). They also reported a reduced progression to mechanical ventilation in the TCZ group. However, this study also holds a significant limitation: the time delay between the two groups means that knowledge about how to treat the disease likely improved over this timeframe as well. All the same, the results of these observational retrospective studies provide support for TCZ as a pharmaceutical of interest for follow-up in clinical trials.

Other retrospective analyses have utilized a case-control design to match pairs of patients with similar baseline characteristics, only one of whom received TCZ for COVID-19. In one such study, TCZ was significantly associated with a reduced risk of progression to intensive care unit (ICU) admission or death (891). This study examined only 20 patients treated with TCZ (all but one of the patients treated with TCZ in the hospital during the study period) and compared them to 25 patients receiving SOC. For the combined primary endpoint of death and/or ICU admission, only 25% of patients receiving TCZ progressed to an endpoint compared to 72% in the SOC group (p = 0.002, presumably based on a chi-square test based on the information provided in the text). When the two endpoints were examined separately, progression to invasive medical ventilation remained significant (32% SOC compared to 0% TCZ, p = 0.006) but not for mortality (48% SOC compared to 25% TCZ, p = 0.066). In contrast, a study that compared 96 patients treated with TCZ to 97 patients treated with SOC only in New York City found that differences in mortality did not differ between the two groups, but that this difference did become significant when intubated patients were excluded from the analysis (892). Taken together, these findings suggest that future clinical trials of TCZ may want to include intubation as an endpoint. However, these studies should be approached with caution, not only because of the small number of patients enrolled and the retrospective design, but also because they performed a large number of statistical tests and did not account for multiple hypothesis testing. In general, caution must be exercised when interpreting subgroup analyses after a primary combined endpoint analysis. These last findings highlight the need to search for a balance between impairing a harmful immune response, such as the one generated during CRS/CSS, and preventing the worsening of the clinical picture of the patients by potential new viral infections. Early meta-analyses and systematic reviews have investigated the available data about TCZ for COVID-19. One meta-analysis (1079) evaluated 19 studies published or released as preprints prior to July 1, 2020 and found that the overall trends were supportive of the frequent conclusion that TCZ does improve survivorship, with a significant HR of 0.41 (p < 0.001). This trend improved when they excluded studies that administered a steroid alongside TCZ, with a significant HR of 0.04 (p < 0.001). They also found some evidence for reduced invasive ventilation or ICU admission, but only when excluding all studies except a small number whose estimates were adjusted for the possible bias introduced by the challenges of stringency during the enrollment process. A systematic analysis of sixteen case-control studies of TCZ estimated an odds ratio of mortality of 0.453 (95% CI 0.376–0.547, p < 0.001), suggesting possible benefits associated with TCZ treatment (1080). Although these estimates are similar, it is important to note that they are drawing from the same literature and are therefore likely to be affected by the same potential biases in publication. A different systematic review of studies investigating TCZ treatment for COVID-19 analyzed 31 studies that had been published or released as pre-prints and reported that none carried a low risk of bias (1081). Therefore, the present evidence is not likely to be sufficient for conclusions about the efficacy of TCZ.

On February 11, 2021, a preprint describing the first randomized control trial of TCZ was released as part of the RECOVERY trial (894). Of the 21,550 patients enrolled in the RECOVERY trial at the time, 4,116 adults hospitalized with COVID-19 across the 131 sites in the United Kingdom were assigned to the arm of the trial evaluating the effect of TCZ. Among them, 2,022 were randomized to receive TCZ and 2,094 were randomized to SOC, with 79% of patients in each group available for analysis at the time that the initial report was released. The primary outcome measured was 28-day mortality, and TCZ was found to reduce 28-day mortality from 33% of patients receiving SOC alone to 29% of those receiving TCZ, corresponding to a rate ratio of 0.86 (95% CI 0.77-0.96; p = 0.007). TCZ was also significantly associated with the probability of hospital discharge within 28 days for living patients, which was 47% in the SOC group and 54% in the TCZ group (rate ratio 1.22, 95% CI 1.12-1.34, p < 0.0001). A potential statistical interaction between TCZ and corticosteroids was observed, with the combination providing greater mortality benefits than TCZ alone, but the authors note that caution is advisable in light of the number of statistical tests conducted. Combining the RECOVERY trial data with data from seven smaller randomized control trials indicates that TCZ is associated with a 13% reduction in 28-day mortality (rate ratio 0.87, 95% CI 0.79-0.96, p = 0.005) (894).

There are possible risks associated with the administration of TCZ for COVID-19. TCZ has been used for over a decade to treat RA (1082), and a recent study found the drug to be safe for pregnant and breastfeeding women (1083). However, TCZ may increase the risk of developing infections (1082), and RA patients with chronic hepatitis B infections had a high risk of hepatitis B virus reactivation when TCZ was administered in combination with other RA drugs (1084). As a result, TCZ is contraindicated in patients with active infections such as tuberculosis (1085). Previous studies have investigated, with varying results, a possible increased risk of infection in RA patients administered TCZ (1086, 1087), although another study reported that the incidence rate of infections was higher in clinical practice RA patients treated with TCZ than in the rates reported by clinical trials (1088). In the investigation of 544 Italian COVID-19 patients, the group treated with TCZ was found to be more likely to develop secondary infections, with 24% compared to 4% in the control group (p < 0.0001) (888). Reactivation of hepatitis B and herpes simplex virus 1 was also reported in a small number of patients in this study, all of whom were receiving TCZ. A July 2020 case report described negative outcomes of two COVID-19 patients after receiving TCZ, including one death; however, both patients were intubated and had entered septic shock prior to receiving TCZ (1089), likely indicating a severe level of cytokine production. Additionally, D-dimer and sIL2R levels were reported by one study to increase in patients treated with TCZ, which raised concerns because of the potential association between elevated D-dimer levels and thrombosis and between sIL2R and diseases where T-cell regulation is compromised (889). An increased risk of bacterial infection was also identified in a systematic review of the literature, based on the unadjusted estimates reported (1079). In the RECOVERY trial, however, only three out of 2,022 participants in the group receiving TCZ developed adverse reactions determined to be associated with the intervention, and no excess deaths were reported (894). TCZ administration to COVID-19 patients is not without risks and may introduce additional risk of developing secondary infections; however, while caution may be prudent when treating patients who have latent viral infections, the results of the RECOVERY trial indicate that adverse reactions to TCZ are very rare among COVID-19 patients broadly.

In summary, approximately 33% of hospitalized COVID-19 patients develop ARDS (1090), which is caused by an excessive early response of the immune system which can be a component of CRS/CSS (889, 1085). This overwhelming inflammation is triggered by IL-6. TCZ is an inhibitor of IL-6 and therefore may neutralize the inflammatory pathway that leads to the cytokine storm. The mechanism suggests TCZ could be beneficial for the treatment of COVID-19 patients experiencing excessive immune activity, and the RECOVERY trial reported a reduction in 28-day mortality. Interest in TCZ as a treatment for COVID-19 was also supported by two meta-analyses (1079, 1091), but a third meta-analysis found that all of the available literature at that time carried a risk of bias (1081). Additionally, different studies used different dosages, number of doses, and methods of administration. Ongoing research may be needed to optimize administration of TCZ (1092), although similar results were reported by one study for intravenous and subcutaneous administration (888). Clinical trials that are in progress are likely to provide additional insight into the effectiveness of this drug for the treatment of COVID-19 along with how it should be administered.

5.7 Interferons

IFNs are a family of cytokines critical to activating the innate immune response against viral infections. Interferons are classified into three categories based on their receptor specificity: types I, II and III (150). Specifically, IFNs I (IFN-𝛼 and 𝛽) and II (IFN-𝛾) induce the expression of antiviral proteins (1093). Among these IFNs, IFN-𝛽 has already been found to strongly inhibit the replication of other coronaviruses, such as SARS-CoV-1, in cell culture, while IFN-𝛼 and 𝛾 were shown to be less effective in this context (1093). There is evidence that patients with higher susceptibility to ARDS indeed show deficiency in IFN-𝛽. For instance, infection with other coronaviruses impairs IFN-𝛽 expression and synthesis, allowing the virus to escape the innate immune response (1094). On March 18 2020, Synairgen plc received approval to start a phase II trial for SNG001, an IFN-𝛽-1a formulation to be delivered to the lungs via inhalation (901). SNG001, which contains recombinant interferon beta-1a, was previously shown to be effective in reducing viral load in an in vivo model of swine flu and in vitro models of other coronavirus infections (1095). In July 2020, a press release from Synairgen stated that SNG001 reduced progression to ventilation in a double-blind, placebo-controlled, multi-center study of 101 patients with an average age in the late 50s (902). These results were subsequently published in November 2020 (903). The study reports that the participants were assigned at a ratio of 1:1 to receive either SNG001 or a placebo that lacked the active compound, by inhalation for up to 14 days. The primary outcome they assessed was the change in patients’ score on the WHO Ordinal Scale for Clinical Improvement (OSCI) at trial day 15 or 16. SNG001 was associated with an odds ratio of improvement on the OSCI scale of 2.32 (95% CI 1.07 – 5.04, p = 0.033) in the intention-to-treat analysis and 2.80 (95% CI 1.21 – 6.52, p = 0.017) in the per-protocol analysis, corresponding to significant improvement in the SNG001 group on the OSCI at day 15/16. Some of the secondary endpoints analyzed also showed differences: at day 28, the OR for clinical improvement on the OSCI was 3.15 (95% CI 1.39 – 7.14, p = 0.006), and the odds of recovery at day 15/16 and at day 28 were also significant between the two groups. Thus, this study suggested that IFN-𝛽1 administered via SNG001 may improve clinical outcomes.

In contrast, the WHO Solidarity trial reported no significant effect of IFN-β-1a on patient survival during hospitalization (811). Here, the primary outcome analyzed was in-hospital mortality, and the rate ratio for the two groups was 1.16 (95% CI, 0.96 to 1.39; p = 0.11) administering IFN-β-1a to 2050 patients and comparing their response to 2,050 controls. However, there are a few reasons that the different findings of the two trials might not speak to the underlying efficacy of this treatment strategy. One important consideration is the stage of COVID-19 infection analyzed in each study. The Synairgen trial enrolled only patients who were not receiving invasive ventilation, corresponding to a less severe stage of disease than many patients enrolled in the SOLIDARITY trial, as well as a lower overall rate of mortality (1096). Additionally, the methods of administration differed between the two trials, with the SOLIDARITY trial administering IFN-β-1a subcutaneously (1096). The differences in findings between the studies suggests that the method of administration might be relevant to outcomes, with nebulized IFN-β-1a more directly targeting receptors in the lungs. A trial that analyzed the effect of subcutaneously administered IFN-β-1a on patients with ARDS between 2015 and 2017 had also reported no effect on 28-day mortality (1097), while a smaller study analyzing the effect of subcutaneous IFN administration did find a significant improvement in 28-day mortality for COVID-19 (1098). At present, several ongoing clinical trials are investigating the potential effects of IFN-β-1a, including in combination with therapeutics such as remdesivir (1099) and administered via inhalation (901). Thus, as additional information becomes available, a more detailed understanding of whether and under which circumstances IFN-β-1a is beneficial to COVID-19 patients should develop.

5.8 Potential Avenues of Interest for Therapeutic Development

Given what is currently known about these therapeutics for COVID-19, a number of related therapies beyond those explored above may also prove to be of interest. For example, the demonstrated benefit of dexamethasone and the ongoing potential of tocilizumab for treatment of COVID-19 suggests that other anti-inflammatory agents might also hold value for the treatment of COVID-19. Current evidence supporting the treatment of severe COVID-19 with dexamethasone suggests that the need to curtail the cytokine storm inflammatory response transcends the risks of immunosuppression, and other anti-inflammatory agents may therefore benefit patients in this phase of the disease. While dexamethasone is considered widely available and generally affordable, the high costs of biologics such as tocilizumab therapy may present obstacles to wide-scale distribution of this drug if it proves of value. At the doses used for RA patients, the cost for tocilizumab ranges from $179.20 to $896 per dose for the IV form and $355 for the pre-filled syringe (1100). Several other anti-inflammatory agents used for the treatment of autoimmune diseases may also be able to counter the effects of the cytokine storm induced by the virus, and some of these, such as cyclosporine, are likely to be more cost-effective and readily available than biologics (1101). While tocilizumab targets IL-6, several other inflammatory markers could be potential targets, including TNF-α. Inhibition of TNF-α by a compound such as Etanercept was previously suggested for treatment of SARS-CoV-1 (1102) and may be relevant for SARS-CoV-2 as well. Another anti-IL-6 antibody, sarilumab, is also being investigated (1103, 1104). Baricitinib and other small molecule inhibitors of the Janus-activated kinase pathway also curtail the inflammatory response and have been suggested as potential options for SARS-CoV-2 infections (1105). Baricitinib, in particular, may be able to reduce the ability of SARS-CoV-2 to infect lung cells (1106). Clinical trials studying baricitinib in COVID-19 have already begun in the US and in Italy (1107, 1108). Identification and targeting of further inflammatory markers that are relevant in SARS-CoV-2 infection may be of value for curtailing the inflammatory response and lung damage.

In addition to immunosuppressive treatments, which are most beneficial late in disease progression, much research is focused on identifying therapeutics for early-stage patients. For example, although studies of HCQ have not supported the early theory-driven interest in this antiviral treatment, alternative compounds with related mechanisms may still have potential. Hydroxyferroquine derivatives of HCQ have been described as a class of bioorganometallic compounds that exert antiviral effects with some selectivity for SARS-CoV-1 in vitro (1109). Future work could explore whether such compounds exert antiviral effects against SARS-CoV-2 and whether they would be safer for use in COVID-19.

Another potential approach is the development of antivirals, which could be broad-spectrum, specific to coronaviruses, or targeted to SARS-CoV-2. Development of new antivirals is complicated by the fact that none have yet been approved for human coronaviruses. Intriguing new options are emerging, however. Beta-D-N4-hydroxycytidine is an orally bioavailable ribonucleotide analog showing broad-spectrum activity against RNA viruses, which may inhibit SARS-CoV-2 replication in vitro and in vivo in mouse models of HCoVs (1110). A range of other antivirals are also in development. Development of antivirals will be further facilitated as research reveals more information about the interaction of SARS-CoV-2 with the host cell and host cell genome, mechanisms of viral replication, mechanisms of viral assembly, and mechanisms of viral release to other cells; this can allow researchers to target specific stages and structures of the viral life cycle. Finally, antibodies against viruses, also known as antiviral monoclonal antibodies, could be an alternative as well and are described in detail in an above section. The goal of antiviral antibodies is to neutralize viruses through either cell-killing activity or blocking of viral replication (1111). They may also engage the host immune response, encouraging the immune system to hone in on the virus. Given the cytokine storm that results from immune system activation in response to the virus, which has been implicated in worsening of the disease, a neutralizing antibody (nAb) may be preferable. Upcoming work may explore the specificity of nAbs for their target, mechanisms by which the nAbs impede the virus, and improvements to antibody structure that may enhance the ability of the antibody to block viral activity.

Some research is also investigating potential therapeutics and prophylactics that would interact with components of the innate immune response. For example, TLRs are pattern recognition receptors that recognize pathogen- and damage-associated molecular patterns and contribute to innate immune recognition and, more generally, promotion of both the innate and adaptive immune responses (147). In mouse models, poly(I:C) and CpG, which are agonists of Toll-like receptors TLR3 and TLR9, respectively, showed protective effects when administered prior to SARS-CoV-1 infection (1112). Therefore, TLR agonists hold some potential for broad-spectrum prophylaxis.

6 Application of Traditional Vaccine Development Strategies to SARS-CoV-2

6.1 Abstract

Over the past 150 years, vaccines have revolutionized the relationship between people and disease. During the COVID-19 pandemic, technologies such as mRNA vaccines have received attention due to their novelty and successes. However, more traditional vaccine development platforms have also yielded important tools in the worldwide fight against the SARS-CoV-2 virus.

A variety of approaches have been used to develop COVID-19 vaccines that are now authorized for use in countries around the world. In this review, we highlight strategies that focus on the viral capsid and outwards, rather than on the nucleic acids inside. These approaches fall into two broad categories: whole-virus vaccines and subunit vaccines. Whole-virus vaccines use the virus itself, either in an inactivated or attenuated state. Subunit vaccines contain instead an isolated, immunogenic component of the virus. Here, we highlight vaccine candidates that apply these approaches against SARS-CoV-2 in different ways. In a companion manuscript, we review the more recent and novel development of nucleic-acid based vaccine technologies.

We further consider the role that these COVID-19 vaccine development programs have played in prophylaxis at the global scale. Well-established vaccine technologies have proved especially important to making vaccines accessible in low- and middle-income countries. Vaccine development programs that use established platforms have been undertaken in a much wider range of countries than those using nucleic-acid-based technologies, which have been led by wealthy Western countries. Therefore, these vaccine platforms, though less novel from a biotechnological standpoint, have proven to be extremely important to the management of SARS-CoV-2.

6.2 Importance

As of March 9, 2023, there have been over 676,570,149 SARS-CoV-2 cases, and the virus has taken the lives of at least 6,881,802 people globally (732). The development, production, and distribution of vaccines is imperative to saving lives, preventing illness, and reducing the economic and social burdens caused by the COVID-19 pandemic. Vaccines that use cutting-edge biotechnology have played an important role in mitigating the effects of SARS-CoV-2. However, more traditional methods of vaccine development that were refined throughout the twentieth century have been especially critical to increasing vaccine access worldwide. Effective deployment is necessary to reducing the susceptibility of the world’s population, which is especially important in light of emerging variants. In this review, we discuss the safety, immunogenicity, and distribution of vaccines developed using established technologies. In a separate review, we describe the vaccines developed using nucleic acid-based vaccine platforms. From the current literature, it is clear that the well-established vaccine technologies are also highly effective against SARS-CoV-2 and are being used to address the challenges of COVID-19 globally, including in low- and middle-income countries. This worldwide approach is critical for reducing the devastating impact of SARS-CoV-2.

6.3 Introduction

The development of vaccines is widely considered one of the most important medical advances in human history. Over the past 150 years, several approaches to vaccination have been developed and refined (1113). The COVID-19 pandemic has produced unusual circumstances compared to past health crises, leading to differences in vaccine development strategies. One way in which the COVID-19 pandemic differs from prior global health crises is that the SARS-CoV-2 viral genome was sequenced, assembled, and released very early in the course of the pandemic (January 2020). This genomic information has informed the biomedical response to this novel pathogen across several dimensions (1, 3). All the same, vaccines have been developed since long before the concept of a virus or a viral genome was known, and as early as September 2020, there were over 180 vaccine candidates against SARS-CoV-2 in development, many of which employed more traditional vaccine technologies (1114). However, public attention in the United States and elsewhere has largely focused on vaccine development platforms that use new technologies, especially mRNA vaccines. We review vaccine technologies used for SARS-CoV-2 in two parts: here, the application of established vaccine development platforms to SARS-CoV-2, and separately, novel nucleic acid-based approaches (6).

Understanding vaccine development programs that are using well-established technologies is important for a global perspective on COVID-19. As of May 3, 2023, 50 SARS-CoV-2 vaccines have been approved for use in at least one country (1115). A resource tracking the distribution of 28 vaccines indicates that, as of May 31, 2023, 13.0 billion doses have been administered across 223 countries (1116). Many of these vaccines use platforms that do not require information about the viral genome, with 20 developed using subunit and 11 using whole-virus approaches (1115). The types of vaccines available varies widely throughout the world, as the process of developing and deploying a vaccine is complex and often requires coordination between government, industry, academia, and philanthropic entities (1117).

Another difference between prior global health crises and the COVID-19 pandemic is the way that vaccines are evaluated. A vaccine’s success is often discussed in terms of vaccine efficacy (VE), which describes the protection conferred during clinical trials (1118). The real-world protection offered by a vaccine is referred to as its effectiveness (1118). Additionally, protection can mean different things in different contexts. In general, the goal of a vaccine is to prevent disease, especially severe disease, rather than infection itself. As a proxy for VE, vaccine developers often test their candidates for serum neutralizing activity, which has been proposed as a biomarker for adaptive immunity in other respiratory illnesses (1119). The duration and intensity of the COVID-19 pandemic has made it possible to test multiple vaccines in phase III trials, where the effect of the vaccines on a cohort’s likelihood of contracting SARS-CoV-2 can be evaluated, whereas this has not always been feasible for other infectious diseases. In some cases (e.g., SARS), the pathogen has been controlled before vaccine candidates were available, while in others (e.g., MERS), the scale of the epidemic has been smaller. Vaccine development is traditionally a slow process, but the urgency of the COVID-19 pandemic created an atypical vaccine development ecosystem where fast development and production was prioritized. Estimates of VE have been released for many vaccine candidates across a number of technology types based on phase III trial data.

However, efficacy is not a static value, and both trial efficacy and real-world effectiveness can vary across location and over time. Shifts in effectiveness in particular have been an especially heightened topic of concern since late 2021 given the potential for variants of SARS-CoV-2 to influence VE. Due to viral evolution, vaccine developers are in an arms race with a pathogen that benefits from mutations that reduce its susceptibility to adaptive immunity. The evolution of several variants of concern (VOC) presents significant challenges for vaccines developed based on the index strain identified in Wuhan in late 2019. We discuss these variants in depth elsewhere (341). To date, the most significant VOC identified are Alpha (2020), Beta (2020), Gamma (2020), Delta (2021), and Omicron (2021), with various subvariants of Omicron being the most recently identified (2022). The relative timing of studies relative to dominant VOC in the region where participants are recruited is important context for a complete picture of efficacy. Therefore, the efficacy and/or effectiveness of vaccines in the context of these variants is discussed where information is available.

Beyond the variability introduced by time and geography, efficacy within a trial and effectiveness in the real-world setting can also differ due to cohort differences. Patients participating in a clinical trial are likely to receive more medical oversight, resulting in better follow-up, adherence, and patient engagement (1120). Additionally, the criteria for participant inclusion in a trial often bias trials towards selection of younger, healthier individuals (1121). The ability of an RCT to accurately assess safety can be biased by the fact that a clinical trial might not reveal rare adverse events (AEs) that might become apparent on a larger scale (1121). Therefore, while clinical trials are the gold standard for evaluating vaccines for COVID-19, the results of these trials must be considered in a broader context when real-world data is available.

While the relationship between a vaccine and a pathogen is not static, the data clearly demonstrates that a variety of efficacious vaccines have been developed against SARS-CoV-2. Here we discuss a selection of programs that use well-established vaccine biotechnologies. These programs have been undertaken worldwide, in complement to the more cutting-edge approaches developed and distributed in the United States (U.S.), the European Union (E.U.), the United Kingdom (U.K.), and Russia (6). In this review, we discuss vaccine development using two well-established technologies, whole-virus vaccination and subunit vaccination, and the role these technologies have played in the global response to the COVID-19 pandemic.

6.4 Whole-Virus Vaccines

Whole-virus vaccines have the longest history among vaccine development approaches. Variolation, which is widely considered the first vaccination strategy in human history, is one example (1122, 1123). Famously, variolation was employed against smallpox when healthy individuals were exposed to pus from an individual infected with what was believed to be either cowpox or horsepox (11221125). This approach worked by inducing a mild case of a disease. Therefore, while whole-virus vaccines confer adaptive immunity, they also raise safety concerns (1124, 1126, 1127). As of 2005, most vaccines still used whole-virus platforms (1128), and these technologies remain valuable tools in vaccine development today (1113). Whole virus vaccine candidates have been developed for COVID-19 using both live attenuated viruses and inactivated whole viruses.

6.4.1 Live-Attenuated Virus Vaccines

Live-attenuated virus vaccines (LAV), also known as replication-competent vaccines, use a weakened, living version of a disease-causing virus or a version of a virus that is modified to induce an immune response (1114). Whether variolation is the first example of a LAV being used to induce immunity is debated (1113, 1126). The first deliberate (albeit pathogen-naïve) attempt to develop an attenuated viral vaccine dates back to Louis Pasteur’s efforts in 1885 to inoculate a child against rabies (1129). The next intentional LAVs were developed against the yellow fever virus in 1935 and influenza in 1936 (1130).

Early efforts in LAV development relied on either the identification of a related virus that was less virulent in humans (e.g., cowpox/horsepox or rotavirus vaccines) or the culturing of a virus in vitro (1113, 1124). Today, a virus can be attenuated by passaging it in a foreign host until, due to selection pressure, the virus loses its efficacy in the original host. Alternatively, selective gene deletion or codon deoptimization can be utilized to attenuate the virus (1114), or foreign antigens can be integrated into an attenuated viral vector (1131). LAVs tend to be restricted to viral replication in the tissues around the location of inoculation (1130), and some can be administered intranasally (1114).

Today, LAVs are used globally to prevent diseases caused by viruses such as measles, mumps, rubella, polio, influenza, varicella zoster, and the yellow fever virus (1132). There were attempts to develop LAVs against both SARS-CoV-1 and MERS-CoV (1133), but no vaccines were approved. It is generally recognized that LAVs induce an immune response similar to natural infection, and they are favored because they induce long-lasting and robust immunity that can protect from disease. This strong protective effect is induced in part by the immune response to the range of viral antigens available from LAV, which tend to be more immunogenic than those from non-replicating vaccines (1126, 1133, 1134).

6.4.2 LAV Vaccines and COVID-19

To date, LAVs have not been widely deployed against SARS-CoV-2 and COVID-19. All the same, there are several COVID-19 LAV candidates in the early (preclinical/phase I) stages of investigation. These candidates utilize different approaches. Interestingly, several candidates (Meissa Vaccines’ MV-014-212 and Codagenix’s COVI-VAC, specifically) are administered intranasally, potentially improving accessibility by eliminating the need for sterile needles and reducing manufacturing costs, targeting conferring mucosal immunity, and reducing some issues related to vaccine hesitancy (1135, 1136). Additionally, although no phase III trial data is available for LAV vaccine candidates, some manufacturers have proactively sought to respond to the emergence of VOC. Therefore, the original approach to vaccination may still prove extremely advantageous in the high-tech landscape of COVID-19 vaccine development.

6.4.2.1 YF-S0

One candidate in the preclinical stage is YF-S0, a single-dose LAV developed at Belgium’s Katholieke Universiteit Leuven that uses live-attenuated yellow fever 17D (YF17D) as a vector for a noncleavable prefusion conformation of the spike antigen of SARS-CoV-2 (1131). YF-S0 induced a robust immune response in three animal models and prevented SARS-CoV-2 infection in macaques and hamsters (1131). Additionally, the protective effect of YF-S0 against VOC has been investigated in hamsters (1137). Even for a small number of hamsters that developed breakthrough infections after exposure to the index strain or the Alpha variant, viral loads were very low (1137). However, much higher rates of breakthrough infection and higher viral loads were observed when the hamsters were exposed to the Beta variant (1137). Reduced seroconversion and nAb titers were also observed against the Beta and Gamma variants (1137). As a result, a modified version of YF-S0, called YF-S0*, was developed to include a modified spike protein intended to increase immunogenicity by including the full spectrum of amino acids found in the Gamma VOC as well as stabilizing the S protein’s conformation (1137). The updated vaccine was again tested in Syrian golden hamsters (1137). No breakthrough infections were observed following vaccination with YF-S0* and exposure to the index strain and the Alpha, Beta, Gamma, and Delta variants (1137). YF-S0* also reduced the infectious viral load in the lungs of several VOCs (Alpha, Beta, Gamma, and Delta) relative to a sham comparison (1137), and the likelihood of the Delta variant spreading to unvaccinated co-housed hamsters was significantly reduced by YF-S0* (1137). The updated vaccine was also associated with the increased production of nAbs against the Omicron variant compared to YF-S0 (1137).

6.4.2.2 COVI-VAC

Other programs are developing codon deoptimized LAV candidates (11381140). This approach follows the synthetic attenuated virus engineering (SAVE) strategy to select codon substitutions that are suboptimal for the virus (1140, 1141). New York-based Codagenix and the Serum Institute of India reported a successful preclinical investigation (1140) of an intranasally administered deoptimized SARS-CoV-2 LAV known as COVI-VAC, and COVI-VAC entered phase I human trials and dosed its first participants in January 2021 (1139, 1142). This vaccine is optimized through the removal of the furin cleavage site (see (1) for a discussion of this site’s importance) and deoptimization of 283 codons (1143). The results of the COVI-VAC phase I human trials are expected soon (1142).

Other results suggest both potential benefits and risks to the COVI-VAC vaccine candidate. Preclinical results suggest that the vaccine candidate may confer some protection against VOC even though it was designed based on the index strain: a poster reported that Syrian golden hamsters who received COVI-VAC were significantly less likely to lose weight following viral challenge with the Beta VOC (1143). On the other hand, some concerns have arisen about the possibility of spillover from LAV vaccines. A December 2022 study analyzed SARS-CoV-2 samples isolated from COVID-19 patients in India and identified two extremely similar sequences collected on June 30, 2020 that showed a high level of recombination relative to the dominant strains at the time (1144). Comparing these samples to a database of SARS-CoV-2 sequences revealed they were most similar to the sequence used for COVI-VAC (1144). Based on phylogenetic reconstruction, the authors argued that these SARS-CoV-2 isolates were most likely to have spilled over from COVI-VAC trials (1144). If this was a case of spillover, the effect seems to have been limited, as these sequences were just two among over 1,600 analyzed. However, these concerns may be one consideration in the development of LAV vaccines for COVID-19.

6.4.2.3 Meissa Vaccines MV-014-212

Another company, Meissa Vaccines in Kansas, U.S., which also develops vaccines for respiratory syncytial virus (RSV), has developed an intranasal live-attenuated chimeric vaccine MV-014-212 (1145). Chimeric vaccines integrate genomic content from multiple viruses to create a more stable LAV (1146). To develop a SARS-CoV-2 vaccine candidate, Meissa Vaccines built on their prior work developing RSV vaccines (1145). A live attenuated recombinant strain of RSV previously investigated as a vaccine candidate was modified to replace two surface glycoproteins with a chimeric protein containing components of the SARS-CoV-2 Spike protein as well as the RSV fusion (F) protein (1145). Preclinical results describing the intranasal administration of MV-014-212 to African green monkeys and mice indicated that the vaccine candidate produced neutralizing antibodies (nAb) as well as a cellular immune response to SARS-CoV-2 challenge, including the Alpha, Beta, and Delta VOC (1145). Enrollment for phase I human trials began in March 2021 and recruitment is ongoing (1139, 1147).

6.4.2.4 Bacillus Calmette-Guerin Vaccines

Finally, Bacillus Calmette-Guerin (BCG) vaccines that use LAVs are being investigated for the prophylaxis of COVID-19 (see online Appendix (1148)). The purpose of the BCG vaccine is to prevent tuberculosis, but non-specific effects against other respiratory illnesses have suggested a possible benefit against COVID-19 (1149). However, a multicenter trial that randomly assigned participants 60 years and older to vaccination with BCG (n = 1,008) or placebo (n = 1,006) found that BCG vaccination had no effect on the incidence of SARS-CoV-2 or other respiratory infections over the course of 12 months (1150). Despite these findings, BCG vaccination was associated with a stronger cytokine (specifically, IL-6) response following ex vivo stimulation of peripheral blood mononuclear cells in patients with no known history of COVID-19 (1150). Additionally, SARS-CoV-2-positive individuals who had received the BCG vaccine one year prior showed increased immunoglobulin (Ig) responses to the SARS-CoV-2 spike protein and receptor binding domain (RBD) relative to individuals who had received a placebo vaccine (1150). Currently, investigations of BCG vaccines against COVID-19 are being sponsored by institutes in Australia in collaboration with the Bill and Melinda Gates Foundation (1151) and by Texas A&M University in collaboration with numerous other U.S. institutions (1152).

6.4.2.5 Summary of LAV Vaccine Development

LAV vaccines for COVID-19 have not advanced as far in development as vaccines developed using other technologies. As of December 2022, COVI-VAC was the only LAV vaccine candidate in phase III clinical trials (1153). As a result, safety data is not yet available for human studies of these vaccines. In general, though, safety concerns previously associated with LAV have been largely mitigated in the modern manufacturing process. Manufacturers use safe and reliable methods to produce large quantities of vaccines once they have undergone rigorous preclinical studies and clinical trials to evaluate their safety and efficacy. However, one remaining safety concern may be contributing to the relatively slow emergence of LAV candidates against COVID-19: they still may present risk to individuals who are immunocompromised (1154), which is an even greater concern when dealing with a novel virus and disease. Additional data are needed to ascertain how this technology performs in the case of SARS-CoV-2 and whether rare cases of spillover have indeed occurred. Additionally, modifications to the design of individual vaccine candidates may make this protection more robust as SARS-CoV-2 evolves, as the limited data about LAV performance against VOC suggests. Despite the long and trusted history of LAV development, this vaccine strategy has not been favored against COVID-19, as other technologies have shown greater expediency and safety compared to the time-consuming nature of developing LAVs for a novel virus.

6.5 Inactivated Whole-Virus Vaccines

Inactivated whole-virus (IWV) vaccines are another well-established vaccine platform. This platform uses full virus particles generally produced via cell culture that have been rendered non-infectious by chemical (i.e., formaldehyde or β-propiolactone (1155)) or physical (i.e., heat or ultraviolet radiation) means. In general, these vaccines mimic the key properties of the virus that stimulate a robust immune response, but the risk of adverse reactions is reduced because the virus is inactivated and thus unable to replicate. Though these viral particles are inactivated, they retain the capacity to prime the immune system. The size of the viral particle makes it ideal for uptake by antigen-presenting cells, which leads to the stimulation of helper T-cells (1156). Additionally, the array of epitopes on the surface of the virus increases antibody binding efficiency (1156). The native conformation of the surface proteins, which is also important for eliciting an immune response, is preserved using these techniques (1157). Membrane proteins, which support B-cell responses to surface proteins, are also induced by this method (1158).

IWV vaccines have been a valuable tool in efforts to control many viruses. Some targets of IWV vaccines have included influenza viruses, poliovirus, and hepatitis A virus. Inactivated vaccines can generally be generated relatively quickly once the pathogenic virus has been isolated and can be passaged in cell culture (1133, 1159). During COVID-19, though the World Health Organization (WHO) has been slower to approve IWV vaccine candidates than those developed with nucleic acid-based technologies, IWV vaccine development was also fast. In China, the first emergency use authorization (EUA) was granted to an IWV vaccine in July 2020, with full approval following that December (1160, 1161). The fact that these vaccines have not received as much public attention (at least in Western media) as nucleic acid vaccines for SARS-CoV-2 may be due at least in part to the novelty of nucleic acid vaccine technologies (1162), which are more modular and immunogenic (6).

Past applications to human coronaviruses (HCoV) have focused predominantly on SARS-CoV-1. Preclinical studies have demonstrated that IWV SARS-CoV-1 vaccine candidates elicited immune responses in vivo. These vaccines generated nAb titers at concentrations similar to those evoked by recombinant protein vaccines (1157, 1163). Studies in ferrets and non-human primates demonstrated that IWV vaccines can offer protection against infection due to nAb and SARS-CoV-1-specific T cell responses (1164). However, several attempts to develop IWV vaccines against both SARS-CoV-1 and MERS-CoV were hindered by incidences of vaccine-associated disease enhancement (VADE) in preclinical studies (1165). In one example of a study in macaques, an inactivated SARS-CoV-1 vaccine induced even more severe lung damage than the virus due to an enhanced immune reaction (1166). Independent studies in mice also demonstrated evidence of lung immunopathology due to VADE in response to MERS-CoV IWV vaccination (1167, 1168). The exact mechanisms responsible for VADE remain elusive due to the specificity of the virus-host interactions involved, but VADE is the subject of investigation in preclinical SARS-CoV-2 vaccine studies to ensure the safety of any potential vaccines that may reach phase I trials and beyond (1165).

6.5.1 Application to COVID-19

Table 2: Inactivated whole-virus vaccines approved in at least one country (1169) as of May 3, 2023 (1115).
Vaccine Company
Covaxin Bharat Biotech
KoviVac Chumakov Center
Turkovac Health Institutes of Turkey
FAKHRAVAC (MIVAC) Organization of Defensive Innovation and Research
QazVac Research Institute for Biological Safety Problems (RIBSP)
KCONVAC Shenzhen Kangtai Biological Products Co
COVIran Barekat Shifa Pharmed Industrial Co
Covilo Sinopharm (Beijing)
Inactivated (Vero Cells) Sinopharm (Wuhan)
CoronaVac Sinovac
VLA2001 Valneva

Several whole-virus vaccines have been developed against COVID-19 and are available in countries around the world (Table 2). As of May 31, 2023, 10 vaccines developed with IWV technology are being distributed in 120 countries (Figure 6). Evidence about the value of these vaccines to combat SARS-CoV-2 is available not only from clinical trials, but also from their roll-out following approval. Here, a major consideration has been that vaccines often lose efficacy as mutations accumulate in the epitopes of the circulating virus; IWV vaccines may be particularly affected in such cases (1127). This loss of specificity over time is likely to be influenced by the evolution of the virus, and specifically by the rate of evolution in the region of the genome that codes for the antigenic spike protein. Here we review three vaccine development programs and their successes in a real-world setting.

Figure 6: Worldwide availability of vaccines developed using inactivated whole viruses. This figure reflects the number of vaccines based on whole inactivated virus technology that were available in each country as of May 31, 2023. These data are retrieved from Our World in Data (532) and plotted using geopandas (1170). The color scale is based on the number of vaccines of this type included in the OWID dataset as a whole, not the maximum observed in a single country. See https://greenelab.github.io/covid19-review/ for the most recent version of this figure, which is updated daily.

6.5.1.1 Sinovac’s CoronaVac

One IWV vaccine, CoronaVac, was developed by Beijing-based biopharmaceutical company Sinovac. The developers inactivated a SARS-CoV-2 strain collected in China with β-propiolactone and propagated it using Vero cells (1133). The vaccine is coupled with an aluminum adjuvant to increase immunogenicity (1133). Administration follows a prime-boost regimen using a 0.5 ml dose containing 3 μg of inactivated SARS-CoV-2 virus per dose (1171). In phase I and II clinical trials, CoronaVac elicited a strong immunogenic response in animal models and the development of nAbs in human participants (11721174). The phase I/II clinical trials were conducted in adults 18-59 years old (1174) and adults over 60 years old (1172) in China. Safety analysis of the CoronaVac vaccine during the phase II trial revealed that most adverse reactions were either mild (grade 1) or moderate (grade 2) in severity (1174). In adults aged 18 to 59 years receiving a variety of dosage schedules, site injection pain was consistently the most common symptom reported (1174). In older adults, the most common local and systemic reactions were pain at the injection site (9%) and fever (3%), respectively (1172).

As of December 2022, a total of 17 CoronaVac trials had been registered in a variety of countries including the Philippines and Hong Kong (1175). Two of the earliest phase III trials to produce results examined a two-dose regimen of CoronaVac following a 14-day prime boost regimen (1176, 1177). These trials were conducted in Turkey (1177) and Chile (1176) and enrolled participants over an identical period from September 2020 and January 2021. The Chilean trial, which reported interim results regarding safety and immunogenicity, identified specific IgG nAbs against the S1 RBD and a robust IFN-γ secreting T cell response was induced via immunization with CoronaVac (1176). In the Turkish trial, VE was estimated to be 83.5% against symptomatic COVID-19 (1177). In the safety and immunogenicity study, minimal AEs were reported (1177), and 18.9% of participants in the vaccine arm of the Turkish trial reported AEs compared to 16.9% of participants in the placebo group (1177). However, 2% (n=7) of Turkish participants aged 18 to 59 reported severe AEs (1172), causing the trial to be halted for investigation (1178). The investigation determined that these events were unrelated to the vaccine (1172, 1178).

An additional phase III trial was conducted in Brazil between July and December 2020 following a randomized, multicenter, endpoint driven, double-blind, placebo-controlled design and enrolling nearly 10,000 healthcare workers (1179, 1180). The preprint reporting the results of this study (1180) reports an efficacy of 50.7% against symptomatic COVID-19 and 100% against moderate to severe cases. A large percentage of participants, 77.1% in the vaccine group and 66.4% in the placebo group, reported AEs, including two deaths, but all of the serious AEs were determined not to be related to the vaccine (1180). CoronaVac also appears to be suitable for use in immunocompromised patients such as those with autoimmune rheumatic diseases according to phase IV trials (1181), and the vaccine was also well tolerated and induced humoral responses in phase I trials in children aged 3 to 17 years, which will now be examined in phase II and III clinical trials (1182).

Estimates of CoronaVac’s VE have varied across trials. The 50.7% VE reported from the Brazilian trial was contested by Turkish officials, as at the time the efficacy in the Turkish trial appeared to be 91.25% (1183, 1184). Ultimately, after multiple announcements, the efficacy debate was settled at over 50% (1183, 1184). Subsequently, the VE for the Turkish trial was finalized at 83.5% (1177), and a prospective national cohort study in Chile reported an adjusted estimated effectiveness of 66% for the prevention of COVID-19 with an estimated 90% and 87% prevention of hospitalization and death, respectively (1185).

Based on these results, CoronaVac was approved in China, and has now been distributed in 66 countries across Africa, Asia, Europe, North America, and South America, including Brazil, Cambodia, Chile, Colombia, Laos, Malaysia, Mexico, Turkey, Ukraine, and Uruguay (1116, 1186). As of August 2021, Sinovac had reportedly produced over a billion doses of CoronaVac (1186). Outside of trials, rare cases of VADE have been reported in association with the CoronaVac vaccine (1187). In one case study, two male patients both presented with COVID-19 pneumonia following vaccination with CoronaVac (1187). This study identified the timeline of disease presentation, vaccination, and known COVID-19 exposure in the two patients and suggests that the inflammatory response induced by the vaccine could have caused an asymptomatic case of COVID-19 to present with symptoms (1187). However, no causal relationship between CoronaVac and COVID-19 symptom onset was evaluated, and the reports are extremely rare.

The effectiveness has also been questioned based on real-world data, such as when concerns were raised about the vaccine’s effectiveness following reports that over 350 doctors became ill with COVID-19 in Indonesia despite being immunized with CoronaVac (1188). One possible explanation for such outbreaks was the evolution of the virus. Sera from individuals vaccinated with CoronaVac was found to show reduced neutralizing activity against the Alpha, Beta, and Delta VOC relative to the index strain (1189). Similarly, a second study of 25 patients in Hong Kong in late 2021 evaluated serum neutralizing activity against the index strain and the Beta VOC, Delta VOC, and two Omicron isolates (1190). They reported that all individuals were seropositive for nAbs against the index strain, 68% against the Delta variant, and 0% against the Beta VOC and Omicron isolates (1190). The Beta variant appears to be more resistant to nAbs in sera from individuals immunized with CoronaVac than the Alpha variant or wildtype virus, indicating that emerging variants may be of concern (1191). Finally, a fourth study examined sera from 180 Thai healthcare workers vaccinated with CoronaVac and reported that neutralizing activity was significantly reduced against the Alpha, Delta, and Beta variants relative to the index strain (1192). Together, these results suggest that viral evolution is likely to pose a significant challenge to immunity acquired from the CoronaVac vaccine.

Therefore, studies have also evaluated whether booster doses would provide additional protection to individuals vaccinated with CoronaVac. This strategy is supported by the fact that the antibody response elicited by CoronaVac has been found to wane following the second dose, though it was still detected six months out (1193). A phase I/II clinical trial of CoronaVac in an elderly cohort (adults 60 years and older) in China determined that by 6 to 8 months following the second dose, nAb titers were detected below the seropositive cutoff (1194). Data from two phase II trials indicated that nAb response had declined 6 months after the second dose of the primary series, but a booster dose of CoronaVac administered 8 months after the second dose markedly increased geometric mean titers of SARS-CoV-2 nAbs (1195). However, the reduction of nAbs was ameliorated by a booster dose administered 8 months after the second CoronaVac dose (1195). Furthermore, Chinese (1196) and Chilean (1197) researchers have opted to investigate options to administer different vaccines (e.g., an mRNA vaccine dose) as a booster dose to individuals who have already received two doses of the IWV vaccine CoronaVac. Another study determined that using a viral-vectored vaccine (CanSino’s Convidecia) or an mRNA vaccine (Pfizer/BioNTech’s BNT162b2) instead of CoronaVac in a prime-boost vaccination regimen could induce a more robust immune response (1198, 1199). The WHO now suggests that a booster dose, either homologous or heterologous, can be considered 4 to 6 months after the primary series, especially for high-risk groups (1200).

6.5.1.2 Sinopharm’s Covilo

Two additional IWV vaccine candidates were developed following a similar approach by the state-owned China National Pharmaceutical Group Co., Ltd., more commonly known as Sinopharm CNBG. One, known as BBIBP-CorV or Covilo, was developed in Beijing using the HB02 variant of SARS-CoV-2. The other was developed at Sinopharm CNBG’s Wuhan Institute using the WIV04 variant of SARS-CoV-2 (1201). The viruses were purified, propagated using Vero cells, isolated, and inactivated using β-propiolactone (1201, 1202). Both vaccines are adjuvanted with aluminum hydroxide (1201, 1202). Here, we focus on Covilo.

Preclinical studies indicated that Covilo induced sufficient nAb titers in mice, and a prime-boost immunization scheme of 2 μg/dose was sufficient to protect rhesus macaques from disease (1202). In phase II trials, the Covilo vaccine appeared well-tolerated, with 23% of participants in the vaccine condition (482 total participants, 3:1, vaccine:placebo) reporting at least one adverse reaction characterized as mild to moderate (1203). No evidence of VADE was detected using this vaccine in phase II data (1204). In phase III trials conducted between July and December 2020, Covilo achieved an efficacy of 72.8% and was well tolerated (1205). However, questions were raised about efficacy when Sinopharm affiliates in the UAE in early December 2020 claimed the vaccine had 86% efficacy, which was later at odds with a Sinopharm Beijing affiliate that stated that Covilo had a 79.34% efficacy later that same month (1206).

Studies have also investigated expected differences in real-world effectiveness of Covilo given the continuing evolution of SARS-CoV-2. The antibody response elicited by Covilo was found to wane, but still to be detectable, by six months following the second dose (1193). One study showed that the Alpha variant exhibited very little resistance to neutralization by sera of those immunized with Covilo, but the Beta variant was more resistant to neutralization by almost a factor of 3 (1191). Another study examined sera from 282 participants and used a surrogate neutralizing assay, a test that generally correlates with nAbs, to determine that Covilo appears to induce nAbs against the binding of the RBD of wild type SARS-CoV-2 and the Alpha, Beta, and Delta variants to ACE2 (1207). Notably, a preprint reported that antisera (i.e., the antibody-containing component of the sera) from 12 people immunized with Covilo exhibited nAb capacity against the Beta variant (B.1.351), wild type SARS-CoV-2 (NB02), and one of the original variants of SARS-CoV-2 (D614G) (1208). As with many other vaccines, booster doses are being evaluated to mitigate some of the issues arising from viral evolution. A study of healthcare workers in China has since indicated that a booster shot of Covilo elevates B cell and T cell responses and increases nAb titers (1209). In May 2021, the UAE announced it would consider booster shots for all citizens who had been immunized with Covilo, which was shortly followed by a similar announcement in Bahrain, and by August 29, 2021, the UAE mandated booster shots for all residents who had received Covilo (1186).

6.5.1.3 Bharat Biotech’s Covaxin

Another IWV vaccine candidate was developed by Bharat Biotech International Ltd., which is the biggest producer of vaccines globally, in collaboration with the Indian Council of Medical Research (ICMR) National Institute of Virology (NIV). This candidate is known as Covaxin or BBV152. Preclinical studies of Covaxin in hamsters (1210) and macaques (1211) indicated that the vaccine induced protective responses deemed sufficient to move forward to human trials. Phase I (July 2020) and phase I/II (September to October 2020) studies indicated that Covaxin adjuvanted with alum and a Toll-like receptor 7/8 (TLR7/8) agonist was safe and immunogenic (1212, 1213). These two studies demonstrated that the vaccine induced significant humoral and cell-mediated responses, as assessed by measuring binding (1212) and neutralizing (1212, 1213) antibodies, cytokines (1212, 1213), CD3+, CD4+, and CD8+ T-cells (1212), with some formulations also eliciting Th1-skewed memory T-cell responses (1213). Only mild to moderate side-effects were reported upon immunization (1212, 1213), and in phase II trials, the Covilo vaccine appeared well-tolerated (1203).

In India, the Covaxin vaccine received emergency authorization on January 3, 2021, but the phase III data was not released until March 3, 2021, and even then it was communicated via press release (1214). This press release reported 80.6% efficacy in 25,800 participants (1214, 1215), spurring Zimbabwe to follow suit and authorize the use of Covaxin (1216). A detailed preprint describing the double-blind, randomized, controlled phase III trial that enrolled between November 2020 and January 2021 became available in July 2021 (1217), and the results collected as of May 17, 2021 were published in December 2021 (1218). Based on a final enrollment of 25,798 people (~1:1 vaccine:placebo), overall VE against symptomatic COVID-19 was estimated at 77.8% and against severe disease and asymptomatic infection was reported as 93.4% and 63.6%, respectively (1218). The vaccine was also reported to be well tolerated, with fewer severe events occurring in the Covaxin group (0.3%) than in the placebo group (0.5%) (1218). One case of a serious AE potentially related to the vaccine, immune thrombocytopenic purpura, was reported, although this patient was seropositive for SARS-CoV-2 at the baseline observation point (1218). As of June 1, 2023, Covaxin was approved for emergency use in 31 countries across Africa, Asia, Europe, and South America, including Guyana, India, Iran, Zimbabwe, Nepal, Mauritius, Mexico, Nepal, Paraguay, and the Philippines (1219).

Like with all vaccines, the continued evolution of SARS-CoV-2 poses a challenge to the effectiveness of Covaxin. In this case, the phase III clinical trial did evaluate the efficacy of Covaxin in response to variants circulating in mid-to-late 2020 (1218). In agreement with previous studies demonstrating sera from individuals vaccinated with Covaxin efficiently neutralized the Alpha variant (B.1.1.7) and the Delta variant (B.1.617.2) (12201222), the phase III trial reported a 65.2% efficacy against the Delta variant (B.1.617.2) (1218). Another study reported that sera from individuals immunized with Covaxin produced effective nAbs against the Delta variant and the so-called Delta plus variant (AY.1) (1223). Indeed, sera obtained from Covaxin boosted individuals (n=13) (1224) and those who were vaccinated with Covaxin but recovered from a breakthrough infection (n=31) also neutralized the Omicron variant (1225). Therefore, the data suggest that the vaccine does continue to confer protection to VOC.

The authorization of Covaxin has also offered opportunities to monitor how well the clinical trial results translate into a real-world setting. Additionally, an effort to monitor AEs and COVID-19 cases following vaccine roll-out reported that most side effects were mild and that cases were rare, even though this data would seem to have been collected during the severe wave of COVID-19 brought on by the Delta VOC in India in early 2021; at the same time, the sample sizes were extremely small (1226). Similarly, larger studies of adults (June to September 2021) (1227) and adolescents (beginning in January 2022) (1228) who received the vaccine outside of a trial setting reported that safety was similar across age groups, with no severe AEs reported in adults and with no serious AEs reported in adolescents, although 0.9% (6 individuals) reported severe AEs. However, a much lower effectiveness (22-29%) was estimated in a real-world setting during an analysis of cases in healthcare workers from April to May 2021 (1229). All the same, monitoring of hospitalized COVID-19 patients between April and June 2021 indicated that the vaccines were highly effective against preventing severe illness (1230).

It is not yet clear what level of protection Covaxin offers beyond 6 to 8 months post the second vaccine; consequently, the potential requirement of a booster immunization is being explored (1231). Furthermore, Bharat Biotech is considering other vaccine regimens such as providing one initial immunization with Covaxin followed by two immunizations with its intranasal vaccine (BBV154) (1232). U.S.-based Ocugen Inc., a co-development partner of Bharat Biotech, is leading the application for an Emergency Use Authorization (EUA) for Covaxin intended for the U.S. market. It has been reported that Bharat Biotech will soon release its phase II and III pediatric trial results (1233).

However, the WHO approval of the Covaxin has been delayed (1234), and in April 2022, the WHO suspended procurement of Covaxin due to concerns about deviation from good manufacturing practice in their production facilities (1235, 1236). All the same, no safety issues had been reported in association with this vaccine, and the suspension was unlikely to affect distribution given that Bharat Biotech had not been supplying doses through this mechanism (1237). Clinical trials had recommenced in the United States as of May 2022 (1237).

6.5.2 Summary of IWV Vaccine Development

In the past, problems that arose during the manufacturing of IWV vaccines could present safety issues, but oversight of the manufacturing process has helped to improve IWV vaccine safety (1238). Nevertheless, the departure from norms necessitated by the COVID-19 crisis raised concerns about whether oversight would occur at pre-pandemic standards (1238). In general, the IWV COVID-19 vaccines have reported very few issues with safety. Additionally, safety audits have proactively identified concerns, as demonstrated with the WHO’s suspension of Covaxin.

More concern has arisen around the issue of effectiveness due to the reduced neutralizing activity of IWV vaccines against VOC relative to the index strain. In several cases, estimates of VE have varied widely across different trials of a single vaccine. Such issues are likely to be exacerbated by spatiotemporal differences in viral evolution, though in the case of the very high estimate generated by the Turkish trial of CoronaVac (1177), the design of the study may have inflated the VE estimate (1239). Regardless, the authors of the original trial argued that all of the trials suggest a very high efficacy against severe disease (1240), as is the case for all of the IWV vaccines discussed here. In addition to issues related to the evolution of SARS-CoV-2, it is important to consider the duration of immunity over time. With IWV vaccines, heterologous vaccine boosters are being considered in many cases. Today, the WHO has developed recommendations for booster immunization for several whole-virus vaccines. In some cases (Valneva-VLA2001 (1241), Covaxin (1242), Covilo (1243), Sinopharm-WIBP Inactivated (Vero Cell) (1244)), boosters are recommended only for high-risk and/or high-priority groups (e.g., the immunocompromised and medical professionals, respectively), while for Sinovac’s CoronaVac (1200), they are recommended more broadly. Studies are also investigating the effects of booster doses in other vaccines (12451247), though some are being investigated or deployed primarily as heterologous boosters in populations vaccinated with a different primary series (1246).

As new vaccines are approved by the WHO, more time elapses since many received the primary series, and new variants emerge, booster recommendations are likely to increase. Therefore, IWV vaccines have played an important role in vaccine access during the initial phase of vaccination against COVID-19, but many IWV vaccinees may receive booster doses developed with emergent vaccine technologies like DNA and mRNA. In head-to-head comparisons, these types of vaccines were typically found to outperform IWV vaccines (e.g., (1190, 1192, 1250). At the same time, IWV vaccines are among the easiest to store and transport due to requiring refrigeration only at 2 to 8°C and remaining stable for years at a time (1205). Therefore, these vaccines are likely to continue to play an important role in vaccine equity and accessibility.

6.6 Subunit Vaccines

Efforts to overcome the limitations of live-virus vaccines led to the development of approaches first to inactivate viruses (circa 1900), leading to IWV vaccines, and then to purifying proteins from viruses cultured in eggs (circa 1920) (1113, 1251). The purification of proteins then set the stage for the development of subunit vaccines based on the principle that the immune system can be stimulated by introducing one or more proteins or peptides isolated from the virus. Today, such approaches often use antigens isolated from the surface of the viral particle that are key targets of the immune system (protein subunit vaccines). Advances in biological engineering have also facilitated the development of approaches like viral-like particle (VLP) vaccines using nanotechnology (1252). VLPs share the conformation of a virus’s capsid, thereby acting as an antigen, but lack the replication machinery (1253).

Unlike whole-virus vaccines, which introduce the whole virus, subunit vaccines stimulate the immune system by introducing one or more proteins or peptides of the virus that have been isolated. The main advantage of this platform is that subunit vaccines are considered very safe, as the antigen alone cannot cause an infection (1254). Both protein subunit and VLP vaccines thus mimic the principle of whole virus vaccines but lack the genetic material required for replication, removing the risk of infection (1255). Protein subunit vaccines can stimulate antibodies and CD4+ T-cell responses (1253, 1256).

The subunit approach is also favored for its consistency in production. The components can be designed for a highly targeted immune response to a specific pathogen using synthetic immunogenic particles, allowing the vaccine to be engineered to avoid allergen and reactogenic sequences (1257). One limitation is that, in the case of protein subunit vaccines, adjuvants are usually required to boost the immune response (1258) (see online Appendix (1148)). Adjuvants, which are compounds that elicit an immunogenic effect, include alum (aluminum hydroxide), squalene- or saponin-based adjuvants, and Freund’s incomplete/complete adjuvants, although the latter is avoided in human and veterinary medicine due to high toxicity (1257, 1259, 1260).

Protein subunit vaccine development efforts for both SARS-CoV-1 and MERS-CoV explored a variety of immunogens as potential targets. The search for a potential SARS-CoV-1 vaccine included the development of vaccines based on the full-length or trimeric S protein (12611263), those focused on the RBD protein only (12641267) or non-RBD S protein fragments (1262, 1268), and those targeting the N and M proteins (12691271). These efforts have been thoroughly reviewed elsewhere (1272). There have been examples of successful preclinical research including candidate RBD219N-1, a 218-amino-acid residue of the SARS-CoV-1 RBD that, when adjuvanted to aluminum hydroxide, was capable of eliciting a high antibody response of both nAbs and RBD-specific monoclonal antibodies in both pseudovirus and live virus infections of immunized mice (1273).

Similarly to the SARS-CoV-1 vaccine candidates, the MERS-CoV protein subunit vaccine candidates generally target the RBD (1264, 1272, 12741277), with some targeting the full length S protein (1278), non-RBD protein fragments such as the SP3 peptide (1279), and the recombinant N-terminal domain (rNTD) (1280). Other strategies investigating the potential use of the full length S DNA have also been investigated in mice and rhesus macaques, which elicited immune responses (1281), but these responses were not as effective as the combination of S DNA and the S1 subunit protein together (1281, 1282). No protein subunit vaccine for MERS-CoV has progressed beyond preclinical research to date. VLPs have been investigated for development of vaccines against MERS and SARS (1283, 1284) including testing in animal models (1285, 1286), but once again, only preclinical data against HCoV has been collected (1287). However, protein subunit vaccines do play a role in public health and have contributed to vaccination against hepatitis B (1288) and pertussis (1289, 1290) since the 1980s and human papillomavirus since 2006 (1291). They are likely to continue to contribute to public health for the foreseeable future due to ongoing research in vaccines against influenza, SARS-CoV-2, Epstein-Barr virus, dengue virus, and human papillomavirus among others (12921294).

6.6.1 Application to COVID-19

Table 3: Subunit vaccines approved for use in at least one country (1169) as of May 3, 2023 (1115).
Vaccine Company Platform
Zifivax Anhui Zhifei Longcom protein subunit
Noora vaccine Bagheiat-allah University of Medical Sciences protein subunit
Corbevax Biological E Limited protein subunit
Abdala Center for Genetic Engineering and Biotechnology (CIGB) protein subunit
Soberana 02 Instituto Finlay de Vacunas Cuba protein subunit
Soberana Plus Instituto Finlay de Vacunas Cuba protein subunit
V-01 Livzon Mabpharm Inc protein subunit
Covifenz Medicago VLP
MVC-COV1901 Medigen protein subunit
Recombinant SARS-CoV-2 Vaccine (CHO Cell) National Vaccine and Serum Institute protein subunit
Nuvaxovid Novavax protein subunit
IndoVac PT Bio Farma protein subunit
Razi Cov Pars Razi Vaccine and Serum Research Institute protein subunit
VidPrevtyn Beta Sanofi/GSK protein subunit
COVOVAX (Novavax formulation) Serum Institute of India protein subunit
SKYCovione SK Bioscience Co Ltd protein subunit
TAK-019 (Novavax formulation) Takeda protein subunit
SpikoGen Vaxine/CinnaGen Co. protein subunit
Aurora-CoV Vector State Research Center of Virology and Biotechnology protein subunit
EpiVacCorona Vector State Research Center of Virology and Biotechnology protein subunit

The development of subunit vaccines against SARS-CoV-2 is a remarkable achievement given the short period of time since the emergence of SARS-CoV-2 in late 2019, particularly considering these types of vaccines have not played a major role in previous pandemics compared to LAV and IWV vaccines. More than 20 protein subunit vaccines from companies such as Sanofi/GlaxoSmithKline, Nanogen, and the Serum Institute of India have entered clinical trials for COVID-19 since the beginning of the pandemic (1293), 20 have been approved, and at least 9 are being administered worldwide (1115, 1116) (Table 3). As of May 31, 2023, protein subunit vaccines are being distributed in at least 42 countries (Figure 7).

Figure 7: Worldwide availability of vaccines developed using protein subunit. This figure reflects the number of vaccines based on protein subunit technology that were available in each country as of May 31, 2023. These data are retrieved from Our World in Data (532, 1116) and plotted using geopandas (1170). The color scale is based on the number of vaccines of this type included in the OWID dataset as a whole, not the maximum observed in a single country. See https://greenelab.github.io/covid19-review/ for the most recent version of this figure, which is updated daily.

VLP vaccines have not progressed as rapidly. Programs seeking to develop VLP vaccines have used either the full-length S protein or the RBD of the S protein specifically as an antigen, although some use several different SARS-CoV-2 proteins (1254). As of May 31, 2023, only one VLP was available in one country (Canada) (1116).

Figure 8: Worldwide availability of vaccines developed with VLPs. This figure reflects the number of vaccines based on VLP technology that were available in each country as of May 31, 2023. These data are retrieved from Our World in Data (532) and plotted using geopandas (1170). The color scale is based on the number of vaccines of this type included in the OWID dataset as a whole, not the maximum observed in a single country. See https://greenelab.github.io/covid19-review/ for the most recent version of this figure, which is updated daily.

6.6.2 Novavax’s Nuvaxovid

NVX-CoV2373, also known as Nuvaxovid or Covovax (1295), is a protein subunit vaccine for SARS-CoV-2 produced by U.S. company Novavax and partners. Nuvaxovid is a protein nanoparticle vaccine constructed from a mutated prefusion SARS-CoV-2 spike protein in combination with a specialized saponin-based adjuvant to elicit an immune response against SARS-CoV-2 (1296). The spike protein is recombinantly expressed in Sf9 insect cells (1297), which have previously been used for several other FDA-approved protein therapeutics (1298), and contains mutations in the furin cleavage site (682-RRAR-685 to 682-QQAQ-685) along with two proline substitutions (K986P and V987P) that stabilize the protein (1299), including improving thermostability (1297).

In preclinical mouse models, Nuvaxovid elicited high anti-spike IgG titers 21 to 28 days post-vaccination that could neutralize the SARS-CoV-2 virus and protect the animals against viral challenge, with particularly strong effects when administered with the proprietary adjuvant Matrix-MTM (1297). In a phase I/II trial, a two-dose regimen of Nuvaxovid was found to induce anti-spike IgG levels and nAb titers exceeding those observed in convalescent plasma donated by symptomatic patients (1296). In line with the preclinical studies, the use of Matrix-M adjuvant further increased anti-spike immunoglobulin levels and induced a Th1 response.

In a phase III randomized, observer-blinded, placebo-controlled clinical trial in the U.K., 14,039 participants received two 5-μg doses of Nuvaxovid or placebo administered 21 days apart in a 1:1 ratio from late September to late November 2020 (1300). The primary endpoint of the trial was the occurrence or absence of PCR-confirmed, symptomatic mild, moderate or severe COVID-19 from 7 days after the second dose onward (1300). The VE was reported to be 89.7%, with a total of 10 patients developing COVID-19 in the vaccine group versus 96 in the placebo group (1300). No hospitalizations or deaths were reported in the vaccine group (1300). An additional phase III randomized, observer-blinded, placebo-controlled trial was conducted in the U.S. and Mexico, enrolling 29,949 participants and administering at least 1 vaccine in a 2:1 ratio from late December 2020 to late February 2021 (1301). This trial (1301) used the same primary endpoints as the initial phase III trial conducted in the U.K. (1300). A vaccine efficacy of 90.4% was reported based on 77 cases total, 63 of which occurred in the placebo group (1301). All moderate to severe cases of COVID-19 occurred in the placebo group (1301). Hospitalization and death were not evaluated as individual secondary endpoints, but were instead included in the definition of severe COVID-19; all-cause mortality was comparable between the placebo and treatment conditions (1301).

In both trials, the vaccine was found to be well-tolerated (1300, 1302). Analysis of 2,310 participants in the U.K. trial revealed that solicited AEs were much more common in the vaccine group than the placebo group across both doses, but the rate of unsolicited events was, while still higher in the vaccine group, much more similar (1300). A small number of severe AEs were reported by vaccine recipients, including one case of myocarditis; however, the myocarditis was determined to be viral myocarditis (1300). Common AEs were generally considered mild, with low incidences of headache, muscle pain, and fatigue (1301). In the trial conducted in the U.S. and Mexico, once again, the most common symptoms included headache, fatigue, and pain, as well as malaise (1301). Here, severe AEs were balanced across the vaccine and placebo groups (1301). Thus, both trials suggested that the Nuvaxovid vaccine is safe and effective against COVID-19.

However, Novavax experienced significant challenges in preparing Nuvaxovid for distribution. Prior to the pandemic’s onset, Novavax had sold their manufacturing facilities and reduced their staff dramatically (1303). As a result, once they began producing Nuvaxovid, they struggled to establish a stable relationship with contractors who could produce the vaccine (1304), especially given the challenge of producing vaccines at scale (1305). Additionally, Novavax was not able to meet the purity standards laid out by the FDA (1306). Eventually, the manufacturing issues were resolved (1307), and Nuvaxovid has since been authorized by the WHO (1308) and by political entities, including the United Kingdom (1309), the E.U. (1310), and the U.S. (1311). These delays obstructed some of the goals of the vaccine development program, which was undertaken with significant investment from the U.S. government through Operation Warp Speed (1306). Novavax was supposed to provide over a billion doses of Nuvaxovid to countries around the world through the COVID-19 Vaccines Global Access (COVAX) Facility (1306). However, following the delays, Gavi (which oversees COVAX) terminated the agreement, leading to ongoing legal disagreements between the nonprofit and Novavax as of late 2022 (1307, 1312).

As with other vaccines, the question of how well Nuvaxovid continues to provide protection as SARS-CoV-2 evolves has been raised. Post hoc analysis in the phase III trial indicated a VE of 86.3% against the Alpha variant (identified based on the presence of the 69–70del polymorphism) and 96.4% against viral specimens lacking the 69-70del polymorphism (1300). In the second phase III trial (1302), whole-genome sequencing was obtained from 61 of the 77 observed cases, and 79% of infections were identified as a VOC or variant of interest (VOI) known at the time of the study. Vaccine efficacy against cases caused by VOC, among which the Alpha variant was predominant (88.6%), was reported to be 92.6% (1302). In late 2020, an analysis of efficacy in South African adults revealed an overall efficacy of 60.1% and a slightly lower efficacy of 50.1% against the Beta variant (B.1.351) in particular (1313).

The company has also initiated the development of new constructs to select candidates that can be used as a booster against new strains and plans to initiate clinical trials for these new constructs in the second quarter of 2021. An analysis of a booster dose of Nuvaxovid administered six months after the primary series revealed a significant increase in neutralizing activity against VOC including Delta and Omicron (1314). This trial was conducted at 18 sites across the United States and Australia (1315). Novavax has also initiated booster trials in the U.K. (1186). Boosters may be especially important given that Omicron and related variants, in particular, may be associated with significantly reduced efficacy of Nuvaxovid (1316).

Given the apparent need for boosters, interest has also emerged in whether booster doses of Nuvaxovid can be safely administered along with annual flu vaccines. In a subgroup of approximately 400 patients enrolled from the U.K. phase III trial who received either Nuvaxovid or a placebo at a ratio of 1:1, a concomitant dose of adjuvanted seasonal influenza vaccines (either a trivalent vaccine or a quadrivalent vaccine) was administered (1317). This study demonstrated that the vaccines could be safely administered together (1317). While no change to the immune response was noted for the influenza vaccine, a notable reduction of the antibody response elicited by Nuvaxovid was reported, but efficacy was still high at 87.5% (1317). Novavax has since started phase I/II trials to investigate the administration of its own influenza vaccine, NanoFlu, concomitantly with Nuvaxovid (1318). The combination appeared to be safe and effective in preclinical studies (1319).

6.6.3 The Cuban Center for Genetic Engineering and Biotechnology’s Abdala Vaccine

Another notable protein subunit vaccine development program came out of Cuba. Concerned about their ability to access vaccines, especially given the U.S.’s embargo (1320), health officials in this developing country made the decision in March 2020 to undertake vaccine development on their own (1321). Today, three Cuban protein subunit vaccines have been approved for use: Abdala, which was developed at the Cuban Genetic Engineering and Biotechnology Center and SOBERANA 02 and SOBERANA Plus, which were developed at Cuba’s Finlay Vaccine Institute (Instituto Finlay de Vacunas Cuba) (1321). Here, we focus on the development of the Abdala vaccine, but SOBERANA 01/02/Plus vaccine development program has also achieved great success and reported VEs of over 90% in the three-dose regimen (1322, 1323).

Abdala, also known as CIGB-66, was developed using yeast as a low-cost alternative to mammalian cell expression systems (e.g., human embryonic kidney cells) to cultivate the recombinant proteins that form the basis of this protein subunit vaccine (1324). A sequence corresponding to the RBD of the Spike protein in the index strain of SARS-CoV-2 was codon optimized for expression in yeast, and the RBD proteins were then purified and used to inoculate mice, rats, and African green monkeys (1324). In addition to the proteins, the vaccine candidate included an adjuvant, aluminum hydroxide gel (1324). Comparing the immunogenicity of the yeast-cultivated proteins to those cultivated in human embryonic kidney cells revealed no significant difference in the immune response (1324).

Based on promising results in laboratory animal testing, Abdala moved to phase I/II trials in human subjects ages 19 to 80, recruiting participants between December 2020 and February 2021 (1325). The three-dose vaccine elicited no serious AEs across either phase I or II, and the vaccine was found to produce a strong immune response (1325). In March 2021, phase III trials began (1326), and by June, officials were reporting the VE to be 92.28% (1327, 1328). This high efficacy estimate, along with the short timeline of data collection, initially elicited skepticism, especially given that the data were not made public (1329). However, the trials were designed to enroll a large number of participants and were carried out during a wave of infections due to the arrival of variants carrying the D614G mutation in Cuba, which would be expected to allow an expedited timeline for interim analysis (1329). Based on the reported results, Abdala gained emergency use authorization in Cuba in July 2021 (1330), and by December 2021, cases in Cuba had dropped dramatically (1331). The results of the phase III trial were posted to medRxiv in September 2022, describing the results of a randomized, placebo-controlled, multicenter, double-blind investigation of the Abdala vaccine candidate in 48,000 participants between March 22 and April 3, 2021 (1332). The final results evaluated 42 symptomatic cases of COVID-19 among participants in the placebo condition compared to only 11 cases among participants who received the vaccine, yielding the reported VE of 92.28% (1332). In terms of secondary endpoints, the VE was 91.96% against mild/moderate COVID-19, 94.46% against severe COVID-19, and 100% against critical illness and death (1332). The vaccine was also found to be very safe, with the overall incidence of AEs only 2.5% in vaccine recipients compared to 1.9% in the placebo recipients (1332). Therefore, the phase III trial suggests that this vaccine is highly effective and safe.

Evidence from the deployment of the vaccine also suggests it is highly effective. A retrospective cohort study conducted between May and August 2021 evaluated public health data from over a million people in the city of Havana and found that the real-world effectiveness of the vaccine met or exceeded estimates of VE during the trial, with 98.2% effectiveness against severe disease and 98.7% effectiveness against death observed in fully vaccinated subjects (1333). Notably, Cuba has vaccinated a high percentage of its population, with 87% of the population vaccinated by January 2022 and 90.3% by the end of December 2022 (1334, 1335). Therefore, one consideration in interpreting retrospective cohort studies is that the vaccination rate in Cuba is so high that the two cohorts might not be directly comparable. All the same, the fact that the efficacy and effectiveness of the Abdala vaccine have both been estimated to be over 90% against severe illness suggests that this vaccine is highly effective for mitigating the risk of COVID-19. As of December 2022, the vaccine had been authorized for distribution in five additional countries, including Mexico and Vietnam, although its evaluation for WHO approval was ongoing (1336, 1337).

However, limited data is available about the Abdala’s vaccine’s robustness to evolutionary changes in SARS-CoV-2. An in silico analysis identified several potential changes in the epitopes of the Omicron VOC relative to the sequence used in the development of Abdala (1338). Instead, Cuban health officials have prioritized boosters. A representative of the Cuban state business group reportedly stated that immunity remains high at six months after the primary course but that some people may be prone to infection (1339), suggesting waning immunity. The Cuban government authorized boosters in January 2022 in an effort to mitigate the effects of the Omicron variant (13391341). Additional support for the efficacy of Abdala and other Cuban vaccines comes from the fact that Cuba’s COVID-19 death rate has virtually flatlined since fall 2021, with less than 250 deaths reported during the entire year of 2022 in a population of 11.3 million (1342). Therefore, in addition to developing a vaccine with an estimated VE paralleling that of vaccines developed using cutting-edge nucleic acid technologies (6), Cuba’s vaccine roll out has also been much more successful than in nearly all similarly sized countries. This remarkable vaccine program underscores the continued importance of established, cost-effective vaccine development strategies (1340) that make it possible for countries that have not traditionally been a leader in biotechnological innovation but have developed a solid vaccine production sector (1343) to develop and produce vaccines that will serve their own population’s needs. Additionally, Cuba’s vaccines are uniquely accessible to many countries around the world (1340).

6.6.4 Medicago’s Covifenz

The leading example of a VLP approach applied to COVID-19 comes from Covifenz, developed by Canadian company Medicago (1344). This vaccine was developed using plant-based VLP technology (1345) that the company had been investigating in order to develop a high-throughput quadrivalent VLP platform to provide protection against influenza (1346). The approach utilizes Nicotiana benthamiana, an Australian relative of the tobacco plant, as an upstream bioreactor (1346, 1347). Specifically, the S gene from SARS-CoV-2 in its prefusion conformation is inserted into a bacterial vector (Agrobacterium tumefaciens) that then infects the plant cells (1346, 1347). Expression of the S glycoprotein causes the production of VLPs composed of S trimers anchored in a lipid envelope that accumulate between the plasma membrane and the cell wall of the plant cell (1347). Because these VLPs do not contain the SARS-CoV-2 genome, they offer similar advantages to whole-virus vaccines while mitigating the risks (1346, 1347).

In the phase I study, 180 Canadian adults ages 18 to 55 years old were administered Covifenz as two doses, 21 days apart, with three different dosages evaluated (1347). This study reported that when the VLPs were administered with AS03, an oil-in-water emulsion containing α-tocopherol and squalene (1348), as an adjuvant, the vaccine elicited an nAb response that was significantly (approximately 10 times) higher than that in convalescent sera (1347). The phase III trial examined 24,141 adults assigned to the treatment and control conditions at a 1:1 ratio between March and September of 2021 (1349).

Covifenz was reported to be 71% effective in preventing COVID-19 in the per-protocol analysis (1349). Efficacy was only slightly lower in the intention-to-treat group at 69%, with the VE for the prevention of moderate-to-severe disease in this group estimated at 78.8% (1349). Over 24,000 participants were included in the safety analysis, which reported that 92.3% of vaccine recipients reported local AEs compared to 45.5% of placebo recipients, with rates for systemic AEs at 87.3% and 65.0%, respectively (1349). The adverse effects reported were generally mild to moderate, with the most common adverse effects being injection site pain, headache, myalgia, fatigue, and general discomfort (1349). Only three patients (two in the vaccine group) reported grade 4 events, all after the second dose (1349). The vaccine was approved for use in adults ages 18 to 65 by Health Canada in February 2022 (1350).

Plant-based expression systems such as the one used in Covifenz are relatively new (1347) but are likely to offer unparalleled feasibility at scale given the speed and low-cost associated with the platform (1351). Additionally, the Covifenz vaccine offers the advantage of being stored at 2 to 8°C. However, the worldwide footprint of Covifenz, and of VLP-based technologies against SARS-CoV-2 broadly, remains small, with only 1 VLP vaccine approved for distribution in 1 countries (Figure 8). Approval and administration of Covifenz in countries outside of Canada has been limited by concerns at the WHO about ties between Medicago and the tobacco industry (1344, 1352). While other species of plants have been explored as the upstream bioreactors for plant-derived VLPs, the specific species of tobacco used increased yield dramatically (1353). In December 2022, tobacco industry investors in Medicago divested, opening new possibilities for the distribution of the vaccine (1354).

As a result of this limited roll-out and given that the phase III results were published only in May 2022, little is known about the real-world performance of Covifenz. However, it should be noted that the Covifenz trials were conducted in 2021, at a time during which the B.1.617.2 (Delta) and P.1 (Gamma) variants were predominant (1349). Genomic analysis of 122 out of 176 cases (165 in the per-protocol population) revealed that none of the COVID-19 cases reported were caused by the original Wuhan strain (1349). Instead, 45.9% of cases were identified as the Delta variant, 43.4% as Gamma, 4.9% as Alpha, and 5.8% as VOIs (1349). Therefore, Covifenz and Nuvaxovid, despite both being designed based on the index strain, were tested under circumstances where different VOC were dominant, and differences in the Spike proteins of different VOC relative to the index strain could affect vaccine efficacy. As of late 2022, Covifenz has not been authorized as a booster in Canada (1355), and no studies on booster doses had been released by Medicago (1356).

6.6.5 Subunit Vaccine Summary

Subunit vaccine technology is one of the best-represented platforms among COVID-19 vaccine candidates. Development programs are underway in many countries around the world, including low- and middle-income countries (1293). To date, data about the effect of viral evolution on the effectiveness of subunit vaccines has been limited. Because these vaccines were developed using the Spike protein from the index strain (1297, 1349), a potential concern has been that these vaccines could lose effectiveness against SARS-CoV-2 containing mutations in the Spike protein. Comparison of studies across vaccines suggests that some VOC, such as Alpha, may have minimal impact on vaccine efficacy/effectiveness (1357). Additionally, to the extent that data is available such as from the vaccine rollout in Cuba, it suggests that real-world effectiveness remains strong against severe illness and death.

Subunit platforms offer some unique advantages. Cuba’s successful vaccine development programs highlights the fact that protein subunit vaccines can be developed using low-cost technologies. Additionally, they are more feasible to store and transport (1358). Hoping to build on Cuba’s success and the continued lack of vaccine access in many countries, several Latin American nations have begun developing protein subunit vaccines (1359).

The efficacy and effectiveness of these vaccines is also very high, especially for Nuvaxovid, Abdala, and SOBERANA 01/02/Plus, where estimates exceeded 90%. Unfortunately, there seem to be limited studies directly comparing the immunogenicity of subunit vaccines to nucleic acid vaccines, and comparing efficacies across trials is subject to bias (1360). All the same, the evidence suggests that some protein subunit vaccines are able to provide extremely strong protection. Coupled with the reduced barriers to development and transportation relative to most nucleic acid vaccines, it is clear that subunit technologies are important to vaccine access.

6.7 Global Vaccine Status and Distribution

The unprecedented deployment of COVID-19 vaccines in under a year from the identification of SARS-CoV-2 led to a new challenge: the formation of rapid global vaccine production and distribution plans. The development of vaccines is costly and complicated, but vaccine distribution can be just as challenging. Logistical considerations such as transport, storage, equipment (e.g., syringes), the workforce to administer the vaccines, and a continual supply from the manufacturers to meet global demands all must be accounted for and vary globally due to economic, geographic, and sociopolitical reasons (13611363). As of May 25, 2023, at least 13.0 billion vaccine doses had been administered in at least 223 countries worldwide using 28 different vaccines (532).> The daily global vaccination rate at this time was 8.0 per million.

However, the distribution of these doses is not uniform around the globe. Latin America leads world vaccination rates with at least 82% of individuals in this region receiving one vaccine dose followed by the U.S. and Canada (81%), Asia-Pacific (81%), Europe (70%), the Middle East (58%), and Africa with only 33% as of November 2022 (1364). It is estimated that only ~25% of individuals in low- and middle-income countries have received one vaccine dose (1116, 1365). Vaccine production and distribution varies from region to region and seems to depend on the availability of the vaccines and potentially a country’s resources and wealth (1366).

One effort to reduce these disparities is COVAX, a multilateral initiative as part of the Access to COVID-19 Tools (ACT) Accelerator coordinated by the WHO, Gavi, the Vaccine Alliance, and the Coalition for Epidemic Preparedness Innovations (CEPI), the latter two of which are supported by the Bill and Melinda Gates Foundation. Their intention is to accelerate the development of COVID-19 vaccines, diagnostics, and therapeutics and to ensure the equitable distribution of vaccines to low- and middle-income countries (1367, 1368). COVAX invested in several vaccine programs to ensure they would have access to successful vaccine candidates (1369). However, the initiative has been less successful than was initially hoped due to less participation from high-income countries than was required for COVAX to meet its goals (1370).

Additionally, the vaccine technologies available differ widely around the globe. As we review elsewhere (6), wealthier nations have invested heavily in mRNA and DNA vaccines. In contrast, as we describe above, many countries outside of Europe and North America have developed highly effective vaccines using more traditional approaches. There is a clear relationship between a country’s gross domestic product (GDP) and its access to these cutting-edge types of vaccines (Figure 9). Whole-virus and subunit vaccine development programs are responsible for a much higher percentage of the vaccinated populous in lower-income countries. Therefore, vaccine development programs that utilized established vaccine technologies have played a critical role in providing protection against SARS-CoV at the global level.

Figure 9: Vaccine Distribution across Platform Types as a Function of GDP. The total number of doses of the original formulation of each vaccine that were distributed within each country as of May 31, 2023, by platform type, is shown as a function of GDP. These data are retrieved from Our World in Data (532, 1116) and plotted using the Python package plotnine (1371). Lines show a general trend in the data and are drawn using geom_smooth (1372). The list of countries included in the dataset is available from OWID (1373). See https://greenelab.github.io/covid19-review/ for the most recent version of this figure, which is updated daily. Axes are not scaled per capita because both variables are modulated by population size.

When vaccines first became available, the wealthy nations of North America and Europe secured most of the limited COVID-19 vaccine stocks (1374). Throughout 2021, low- and middle-income countries faced steep competition with high-income countries for vaccines, and the rates of vaccination reflected this unequal distribution (1375). While the wealthiest countries in these regions could compete with each other for vaccines independent of programs such as COVAX (1375), other countries in these regions have faced challenges in acquiring vaccines developed by the world’s wealthiest nations. Fortunately, while mRNA and DNA vaccine development programs are not widespread, vaccine development using whole-virus and subunit technologies has been undertaken worldwide. China and India, in particular, have developed several vaccines that are now widely available in these densely populated countries (see online Appendix (1148)).

Still, many nations, especially in Africa, are reliant on the COVAX Facility, who have promised 600 million doses to the continent (1376). The COVAX plan seeks to ensure that all participating countries would be allocated vaccines in proportion to their population sizes. Once each country has received vaccine doses to account for 20% of their population, the country’s risk profile will determine its place in subsequent phases of vaccine distribution. However, several limitations of this framework exist, including that the COVAX scheme seems to go against the WHO’s own ethical principles of human well-being, equal respect, and global equity and that other frameworks might have been more suitable (1377). Furthermore, COVAX is supposed to allow poorer countries access to affordable vaccines, but the vaccines are driven by publicly traded companies that are required to make a profit (1366).

In any case, COVAX provides access to COVID-19 vaccines that may otherwise have been difficult for some countries to obtain. COVAX aimed to distribute 2 billion vaccine doses globally by the end of 2021 (1378). According to Gavi, as of January 2022, COVAX had distributed over 1 billion vaccines to 144 participants of the program (1379), short of its target but still a major global achievement. It is envisaged that COVAX may also receive additional donations of doses from Western nations who purchased surplus vaccines in the race to vaccinate their populations, which will be a welcome boost to the vaccination programs of low- and middle-income countries (1380).

In general, deciding on the prioritization and allocation of the COVID-19 vaccines is also a challenging task due to ethical and operational considerations. Various frameworks, models, and methods have been proposed to tackle these issues with many countries, regions, or U.S. states devising their own distribution and administration plans (13811385). The majority of the distribution plans prioritized offering vaccines to key workers such as health care workers and those who are clinically vulnerable, such as the elderly, the immunocompromised, and individuals with comorbidities, before targeting the rest of the population, who are less likely to experience severe outcomes from COVID-19 (1386). The availability of vaccines developed in a variety of countries using a variety of platforms is likely to work in favor of worldwide vaccine access. The initiative by Texas Children’s Hospital and Baylor College of Medicine to develop Corbevax, a patent-free COVID-19 protein subunit vaccine, is an important step towards vaccine equity because the manufacturing specifications can be shared globally. Corbevax can be produced at low cost using existing technology and is now licensed to Biological E. Limited (BioE), an Indian company specializing in low-cost vaccine production (1387). The vaccine has been approved for distribution in India and Botswana (1388).

Logistical challenges and geographical barriers also dictate the availability of certain vaccines. Many countries have had poor availability of ultra-low temperature freezers, leading to challenges of distribution for vaccines such as mRNA vaccines that require storage at very low temperatures (13891391). Furthermore, ancillary supplies such as vaccine containers, diluents for frozen or lyophilized vaccines, disinfecting wipes, bandages, needles, syringes, sharps and biological waste disposal containers are also required, which may not be readily available in geographically isolated locations and can be bulky and expensive to ship (1389). While some of these challenges in vaccine rollout in low- and middle-income countries are being addressed through COVAX (1392), many issues persist worldwide (13931395). COVAX also failed to distribute its promised two billion vaccine doses on time due to multiple complications (1396).

Another major challenge to global vaccine distribution is vaccine hesitancy, which the WHO has designated as a leading global health threat (1397). Polling in the U.S. in January 2021 suggested that 20% of individuals were reluctant to receive a vaccine at that time, with a further 31% expressing some hesitancy to a lesser extent (1398, 1399). A survey of 8,243 long-term healthcare workers in November 2020 (Indiana, USA) reported that only 69% of respondents would ever consider receiving an FDA-approved vaccine due to their perceived risk of side effects (70%), health concerns (34%), efficacy (20%), and religious beliefs (12%) (1400). Notably, almost a third of parents surveyed in the United States in March 2021 expressed concerns about vaccinating their children against COVID-19 (1401). Indeed, vaccine hesitancy has been reported as a significant barrier to vaccine distribution in countries in North and South America, Europe, Asia, and Africa (14021406). Various factors have been associated with increased vaccine hesitancy including access to compelling misinformation via social media (1407, 1408), religious and conservative political beliefs (14091412), and safety and efficacy concerns (1401), to highlight a few. Many of the concerns regarding safety and efficacy have focused on the novel mRNA technologies due to the perceived speed of their development and expedited clinical trial process (1413); however, general vaccine hesitancy relating to traditional vaccine platforms existed long before the pandemic and the distribution of the novel mRNA vaccines (1414, 1415). While in the United States, it was hoped that Novavax’s Nuvaxovid would appeal to the vaccine hesitant (1416, 1417), but this protein subunit vaccine has not led to the uptake hoped (1418, 1419).

Overall, the vaccine landscape remains heterogeneous even as the pandemic nears its third year, with certain vaccines much more accessible in high-income countries than in low- and middle-income countries. The vaccines described in this manuscript, which were developed using well-established technologies, have played a crucial role in improving the feasibility and accessibility of vaccination programs worldwide. While the novel technologies have received the bulk of public attention in countries like the U.S., these more traditional vaccine platforms also provide safe and highly effective protection against SARS-CoV-2. Although companies developing cutting-edge technologies, namely Moderna and Pfizer/BioNTech, reported very high efficacies greater than 90% in their clinical trials (6), the efficacies identified in whole-virus and subunit trials have also been very high. While the clinical trial efficacy estimates for IWV and subunit have been lower, some of these trials have also reported efficacies over 80% (e.g., Novavax’s Nuvaxovid with 89.7% (1300) or Sinovac’s CoronaVac with 83.5% (1177)). Variation among studies investigating the efficacy of these vaccines, especially CoronaVac, clearly indicate that clinical trials of the same vaccine might not identify the same efficacy, depending on conditions such as the specific variants circulating in a clinical trial population during the trial period. Additionally, there are many cohort- and population-level characteristics that can introduce bias within and between clinical trials (1360, 1420), and the extent to which these different factors are present may influence trial outcomes. While head-to-head comparisons of VE across different studies may therefore not be appropriate, the results make it clear that effective vaccines have been developed with a wide variety of technologies. The vaccines discussed here, which took advantage of well-established approaches, have proven to be especially valuable in pursuing vaccine equity.

6.8 Conclusions

Much attention has focused on the most novel vaccine technologies that have been deployed against SARS-CoV-2, but the established vaccine platforms discussed here have all made a significant impact on human health during the twentieth century and in some cases even earlier. The COVID-19 pandemic has demonstrated new potential in these established technologies. In the early 2000s, these technologies were explored for managing SARS-CoV-1 (1421, 1422), but the epidemic was controlled before those efforts came to fruition (1423). Similarly, these technologies were explored for MERS-CoV, but outbreaks were sporadic and difficult to predict, making vaccine testing and the development of a vaccination strategy difficult (1424). However, in the COVID-19 pandemic, most of these technologies have been used to accelerate vaccine development programs worldwide. Therefore, they are also offering the opportunity to respond quickly to an emergent pathogen.

While these tried-and-true technologies do not always produce vaccines with the highly desirable VE reported in mRNA clinical trials (which exceeded 90%), the efficacies are still very high, and these vaccines are extremely effective at preventing severe illness and death. Furthermore, some vaccine development programs using established technologies, especially protein subunit vaccines, have seen remarkably high VE and vaccine effectiveness. Some protein subunit vaccine phase III trials generated VE estimates of over 90%, comparable to those in the mRNA vaccine trials. Additionally, in some cases, such as Cuba’s highly successful vaccine development program, these vaccines have been developed by and for low- and middle-income nations. As a result, the greater accessibility and stability of these vaccines makes them extremely valuable for the global effort to mitigate the loss of life from SARS-CoV-2. The outcomes of the response to COVID-19 suggests that these established vaccine technologies may continue to play an important role in tackling future viral threats.

7 Appendix: Additional Information about Established Vaccine Platforms for COVID-19

7.1 Sinovac’s CoronaVac

The CoronaVac vaccine was developed by Sinovac, a Beijing-based biopharmaceutical company. The vaccine uses an inactivate whole virus with the addition of an aluminum adjuvant (1425). Pre-clinical trials were performed using BALB/c mice and rhesus macaques (1173). The SARS-CoV-2 strains used in this trial isolated from 11 hospitalized patients (5 from China, 3 from Italy, 1 from the United Kingdom (U.K.), 1 from Spain, 1 from Switzerland). A phylogenetic analysis demonstrated that the strains were representative of the variants circulating at the time. One of the strains from China, CN2, was used as the inactivated and purified virus while the other 10 strains were used to challenge. CN2 was grown in Vero cells. The immunogenicity of the vaccine candidate was evaluated with an ELISA assay. Ten mice were injected with the vaccine on day 0 and day 7 with varying doses (0, 1.5, 3, or 6 μg), and 10 mice were treated with physiological saline as the control. IgG developed in the serum of all vaccinated mice.

Using the same setup, immunogenicity was also assessed in macaques. Four macaques were assigned to each of four groups: treatment with 3 μg at day 0, 7, and 14, treatment with a high dose of 6 μg at day 0, 7, and 14, administration of a placebo vaccine, and administration of only the adjuvant. All vaccinated macaques induced IgGs and neutralizing antibodies. After challenge with SARS-CoV-2 strain CN1, vaccinated macaques were protected compared to control macaques (placebo or adjuvant only) based on histology and viral loads collected from different regions of the lung.

A single center, randomized, double-blind, placebo-controlled phase I/II trial was conducted in April 2020i in adults 18-59 years old. Patients in this study were recruited from the community in Suining County of Jiangsu province, China. For the phase I trial, 144 (of 185 screened) participants were enrolled, with 72 enrolled in the 14-day interval cohort (i.e., treated on day 0 and day 14) and 72 in the 28-day interval cohort. This group of 72 participants was split into 2 blocks for a low-dose (3 μg) and high-dose (6 μg) vaccine. Within each block, participants were randomly assigned vaccination with CoronaVac or placebo (aluminum diluent without the virus) at a 2:1 ratio. Both the vaccine and placebo were prepared in a Good Manufacturing Practice-accredited facility of Sinovac Life Sciences (Beijing, China).

The phase II trial followed the same organization of participants, this time using 300 enrolled participants in the 14-day and another 300 enrolled in the 28-day groups. One change of note was that the vaccine was produced using a highly automated bioreactor (ReadyToProcess WAVE 25, GE, Umeå, Sweden) to increase vaccine production capacity. This change resulted in a higher intact spike protein content. The authors of this study were not aware of this antigen-level difference between the vaccine batches for the phase I/II when the ethical approval for the trials occurred.

To assess adverse responses, participants were asked to record any events up to 7 days post-treatment. The reported adverse events were graded according to the China National Medical Products Administration guidelines. In the phase I trial, the overall incidence of adverse reactions was 29-38% of patients in the 0 to 14 day group and 13-17% in the 0 to 28 day vaccination group. The most common symptom was pain at the injection site, which was reported by 17-21% of patients in the 0 to 14 day cohort and 13% in the 0 to 28 day cohort. Most adverse reactions were mild (grade 1), with patients recovering within 48 hours. A single case of acute hypersensitivity with manifestation of urticaria 48 hours following the first dose of study drug was reported in the 6 μg group Both the 14-day and 28-day cohorts had a strong neutralizing antibody (Ab) response. The neutralizing Ab response was measured using a micro cytopathogenic effect assay, which assesses the minimum dilution of neutralizing Ab to be 50% protective against structural changes in host cells in response to viral infection (1426). Additionally IgG antibody titers against the receptor binding domain were also measured using ELISA.

Another phase I/II study was performed with older patients (over 60 years of age) (1172). The study conducted a single-center, randomized, double-blind, placebo-controlled trial. The phase I trial looked at dose escalation using 3 dosages: 1.5, 3, and 6 μg. The mean age of participants was 65.8 years (standard deviation = 4.8). Of 95 screened participants, 72 were enrolled. These 72 participants were split into low (3 μg) and high (6 μg) dose groups. Within each group, 24 participants received the treatment and 12 the placebo. A neutralizing antibody response against live SARS-CoV-2 was detected compared to baseline using the same micro cytopathogenic effect assay. This response was similar across the two dose concentrations. Additionally, they did not observe a difference in response between age groups (60–64 years, 65–69 years, and ≥70 years).

In phase II the mean age was 66.6 years (standard deviation = 4.7). 499 participants were screened and 350 were enrolled. 300 were evenly split into 1.5, 3, and 6 μg dose groups, and the remaining 50 were assigned to the placebo group. Again, they found a neutralizing antibody response in phase II. There wasn’t a significant different between the response to 3 μg versus 6 μg, but the response in both these conditions was higher than in the 1.5 μg condition.

Participants were required to record adverse reaction events within the first 7 days after each dose. The safety results were combined across phase I and II. All adverse reactions were either mild (grade 1) or moderate (grade 2) in severity. The most common symptom was pain at the injection site (9%) and fever (3%). 2% (7 participants) reported severe adverse events (4 from the 1.5 μg group, 1 from the 3 μg group, 2 from the 6 μg group), though these were found to be unrelated to the vaccine.

Overall, the results from the pre-clinical and phase I/II clinical trials were promising, especially the fact that the immune response was consistent in older adults (> 60 years).

7.2 Sinopharm’s Clinical Trials of Two Vaccine Candidates

Sinopharm Wuhan Institute developed their SARS-CoV-2 inactivated vaccine using the WIV04 strain isolated from a patient at the Jinyintan Hospital in Wuhan, China (1201). This vaccine is administered intramuscularly using 5 μg of virus per dose. Preclinical data providing supporting evidence for the use of this vaccine is not available publicly. Despite the lack of publicly available preclinical results, Sinopharm Wuhan Institute initiated phase I/II trials, which reported on varying dosing and prime-boost regimens.

A combined phase I/II RCT of Sinopharm’s BBIBP-CorV, also known as Covilo, followed (1203). In phase I, 192 participants were randomized with varying doses of 2 μg, 4 μg, or 8 μg/dose or a placebo, and they received the same as a second dose 28 days later. Approximately 29% of participants reported at least 1 adverse event, most commonly fever, and neutralizing antibody titers were reported for all doses. In the phase II trial, 482 participants were enrolled (3:1, vaccine:placebo). Participants in the vaccine condition received either a single 8 μg dose or a double immunization of a 4 μg/dose that was administered 14, 21, or 28 days post the prime dose. Participants in the placebo condition received the placebo on one of the same four schedules. The vaccine appeared well-tolerated, with 23% reporting at least one adverse reaction characterized as mild to moderate. It was reported that all participants had a humoral immune response to the vaccines by day 42 but that the double immunization dosing regimen of 4 μg/dose achieved higher neutralizing antibody titers than a single dose of 8 μg and that the highest response was seen in the double-immunization regimen when at least 21 days separated the two doses (1203). Similar findings were reported in another phase I/II trial published by the same authors (1205). For the other vaccine, nAbs were detected in all groups 14 days after the final dose in the phase I part of the trial (1204), with similar findings reported in interim phase II data (1204).

7.3 Novavax’s Nuvaxovid (NVX-CoV2373)

Novavax’s Nuvaxovid is a particularly appealing candidate because the improved stability caused by the proline substitutions is particularly critical to facilitating global distribution, particularly to regions where local refrigerator/freezer capacities are limited. Importantly, these amino acid substitutions did not affect the ability of the spike protein to bind the hACE2 receptor (the target receptor of SARS-CoV-2 spike protein). The Novavax-CoV2373 vaccine candidate uses a proprietary, saponin-based Matrix-MTM adjuvant that contains two different 40nm-sized particles formed by formulating purified saponin with cholesterol and phospholipids (1427). In preclinical models, the use of the Matrix-M adjuvant potentiated the cellular and humoral immune responses to influenza vaccines (14271430). Importantly, Matrix-M adjuvant-containing vaccines have shown acceptable safety profiles in human clinical trials (1431).

Novavax-CoV2373 induced a multifunctional CD4/CD8 T-cell responses and generate high frequencies of follicular helper T-cells and B-cell germinal centers after vaccination. These findings were subsequently evaluated in a baboon primate model, in which Novavax-CoV2373 also elicited high antibody titers against the SARS-CoV-2 spike protein, as well as an antigen specific T-cell response. Based on this data Novavax initiated a phase I/II clinical trial to evaluate the safety and immunogenicity of Novavax-CoV2373 with Matrix-M (1296, 1315).

The phase I/II trial was a randomized, placebo-controlled study with 131 healthy adult participants in 5 treatment arms (1296). Participants that received the recombinant SARS-CoV-2 vaccine with or without the Matrix-M adjuvant got two injections, 21 days apart. Primary outcomes that were assessed include reactogenicity, lab-values (serum chemistry and hematology), and anti-spike IgG levels. Secondary outcomes measured included virus neutralization, T-cell responses, and unsolicited adverse events. The authors reported that no serious treatment-related adverse events occurred in any of the treatment arms. Reactogenicity was mostly absent and of short duration. The two-dose vaccine regimen induced anti-spike IgG levels and neutralizing antibody-titers exceeding those in the convalescent plasma of symptomatic patients. The outcomes of this trial suggest that Novavax-CoV2373 has an acceptable safety profile and is able to induce a strong immune response with high neutralizing antibody titers.

The phase II component of the trial was designed to identify the dose regimen for the next clinical trial stage (1432). Both younger (18-59 years) and older patients (60-84 years) were randomly assigned to receive either 5 μg or 25 μg Novavax-CoV2373 or a sodium-chloride placebo in two doses, 21 days apart. In line with the phase I data, reactogenicity remained mild to moderate, with no more than 1% of participants in any group reporting grade 3 AEs, and of short duration. Both dose levels were able to induce high anti-spike IgG titers as well as neutralizing antibody responses after the second dose. Based on both safety and efficacy data, the 5 μg dosing regimen was selected as the optimal dose regimen for the phase III trial.

Novavax announced an efficacy of 89.3% based on their phase III trial in the U.K. and South Africa (1300, 1433, 1434). This trial included over 15,000 participants in the U.K. and 4,000 participants in South Africa. The primary endpoint of the trial was the occurrence or absence of PCR-confirmed, symptomatic mild, moderate or severe COVID-19 from 7 days after the second dose onward. In the first interim analysis (U.K.), 56 cases of COVID-19 were observed in the placebo group compared to 6 cases in the treatment group. Importantly, the vaccine candidate also shows significant clinical efficacy against the prevalent U.K. and South African variants.

7.4 Protein Subunit Vaccine Development Programs Prior to SARS-CoV-2

Earlier studies examined the immunogenicity of a SARS-CoV-1 RBD fused with IgG1 Fc. This recombinant fusion protein could induce a robust long-lasting neutralizing antibody and cellular immune response that protected mice from SARS-CoV-1 (1133, 1264, 1267). While there have been other potential protein subunit vaccines for SARS-CoV-1 investigated in vivo (1133, 1272), none of these candidates have successfully completed clinical trials, more than likely due to the fact that the SARS-CoV-1 epidemic mostly ended by May 2004, and there was thus less of a demand or funding for SARS-CoV-1 vaccine research.

Similar vaccine candidates have emerged that target the RBD found in the S1 subunit of the trimeric MERS-CoV S protein, which binds to dipeptidyl-peptidase 4 (DPP4 also known as hCD26), the entry point through which MERS-CoV infects cells (14351437). After initially determining that an RBD subunit candidate (S377-588-Fc) could elicit neutralizing antibodies (1438), a study in mice determined that the administration of three sequential doses of RBD-Fc vaccine coupled with MF59, a squalene immunogenic adjuvant, induced humoral and systemic immunity in mice (1439). Mice that had been transduced with Ad5-hCD26 and subsequently challenged with MERS-CoV five days later did not show evidence of viral infection in the lungs versus control mice at ten days post vaccination (1439). Other variations of this vaccine approach include a stable S trimer vaccine whereby proline-substituted variants of S2 can maintain a stable prefusion conformation of the S2 domain (1133). This approach leads to broad and potent neutralizing antibodies (1133)

7.5 Complementary Approaches to Vaccine Development

A complementary approach to other vaccine development programs that is being investigated explores the potential for vaccines that are not made from the SARS-CoV-2 virus to confer what has been termed trained immunity. In a recent review (1440), trained immunity was defined as forms of memory that are temporary (e.g., months or years) and reversible. It is induced by exposure to whole-microorganism vaccines or other microbial stimuli that generates heterologous protective effects. Trained immunity can be displayed by innate immune cells or innate immune features of other cells, and it is characterized by alterations to immune responsiveness to future immune challenges due to epigenetic and metabolic mechanisms. These alterations can take the form of either an increased or decreased response to immune challenge by a pathogen. Trained immunity elicited by non-SARS-CoV-2 whole-microorganism vaccines could potentially improve SARS-CoV-2 susceptibility or severity (1441).

One type of stimulus which research indicates can induce trained immunity is bacillus Calmette-Guerin (BCG) vaccination. BCG is an attenuated form of bacteria Mycobacterium bovis. The vaccine is most commonly administered for the prevention of tuberculosis in humans. Clinical trials in non-SARS-CoV-2-infected adults have been designed to assess whether BCG vaccination could have prophylactic effects against SARS-CoV-2 by reducing susceptibility, preventing infection, or reducing disease severity. A number of trials are now evaluating the effects of the BCG vaccine or the related vaccine VPM1002 (1151, 1152, 14411453).

The ongoing trials are using a number of different approaches. Some trials enroll healthcare workers, other trials hospitalized elderly adults without immunosuppression who get vaccinated with placebo or BCG at hospital discharge, and yet another set of trials older adults (>50 years) under chronic care for conditions like hypertension and diabetes. One set of trials, for example, uses time until first infection as the primary study endpoint; more generally, outcomes measured in some of these trials are related to incidence of disease and disease severity or symptoms. Some analyses have suggested a possible correlation at the country level between the frequency of BCG vaccination (or BCG vaccination policies) and the severity of COVID-19 (1441). Currently it is unclear whether this correlation has any connection to trained immunity. Many possible confounding factors are also likely to vary among countries, such as age distribution, detection efficiency, stochastic epidemic dynamic effects, differences in healthcare capacity over time in relation to epidemic dynamics, and these have not been adequately accounted for in current analyses. It is unclear whether there is an effect of the timing of BCG vaccination, both during an individual’s life cycle and relative to the COVID-19 pandemic. Additionally, given that severe SARS-CoV-2 may be associated with a dysregulated immune response, it is unclear what alterations to the immune response would be most likely to be protective versus pathogenic (e.g., (149, 1441, 1454, 1455)). The article (1441) proposes that trained immunity might lead to an earlier and stronger response, which could in turn reduce viremia and the risk of later, detrimental immunopathology. While trained immunity is an interesting possible avenue to complement vaccine development efforts through the use of an existing vaccine, additional research is required to assess whether the BCG vaccine is likely to confer trained immunity in the case of SARS-CoV-2.

7.6 India and China’s Roles in Vaccine Innovation and Development

The nations of China and India have played a major role as COVID-19 vaccination developers and providers. Considering India produced approximately 60% of the world’s vaccines prior to the pandemic, it is no surprise that the nation has developed and is developing several COVID-19 vaccine candidates. In addition to Covaxin, the Bio E subunit vaccine CORBEVAX is being produced by Biological E in collaboration with U.S.-based Dynavax and the Baylor College of Medicine (1456). These two home-grown vaccines are now approved for adults and children as young as five (CORBEVAX) and six (Covaxin) (1457).

Other vaccines licensed by India were developed elsewhere but produced in India. For example, Novavax (developed in the United States) has signed an agreement with the Serum Institute of India allowing them to produce up to 2 billion doses a year (1458). Similarly, many people within India have been vaccinated with the AstraZeneca-University of Oxford vaccine, known as Covishield in India, which is also produced by the Serum Institute of India (1456). India is also developing vaccines using cutting-edge nucleic-acid-based platforms.. These include ZyCov-D, a DNA vaccine produced by Zydus Cadila, HGCO19 and India’s first mRNA vaccine, produced by Genova and HDT Biotech Corporation (of the U.S.) (1456). Additionally, in February 2021, Bharat Biotech received approval from Indian officials to commence a phase I study of an intranasal chimpanzee-adenovirus (ChAd) vectored SARS-CoV-2-S vaccine called BBV154 (1459).

In China, the Sinopharm-Beijing Institute vaccine, the Sinopharm-Wuhan Institute of Biological Products vaccine, the Sinovac Biotech (CoronaVac) vaccine, and CanSino Biologics vaccine are the main vaccines being distributed. Sinovac and Sinopharm aimed to produce 2 billion doses by the end of 2021, and they have distributed vaccines as aid to the Philippines and Pakistan (1460). In contrast, the Sinopharm-Wuhan vaccine, which has been approved for use in China since February 25, 2021, has been distributed almost exclusively within China, with limited supplies distributed to the United Arab Emirates (1186). On the same date, the CanSino vaccine was approved for use in China and has been granted emergency use in several other countries (1186).

However, the vaccine approval and distribution processes in China have come under increased scrutiny from other nations. China was criticized for administering vaccines to thousands of government officials and state-owned businesses in September 2020, prior to the completion of phase III clinical trials (1461). The behavior of Chinese officials has also come into question due to misinformation campaigns questioning the safety of Western vaccine candidates such as Moderna and Pfizer-BioNTech in a way that is intended to highlight the benefits of their own vaccine candidates (1460). China in particular took aim at mRNA technologies, but Chinese companies have since developed their own mRNA vaccines targeting the omicron variant, one of which is due to begin trials soon in the UAE (1462). Furthermore, delays in vaccine distribution have also caused issues, particularly in Turkey where 10 million doses of Sinovac were due to arrive by December 2020, but instead only 3 million were delivered in early January (1460). Similar delays and shortages of doses promised were reported by officials in the Philippines, Egypt, Morocco, and the United Arab Emirates (1463, 1464). All the same, Sinovac’s vaccine has since been approved for use in countries around the world (1186).

8 The Coming of Age of Nucleic Acid Vaccines during COVID-19

8.1 Abstract

In the 21st century, several emergent viruses have posed a global threat. Each pathogen has emphasized the value of rapid and scalable vaccine development programs. The ongoing SARS-CoV-2 pandemic has made the importance of such efforts especially clear.

New biotechnological advances in vaccinology allow for recent advances that provide only the nucleic acid building blocks of an antigen, eliminating many safety concerns. During the COVID-19 pandemic, these DNA and RNA vaccines have facilitated the development and deployment of vaccines at an unprecedented pace. This success was attributable at least in part to broader shifts in scientific research relative to prior epidemics; the genome of SARS-CoV-2 was available as early as January 2020, facilitating global efforts in the development of DNA and RNA vaccines within two weeks of the international community becoming aware of the new viral threat. Additionally, these technologies that were previously only theoretical are not only safe but also highly efficacious.

Although historically a slow process, the rapid development of vaccines during the COVID-19 crisis reveals a major shift in vaccine technologies. Here, we provide historical context for the emergence of these paradigm-shifting vaccines. We describe several DNA and RNA vaccines and in terms of their efficacy, safety, and approval status. We also discuss patterns in worldwide distribution. The advances made since early 2020 provide an exceptional illustration of how rapidly vaccine development technology has advanced in the last two decades in particular and suggest a new era in vaccines against emerging pathogens.

8.2 Importance

The SARS-CoV-2 pandemic has caused untold damage globally, presenting unusual demands on but also unique opportunities for vaccine development. The development, production, and distribution of vaccines is imperative to saving lives, preventing severe illness, and reducing the economic and social burdens caused by the COVID-19 pandemic. Although vaccine technologies that provide the DNA or RNA sequence of an antigen had never previously been approved for use in humans, they have played a major role in the management of SARS-CoV-2. In this review we discuss the history of these vaccines and how they have been applied to SARS-CoV-2. Additionally, given that the evolution of new SARS-CoV-2 variants continues to present a significant challenge in 2022, these vaccines remain an important and evolving tool in the biomedical response to the pandemic.

8.3 Introduction

The SARS-CoV-2 virus emerged at the end of 2019 and soon spread around the world. In response, the Coalition for Epidemic Preparedness Innovations quickly began coordinating global health agencies and pharmaceutical companies to develop vaccines, as vaccination is one of the primary approaches available to combat the effects of a virus. Vaccines can bolster the immune response to a virus at both the individual and population levels, thereby reducing fatalities and severe illness and potentially driving a lower rate of infection even for a highly infectious virus like SARS-CoV-2. However, vaccines have historically required a lengthy development process due to both the experimental and regulatory demands.

As we review in a companion manuscript (5), vaccine technologies prior to the COVID-19 pandemic were largely based on triggering an immune response by introducing a virus or one of its components. Such vaccines are designed to induce an adaptive immune response without causing the associated viral illness. Each time a virus emerges that poses a significant global threat, as has happened several times over the past 20 years, the value of a rapid vaccine response is underscored. With progressive biotechnological developments, this objective has become increasingly tangible.

In the current century, significant advances in vaccine development have largely been built on genomics, as is somewhat unsurprising given the impact of the Genomic Revolution across all biology. This shift towards nucleic acid-based technologies opens a new frontier in vaccinology, where just the sequence encoding an antigen can be introduced to induce an immune response. While other platforms can carry some risks related to introducing all or part of a virus (5), nucleic acid-based platforms eliminate these risks entirely. Additionally, vaccine technologies that could be adjusted for novel viral threats are appealing because this modular approach would mean they could enter trials quickly in response to a new pathogen of concern.

8.4 Honing a 21st Century Response to Emergent Viral Threats

Recently, vaccine technologies have been developed and refined in response to several epidemics that did not reach the level of destruction caused by COVID-19. Emergent viral threats of the 21st century include severe acute respiratory syndrome (SARS), the H1N1 influenza strain known as swine flu, Middle East respiratory syndrome (MERS), Ebola virus disease, COVID-19, and, most recently, monkeypox, all of which have underscored the importance of a rapid global response to a new infectious virus. Because the vaccine development process has historically been slow, the use of vaccines to control most of these epidemics was limited.

One of the more successful recent vaccine development programs was for H1N1 influenza. This program benefited from the strong existing infrastructure for influenza vaccines along with the fact that regulatory agencies had determined that vaccines produced using egg- and cell-based platforms could be licensed under the regulations used for a strain change (1423). Although a monovalent H1N1 vaccine was not available before the pandemic peaked in the United States of America (U.S.A.) and Europe, it became available soon afterward as a stand-alone vaccine that was eventually incorporated into commercially available seasonal influenza vaccines (1423). Critiques of the production and distribution of the H1N1 vaccine have stressed the need for alternative development-and-manufacturing platforms that can be readily adapted to new pathogens.

Efforts to develop such approaches had been undertaken prior to the COVID-19 pandemic. DNA vaccine development efforts began for SARS-CoV-1 but did not proceed past animal testing (1422). Likewise, the development of viral-vectored Ebola virus vaccines was undertaken, but the pace of vaccine development was behind the spread of the virus from early on (1465). Although a candidate Ebola vaccine V920 showed promise in preclinical and clinical testing, it did not receive breakthrough therapy designation until the summer of 2016, by which time the Ebola outbreak was winding down (1466). Therefore, the COVID-19 pandemic has been the first case where vaccines have been available early enough to significantly influence outcomes at the global scale.

The pandemic caused by SARS-CoV-2 has highlighted a confluence of circumstances that positioned vaccine development as a key player in efforts to control the virus and mitigate its damage. This virus did not follow the same trajectory as other emergent viruses of recent note, such as SARS-CoV-1, MERS-CoV, and Ebola virus, none of which presented a global threat for such a sustained duration (see visualization in (3)). Spread of the SARS-CoV-2 virus has remained out of control in many parts of the world into 2022, especially with the emergence of novel variants exhibiting increased rates of transmission (1). While, for a variety of reasons, SARS-CoV-2 was not controlled as rapidly as the viruses underlying prior 21st century epidemics, vaccine development technology had also progressed based on these and other prior viral threats to the point that a rapid international vaccine development response was possible.

8.5 Development of COVID-19 Vaccines using DNA and RNA Platforms

Vaccine development programs for COVID-19 emerged very quickly. The first administration of a dose of a COVID-19 vaccine to a human trial participant occurred on March 16, 2020 (1467, 1468), marking an extremely rapid response to the emergence of SARS-CoV-2. Within a few weeks of this first trial launching, at least 78 vaccine development programs were active (1468), and by September 2020, there were over 180 vaccine candidates against SARS-CoV-2 in development (1114). As of May 3, 2023, 50 SARS-CoV-2 vaccines have been approved world wide and 28 are being administered throughout the world, with 13.0 billion doses administered across 223 countries. The first critical step towards developing a vaccine against SARS-CoV-2 was characterizing the viral target, which happened extremely early in the COVID-19 outbreak with the sequencing and dissemination of the viral genome in early January 2020 (555) (Figure 10). This genomic information allowed for an early identification of the sequence of the Spike (S) protein (Figure 10), which is the antigen and induces an immune response (1469, 1470).

Figure 10: Structure of the SARS-CoV-2 virus. The development of vaccines depends on the immune system recognizing the virus. Here, the structure of SARS-CoV-2 is represented both in the abstract and against a visualization of the virion. The abstracted visualization was made using BioRender (1471) using the template “Human Coronavirus Structure” by BioRender (August 2020) (1472). The microscopy was conducted by the National Institute of Allergy and Infectious Diseases (1473).

During the development process, one measure used to assess whether a vaccine candidate is likely to provide protection is serum neutralizing activity (1119). This assay evaluates the presence of antibodies that can neutralize, or prevent infection by, the virus in question. Often, titration is used to determine the extent of neutralization activity. However, unlike in efforts to develop vaccines for prior viral threats, the duration of the COVID-19 pandemic has made it possible to also test vaccines in phase III trials where the effect of the vaccines on a cohort’s likelihood of contracting SARS-CoV-2 was evaluated.

8.6 Theory and Implementation of Nucleic Acid Vaccines

Biomedical research in the 21st century has been significantly influenced by the genomic revolution. While traditional methods of vaccine development, such as inactivated whole viruses are still used today (5), vaccine development is no exception. The shift towards omics-based approaches to vaccine development began to take hold with the meningococcal type B vaccine, which was developed using reverse vaccinology in the early 2010s (1474, 1475). Under this approach, the genome of a pathogen is screened to identify potential vaccine targets (1475), and pathogens of interest are then expressed in vitro and tested in animal models to determine their immunogenicity (1475). In this way, the genomic revolution catalyzed a fundamental shift in the development of vaccines. Such technologies could revolutionize the role of vaccines given their potential to address one of the major limitations of vaccines today and facilitate the design of therapeutic, rather than just prophylactic, vaccines (1476).

Nucleic-acid based approaches share an underlying principle: a vector that delivers the information needed to produce an antigen. When the host cells manufacture the antigen, it can then trigger an immune response. The fact that no part of the virus is introduced aside from the genetic code of the antigen means that these vaccines carry no risk of infection. Such approaches build on subunit vaccination strategies, where a component of a virus (e.g., an antigenic protein) is delivered by the vaccine. Platforms based on genomic sequencing began to be explored beginning in the 1980s as genetic research became increasingly feasible. Advances in genetic engineering allowed for gene sequences of specific viral antigens to be grown in vitro (1113). Studies also demonstrated that model organisms could be induced to construct antigens that would trigger an immune response (1128, 1477, 1478). These two developments sparked interest in whether it could be possible to identify any or all of the antigens encoded by a virus’s genome and train the immune response to recognize them.

The delivery and presentation of antigens is fundamental to inducing immunity against a virus. Vaccines that deliver nucleic acids allow the introduction of foreign substances to the body to induce both humoral and cellular immune responses (1479). Delivering a nucleic acid sequence to host cells allows the host to synthesize an antigen without exposure to a viral threat (1479). Host-synthesized antigens can activate both humoral and cellular immunity (1479), as they can be presented in complex with major histocompatibility complex (MHC) I and II, which can activate either T- or B-cells (1479). In contrast, prior approaches activated only MHC II (1478). Because these vaccines encode specific proteins, providing many of the benefits of a protein subunit vaccine, they do not carry any risk of DNA being live, replicating, or spreading, and their manufacturing process lends itself to scalability (1479). Here, opportunities can be framed in terms of the central dogma of genetics: instead of directly providing the proteins from the infectious agents, vaccines developers are exploring the potential for the delivery of DNA or RNA to induce the cell to produce proteins from the virus that in turn induce a host immune response.

8.7 Cross-Platform Considerations in Vaccine Development

Certain design decisions are relevant to vaccine development across multiple platforms. One applies to the platforms that deliver the antigen, which in the case of SARS-CoV-2 vaccines is the S protein. The prefusion conformation of the S protein, which is the structure before the virus fuses to the host cell membrane, is metastable (1480), and the release of energy during membrane fusion drives this process forward following destabilization (12, 1481). Due to the significant conformational changes that occur during membrane fusion (25, 1482, 1483), S protein immunogens that are stabilized in the prefusion conformation are of particular interest, especially because a prefusion stabilized Middle East respiratory syndrome-related coronavirus (MERS-CoV) S antigen was found to elicit an improved antibody response (1278). Moreover, the prefusion conformation offers an opportunity to target S2, a region of the S protein that accumulates mutations at a slower rate (23, 24, 1278) (see also (1)). Vaccine developers can stabilize the prefusion conformer by selecting versions of the S protein containing mutations that lock the position (1484). The immune response to the Spike protein when it is stabilized in this conformation is improved over other S structures (1485). Thus, vaccines that use this prefusion stabilized conformation are expected to not only offer improved immunogenicity, but also be more resilient to the accumulation of mutations in SARS-CoV-2.

Another cross-platform consideration is the use of adjuvants. Adjuvants include a variety of molecules or larger microbial-related products that affect the immune system broadly or an immune response of interest. They can either be comprised of or contain immunostimulants or immunomodulators. Adjuvants are sometimes included within vaccines in order to enhance the immune response. Different adjuvants can regulate different types of immune responses, so the type or combination of adjuvants used in a vaccine will depend on both the type of vaccine and concern related to efficacy and safety. A variety of possible mechanisms for adjuvants have been investigated (14861488).

Due to viral evolution, vaccine developers are in an arms race with a pathogen that benefits from mutations that reduce its susceptibility to adaptive immunity. The evolution of several variants of concern (VOC) presents significant challenges for vaccines developed based on the index strain identified in Wuhan in late 2019. We discuss these variants in depth elsewhere in the COVID-19 Review Consortium project (1148). To date, the most significant variants of concern identified are Alpha (2020), Beta (2020), Gamma (2020), Delta (2021), Omicron (2021), and related Omicron subvariants (2022). The effectiveness or efficacy (i.e., trial or real-world prevention, respectively) of vaccines in the context of these variants is discussed where information is available.

8.8 DNA Vaccine Platforms

DNA vaccine technologies have developed slowly over the past thirty years. These vaccines introduce a vector containing a DNA sequence that encodes antigen(s) selected to induce a specific immune response (1478). Early attempts revealed issues with low immunogenicity (1128, 1478, 1489). Additionally, initial skepticism about the approach suggested that DNA vaccines might bind to the host genome or induce autoimmune disease (1479, 1490), but pre-clinical and clinical studies have consistently disproved this hypothesis and indicated DNA vaccines to be safe (1489). Another concern, antibiotic resistance introduced during the plasmid selection process, did remain a concern during this initial phase of development (1479), but this issue was resolved through strategic vector design (1491, 1492). However, for many years, the immunogenicity of DNA vaccines failed to reach expectations (1479). Several developments during the 2010s led to greater efficacy of DNA vaccines (1479). However, no DNA vaccines had been approved for use in humans prior to the COVID-19 pandemic (1489, 1493). As of May 3, 2023, 10 vaccines have been approved worldwide (Table 4). These vaccines fall into two categories, vaccines that are vectored with a plasmid and those that are vectored with another virus.

Table 4: DNA vaccines approved in at least one country (1169) as of May 3, 2023.
Vaccine Company Platform
iNCOVACC Bharat Biotech non replicating viral vector
Convidecia CanSino non replicating viral vector
Convidecia Air CanSino non replicating viral vector
Gam-COVID-Vac Gamaleya non replicating viral vector
Sputnik Light Gamaleya non replicating viral vector
Sputnik V Gamaleya non replicating viral vector
Jcovden Janssen (Johnson & Johnson) non replicating viral vector
Vaxzevria Oxford/AstraZeneca non replicating viral vector
Covishield (Oxford/ AstraZeneca formulation) Serum Institute of India non replicating viral vector
ZyCoV-D Zydus Cadila plasmid vectored

8.8.1 Plasmid-Vectored DNA Vaccines

Many DNA vaccines use a plasmid vector-based approach, where the sequence encoding the antigen(s) against which an immune response is sought are cultivated in a plasmid and delivered directly to an appropriate tissue (1494). Plasmids can also be designed to act as adjuvants by targeting essential regulators of pathways such as the inflammasome or simply just specific cytokines (1490, 1495). The DNA itself may also stimulate the innate immune response (1478, 1492). Once the plasmid brings the DNA sequence to an antigen-presenting cell (APC), the host machinery can be used to construct antigen(s) from the transported genetic material, and the body can then synthesize antibodies in response (1479). The vectors are edited to remove extra sequences (1492). These types of manufacturing advances have improved the safety and throughput of this platform (1492).

8.8.1.1 Prior Applications

In the 1990s and 2000s, DNA vaccines delivered via plasmids sparked significant scientific interest, leading to a large number of preclinical trials (1479). Early preclinical trials primarily focused on long-standing disease threats, including viral diseases such as rabies and parasitic diseases such as malaria, and promising results led to phase I testing of the application of this technology to human immunodeficiency virus (HIV), influenza, malaria, and other diseases of concern during this period (1479). Although they were well-tolerated, these early attempts to develop vaccines were generally not very successful in inducing immunity to the target pathogen, with either limited T-cell or limited neutralizing antibody responses observed (1479).

Early plasmid-vectored DNA vaccine trials targeted HIV and subsequently diseases of worldwide importance such as malaria and hepatitis B (1496). The concern with these early development projects was immunogenicity, not safety (1496). Around the turn of the millennium, a hepatitis B vaccine development program demonstrated that these vaccines can induce both antibody and cellular immune response (1497). Prior to COVID-19, however, plasmid-vectored DNA vaccines had been approved for commercial use only in veterinary populations (14981500). Between 2005 and 2006, several DNA vaccines were developed for non-human animal populations, including against viruses including a rhabdovirus in fish (1501), porcine reproductive and respiratory syndrome virus (1502), and West Nile virus in horses (1503). Within the past five years, additional plasmid-vectored vaccines for immunization against viruses were developed against a herpesvirus (in mice) (1504) and an alphavirus (in fish) (1505).

8.8.1.2 Applications to COVID-19

Several plasmid-vectored DNA vaccines have been developed against COVID-19 (Table 4). In fact, the ZyCoV-D vaccines developed by India’s Zydus Cadila is the first plasmid-vectored DNA vaccine to receive approval or to be used in human medicine (15061508). Another plasmid-vectored DNA vaccine, INO-4800 (1509), was developed by Inovio Pharmaceuticals Technology that uses electroporation as an adjuvant. Electroporation was developed as a solution to the issue of limited immunogenicity by increasing the permeability of cell membranes by delivering electrical pulses (1510). It has been shown that electroporation can enhance vaccine efficacy by up to 100-fold, as measured by increases in antigen-specific antibody titers (1511). The temporary formation of pores through electroporation facilitates the successful transportation of macromolecules into cells, allowing cells to robustly take up INO-4800 for the production of an antibody response. For INO-4800, a plasmid-vectored vaccine is delivered through intradermal injection which is then followed by electroporation with a device known as CELLECTRA® (1512). The safety of the CELLECTRA® device has been studied for over seven years, and these studies support the further development of electroporation as a safe vaccine delivery method (1510).

These vaccines therefore represent implementations of a new platform technology. In particular, they offer the advantage of a temperature-stable vaccine, facilitating worldwide administration (1513). Although an exciting development in DNA vaccines, the cost of vaccine manufacturing and electroporation may make scaling the use of this technology for prophylactic use for the general public difficult.

8.8.1.3 Trial Safety and Immunogenicity

The INO-4800 trials began with a phase I trial evaluating two different doses administered as a two-dose series (1512). This trial found the vaccine to be safe, with only six adverse events (AEs) reported by 39 participants, all grade 1, and effective, with all but three participants of 38 developing serum IgG binding titers to the SARS-CoV-2 S protein (1512). In the phase II trial of 401 adults at high risk of exposure to SARS-CoV-2 similarly supported the safety and efficacy of INO-4800. Only one treatment-related AE was observed and the vaccine was found to be associated with a significant increase in neutralizing activity (1513). Results of phase III trials are not yet available (15141517).

Trials of ZyCoV-D have progressed further. This vaccine uses a plasmid to deliver the expression-competent Spike protein and IgE signal peptides to the vaccinee (1518). During the phase I trial, vaccination with a needle versus a needle-free injection system was evaluated, and the vaccine can now be administered without a needle (1506, 1507). A phase III trial enrolling over 27,000 patients found no difference in AEs between the placebo and treatment groups and estimated the efficacy of ZyCoV-D to be 66.6% (1519). It was authorized for people ages 12 and older (1508) The highly portable design offers advantages over traditional vaccines (1518), especially as the emergence of variants continues to challenge the effectiveness of vaccines. As of August 2022, ZyCoV-D has only been approved in India (1520) and is not tracked by Our World in Data (1116).

8.8.1.4 Real-World Safety and Effectiveness

In terms of the ability of plasmid-vectored vaccines to neutralize VOC, varying information is available. The situation for ZyCoV-D is somewhat different, as their phase III trial occurred during the Delta wave in India (1519). At present, no major press releases have addressed the vaccine’s ability to neutralize Omicron and related VOC, but reporting suggests that the manufacturers were optimistic about the vaccine in light of the Omicron variant as of late 2021 (1521).

As for INO-4800, studies have examined whether the induced immune response can neutralize existing VOC. They assessed neutralization of several VOC relative to the index strain and found no difference in neutralization between the index strain and the Gamma VOC (P.1) (1522). However, neutralization of the Alpha and Beta VOC was significantly lower (approximately two and seven times, respectively) (1522). These findings are in line with the shifts in effectiveness reported for other vaccines (5). In addition to loss of neutralizing activity due to viral evolution, studies have also evaluated the decline in neutralizing antibodies (nAbs) induced by INO-4800 over time. Levels of nAbs remained statistically significant relative to the pre-vaccination baseline for six months (1523). Administration of a booster dose induced a significant increase of titers relative to their pre-booster levels (1523). Given the timing of this trial (enrollment between April and July 2020), it is unlikely that participants were exposed to VOC associated with decreased efficacy.

In light of the emergence of VOC against which many vaccines show lower effectiveness, Inovio Pharmaceuticals began to develop a new vaccine with the goal of improving robustness against known and future VOC (1524). Known as INO-4802, this vaccine was designed to express a pan-Spike immunogen (1525). Booster studies in rodents (1526) and non-human primates (1525) suggest that it may be more effective than INO-4800 in providing immunity to VOC such as Delta and Omicron when administered as part of a heterologous boost regimen, although boosting with INO-4800 was also very effective in increasing immunity in rhesus macaques (1525). Therefore, boosting is likely to be an important strategy for this vaccine, especially as the virus continues to evolve.

8.8.2 Viral-Vectored DNA Vaccines

Plasmids are not the only vector that can be used to deliver sequences associated with viral antigens. Genetic material from the target virus can also be delivered using a second virus as a vector. Viral vectors have emerged as a safe and efficient method to furnish the nucleotide sequences of an antigen to the immune system (1527). The genetic content of the vector virus is often altered to prevent it from replicating, but replication-competent viruses can also be used under certain circumstances (1528). Once the plasmid or viral vector brings the DNA sequence to an APC, the host machinery can be used to construct antigen(s) from the transported genetic material, and the host can then synthesize antibodies in response (1479).

One of the early viral vectors explored was adenovirus, with serotype 5 (Ad5) being particularly effective (1479). This technology rose in popularity during the 2000s due to its being more immunogenic in humans and non-human primates than plasmid-vectored DNA vaccines (1479). In the 2000s, interest also arose in utilizing simian adenoviruses as vectors because of the reduced risk that human vaccine recipients would have prior exposure resulting in adaptive immunity (1479, 1529), and chimpanzee adenoviruses were explored as a potential vector in the development of a vaccine against MERS-CoV (1530).

Today, various viral-vector platforms including poxviruses (1531, 1532), adenoviruses (1533), and vesicular stomatitis viruses (1534, 1535) are being developed, Viral-vector vaccines are able to induce both an antibody and cellular response; however, the response is limited due to the immunogenicity of the viral vector used (1533, 1536). An important consideration in identifying potential vectors is the immune response to the vector. Both the innate and adaptive immune responses can potentially respond to the vector, limiting the ability of the vaccine to transfer information to the immune system (1537). Different vectors are associated with different levels of reactogenicity; for example, adenoviruses elicit a much stronger innate immune response than replication deficient adeno-associated viruses derived from parvoviruses (1537). Additionally, using a virus circulating widely in human populations as a vector presents additional challenges because vaccine recipients may already have developed an immune response to the vector (1538). Furthermore, repeated exposure to adenoviruses via viral-vectored DNA vaccines may increase reactivity to these vectors over time, presenting a challenge that will need to be considered in long-term development of these vaccines (1539, 1540).

8.8.2.1 Prior Applications

There are several viral vector vaccines that are available for veterinary use (1479, 1541), but prior to the COVID-19 pandemic, only one viral vector vaccine was approved by the United States’ Food and Drug Administration (FDA) for use in humans. This vaccine is vectored with a recombinant vesicular stomatitis virus and targeted against the Ebola virus (1542). Additionally, several phase I and phase II clinical trials for other vaccines are ongoing (1527), and the technology is currently being explored for its potential against numerous infectious diseases including malaria (1543, 1544), Ebola (15451547), and HIV (1548, 1549).

The threat of MERS and SARS initiated interest in the application of viral vector vaccines to human coronaviruses (1530), but efforts to apply this technology to these pathogens had not yet led to a successful vaccine candidate. In the mid-to-late 2000s, adenoviral vectored vaccines against SARS were found to induce SARS-CoV-specific IgA in the lungs of mice (1550) but were later found to offer incomplete protection in ferret models (1551). Gamaleya National Center of Epidemiology and Microbiology in Moscow sought to use an adenovirus platform for the development of vaccines for MERS-CoV and Ebola virus, although neither of the previous vaccines were internationally licensed (1552).

In 2017, results were published from an initial investigation of two vaccine candidates against MERS-CoV containing the MERS-CoV S gene vectored with chimpanzee adenovirus, Oxford University #1 (ChAdOx1), a replication-deficient chimpanzee adenovirus (1553). This study reported that a candidate containing the complete S protein sequence induced a stronger neutralizing antibody response in mice than candidates vectored with modified vaccinia virus Ankara.

The candidate was pursued in additional research, and in the summer of 2020 results of two studies were published. The first reported that a single dose of ChAdOx1 MERS induced an immune response and inhibited viral replication in macaques (1554). The second reported promising results from a phase I trial that administered the vaccine to adults and measured safety, tolerability, and immune response (1555).

8.8.2.2 Application to COVID-19

Figure 11: Worldwide availability of vaccines developed using non-replicating viral vectors. This figure reflects the number of vaccines using non-replicating viral vectors that were available in each country as of May 31, 2023. These data are retrieved from Our World in Data (1116) and plotted using geopandas (1170). See https://greenelab.github.io/covid19-review/ for the most recent version of this figure, which is updated daily. Note that this figure draws from a different data source than Table 4 and does not necessarily include data for every vaccine developed within this category.

While not all of the above results were available at the time that vaccine development programs against SARS-CoV-2 began, at least three viral vector vaccines have also been developed against SARS-CoV-2 (Figure 11). First, a collaboration between AstraZeneca and researchers at the University of Oxford successfully applied a viral vector approach to the development of a vaccine against SARS-CoV-2 using the replication-deficient ChAdOx1 vector modified to encode the S protein of SARS-CoV-2 (1556). In a phase I trial, the immunogenic potential of vaccine candidate ChAdOx1 nCoV-19 was demonstrated through the immune challenge of two animal models, mice and rhesus macaques (1556). In a phase I/II trial, patients receiving the ChAdOx1 nCoV-19 vaccine developed antibodies to the SARS-CoV-2 Spike protein that peaked by day 28, with these levels remaining stable until a second observation at day 56 (1557).

Second, a viral vector approach was applied by Russia’s Gamaleya Research Institute of Epidemiology and Microbiology to develop Sputnik V, a replication-deficient recombinant adenovirus (rAd) vaccine that combines two adenovirus vectors, rAd26-S and rAd5-S, that express the full-length SARS-CoV-2 Spike glycoprotein. These vectors are intramuscularly administered individually using two separate vaccines in a prime-boost regimen. The rAd26-S is administered first, followed by rAd5-S 21 days later. Both vaccines deliver 1011 viral particles per dose. This approach is designed to overcome any potential pre-existing immunity to adenovirus in the population (1558), as some individuals may possess immunity to Ad5 (1559). Sputnik V is the only recombinant adenovirus vaccine to utilize two vectors.

Third, Janssen Pharmaceuticals, Inc., a subsidiary of Johnson & Johnson, developed a viral vector vaccine in collaboration with and funded by the United States’ “Operation Warp Speed” (1560, 1561). The vaccine candidate JNJ-78436735, formerly known as Ad26.COV2-S, is a monovalent vaccine that is composed of a replication-deficient adenovirus serotype 26 (Ad26) vector expressing the stabilized prefusion S protein of SARS-CoV-2 (1485, 1562). Unlike the other two viral vector vaccines available to date, JNJ-78436735 requires only a single dose, a characteristic that was expected to aid in global deployment (1563). JNJ-78436735 was selected from among a number of initial candidate designs (1485) and tested in vivo in Syrian golden hamsters and Rhesus macaques to assess safety and immunogenicity (1485, 15631565). The JNJ-78436735 candidate was selected for its favorable immunogenicity profile and ease of manufacturability (1485, 15631565) and was found to confer protection against SARS-CoV-2 in macaques even after six months (1566). The one- versus two-dose regimen was then tested in volunteers through a phase I/IIa trial (1562, 1567). A major difference between this vaccine and the other two in this category is that the S protein immunogen is stabilized in its prefusion conformation, while in the Sputnik V and AstraZeneca vaccines it is not.

As of May 31, 2023, data describing the distribution of 5 viral-vectored vaccines in 203 countries are available (Figure 11). ChAdOx1 nCoV-19 was first approved for emergency use on December 30, 2020 in the U.K. (1568). Sputnik V was available soon after, and early as January 2021, Sputnik V had been administered to 1.5 million Russians (1569) and began distributing doses to other countries within Europe such as Belarus, Bosnia-Herzegovina, Hungary, San Marino, Serbia, and Slovakia (15701572).

8.8.2.3 Trial Estimates of Safety and Efficacy

The first DNA viral-vectored vaccine for which efficacy estimates became available was AstraZeneca’s ChAdOx1 nCoV-19. In December 2020, preliminary results of the phase III trial were released detailing randomized control trials conducted in the United Kingdom (U.K.), Brazil, and South Africa between April and November 2020 (1469). These trials compared ChAdOx1 nCoV-19 to a control, but the design of each study varied; pooling data across studies indicated an overall efficacy of 70.4%. For Sputnik V, the phase III trial indicated an overall vaccine efficacy of 91.6% for symptomatic COVID-19 (1573). As for Janssen, the vaccine was well-tolerated, and across all regions studied, it was found to be 66.9% effective after 28 days for the prevention of moderate to severe COVID-19 and to be 81.7% effective for the prevention of laboratory-confirmed severe COVID-19 (1574). There were no COVID-19-associated deaths in the vaccine group. However, the emergence of the Beta variant in the South African trial population was associated with a slightly reduced efficacy (64% two weeks after receipt), and all of the COVID-19-associated deaths in the trial occurred in the South African placebo cohort (1574). In February 2021, the FDA issued an EUA for the Janssen vaccine based on interim results from the phase III trial (1575, 1576).

Two of the three vaccines have faced a number of criticisms surrounding the implementation of their clinical trials. In the race to develop vaccines against SARS-CoV-2, President Vladimir Putin of Russia announced the approval of the Sputnik V vaccines on August 11, 2020 in the absence of clinical evidence (1577). A press release on November 11, 2020 indicated positive results from an interim analysis of the phase III Sputnik V trials, which reported 92% efficacy in 16,000 volunteers (1578). However, this release came only two days after both Pfizer and BioNTech reported that their vaccines had an efficacy over 90%, which led to significant skepticism of the Russian findings for a myriad of reasons including the lack of a published protocol and the “reckless” approval of the vaccine in Russia months prior to the publication of the interim results of the phase III trial (1578, 1579). Consequently, many international scientific agencies and public health bodies expressed concern that due diligence to the clinical trial process was subverted for the sake of expediency, leading many to question the safety and efficacy of Sputnik V (1577, 1580, 1581). Despite regulatory, safety, and efficacy concerns, pre-orders for 1 billion doses of the Sputnik V were reported within days of the vaccine’s approval in Russia (1577). Almost a month later, the phase I/II trial data was published (1582) It wasn’t until February 2021, six months after its approval in Russia, that interim results of the phase III trial were released (1573). This publication reported a VE of 91% and a low rate of serious AEs, although there were several serious AEs that were determined not to be associated with the vaccine by an independent data monitoring committee about which little other information was released (1583).

AstraZeneca’s clinical trial also faced criticism. The trial was paused in September 2020 following a severe adverse event in one participant (1584). It was restarted soon after (1585), but it seems that the recent pause was not mentioned to the FDA during a call the morning before the story broke (1586). Additionally, individual sites within the trial employed somewhat different designs but were combined for analysis. For example, in South Africa, the trial was double-blinded, whereas in the U.K. and Brazil it was single-blinded, and one of the two trials carried out in the U.K. evaluated two dosing regimens (low dose or standard dose, both followed by standard dose). Some of the trials used a meningococcal conjugate vaccine (MenACWY) as a control, while others used saline. Data was pooled across countries for analysis, a design decision that was approved by regulators but raised some questions when higher efficacy was reported in a subgroup of patients who received a low-dose followed by a standard dose. This group came about because some participants in the U.K. were erroneously primed with a much lower dose, which turned out to have higher efficacy than the intended dose (1587). Combining the data then led to confusion surrounding the VE, as VE varied widely among conditions (e.g., 62% VE in the standard dose group vs 90% in the group that received a low prime dose (1469)). Subsequent research, however, suggests that reducing the prime dose may, in fact, elicit a superior immune response in the long-term despite a lower initial response (1588). Therefore, this error may serendipitously improve efficacy of vaccine-vectored vaccines broadly.

8.8.2.4 Real-World Safety and Efficacy

Following the trials, additional concerns have been raised about some of these vaccines. Within a few days to a few weeks following their first dose of the AstraZeneca vaccine, three women developed extensive venous sinus thrombosis (1589). In March 2021, administration of the vaccine was paused in several European countries while a possible link to thrombotic events was investigated (1590), as these adverse events had not been observed in clinical trials, but the European Medicine Agency (EMA) soon determined that 25 events were not related to the vaccine (1591). The following month, the United States paused administration of the Janssen vaccine for ten days due to 15 similar AEs (1592, 1593), but the EMA, U.S. Centers for Disease Control, and the FDA’s Advisory Committee on Immunization Practices again identified the events as being very rare and the benefits of the vaccine as likely to outweigh its risks (15941597). In Denmark and Norway, population-based estimates suggested AstraZeneca’s vaccine increased incidence of venous thromboembolic events by 11 cases over baseline per 100,000 doses (1598). Estimates of the incidence in other western countries have also been low (1599). In the US, thromboembolic events following the Janssen vaccine have also been very rare (1595). Subsequently, a potential mechanism was identified: the adenovirus vector binding to platelet factor 4 (1600, 1601). Because this adverse event is so rare, the risk is likely still outweighed by the risks associated with contracting COVID-19 (1602), which is also associated with thrombotic events) (1593, 1603). Similarly, concerns about Guillain-Barré syndrome arose in connection to the Janssen vaccine, but these events have similarly been determined to be very rare and the benefits to outweigh the risks (1597).

Given that vaccines from multiple platforms are now widely available, people at increased risk of a specific severe AE may have options to pursue vaccination with a platform that does not carry such risks. For example, a woman in the U.S. with a history of thromboembolic concerns might feel more comfortable with an mRNA vaccine (described below), where such AEs have not been identified in association with COVID-19 vaccination. However, within the U.S.A., no clear framework has been established for advising patients on whether a specific vaccine may be preferable for their individual concerns now that vaccines based on three different technologies are widely available (see (5) for information about Novavax, which is a protein subunit vaccine).

8.9 mRNA Vaccines

Building on DNA vaccine technology, RNA vaccines are an even more recent advancement for vaccine development. Interest in messenger RNA (mRNA) vaccines emerged around 1990 following in vitro and animal model studies that demonstrated that mRNA could be transferred into cells (1604, 1605). mRNA contains the minimum information needed to create a protein (1605). RNA vaccines are therefore nucleic-acid based modalities that code for viral antigens against which the human body elicits a humoral and cellular immune response.

The strategy behind mRNA vaccines operates one level above the DNA: instead of directly furnishing the gene sequence associated with an antigen to the host, it provides the mRNA transcribed from the DNA sequence. The mRNA is transcribed in vitro and delivered to cells via lipid nanoparticles (LNP) (1606). It is recognized by ribosomes in vivo and then translated and modified into functional proteins (1127). The resulting intracellular viral proteins are displayed on surface MHC proteins, provoking a strong CD8+ T cell response as well as a CD4+ T cell and B cell-associated antibody responses (1127). mRNA is naturally not very stable and can degrade quickly in the extracellular environment or the cytoplasm. The LNP covering protects the mRNA from enzymatic degradation outside of the cell (1607). Codon optimization to prevent secondary structure formation and modifications of the poly-A tail as well as the 5’ untranslated region to promote ribosomal complex binding can increase mRNA expression in cells. Furthermore, purifying out double-stranded RNA and immature RNA with fast performance liquid chromatography and high performance liquid chromatography technology will improve translation of the mRNA in the cell (1127, 1608).

There are three types of RNA vaccines: non-replicating, in vivo self-replicating, and in vitro dendritic cell non-replicating (1609). Non-replicating mRNA vaccines consist of a simple open reading frame for the viral antigen flanked by the 5’ UTR and 3’ poly-A tail. In vivo self-replicating vaccines encode a modified viral genome derived from single-stranded, positive sense RNA alphaviruses (1127, 1608). The RNA genome encodes the viral antigen along with proteins of the genome replication machinery, including an RNA polymerase. Structural proteins required for viral assembly are not included in the engineered genome (1127). Self-replicating vaccines produce more viral antigens over a longer period of time, thereby evoking a more robust immune response (1609). Finally, in vitro dendritic cell non-replicating RNA vaccines limit transfection to dendritic cells. Dendritic cells are potent antigen-presenting immune cells that easily take up mRNA and present fragments of the translated peptide on their MHC proteins, which can then interact with T cell receptors. Ultimately, primed T follicular helper cells can stimulate germinal center B cells that also present the viral antigen to produce antibodies against the virus (1610). These cells are isolated from the patient, then grown and transfected ex vivo (1611). They can then be reintroduced to the patient (1611).

In addition to the benefits of nucleic acid vaccines broadly, mRNA confers specific advantages compared to DNA vaccines and other platforms (1612). Some of these advantages fall within the domain of safety. Unlike DNA vaccines, mRNA technologies are naturally degradable and non-integrating, and they do not need to cross the nuclear membrane in addition to the plasma membrane for their effects to be seen (1127). Additionally, the half life can be regulated by the contents of the 5’ and 3’ untranslated regions (1613). In comparison to vaccines that use live attenuated viruses, mRNA vaccines are non-infectious and can be synthetically produced in an egg-free, cell-free environment, thereby reducing the risk of a detrimental immune response in the host (1614). Furthermore, mRNA vaccines are easily, affordably, and rapidly scalable, despite the fact that it took time to reach the scale needed to manufacture vaccines at a scale sufficient for the global population (1612).

8.9.0.1 Prior Applications

Although mRNA vaccines have been developed for therapeutic and prophylactic purposes, none have previously been licensed or commercialized. Challenges were caused by the instability of mRNA molecules, the design requirements of an efficient delivery system, and the potential for mRNA to elicit either a very strong immune response or to stimulate the immune system in secondary ways (1476, 1615). As of the 2010s, mRNA was still considered a promising technology for future advances in vaccine development (1605), but prior to 2020, no mRNA vaccines had been approved for use in humans, despite significant advances in the development of this technology (1611). This approach showed promise in animal models and preliminary clinical trials for several indications, including rabies, coronavirus, influenza, and cytomegalovirus (1616). Preclinical data previously identified effective antibody generation against full-length purified influenza hemagglutinin stalk-encoding mRNA in mice, rabbits, and ferrets (1617). Similar immunological responses for mRNA vaccines were observed in humans in phase I and II clinical trials operated by the pharmaceutical-development companies Curevac and Moderna for rabies, flu, and zika (1608). Positively charged bilayer LNPs carrying the mRNA attract negatively charged cell membranes, endocytose into the cytoplasm (1607), and facilitate endosomal escape. LNPs can be coated with modalities recognized and engulfed by specific cell types, and LNPs that are 150 nm or less effectively enter into lymphatic vessels (1607, 1618). Therefore, while these technologies elegantly capitalize on decades of research in vaccine development as well as the tools of the genomic revolution, it was largely unknown prior to the SARS-CoV-2 pandemic whether this potential could be realized in a real-world vaccination effort.

8.9.0.2 Application to COVID-19

Table 5: mRNA vaccines approved in at least one country (1169) as of May 3, 2023. As a note, this table includes licensing of existing mRNA technology, i.e., TAK-919 is used to describe Takeda’s manufacturing of Moderna’s formulation.
Vaccine Company
GEMCOVAC-19 Gennova Biopharmaceuticals Limited
Spikevax Moderna
Spikevax Bivalent Original/Omicron BA.1 Moderna
Spikevax Bivalent Original/Omicron BA.4/BA.5 Moderna
Comirnaty Pfizer/BioNTech
Comirnaty Bivalent Original/Omicron BA.1 Pfizer/BioNTech
Comirnaty Bivalent Original/Omicron BA.4/BA.5 Pfizer/BioNTech
TAK-919 (Moderna formulation) Takeda
AWcorna Walvax

Given the potential for mRNA technology to be quickly adapted for a new pathogen, it was favored as a potential vaccine against COVID-19, and fortunately, the prior work in mRNA vaccine development paid off, with 9 mRNA vaccines available in at least one country as of May 3, 2023 (Table 5). In the vaccines developed under this approach, the mRNA coding for a stabilized prefusion Spike protein, which is immunogenic (1619), is furnished to the immune system in order to train its response.

Two vaccine candidates in this category emerged with promising phase III results at the end of 2020. Both require two doses approximately one month apart. The first was Pfizer/BioNTech’s BNT162b2, which contains the full prefusion stabilized, membrane-anchored SARS-CoV-2 Spike protein in a vaccine formulation based on modified mRNA (modRNA) technology (1620, 1621). The second mRNA vaccine, mRNA-1273 developed by ModernaTX, is comprised by a conventional LNP-encapsulated RNA encoding a full-length prefusion stabilized S protein for SARS-CoV-2 (1622). The vaccine candidates developed against SARS-CoV-2 using mRNA vectors utilize similar principles and technologies, although there are slight differences in implementation among candidates such as the formulation of the platform and the specific components of the Spike protein encapsulated (e.g., the full Spike protein vs. the RBD alone) (1623). As of May 31, 2023, 2 mRNA vaccines are available in 169 countries (Figure 12).

Figure 12: Worldwide availability of vaccines developed using mRNA. This figure reflects the number of vaccines based on mRNA technology that were available in each country as of May 31, 2023. These data are retrieved from Our World in Data (1116) and plotted using geopandas (1170). See https://greenelab.github.io/covid19-review/ for the most recent version of this figure, which is updated daily. Note that this figure draws from a different data source than Table 5 and does not necessarily include data for every vaccine developed within this category.

The rapid and simultaneous development of these vaccines was met with some controversy related to intellectual property (IP). First, the National Institutes of Health (NIH) and Moderna became involved in a patent dispute, after researchers at the NIH argued they were unfairly excluded from some patents filed based on their IP after they generated the stabilized modRNA sequence used in the vaccine (1624). Ultimately, in late 2021, Moderna backed down on the patent application (1625). However, in August 2022, the company filed their own suit against Pfizer/BioNTech over IP related to the modRNA used in the latter’s COVID-19 vaccine (1625, 1626). The outcome of this suit remains to be seen.

8.9.0.3 Trial Safety and Immunogenicity

The VEs revealed by the Pfizer/BioNTech and Moderna clinical trials exceeded expectations. In a phase II/III multinational trial, the Pfizer/BioNTech’s BNT162b2 vaccine was associated with a 95% efficacy against laboratory-confirmed COVID-19 and with mild-to-moderate local and systemic effects but a low risk of serious AEs when the prime-boost doses were administered 21 days apart (516). The ModernaTX mRNA-1273 vaccine was the second mRNA vaccine to release phase III results, despite being the first mRNA vaccine to enter phase I clinical trials and publish interim results of their phase III trial a few months later. Their study reported a 94.5% vaccine efficacy in preventing symptomatic COVID-19 in adults who received the vaccine at 99 sites around the United States (1627). Similar to BNT162b2, the mRNA-1273 vaccine was associated with mild-to-moderate AEs but with a low risk of serious AEs (1627). In late 2020, both vaccines received approval from the FDA under an emergency use authorization (1628, 1629), and these vaccines have been widely distributed, primarily in North America and the European Union (1186). As the first mRNA vaccines to make it to market, these two highly efficacious vaccines demonstrate the power of this emerging technology, which has previously attracted scientific interest because of its potential to be used to treat non-infectious as well as infectious diseases.

8.9.0.4 Real-World Safety and Effectiveness

As vaccines were rolled out, one study sought to monitor their effectiveness in a real-world setting. Between December 2020 and April 2021, this prospective cohort study obtained weekly nasal swabs from 3,975 individuals at high risk of SARS-CoV-2 exposure (health care workers, frontline workers, etc.) within the United States (332). Among these participants, 3,179 (80%) had received at least one dose of an mRNA vaccine, and of those, 2,686 (84%) were fully vaccinated, corresponding to 68% of trial participants overall. For each vaccinated participant (defined here as having received at least dose 1 more than 7 days ago) whose sample tested positive for SARS-CoV-2, they categorized the viral lineage(s) present in the sample as well as in samples from 3-4 unvaccinated individuals matched by site and testing date. Overall efficacy of mRNA vaccines was estimated at 91% with full vaccination, similar to the reports from the clinical trials. The occurrence of fevers was also lower in individuals who were partially or fully vaccinated, and the duration of symptoms was approximately 6 days shorter. Among the five cases in fully vaccinated and 11 cases in partially vaccinated participants, the rate of infection by VOC was much higher than in the unvaccinated population (30% versus 10%), suggesting that the vaccine was less effective against the VOC than the index strain.

The WHO continues to monitor the emergence of variants and their impact on vaccine efficacy (529). In general, mRNA vaccines remain highly effective against severe illness and death, but the effectiveness against infection generally has declined. A study monitoring infections in a Minnesota cohort from January to July 2021 estimated that the effectiveness of the Moderna vaccine fell to 86% and Pfizer to 76%, although protection against hospitalization remained at 91% and 85%, respectively (1630). In July of that year, as the Delta variant became dominant in the U.S.A., these estimates all fell, to an effectiveness of 76% for Moderna and 42% for Pfizer and effectiveness against hospitalization of 81% and 75%, respectively (1630).

With the emergence of the Omicron VOC, vaccine effectiveness has likely further declined. A study in a diverse cohort in Southern California, U.S.A. found the effectiveness of the Moderna vaccine in participants who had received only the primary course to be 44% (1631). A study in South Africa compared case and hospitalization records from a 4-week period where Omicron was dominant to a 2-month period where Delta was dominant and found that the effectiveness against hospitalization during the Omicron wave was approximately 70% compared to 93% during the Delta wave (1632). Similarly, a large study in England of 2.5 million individuals suggested that not only the variants circulating, but also the time since vaccination, played a large role in vaccine effectiveness (1633). Shortly after the BNT162b2 primary course, effectiveness against the Omicron VOC was as high as 65.5%, but this declined to below 10% by six months after the second dose. For mRNA-1273, the decline was from 75.1% to 14.9%. Therefore, it is unsurprising that in spite of vaccination programs, infection rates and hospitalization rates climbed in early 2022 in many Western countries including the United States (1634, 1635), especially given that many places simultaneously began to loosen public health restrictions designed to reduce viral spread.

On the side of safety, the only major concern that has been raised is a possible link between mRNA vaccination and myocarditis, especially in young men (1597). This concern began with case reports of a small number of cases of myocarditis following vaccination in several countries (1636, 1637). Following these reports, the Israeli Ministry of Health began surveillance to monitor the occurrence of myocarditis (1638). They identified 283 cases, almost exactly half of which occurred following vaccination with Pfizer’s BNT162b2. Close analysis of these cases determined that the vaccine did have a significant effect on the incidence of myocarditis; however, the rate of myocarditis remained low overall (1638). The identification of young men as a population at particular risk of this AE was supported, and the risk was found to be greater after the second dose than the first. Both this study and a study evaluating data collected from US population-based surveillance identified an increased risk with additional doses (1639). However, most findings suggest that this AE does not have long-term negative effects; a 2021 meta-analysis identified 69 cases, all of which resulted in full recovery (1640). Although these events are very rare, as with the possible thromboembolic AEs associated with viral-vectored DNA vaccines, these findings suggest that it may be prudent to offer a framework for decision making for patients particularly concerned about specific AEs in settings where multiple vaccines are available.

8.10 Booster Doses

Due to waning effectiveness of vaccines over time, especially in light of viral evolution, boosters have emerged as an important strategy in retaining the benefits of vaccination over time. Booster shots are now recommended in many places, and boosters that account for multiple variants and strains of SARS-CoV-2 are now available in some places (531). For example, in the U.S.A., the FDA recently recommended bivalent booster doses designed to account for the Omicron VOC (16411643). In this case, bivalent refers to the fact that doses deliver both the original formulation and an updated vaccine designed for the Omicron subvariants circulating in summer 2022. The fact that the FDA did not require additional clinical trials from manufacturers for Omicron subvariants BA.4 and BA.5 specifically suggests that the rapid authorization of strain changes in response to emerging VOC may be increasingly attainable (1644). Results suggest that this fourth dose offered at least a short-term increase in VE against Omicron subvariants and also provided additional protection against hospitalization (1645).

Homologous booster doses have been investigated for most vaccines. For example, over 14,000 adults were administered a booster (second) dose of the Janssen Ad26.COV2.S vaccine (1646). The booster dose was highly efficacious, with severe COVID-19 and hospitalization prevented almost completely in the vaccinated group. A booster dose was also found to improve immune response for Sputnik V vaccinees (1647). For the AstraZeneca vaccine, a different approach was taken. In the interest of distributing first doses as widely as possible, in some places the time between the first and second doses was extended. One study assessed the immunogenicity and reactogenicity associated with delaying the second dose in the prime-boost series until up to 45 weeks after the first, reporting that an extended inter-dose period was associated with increased antibody titers 28 days after the second dose (1648). This analysis also revealed that a third dose provided an additional boost in neutralizing activity (1648).

Third and fourth doses have been introduced for at least some populations in many places in response to the Omicron variant. An early study in Israeli healthcare workers showed that the additional immunization was safe and immunogenic with antibody titers restored to peak-third dose titers. No severe illness was reported in the cohort studied (274 versus 426 age-matched controls), and vaccine efficacy against infection was reported at 30% for BNT162b2 and 11% for mRNA-1273 (1649). Other studies reported that a third dose of BNT162b2 raised vaccine effectiveness to 67.2% for approximately the first month but that the effectiveness dropped to 45.7% (1633). Reduced and even low efficacy against infection does not undermine the value of vaccination, considering the vaccines are intended to prevent severe disease, hospitalization, and death rather than infection generally. However, these findings do suggest that boosters will likely be needed as the virus continues to evolve.

Many trials have also investigated heterologous boosting approaches. In particular, the mRNA vaccines are a popular choice for booster doses regardless of primary series. In general, such approaches have been found to confer favorable immunogenicity relative to homologous boosters (e.g., (16501656) and many other studies). Due to remaining concerns about rare thromboembolic events, vaccinees who received AstraZeneca for their primary course are advised in some countries to seek a heterologous booster (1657), although such guidances are not supported by the evidence, which indicates that the first dose of AstraZeneca is most likely to be linked to these rare events (1658). In general, heterologous boosting with mRNA vaccines elicits a strong immune response. For patients who received BNT162b2 as a heterologous booster following a ChAdOx1 primary series, the vaccine effectiveness was estimated to be 62.4% initially, dropping to 39.6% after 10 weeks (1633). For a heterologous mRNA-1273 booster, the effectiveness was estimated to be slightly higher (70.1% and 60.9% following ChAdOx1 and 73.9% to 64.4% following BNT162b2) (1633). Therefore, subsequent booster doses may remain an ongoing component of strategies to combat SARS-CoV-2.

Although the vaccines developed based on the index strain remain highly effective at preventing severe illness and death, they serve much less utility at preventing illness broadly than they did early in the pandemic. Therefore, many manufacturers are exploring potential reformulations based on VOC that have emerged since the beginning of the pandemic. In June 2022, Moderna released data describing the effect of their bivalent mRNA booster, mRNA-1273.214, designed to protect against the Omicron variant (1659). A 50 μg dose of mRNA-1273.214 was administered to 437 participants. One month later, the neutralizing geometric mean titer ratio was assessed against several variants of SARS-CoV-2, including Omicron. The immune response was higher against all variants assessed, including Omicron, than for boosting with the original formulation (mRNA-1273). Another formulation, mRNA-1273.211, developed based on the Beta variant, has been associated with durable protection as long as six months after dosing. The associated publications suggest that this novel formulation offers significant protection against Omicron and other VOC (1660, 1661). In August 2022, Pfizer also announced successful development of a new formulation effective against Omicron (1662).

Modularity has been proposed as one of the advantages to developing DNA and mRNA vaccines. This design would allow for faster adaptation to viral evolution. However, in the arms race against SARS-CoV-2, the vaccines are still lagging behind the virus. This disadvantage may change as regulators become more familiar with these vaccines and as a critical mass of data is accumulated. Given the apparent need for boosters, interest has also emerged in whether updated formulations of SARS-CoV-2 vaccines can be administered along with annual flu vaccines to improve immunity to novel variants.

8.11 Conclusions

COVID-19 has seen the coming-of-age of vaccine technologies that have been in development since the late 20th century but had never before been authorized for use. Vaccines that employ DNA and RNA eliminate all concerns about potential infection due to the vaccine components. The vaccines described above demonstrate the potential for these technologies to facilitate a quick response to an emerging pathogen. Additionally, their efficacy in trials far exceeded expectations, especially in the case of RNA vaccines. These technologies hold significant potential to drive improvements in human health over the coming years.

Traditional vaccine technologies were built on the principle of using either a weakened version of the virus or a fragment of the virus. COVID-19 has highlighted the fact that in recent years, the field has undergone a paradigm shift towards reverse vaccinology. Reverse vaccinology emphasizes a discovery-driven approach to vaccine development based on knowledge of the viral genome (1663). This strategy was explored during development of a DNA vaccine against the Zika virus (1664). Though the disease was controlled before the vaccine became available (1423), the response demonstrated the potential for modular technologies to facilitate a response to emerging viral threats (1664). The potential for such vaccines to benefit the field of oncology has encouraged vaccine developers to invest in next-generation approaches, which has spurred the diversification of vaccine development programs (1479, 1665). As a result, during the COVID-19 pandemic, these modular technologies have taken center stage in controlling a viral threat for the first time.

The safety and efficacy of vaccines that use these new technologies has exceeded expectations. While there were rare reports of severe AEs such as myocarditis (mRNA platforms) and thromboembolic events (viral-vectored DNA platforms), widespread availability of both types of vaccines would allow individuals to choose (particularly relevant in this case because myocarditis has primarily been reported in men and thromboembolic events primarily in women). Estimates of efficacy have varied widely, but in all cases are high. Estimates of the efficacy of DNA vaccine platforms have typically fallen either in the range of approximately 67% (ZyCoV-D and Janssen) or 90% (Sputnik V). AstraZeneca’s trial produced estimates in both ranges, with the standard dosage producing an efficacy of 62% and the lower prime dose producing a VE of 90%. The mRNA vaccine trials were somewhat higher, with VE estimated at approximately 95% for both the Moderna and Pfizer/BioNTech clinical trials. However, in all cases, the efficacy against severe illness and death were very high. Therefore, all of these vaccines are useful tools for combating COVID-19.

Furthermore, the fact that vaccine efficacy is not a static value has become particularly salient, as real-world effectiveness has changed with location and over time. COVID-19 vaccines have been challenged by the emergence of VOC. These VOC generally carry genetic mutations that code for an altered Spike protein (i.e., the antigen), so the antibodies resulting from immunization with vaccines developed from the index strain neutralize them less effectively (1666, 1667). Despite some reports of varying and reduced effectiveness or efficacy of the mRNA vaccines against the Alpha (B.1.1.7), Beta (B.1.351), and Delta (B.1.617.2) variants versus the original SARS-CoV-2 strain or the D614G variant (16681670), the greatest concern to date has been the Omicron variant (B.1.1.529), which was first identified in November 2021 (1667, 1671). As of March 2022, the Omicron variant accounted for 95% of all infections sequenced in the United States (122) and was linked to an increased risk of SARS-CoV-2 reinfection (1666) and further infection of those who have been vaccinated with the mRNA vaccines (1672).

One of the downsides of this leap in vaccine technologies, however, is that they have largely been developed by wealthy countries, including countries in the European Union, the United States, the U.K., and Russia. As a result, they are also largely available to residents of wealthy countries, primarily in Europe and North America. Although the VE of DNA vaccines tends to be lower than that of mRNA vaccines (1673), they still provide excellent protection against severe illness and are much easier to distribute due to less complex demands for storage. Efforts such as COVAX that aim to expand access to vaccines developed by wealthy countries have not been as successful as hoped (1674). Fortunately, vaccine development programs using more established technologies have been undertaken in many middle-income countries, and those vaccines have been more accessible globally (5). Additionally, efforts to develop new formulations of DNA vaccines in lower- and middle-income countries are increasingly being undertaken (1675).

The modular nature of nucleic acid-based vaccine platforms has opened a new frontier in responding to emerging viral illnesses. The RNA vaccines received an EUA in only a few months more than it took to identify the pathogen causing SARS in 2002. Given the variety of options available for preventing severe illness and death, it is possible that certain vaccines may be preferable for certain demographics (e.g., young women might choose an mRNA vaccine to entirely mitigate the very low risk of blood clots (1597)). However, this option is likely only available to people in high-income countries. In lower-income countries, access to vaccines broadly is a more critical issue. Different vaccines may confer advantages in different countries, and vaccine development in a variety of cultural contexts is therefore important (1676). Without widespread access to vaccines on the global scale, SARS-CoV-2 will continue evolving, presenting a threat to all nations.

9 Appendix: Additional Information about Novel Vaccine Platforms for COVID-19

9.1 Plasmid-Vectored DNA Vaccines

9.1.1 INO-4800

The phase I trial for INO-4800 began enrolling participants in April 2020 in Philadelphia, PA at the Perelman School of Medicine and at the Center for Pharmaceutical Research in Kansas City, MO. This trial examined two different doses administered in a two-dose regimen (1512). Among the 39 participants, only six AEs were reported and all were grade 1 (1512). Efficacy was evaluated based on blood samples collected pre- and post-vaccination, and all but three participants of 38 included in the analysis were found to have serum IgG binding titers to the spike protein after vaccination (1512).

Results from the phase II trial were released as a preprint in May 2021 and reported findings based on administering INO-4800 to 401 adult volunteers at high risk of exposure to SARS-CoV-2 (1513). The phase II results supported that the vaccine was safe, with 1,446 treatment-related AEs observed across 281 participants, all but one of which were grade 1 or grade 2. The single grade 3 event was joint stiffness (1513). The rates of AEs in the placebo group were not reported. To assess the immunogenicity of INO-4800, pre- and post-vaccination blood samples were collected and evaluated for a humoral immune response to the spike protein, and the treatment group was identified to show significantly greater neutralizing activity than the placebo group (1513).

The phase II/III trials are ongoing in several countries, including the United States, Mexico, India, and Colombia (15141517). Therefore, vaccine efficacy data from a large study population is not yet available.

9.2 Viral-Vectored DNA Vaccines

9.2.1 ChAdOx1 nCoV-19 (AstraZeneca)

Prior analyses of viral vector vaccines against human coronaviruses (HCoVs) had indicated that this approach showed potential for inducing an immune response, but little information was available about the effect on real-world immunity. In the first phase of development, a candidate ChAdOx1 nCoV-19 was evaluated through the immune challenge of two animal models, mice and rhesus macaques (1556). Animals in the treatment condition were observed to develop neutralizing antibodies specific to SARS-CoV-2 (both macaques and mice) and to show reduced clinical scores when exposed to SARS-CoV-2 (macaques) (1556).

Next, a phase I/II trial was undertaken using a single-blind, randomized controlled design (1557). ChAdOx1 nCoV-19 and a control, the meningococcal conjugate vaccine MenACWY, were administered intramuscularly to adults ages 18 to 55 at five sites within the United Kingdom (U.K.) at a 1:1 ratio (n=543 and n=534, respectively). All but ten participants received a single dose; this small group received a booster 28 days after their first dose of ChAdOx1 nCoV-19. Commonly reported local adverse reactions included mild-to-moderate pain and tenderness at the injection site over the course of seven days, while the most common systemic adverse reactions were fatigue and headache; some patients reported severe adverse systemic effects. The study also reported that many common reactions could be reduced through the administration of paracetamol (acetaminophen), and paracetamol was not found to reduce immunogenicity. Patients receiving the ChAdOx1 nCoV-19 vaccine developed antibodies to the SARS-CoV-2 spike protein that peaked by day 28, with these levels remaining stable until a second observation at day 56 except in the ten patients who received a booster dose at day 28, in whom they increased by day 56. Analysis of serum indicated that participants developed antibodies to both S and the RBD, and that 100% of them achieved neutralizing titers by day 28. By day 35, the neutralization titers of vaccinated patients was comparable to that observed with plasma from convalescents. This initial study therefore suggested that the vaccine was likely to confer protection against SARS-CoV-2, although analysis of its efficacy in preventing COVID-19 was not reported.

The primary outcome assessed was symptomatic, laboratory-confirmed COVID-19. There were 131 cases observed among the 11,636 participants eligible for the primary efficacy analysis, corresponding to an overall efficacy of 70.4% (30 out of 5,807 in the vaccine arm and 101 out of 5,829 in the control arm); the 95.8% CI was reported as 54.8 to 80.6. However, a higher efficacy was reported in the subgroup of patients who received a low-dose followed by a standard dose (90.0%, 95% CI 67.4 to 97·0). A total of ten cases of severe COVID-19 resulting in hospitalization were observed among trial participants, and all of these occurred in patients in the control arm of the study. In line with the previously reported safety profiling for this vaccine, serious adverse events were reported to be comparable across the two arms of the study, with only three events identified as potentially associated with the vaccine itself.

Additional data about the efficacy of this vaccine became available in a preprint released on March 2, 2021 (1677). This report provided data describing the efficacy of ChAdOx1 nCoV-19, along with Pfizer/BioNTech’s BNT162b2, in the U.K. between December 8, 2020 and February 19, 2021 and specifically sought to evaluate the efficacy of the vaccine in the presence of a potentially more contagious variant of concern, B.1.1.7. All participants in this study were age 70 or older and the efficacy was estimated to increase from 60% at 28 days after vaccination to 73% at 35 days after vaccination, although the standard error also increased over this time. Therefore, preliminary results suggest that in a number of samples, this vaccine confers a high level of protection against SARS-CoV-2.

9.2.2 Sputnik-V (Gam-COVID-Vac and Gam-COVID-Vac-Lyo)

The vaccine Gam-COVID-Vac, nicknamed Sputnik V in reference to the space race and “V for vaccine”, was developed by the Gamaleya National Center of Epidemiology and Microbiology in Moscow. The development of Sputnik V was financed by the Russian Direct Investment Fund (RDIF) (1577, 1678). The Sputnik V vaccines are available in both a lyophilized (Gam-COVID-Vac-Lyo) and frozen form (Gam-COVID-Vac), which are stored at 2-8°C and -18°C respectively (1582). The lyophilized vaccine is convenient for distribution and storage, particularly to remote or disadvantaged areas (1679).

In the phase I/II trial study conducted between late June and early August 2020, 76 participants (18-60 years old) were split into two groups of 38 participants, which were non-randomized in two hospitals in Russia. In phase I, 9 patients received rAd26 and 9 patients received rAd5-S to assess safety over 28 days. In phase II, at least 5 days after the completion of phase I, 20 patients received a prime-boost vaccination of rAd26-S on day 0 and rAd5-S on day 2, which was administered intramuscularly. The phase I/II trial reported that both vaccines were deemed safe and well tolerated. The most common adverse events reported were mild, such as pain at the injection site (58%), hypothermia (50%), headaches (42%), fatigue (28%), and joint and muscle pain (24%). Seroconversion was observed in all participants three weeks post the second vaccination (day 42), and all participants produced antibodies to the SARS-CoV-2 glycoprotein. RBD-specific IgG levels were high in both the frozen and lyophilized versions of the vaccine (14,703 and 11,143 respectively), indicating a sufficient immune response to both. Three weeks post the second vaccination, the virus-neutralizing geometric mean antibody titers were 49.25 and 45.95 from the frozen and lyophilized vaccines, respectively. At 28 days, median cell proliferation of 1.3% CD4+ and 1.1% CD8+ were reported for the lyophilized vaccine and 2.5% CD4+ and 1.3% CD8+ for the vaccine stored frozen. These results indicated that both forms of Sputnik V appeared to be safe and induced a humoral and cellular response in human subjects (1582), which may be robust enough to persist and not wane rapidly (1558).

In February 2021, the interim results of the phase III randomized, double-blind, placebo-controlled trial were published in The Lancet (1573). The participants were randomly assigned to receive either a 0.5 mL/dose of vaccine or placebo, which was comprised of the vaccine buffer composition, that was delivered intramuscularly using the same prime-boost regimen as in the phase I/II trials. From September 7 to November 24, 19,866 participants completed the trial. Of the 14,964 participants who received the vaccine, 16 (0.1%) were confirmed to have COVID-19, whereas 62 of the 4,902 participants (1.3%) in the placebo group were confirmed to have COVID-19. Of these participants, no moderate or severe cases of COVID-19 were reported in the vaccine group, juxtaposed with 20 in the placebo group. However, only symptomatic individuals were confirmed for SARS-CoV-2 infection in this trial. Therefore, asymptomatic infections were not detected, thus potentially inflating the efficacy estimate. Overall, a vaccine efficacy of 91.6% (95% CI 85.6-95.2) was reported, where an efficacy of 91.8% was reported for those over 60 years old and 92.7% for those who were 51-60 years old. Indeed, 14 days after the first dose, 87.6% efficacy was achieved and the immunity required to prevent disease occurred within 18 days of vaccination.

Based on these results, scientists are investigating the potential for a single dose regimen of the rAd26-S sputnik V vaccine (1680). By the end of the trial, 7,485 participants reported adverse events, of which 94% were grade I. Of the 68 participants who experienced serious adverse events during the trial, 45 from the vaccine group and 23 from the placebo groups, none were reported to be associated with the vaccination. Likewise, 4 deaths occurred during the trial period that were not related to the vaccine (1573). The interim findings of the phase III trial indicate that the Sputnik V vaccine regimen appears to be safe with 91.6% efficacy. Gamaleya had intended to reach a total of 40,000 participants for the completion of their phase III trial. However, the trial has stopped enrolling participants and the numbers have been cut to 31,000 as many individuals in the placebo group dropped out of the study to obtain the vaccine (1681). Other trials involving Sputnik V are currently underway in Belarus, India, the United Arab Emirates, and Venezuela (1682).

Preliminary results of a trial of Argentinian healthcare workers in Buenos Aires who were vaccinated with the Sputnik V rAd26-R vector-based vaccine seems to support the short term safety of the first vaccination (1683). Of the 707 vaccinated healthcare workers, 71.3% of the 96.6% of respondents reported at least one adverse event attributed to the vaccine. Of these individuals, 68% experienced joint and muscle pain, 54% had injection site pain, 11% reported redness and swelling, 40% had a fever, and 5% reported diarrhea. Only 5% of the vaccinated participants experienced serious adverse events that required medical attention, of which one was monitored as an inpatient.

Additionally, an independent assessment of Sputnik V in a phase II clinical trial in India found the vaccine to be effective, but the data is not yet publicly available (1684). On December 21, 2020, Gamaleya, AstraZeneca, R-Pharm, and the Russian Direct Investment Fund agreed to assess the safety and immunogenicity of the combined use of components of the AstraZeneca and University of Oxford AZD1222 (ChAdOx1) vaccine and the rAd26-S component of the Sputnik V vaccine in clinical trials (1685). This agreement hopes to establish scientific and business relations between the entities with an aim to co-develop a vaccine providing long-term immunization. The trial, which will begin enrollment soon, will include 100 participants in a phase II open-label study and is hoped to be complete within 6 months. Participants will first receive an intramuscular dose of AZD1222 on day 1, followed by a dose of rAd26 on day 29 and be monitored from day 1 for 180 days in total. The primary outcomes measured will include incidence of serious adverse events post first dose until the end of the study. Secondary outcome measures will include incidence of local and systemic adverse events 7 days post each dose, a time course of antibody responses for the Spike protein and the presence of anti-SARS-CoV-2 neutralizing antibodies (1686).

Overall, there is hesitancy surrounding the management of the Sputnik V vaccine approval process and concerns over whether the efficacy data may be inflated due to a lack of asymptomatic testing within the trial. However, the interim results of the phase III study were promising and further trials are underway, which will likely shed light on the overall efficacy and safety of the Sputnik V vaccine regimen. There may be some advantage to the Sputnik V approach including the favorable storage conditions afforded by choice between a frozen and lyophilized vaccine. Furthermore, the producers of Gam-COVID-Vac state that they can produce the vaccine at a cost of less than $10 per dose or less than $20 per patient (1687).

9.3 Janssen’s JNJ-78436735

The Johnson and Johnson (J&J) vaccine developed by Janssen Pharmaceuticals, Inc., a subsidiary of J&J, was conducted in collaboration with and funded by “Operation Warp Speed” (1485, 15601562). The vaccine was developed using Janssen’s AdVac® and PER.C6 platforms that were previously utilized to develop the European Commission-approved Ebola vaccine (Ad26 ZEBOV and MVN-BN-Filo) and their Zika, respiratory syncytial virus, and human immunodeficiency virus investigational vaccine candidates (1688).

The development of a single-dose vaccine was desired by J&J from the outset, with global deployment being a key priority (1563). Using their AdVac® technology, the vaccine can remain stable for up to two years between -15 and -25℃ and at least three months at 2 to 8℃ (1688). This allows the vaccine to be distributed easily without the requirement for very low temperature storage, unlike many of the other COVID-19 vaccine candidates. J&J screened numerous potential vaccine candidates in vitro and in animal models using varying different designs of the S protein, heterologous signal peptides, and prefusion-stabilizing substitutions (1485). A select few candidates were further investigated as a single dose regimen in Syrian golden hamsters, a single dose regimen in rhesus macaques, and a single- and two-dose regimen in both adult and aged rhesus macaques (1485, 15631565). A SARS-CoV-2 challenge study in rhesus macaques showed that vaccine doses as low as 2 x 109 viral particles/mL was sufficient to induce strong protection in bronchoalveolar lavage but that doses higher than 1.125 x 1010 were required to close achieve close to complete protection in nasal swabs (1689). Indeed, six months post-immunization, levels of S-binding and neutralizing antibodies in rhesus macaques indicated that the JNJ-78436735 vaccine conferred durable protection against SARS-CoV-2 (1566).

Following selection of the JNJ-78436735 vaccine, J&J began phase I/IIa trials. The interim phase I/IIa data was placed on the medRxiv preprint server on September 25th, 2020 (1690) and was later published in the New England Journal of Medicine on January 13th, 2021 (1562). The phase I/IIa multi-center, randomized, placebo-controlled trial enrolled 402 healthy participants between 18-55 years old and a further 403 healthy older participants ≥ 65 years old (1562). Patients were administered either a placebo, a low dose (5 x 1010 viral particles per mL), or a high dose (1 X 1011 viral particles per mL) intramuscularly as part of either a single- or two-dose regimen. All patients received injections 56 days apart, but participants in the single-dose condition received the placebo at the second appointment. Those who received only one dose of either vaccine received a placebo dose at their second vaccination visit. The primary endpoints of both the trial were safety and reactogenicity of each dose. Fatigue, headache, myalgia, and pain at the injection site were the most frequent solicited adverse events reported by participants. Although less common, particularly for those in the elderly cohort and those on the low dose regimen, the most frequent systemic adverse effect was fever. Overall, immunization was well tolerated, particularly at the lower dose concentration. In terms of reactogenicity, over 90% of those who received either the low or high dose demonstrated seroconversion in a neutralization assay using wild-type SARS-CoV-2, 29 days after immunization (1562). Neutralizing geometric mean ratio of antibody titers (GMT) between 224-354 were detected regardless of age. By day 57, 100% of the 18-55 year old participants had neutralizing GMT (288-488), which remained stable until day 71. In the ≥ 65 years old cohort, the incidence of seroconversion for the low- and high-dose was 96% and 88% respectively by day 29.

GMTs for the low and high doses were slightly lower for participants ≥ 65 years old (196 and 127 respectively), potentially indicating slightly lower immunogenicity. Seroconversion of the S antibodies was detected in 99% of individuals between 18-55 years old for the low and high doses (GMTs 528 and 695 respectively), with similar findings reported for the ≥ 65 years old. Indeed, both dose concentrations also induced robust Th1 cytokine-producing S-specific CD4+ T cells and CD8+ T cell responses in both age groups. The findings of the phase I/IIa study supported further investigation of a single immunization using the low dose vaccine. Therefore, 25 patients were enrolled for a second randomized double-blind, placebo-controlled phase I clinical trial currently being conducted in Boston, Massachusetts for 2 years (1691). Participants received either a single dose followed by a placebo, or a double dose of either a low dose (5 x 1010 viral particles/mL) or a high dose (1 x 1011 viral particles/mL) vaccine administered intramuscularly on day 1 or day 57. Placebo-only recipients received a placebo dose on day 1 and 57. Interim analyses conducted on day 71 indicated that binding and neutralizing antibodies developed 8 days after administration in 90% and 25% of vaccine recipients, respectively. Binding and neutralizing antibodies were detected in 100% of vaccine recipients by day 57 after a single dose immunization. Spike-specific antibodies were highly prevalent (GMT 2432 to 5729) as were neutralizing antibodies (GMT 242 to 449) in the vaccinated groups. Indeed, CD4+ and CD8+ T-cell responses were also induced, which may provide additional protection, particularly if antibodies wane or poorly respond to infection (1692).

On September 23rd, 2020, J&J launched its phase III trial ENSEMBLE and released the study protocol to the public (1688, 1693). The trial intended to enroll 60,000 volunteers to assess the safety and efficacy of the single vaccine dose versus placebo with primary endpoints of 14 and 28 days post-immunization (1688). The trial was conducted in Argentina, Brazil, Chile, Colombia, Mexico, Peru, South Africa, and the U.S. The trial was paused briefly in October 2020 to investigate a “serious medical event”, but resumed shortly after (1694).

An interim analysis was reported via press release on January 29th, 2021 (1575, 1576). The interim data included 43,783 participants who accrued 468 symptomatic cases of COVID-19. It was reported that the JNJ-78436735 vaccine was 66% effective across all regions studied for the prevention of moderate to severe COVID-19 28 days post-vaccination in those aged 18 years and older. Notably, JNJ-78436735 was 85% effective for the prevention of laboratory-confirmed severe COVID-19 and 100% protection against COVID-19-related hospitalization and death 28 days post-vaccination across all study sites. Efficacy of the vaccine against severe COVID-19 increased over time, and there were no cases of COVID-19 reported in immunized participants after day 49. The trial also determined that the vaccine candidate has a favorable safety profile as determined by an independent Data and Safety Monitoring Board. The vaccine was well tolerated, consistent with previous vaccines produced using the AdVac® platform. Fever occurred in 9% of vaccine recipients, with grade 3 fever occurring in only 0.2% of recipients. Serious adverse events were reportedly higher in the placebo group than the vaccine group, and no anaphylaxis was reported (1576).

At the time the phase III trial was being conducted, several concerning variants, including B.1.1.7 (482) and B.1.351 (231), were spreading across the globe. In particular, B.1.351 was first identified in South Africa, which was one of the JNJ-78436735 vaccine trial sites. Therefore, the J&J investigators also analyzed the efficacy of the JNJ-78436735 vaccine associated with their various trial sites to determine any potential risk of reduced efficacy as a result of the novel variants. It was determined that JNJ-78436735 was 72% effective in the U.S., 66% effective in Latin America, and 57% effective in South Africa 28 days post-vaccination. These findings underpin the importance of monitoring for the emergence of novel SARS-CoV-2 variants and determining their effects on vaccine efficacy.

Looking forward, Janssen are also running a phase III randomized, double-blind, placebo-controlled clinical trial, Ensemble 2, which aims to assess the efficacy, safety, and immunogenicity of a two-dose regimen of JNJ-78436735 administered 57 days apart. This trial will enroll 30,000 participants ≥ 18 years old from Belgium, Colombia, France, Germany, Philippines, South Africa, Spain, U.K., and the U.S. (1695). This trial will also include participants with and without comorbidities associated with an increased risk of COVID-19.

9.4 RNA Vaccines

RNA vaccines are nucleic-acid based modalities that code for viral antigens against which the human body elicits a humoral and cellular immune response. The resulting intracellular viral proteins are displayed on surface MHC proteins, provoking a strong CD8+ T cell response as well as a CD4+ T cell and B cell-associated antibody responses (1127). Given the potential for this technology to be quickly adapted for a new pathogen, it has held significant interest for the treatment of COVID-19. The results of the interim analyses of two mRNA vaccine candidates became available at the end of 2020 and provided strong support for this emerging approach to vaccination. Below we describe in detail the results available as of February 2021 for two such candidates, mRNA-1273 produced by ModernaTX and BNT162b2 produced by Pfizer, Inc. and BioNTech. As of August 2022, the U.S. FDA has issues approvals or emergency use authorizations of versions of these vaccines for adults and for children 6 months and older (1696).

9.4.1 ModernaTX mRNA Vaccine

ModernaTX’s mRNA-1273 vaccine was the first COVID-19 vaccine to enter a phase I clinical trial in the United States. An initial report described the results of enrolling forty-five participants who were administered intramuscular injections of mRNA-1273 in their deltoid muscle on day 1 and day 29, with the goal of following patients for the next twelve months (1119). Healthy males and non-pregnant females aged 18-55 years were recruited for this study and divided into three groups receiving 25, 100, or 250 μg of mRNA-1273. IgG ELISA assays on patient serology samples were used to examine the immunogenicity of the vaccine (1622). Binding antibodies were observed at two weeks after the first dose at all concentrations. At the time point one week after the second dose was administered on day 29, the pseudotyped lentivirus reporter single-round-of-infection neutralization assay, which was used to assess neutralizing activity, reached a median level similar to the median observed in convalescent plasma samples. Participants reported mild and moderate systemic adverse events after the day 1 injection, and one severe local event was observed in each of the two highest dose levels. The second injection led to severe systemic adverse events for three of the participants at the highest dose levels, with one participant in the group being evaluated at an urgent care center on the day after the second dose. The reported localized adverse events from the second dose were similar to those from the first.

Several months later, a press release from ModernaTX described the results of the first interim analysis of the vaccine (1697). On November 16, 2020, a report was released describing the initial results from phase III testing, corresponding to the first 95 cases of COVID-19 in the 30,000 enrolled participants (1697), with additional data released to the FDA on December 17, 2020 (1698). These results were subsequently published in a peer-reviewed journal (The New England Journal of Medicine) on December 30, 2020 (1627). The first group of 30,420 study participants was randomized to receive the vaccine or a placebo at a ratio of 1:1 (1627). Administration occurred at 99 sites within the United States in two sessions, spaced 28 days apart (1627, 1699). Patients reporting COVID-19 symptoms upon follow-up were tested for SARS-CoV-2 using a nasopharyngeal swab that was evaluated with RT-PCR (1699). The initial preliminary analysis reported the results of the cases observed up until a cut-off date of November 11, 2020. Of these first 95 cases reported, 90 occurred in participants receiving the placebo compared to 5 cases in the group receiving the vaccine (1697). These results suggested the vaccine is 94.5% effective in preventing COVID-19. Additionally, eleven severe cases of COVID-19 were observed, and all eleven occurred in participants receiving the placebo. The publication reported the results through an extended cut-off date of November 21, 2020, corresponding to 196 cases (1627). Of these, 11 occurred in the vaccine group and 185 in the placebo group, corresponding to an efficacy of 94.1%. Once again, all of the severe cases of COVID-19 observed (n=30) occurred in the placebo group, including one death. Thus, as more cases are reported, the efficacy of the vaccine has remained above 90%, and no cases of severe COVID-19 have yet been reported in participants receiving the vaccine.

These findings suggest the possibility that the vaccine might bolster immune defenses even for subjects who do still develop a SARS-CoV-2 infection. The study was designed with an explicit goal of including individuals at high risk for COVID-19, including older adults, people with underlying health conditions, and people of color (1700). The phase III trial population was comprised by approximately 25.3% adults over age 65 in the initial report and 24.8% in the publication (1699). Among the cases reported by both interim analyses, 16-17% occurred in older adults (1627, 1697). Additionally, approximately 10% of participants identified a Black or African-American background and 20% identified Hispanic or Latino ethnicity (1627, 1699). Among the first 95 cases, 12.6% occurred in participants identifying a Hispanic or Latino background and 4% in participants reporting a Black or African-American background (1697); in the publication, they indicated only that 41 of the cases reported in the placebo group and 1 case in the treatment group occurred in “communities of color”, corresponding to 21.4% of all cases (1627). While the sample size in both analyses is small relative to the study population of over 30,000, these results suggest that the vaccine is likely to be effective in people from a variety of backgrounds.

In-depth safety data was released by ModernaTX as part of their application for an EUA from the FDA and summarized in the associated publication (1627, 1699). Because the detail provided in the report is greater than that provided in the publication, here we emphasize the results observed at the time of the first analysis. Overall, a large percentage of participants reported adverse effects when solicited, and these reports were higher in the vaccine group than in the placebo group (94.5% versus 59.5%, respectively, at the time of the initial analysis) (1699). Some of these events met the criteria for grade 3 (local or systemic) or grade 4 (systemic only) toxicity (1699), but most were grade 1 or grade 2 and lasted 2-3 days (1627). The most common local adverse reaction was pain at the injection site, reported by 83.7% of participants receiving the first dose of the vaccine and 88.4% upon receiving the second dose, compared to 19.8% and 17.0%, respectively, of patients in the placebo condition (1699). Fewer than 5% of vaccine recipients reported grade 3 pain at either administration. Other frequent local reactions included erythema, swelling, and lymphadenopathy (1699). For systemic adverse reactions, fatigue was the most common (1699). Among participants receiving either dose of the vaccine, 68.5% reported fatigue compared to 36.1% participants receiving the placebo (1699). The level of fatigue experienced was usually fairly mild, with only 9.6% and 1.3% of participants in the vaccine and placebo conditions, respectively, reporting grade 3 fatigue (1699), which corresponds to significant interference with daily activity (1701). Based on the results of the report, an EUA was issued on December 18, 2020 to allow distribution of this vaccine in the United States (1629), and it was shortly followed by an Interim Order authorizing distribution of the vaccine in Canada (1702) and a conditional marketing authorization by the European Medicines Agency to facilitate distribution in the European Union (1703).

9.4.2 Pfizer/BioNTech BNT162b2

ModernaTX was, in fact, the second company to release news of a successful interim analysis of an mRNA vaccine and receive an EUA. The first report came from Pfizer and BioNTech’s mRNA vaccine BNT162b2 on November 9, 2020 (1704), and a preliminary report was published in the New England Journal of Medicine one month later (516). This vaccine candidate should not be confused with a similar candidate from Pfizer/BioNTech, BNT162b1, that delivered only the RBD of the spike protein (1705, 1706), which was not advanced to a phase III trial because of the improved reactogenicity/immunogenicity profile of BNT162b2 (517).

During the phase III trial of BNT162b2, 43,538 participants were enrolled 1:1 in the placebo and the vaccine candidate and received two 30-μg doses 21 days apart (516). Of these enrolled participants, 21,720 received BNT162b2 and 21,728 received a placebo (516). Recruitment occurred at 135 sites across six countries: Argentina, Brazil, Germany, South Africa, Turkey, and the United States. An initial press release described the first 94 cases, which were consistent with 90% efficacy of the vaccine at 7 days following the second dose (1704). The release of the full trial information covered a longer period and analyzed the first 170 cases occurring at least 7 days after the second dose, 8 of which occurred in patients who had received BNT162b2. The press release characterized the study population as diverse, reporting that 42% of the participants worldwide came from non-white backgrounds, including 10% Black and 26% Hispanic or Latino (1707). Within the United States, 10% and 13% of participants, respectively, identified themselves as having Black or Hispanic/Latino backgrounds (1707). Additionally, 41% of participants worldwide were 56 years of age or older (1707), and they reported that the efficacy of the vaccine in adults over 65 was 94% (1708). The primary efficacy analysis of the phase III study was concluded on November 18, 2020 (1708), and the final results indicted 94.6% efficacy of the vaccine (516).

The safety profile of the vaccine was also assessed (516). A subset of patients were followed for reactogenicity using electronic diaries, with the data collected from these 8,183 participants comprising the solicited safety events analyzed. Much like those who received the ModernaTX vaccine candidate, a large proportion of participants reported experiencing site injection pain within 7 days of vaccination. While percentages are broken down by age group in the publication, these proportions correspond to approximately 78% and 73% of all participants after the first and second doses, respectively, overall. Only a small percentage of these events (less than 1%) were rated as serious, with the rest being mild or moderate, and none reached grade 4. Some participants also reported redness or swelling, and the publication indicates that in most cases, such events resolved within 1 to 2 days. Participants also experienced systemic effects, including fever (in most cases lower than 38.9°C and more common after dose 2), fatigue (25-50% of participants depending on age group and dose), headache (25-50% of participants depending on age group and dose), chills, and muscle or joint pain; more rarely, patients could experience gastrointestinal effects such as vomiting or diarrhea. As with the local events, these events were almost always grade 1 or 2. While some events were reported by the placebo groups, these events were much rarer than in the treatment group even though compliance was similar. Based on the efficacy and safety information released, the vaccine was approved in early December by the United Kingdom’s Medicines and Healthcare Products Regulatory Agency with administration outside of a clinical trial beginning on December 8, 2020 (1709, 1710). As of December 11, 2020, the United States FDA approved this vaccine under an emergency use authorization (1628), and in August 2021, it received full approval for ages 16 and older (1711).

9.4.3 Neutralizing of VOC

Prior to studies examining the effectiveness of vaccines in real-world settings (summarized in (6), several studies reported reduced efficacy of the mRNA vaccines based on the measurement of antibody titers. Plasma from individuals double-dosed with Pfizer/BioNTech’s BNT162b2 vaccine had up to a 16-fold reduction in neutralizing capacity against the Omicron variant (1712) and a reduced efficacy (70%) (1632). Estimates for the mRNA vaccines range from a 2-fold to over a 20-fold drop in neutralisation titers (1713), hence the push for third and fourth doses of mRNA vaccines in many Western countries. A third mRNA vaccine dose does increase antibody titers, but these levels also wane with time (1714). Notably, immunocompromised individuals such as cancer patients seem to elicit a sufficient protective immune response against the Omicron variant when they have been boosted with a third dose of either mRNA vaccine, albeit a blunted response (1715). While antibody titers do correlate with protection (17161720), they are not the only mechanisms of immune protection. For example, T cell and non-neutralizing antibody responses may be unaffected or less affected by the new VOC, and they warrant further investigation.

9.5 Global Vaccine Status and Distribution

In North America, the majority of vaccines distributed until March 2021 have been produced by Pfizer-BioNTech and Moderna. In Canada, the vaccine approval process is conducted by Health Canada, which uses a fast-tracked process whereby vaccine producers can submit data as it becomes available to allow for rapid review. An approval may be granted following reviews of the available phase III clinical data. This is followed by a period of pharmacovigilance in the population using their post-market surveillance system, which will monitor the long-term safety and efficacy of any vaccines (1721, 1722). Health Canada has authorized the use of the Pfizer (December 9th, 2020), Moderna (December 23rd, 2020), Oxford-AstraZeneca (February 26th, 2021), and the Janssen (March 5th, 2021) vaccines, and the Novavax Inc vaccine is also under consideration (1723). While Canada initially projected that by the end of September 2021 a vaccine would be available for all Canadian adults, they now predict that it may be possible earlier as more vaccines have been approved and become available (1724).

In the U.S., vaccines are required to have demonstrated safety and efficacy in phase III trials before manufacturers apply for an emergency use authorization (EUA) from the FDA. If an EUA is granted, an additional evaluation of the safety and efficacy of the vaccines is conducted by the CDC’s Advisory Committee on Immunization Practices (ACIP) who also provide guidance on vaccine prioritization. On December 1st, 2020, ACIP provided an interim phase 1a recommendation that healthcare workers and long-term care facility residents should be the first to be offered any vaccine approved (1725). This was shortly followed by an EUA on December 11th, 2020 for the use of the Pfizer-BioNTech COVID vaccine (1726), which was distributed and administered to the first healthcare workers on December 14th, 2020 (1727). Shortly thereafter, an EUA for the Moderna vaccine was issued on December 18th, 2020 (1728). On December 20th, 2020, ACIP updated their initial recommendations to suggest that vaccinations should be offered to people aged 75 years old and older and to non-healthcare frontline workers in phase 1b (1729). On the same date, it was recommended that phase 1c should include people aged 65-74 years old, individuals between the ages of 16-74 years old at high-risk due to health conditions, and essential workers ineligible in phase 1b (1729). On the following day, December 21st, 2020, the first Moderna vaccines used outside of clinical trials were administered to American healthcare workers, which was the same day that President-elect Biden and Dr. Biden received their first doses of the Pfizer-BioNTech vaccine live on television to instill confidence in the approval and vaccination process (1730).

On February 27th, 2020, the FDA issued an EUA for the Janssen COVID-19 Vaccine (1731). This was followed by an update on recommendations by ACIP for the use of the Janssen COVID-19 vaccine for those over 18 years old (1732). The Janssen vaccine was first distributed to healthcare facilities on March 1st, 2021. On March 12, 2021, the WHO added the Janssen vaccine to the list of safe and effective emergency tools for COVID-19 (1733). While the CDC’s ACIP can provide recommendations, it is up to the public health authorities of each state, territory, and tribe to interpret the guidance and determine who will be vaccinated first (1734). Prior to distribution of the Janssen vaccine, over 103 million doses of the Moderna and Pfizer-BioNTech vaccines were delivered across the U.S., with almost 79 million doses administered. Of the total population, 15.6% have received at least one dose and 7.9% have received a second dose of either the Moderna (~38.3 million) or the Pfizer-BioNTech (~40.2 million) vaccines by February 28th, 2021 (1735). President Biden’s administration has predicted that by the end of May 2021 there may be enough vaccine supply available for all adults in the U.S. (1736, 1737). However, vaccine production, approval, and distribution was not straightforward in the U.S., as information was initially sparse and the rollout of vaccines was complicated by poor planning and leadership due to political activities prior to the change of administration in January 2021 (1738). These political complications highlight the importance of the transparent vaccine approval process conducted by the FDA (1461).

Outside the U.S., the Moderna and Pfizer-BioNTech vaccines have been administered in 29 and 69 other countries, respectively, mainly in Europe and North America (1364). The Janssen vaccine has so far only been administered in South Africa and the U.S. (1364, 1739), but it has also been approved in Bahrain, the European Union (E.U.), Iceland, Liechtenstein, and Norway (1186). On March 11th, 2021, Johnson & Johnson received approval from the European Medicines Agency (EMA) for conditional marketing authorization of their vaccine (1740). Notably, on March 2nd, 2021, rivals Johnson & Johnson and Merck announced that they entered an agreement to increase production of the Janssen vaccine to meet global demand (1741).

The U.K. was the first country to approve use of the Pfizer-BioNTech vaccine on December 2nd, 2020 (1742), and it was later approved by EMA on December 21st, 2020 (1743). The U.K. was also the first to administer the Pfizer-BioNTech vaccine, making it the first COVID-19 vaccine supported by phase III data to be administered outside of clinical trials on December 8th, 2020. The Oxford-AstraZeneca vaccine, was approved by the Medicines and Healthcare Products Regulatory Agency (MHRA) in the U.K. and by EMA in the E.U. on December 30th (2020) (1744) and January 29th (2021) (1602), respectively. The Oxford-AstraZeneca vaccine was first administered in the UK on January 4th, 2021 (1745), and it is now being used in 53 countries in total, including Brazil, India, Pakistan, Mexico, and spanning most of Europe (1364). The Moderna vaccine was authorized for use in the E.U. by EMA on January 6th, 2021 (1746) and in the U.K. by MHRA on January 8th, 2021 (1747). As of March 5th, 2021, 22 million people in the U.K. had received at least one vaccine dose (1365).

While the Pfizer-BioNTech vaccine was the first to be distributed following phase III clinical trials, the first COVID-19 vaccine to be widely administered to people prior to the completion of phase III clinical trials was Sputnik V. Sputnik V was administered to as many as 1.5 million Russians by early January (1569) due to the establishment of mass vaccination clinics in December 2020, prior to which only approximately 100,000 Russians had already been vaccinated (1748, 1749). Doses of Sputnik V have also been distributed to other parts of Europe (15701572). Hungary was the first E.U. member country to approve and distribute Sputnik V outside of Russia (1750), despite the EMA stating that they had neither approved nor received a request for approval of Sputnik V (1751). Hungary is also in talks with China to procure the Sinopharm vaccines, which have been approved by Hungarian health authorities but also have not received approval by EMA in the E.U. (1750). In Latin America, production facilities in both Brazil and Argentina will allow for increased production capacity of Sputnik V and doses have been distributed to Mexico, Argentina, Bolivia, Nicaragua, Paraguay, and Venezuela (1752). Guinea was the first African nation to administer Sputnik V in December 2020, and the Central African Republic, Zimbabwe, and the Ivory Coast have all registered their interest in purchasing doses of the vaccine (1752). In the Middle East, Iran has received its first doses of Sputnik V and the United Arab Emirates is conducting phase III trials (1752). In Asia, while China’s vaccine candidates are favored, the Philippines, Nepal, and Uzbekistan have sought Sputnik V doses (1753). In total, the RDIF claims to have received orders totalling 1.2 billion doses by over 50 countries worldwide (1753) and at least 18 countries are currently administering Sputnik V around the globe (1364). Sputnik V has been an attractive vaccine for many countries due to its relatively low price, high efficacy, and its favorable storage conditions. For some countries, Russia and China have also been more palatable politically than vaccine suppliers in the West (1752, 1754). For others, the delays in the distribution of the other, more-favored candidates has been a motivating factor for pursuing the Sputnik V and Chinese alternatives (1571, 1754). Additionally, Germany has stated that if Sputnik V were approved by EMA, it would be considered by the E.U. (1755). Russia is developing other vaccine candidates and has approved a third vaccine, CoviVac, which is an inactivated vaccine produced by the Chumakov Centre in Moscow, despite the fact the clinical trials have yet to begin (1756).

10 Dietary Supplements and Nutraceuticals Under Investigation for COVID-19 Prevention and Treatment

10.1 Abstract

Coronavirus disease 2019 (COVID-19) has caused global disruption and a significant loss of life. Existing treatments that can be repurposed as prophylactic and therapeutic agents could reduce the pandemic’s devastation. Emerging evidence of potential applications in other therapeutic contexts has led to the investigation of dietary supplements and nutraceuticals for COVID-19. Such products include vitamin C, vitamin D, omega 3 polyunsaturated fatty acids, probiotics, and zinc, all of which are currently under clinical investigation. In this review, we critically appraise the evidence surrounding dietary supplements and nutraceuticals for the prophylaxis and treatment of COVID-19. Overall, further study is required before evidence-based recommendations can be formulated, but nutritional status plays a significant role in patient outcomes, and these products could help alleviate deficiencies. For example, evidence indicates that vitamin D deficiency may be associated with greater incidence of infection and severity of COVID-19, suggesting that vitamin D supplementation may hold prophylactic or therapeutic value. A growing number of scientific organizations are now considering recommending vitamin D supplementation to those at high risk of COVID-19. Because research in vitamin D and other nutraceuticals and supplements is preliminary, here we evaluate the extent to which these nutraceutical and dietary supplements hold potential in the COVID-19 crisis.

10.2 Importance

Sales of dietary supplements and nutraceuticals have increased during the pandemic due to their perceived “immune-boosting” effects. However, little is known about the efficacy of these dietary supplements and nutraceuticals against the novel coronavirus (SARS-CoV-2) or the disease it causes, COVID-19. This review provides a critical overview of the potential prophylactic and therapeutic value of various dietary supplements and nutraceuticals from the evidence available to date. These include vitamin C, vitamin D, and zinc, which are often perceived by the public as treating respiratory infections or supporting immune health. Consumers need to be aware of misinformation and false promises surrounding some supplements, which may be subject to limited regulation by authorities. However, considerably more research is required to determine whether dietary supplements and nutraceuticals exhibit prophylactic and therapeutic value against SARS-CoV-2 infection and COVID-19. This review provides perspective on which nutraceuticals and supplements are involved in biological processes that are relevant to recovery from or prevention of COVID-19.

10.3 Introduction

The year 2020 saw scientists and the medical community scrambling to repurpose or discover novel host-directed therapies against the coronavirus disease 2019 (COVID-19) pandemic caused by the spread of the novel Severe acute respiratory syndrome-related coronavirus 2 (SARS-CoV-2). This rapid effort led to the identification of some promising pharmaceutical therapies for hospitalized patients, such as remdesivir and dexamethasone. Furthermore, most societies have adopted non-pharmacological preventative measures such as utilizing public health strategies that reduce the transmission of SARS-CoV-2. However, during this time, many individuals sought additional protections via the consumption of various dietary supplements and nutraceuticals that they believed to confer beneficial effects. While a patient’s nutritional status does seem to play a role in COVID-19 susceptibility and outcomes (17571761), the beginning of the pandemic saw sales of vitamins and other supplements soar despite a lack of any evidence supporting their use against COVID-19. In the United States, for example, dietary supplement and nutraceutical sales have shown modest annual growth in recent years (approximately 5%, or a $345 million increase in 2019), but during the six-week period preceding April 5, 2020, they increased by 44% ($435 million) relative to the same period in 2019 (1762). While growth subsequently leveled off, sales continued to boom, with a further 16% ($151 million) increase during the six weeks preceding May 17, 2020 relative to 2019 (1762). In France, New Zealand, India, and China, similar trends in sales were reported (17631766). The increase in sales was driven by a consumer perception that dietary supplements and nutraceuticals would protect consumers from infection and/or mitigate the impact of infection due to the various “immune-boosting” claims of these products (1767, 1768).

Due to the significant interest from the general public in dietary additives, whether and to what extent nutraceuticals or dietary supplements can provide any prophylactic or therapeutic benefit remains a topic of interest for the scientific community. Nutraceuticals and dietary supplements are related but distinct non-pharmaceutical products. Nutraceuticals are classified as supplements with health benefits beyond their basic nutritional value (1769, 1770). The key difference between a dietary supplement and a nutraceutical is that nutraceuticals should not only supplement the diet, but also aid in the prophylaxis and/or treatment of a disorder or disease (1771). However, dietary supplements and nutraceuticals, unlike pharmaceuticals, are not subject to the same regulatory protocols that protect consumers of medicines. Indeed, nutraceuticals do not entirely fall under the responsibility of the Food and Drug Administration (FDA), but they are monitored as dietary supplements according to the Dietary Supplement, Health and Education Act 1994 (DSHEA) (1772) and the Food and Drug Administration Modernization Act 1997 (FDAMA) (1773). Due to increases in sales of dietary supplements and nutraceuticals, in 1996 the FDA established the Office of Dietary Supplement Programs (ODSP) to increase surveillance. Novel products or nutraceuticals must now submit a new dietary ingredient notification to the ODSP for review. There are significant concerns that these legislations do not adequately protect the consumer as they ascribe responsibility to the manufacturers to ensure the safety of the product before manufacturing or marketing (1774). Manufacturers are not required to register or even seek approval from the FDA to produce or sell food supplements or nutraceuticals. Health or nutrient content claims for labeling purposes are approved based on an authoritative statement from the Academy of Sciences or relevant federal authorities once the FDA has been notified and on the basis that the information is known to be true and not deceptive (1774). Therefore, there is often a gap between perceptions by the American public about a nutraceutical or dietary supplement and the actual clinical evidence surrounding its effects.

Despite differences in regulations, similar challenges exist outside of the United States. In Europe, where the safety of supplements is monitored by the European Union (EU) under Directive 2002/46/EC (1775). However, nutraceuticals are not directly mentioned. Consequently, nutraceuticals can be generally described as either a medicinal product under Directive 2004/27/EC (1776) or as a ‘foodstuff’ under Directive 2002/46/EC of the European council. In order to synchronize the various existing legislations, Regulation EC 1924/2006 on nutrition and health claims was put into effect to assure customers of safety and efficacy of products and to deliver understandable information to consumers. However, specific legislation for nutraceuticals is still elusive. Health claims are permitted on a product label only following compliance and authorization according to the European Food Safety Authority (EFSA) guidelines on nutrition and health claims (1777). EFSA does not currently distinguish between food supplements and nutraceuticals for health claim applications of new products, as claim authorization is dependent on the availability of clinical data in order to substantiate efficacy (1778). These guidelines seem to provide more protection to consumers than the FDA regulations but potentially at the cost of innovation in the sector (1779). The situation becomes even more complicated when comparing regulations at a global level, as countries such as China and India have existing regulatory frameworks for traditional medicines and phytomedicines not commonly consumed in Western society (1780). Currently, there is debate among scientists and regulatory authorities surrounding the development of a widespread regulatory framework to deal with the challenges of safety and health claim substantiation for nutraceuticals (1774, 1778), as these products do not necessarily follow the same rigorous clinical trial frameworks used to approve the use of pharmaceuticals. Such regulatory disparities have been highlighted by the pandemic, as many individuals and companies have attempted to profit from the vulnerabilities of others by overstating claims in relation to the treatment of COVID-19 using supplements and nutraceuticals. The FDA has written several letters to prevent companies marketing or selling products based on false hyperbolic promises about preventing SARS-CoV-2 infection or treating COVID-19 (17811783). These letters came in response to efforts to market nutraceutical prophylactics against COVID-19, some of which charged the consumer as much as $23,000 (1784). There have even been some incidents highlighted in the media because of their potentially life threatening consequences; for example, the use of oleandrin was touted as a potential “cure” by individuals close to the former President of the United States despite its high toxicity (1785). Thus, heterogeneous and at times relaxed regulatory standards have permitted high-profile cases of the sale of nutraceuticals and dietary supplements that are purported to provide protection against COVID-19, despite a lack of research into these compounds.

Notwithstanding the issues of poor safety, efficacy, and regulatory oversight, some dietary supplements and nutraceuticals have exhibited therapeutic and prophylactic potential. Some have been linked with reduced immunopathology, antiviral and anti-inflammatory activities, or even the prevention of acute respiratory distress syndrome (ARDS) (1767, 1786, 1787). A host of potential candidates have been highlighted in the literature that target various aspects of the COVID-19 viral pathology, while others are thought to prime the host immune system. These candidates include vitamins and minerals along with extracts and omega-3 polyunsaturated fatty acids (n-3 PUFA) (1788). In vitro and in vivo studies suggest that nutraceuticals containing phycocyanobilin, N-acetylcysteine, glucosamine, selenium or phase 2 inductive nutraceuticals (e.g. ferulic acid, lipoic acid, or sulforaphane) can prevent or modulate RNA virus infections via amplification of the signaling activity of mitochondrial antiviral-signaling protein (MAVS) and activation of Toll-like receptor 7 (1789). Phase 2 inductive molecules used in the production of nutraceuticals are known to activate nuclear factor erythroid 2–related factor 2 (Nrf2), which is a protein regulator of antioxidant enzymes that leads to the induction of several antioxidant enzymes, such as gamma-glutamylcysteine synthetase. While promising, further animal and human studies are required to assess the therapeutic potential of these various nutrients and nutraceuticals against COVID-19. For the purpose of this review, we have highlighted some of the main dietary supplements and nutraceuticals that are currently under investigation for their potential prophylactic and therapeutic applications. These include n-3 PUFA, zinc, vitamins C and D, and probiotics.

10.4 n-3 PUFA

One category of supplements that has been explored for beneficial effects against various viral infections are the n-3 PUFAs (1788), commonly referred to as omega-3 fatty acids, which include eicosapentaenoic acid (EPA) and docosahexaenoic acid (DHA). EPA and DHA intake can come from a diet high in fish or through dietary supplementation with fish oils or purified oils (1790). Other, more sustainable sources of EPA and DHA include algae (1791, 1792), which can also be exploited for their rich abundance of other bioactive compounds such as angiotensin converting enzyme inhibitor peptides and antiviral agents including phycobiliproteins, sulfated polysaccharides, and calcium-spirulan (1793). n-3 PUFAs have been investigated for many years for their therapeutic potential (1794). Supplementation with fish oils is generally well tolerated (1794), and intake of n-3 PUFAs through dietary sources or supplementation is specifically encouraged for vulnerable groups such as pregnant and lactating women (1795, 1796). As a result, these well-established compounds have drawn significant interest for their potential immune effects and therapeutic potential.

Particular interest has arisen in n-3 PUFAs as potential therapeutics against diseases associated with inflammation. n-3 PUFAs have been found to modulate inflammation by influencing processes such as leukocyte chemotaxis, adhesion molecule expression, and the production of eicosanoids (1797, 1798). This and other evidence indicates that n-3 PUFAs may have the capacity to modulate the adaptive immune response (1770, 1790, 1797); for example, they have been found to influence antigen presentation and the production of CD4(+) Th1 cells, among other relevant effects (1799). Certainly, preliminary evidence from banked blood samples from 100 COVID-19 patients suggests that patients with a higher omega-3 index, a measure of the amount of EPA and DHA in red blood cells, had a lower risk of death due to COVID-19 (1800). Interest has also arisen as to whether nutritional status related to n-3 PUFAs can also affect inflammation associated with severe disease, such as ARDS or sepsis (1801, 1802). ARDS and sepsis hold particular concern in the treatment of severe COVID-19; an analysis of 82 deceased COVID-19 patients in Wuhan during January to February 2020 reported that respiratory failure (associated with ARDS) was the cause of death in 69.5% of cases, and sepsis or multi-organ failure accounted for 28.0% of deaths (742). Research in ARDS prior to current pandemic suggests that n-3 PUFAs may hold some therapeutic potential. One study randomized 16 consecutive ARDS patients to receive either a fish oil-enriched lipid emulsion or a control lipid emulsion (comprised of 100% long-chain triglycerides) under a double-blinded design (1803). They reported a statistically significant reduction in leukotriene B4 levels in the group receiving the fish oil-enriched emulsion, suggesting that the fish oil supplementation may have reduced inflammation. However, they also reported that most of their tests were not statistically significant, and therefore it seems that additional research using larger sample sizes is required. A recent meta-analysis of 10 randomized controlled trials (RCTs) examining the effects of n-3 PUFAs on ARDS patients did not find evidence of any effect on mortality, although the effect on secondary outcomes could not be determined due to a low quality of evidence (1804). However, another meta-analysis that examined 24 RCTs studying the effects of n-3 fatty acids on sepsis, including ARDS-induced sepsis, did find support for an effect on mortality when n-3 fatty acids were administered via enteral nutrition, although a paucity of high-quality evidence again limited conclusions (1805). Therefore, despite theoretical support for an immunomodulatory effect of n-3 PUFAs in COVID-19, evidence from existing RCTs is insufficient to determine whether supplementation offers an advantage in a clinical setting that would be relevant to COVID-19.

Another potential mechanism that has led to interest in n-3 PUFAs as protective against viral infections including COVID-19 is its potential as a precursor molecule for the biosynthesis of endogenous specialized proresolving mediators (SPM), such as protectins and resolvins, that actively resolve inflammation and infection (1806). SPM have exhibited beneficial effects against a variety of lung infections, including some caused by RNA viruses (1807, 1808). Several mechanisms for SPM have been proposed, including preventing the release of pro-inflammatory cytokines and chemokines or increasing phagocytosis of cellular debris by macrophages (1809). In influenza, SPM promote antiviral B lymphocytic activities (1810), and protectin D1 has been shown to increase survival from H1N1 viral infection in mice by affecting the viral replication machinery (1811). It has thus been hypothesized that SPM could aid in the resolution of the cytokine storm and pulmonary inflammation associated with COVID-19 (1812, 1813). Another theory is that some comorbidities, such as obesity, could lead to deficiencies of SPM, which could in turn be related to the occurrence of adverse outcomes for COVID-19 (1814). However, not all studies are in agreement that n-3 PUFAs or their resulting SPM are effective against infections (1815). At a minimum, the effectiveness of n-3 PUFAs against infections would be dependent on the dosage, timing, and the specific pathogens responsible (1816). On another level, there is still the question of whether fish oils can raise the levels of SPM levels upon ingestion and in response to acute inflammation in humans (1817). Currently, Karolinska University Hospital is running a trial that will measure the levels of SPM as a secondary outcome following intravenous supplementation of n-3 PUFAs in hospitalized COVID-19 patients to determine whether n-3 PUFAs provides therapeutic value (1818, 1819). Therefore, while this mechanism provides theoretical support for a role for n-3 PUFAs against COVID-19, experimental support is still needed.

A third possible mechanism by which n-3 PUFAs could benefit COVID-19 patients arises from the fact that some COVID-19 patients, particularly those with comorbidities, are at a significant risk of thrombotic complications including arterial and venous thrombosis (105, 1820). Therefore, the use of prophylactic and therapeutic anticoagulants and antithrombotic agents is under consideration (1821, 1822). Considering that there is significant evidence that n-3 fatty acids and other fish oil-derived lipids possess antithrombotic properties and anti-inflammatory properties (1790, 1823, 1824), they may have therapeutic value against the prothrombotic complications of COVID-19. In particular, concerns have been raised within the medical community about using investigational therapeutics on COVID-19 patients who are already on antiplatelet therapies due to pre-existing comorbidities because the introduction of such therapeutics could lead to issues with dosing and drug choice and/or negative drug-drug interactions (1821). In such cases, dietary sources of n-3 fatty acids or other nutraceuticals with antiplatelet activities could hold particular value for reducing the risk of thrombotic complications in patients already receiving pharmaceutical antiplatelet therapies. A new clinical trial (1825) is currently recruiting COVID-19 positive patients to investigate the anti-inflammatory activity of a recently developed, highly purified nutraceutical derivative of EPA known as icosapent ethyl (VascepaTM) (1826). Other randomized controlled trials that are in the preparatory stages intend to investigate the administration of EPA and other bioactive compounds to COVID-19 positive patients in order to observe whether anti-inflammatory effects or disease state improvements occur (1827, 1828). Finally, while there have been studies investigating the therapeutic value of n-3 fatty acids against ARDS in humans, there is still limited evidence of their effectiveness (1829). It should be noted that the overall lack of human studies in this area means there is limited evidence as to whether these supplements could affect COVID-19 infection. Consequently, the clinical trials that are underway and those that have been proposed will provide valuable insight into whether the anti-inflammatory potential of n-3 PUFAs and their derivatives can be beneficial to the treatment of COVID-19. All the same, while the evidence is not present to draw conclusions about whether n-3 PUFAs will be useful in treating COVID-19, there is likely little harm associated with a diet rich in fish oils, and interest in n-3 PUFA supplementation by the general public is unlikely to have negative effects.

10.5 Zinc

Zinc is nutrient supplement that may exhibit some benefits against RNA viral infections. Zinc is a trace metal obtained from dietary sources or supplementation and is important for the maintenance of immune cells involved in adaptive and innate immunity (1830). Supplements can be administered orally as a tablet or as a lozenge and are available in many forms, such as zinc picolinate, zinc acetate, and zinc citrate. Zinc is also available from dietary sources including meat, seafood, nuts, seeds, legumes, and dairy. The role of zinc in immune function has been extensively reviewed (1830). Zinc is an important signaling molecule, and zinc levels can alter host defense systems. In inflammatory situations such as an infection, zinc can regulate leukocyte immune responses and modulate the nuclear factor kappa-light-chain-enhancer of activated B cells, thus altering cytokine production (1831, 1832). In particular, zinc supplementation can increase natural killer cell levels, which are important cells for host defense against viral infections (1830, 1833). As a result of these immune-related functions, zinc is also under consideration for possible benefits against COVID-19.

Adequate zinc intake has been associated with reduced incidence of infection (1834) and antiviral immunity (1835). A randomized, double-blind, placebo-controlled trial that administered zinc supplementation to elderly subjects over the course of a year found that zinc supplementation decreased susceptibility to infection and that zinc deficiency was associated with increased susceptibility to infection (1834). Clinical trial data supports the utility of zinc to diminish the duration and severity of symptoms associated with common colds when it is provided within 24 hours of the onset of symptoms (1836, 1837). An observational study showed that COVID-19 patients had significantly lower zinc levels in comparison to healthy controls and that zinc-deficient COVID-19 patients (those with levels less than 80 μg/dl) tended to have more complications (70.4% vs 30.0%, p = 0.009) and potentially prolonged hospital stays (7.9 vs 5.7 days, p = 0.048) relative to patients who were not zinc deficient (1838). In coronaviruses specifically, in vitro evidence has demonstrated that the combination of zinc (Zn2+) and zinc ionophores (pyrithione) can interrupt the replication mechanisms of SARS-CoV-GFP (a fluorescently tagged SARS-CoV-1) and a variety of other RNA viruses (1839, 1840). Currently, there are over twenty clinical trials registered with the intention to use zinc in a preventative or therapeutic manner for COVID-19. However, many of these trials proposed the use of zinc in conjunction with hydroxychloroquine and azithromycin (18411844), and it is not known how the lack of evidence supporting the use of hydroxychloroquine will affect investigation of zinc. One retrospective observational study of New York University Langone hospitals in New York compared outcomes among hospitalized COVID-19 patients administered hydroxychloroquine and azithromycin with zinc sulfate (n = 411) versus hydroxychloroquine and azithromycin alone (n = 521). Notably, zinc is the only treatment that was used in this trial that is still under consideration as a therapeutic agent due to the lack of efficacy and potential adverse events associated with hydroxychloroquine and azithromycin against COVID-19 (18451847). While the addition of zinc sulfate did not affect the duration of hospitalization, the length of ICU stays or patient ventilation duration, univariate analyses indicated that zinc did increase the frequency of patients discharged and decreased the requirement for ventilation, referrals to the ICU, and mortality (1848). However, a smaller retrospective study at Hoboken University Medical Center New Jersey failed to find an association between zinc supplementation and survival of hospitalized patients (1849). Therefore, whether zinc contributes to COVID-19 recovery remains unclear. Other trials are now investigating zinc in conjunction with other supplements such as vitamin C or n-3 PUFA (1828, 1850). Though there is, overall, encouraging data for zinc supplementation against the common cold and viral infections, there is currently limited evidence to suggest zinc supplementation has any beneficial effects against the current novel COVID-19; thus, the clinical trials that are currently underway will provide vital information on the efficacious use of zinc in COVID-19 prevention and/or treatment. However, given the limited risk and the potential association between zinc deficiency and illness, maintaining a healthy diet to ensure an adequate zinc status may be advisable for individuals seeking to reduce their likelihood of infection.

10.6 Vitamin C

Vitamins B, C, D, and E have also been suggested as potential nutrient supplement interventions for COVID-19 (1788, 1851). In particular vitamin C has been proposed as a potential therapeutic agent against COVID-19 due to its long history of use against the common cold and other respiratory infections (1852, 1853). Vitamin C can be obtained via dietary sources such as fruits and vegetables or via supplementation. Vitamin C plays a significant role in promoting immune function due to its effects on various immune cells. It affects inflammation by modulating cytokine production, decreasing histamine levels, enhancing the differentiation and proliferation of T- and B-lymphocytes, increasing antibody levels, and protecting against the negative effects of reactive oxygen species, among other effects related to COVID-19 pathology (18541856). Vitamin C is utilized by the body during viral infections, as evinced by lower concentrations in leukocytes and lower concentrations of urinary vitamin C. Post-infection, these levels return to baseline ranges (18571861). It has been shown that as little as 0.1 g/d of vitamin C can maintain normal plasma levels of vitamin C in healthy individuals, but higher doses of at least 1-3 g/d are required for critically ill patients in ICUs (1862). Indeed, vitamin C deficiency appears to be common among COVID-19 patients (1863, 1864). COVID-19 is also associated with the formation of microthrombi and coagulopathy (107) that contribute to its characteristic lung pathology (1865), but these symptoms can be ameliorated by early infusions of vitamin C to inhibit endothelial surface P-selectin expression and platelet-endothelial adhesion (1866). Intravenous vitamin C also reduced D-dimer levels in a case study of 17 COVID-19 patients (1867). D-dimer levels are an important indicator of thrombus formation and breakdown and are notably elevated in COVID-19 patients (103, 104). There is therefore preliminary evidence suggesting that vitamin C status and vitamin C administration may be relevant to COVID-19 outcomes.

Larger-scale studies of vitamin C, however, have provided mixed results. A recent meta-analysis found consistent support for regular vitamin C supplementation reducing the duration of the common cold, but that supplementation with vitamin C (> 200 mg) failed to reduce the incidence of colds (1868). Individual studies have found Vitamin C to reduce the susceptibility of patients to lower respiratory tract infections, such as pneumonia (1869). Another meta-analysis demonstrated that in twelve trials, vitamin C supplementation reduced the length of stay of patients in intensive care units (ICUs) by 7.8% (95% CI: 4.2% to 11.2%; p = 0.00003). Furthermore, high doses (1-3 g/day) significantly reduced the length of an ICU stay by 8.6% in six trials (p = 0.003). Vitamin C also shortened the duration of mechanical ventilation by 18.2% in three trials in which patients required intervention for over 24 hours (95% CI 7.7% to 27%; p = 0.001) (1862). Despite these findings, an RCT of 167 patients known as CITRUS ALI failed to show a benefit of a 96-hour infusion of vitamin C to treat ARDS (1870). Clinical trials specifically investigating vitamin C in the context of COVID-19 have now begun, as highlighted by Carr et al. (1853). These trials intend to investigate the use of intravenous vitamin C in hospitalized COVID-19 patients. The first trial to report initial results took place in Wuhan, China (1871). These initial results indicated that the administration of 12 g/12 hr of intravenous vitamin C for 7 days in 56 critically ill COVID-19 patients resulted in a promising reduction of 28-day mortality (p = 0.06) in univariate survival analysis (1872). Indeed, the same study reported a significant decrease in IL-6 levels by day 7 of vitamin C infusion (p = 0.04) (1873). Additional studies that are being conducted in Canada, China, Iran, and the USA will provide additional insight into whether vitamin C supplementation affects COVID-19 outcomes on a larger scale.

Even though evidence supporting the use of vitamin C is beginning to emerge, we will not know how effective vitamin C is as a therapeutic for quite some time. Currently (as of January 2021) over fifteen trials are registered with clinicaltrials.gov that are either recruiting, active or are currently in preparation. When completed, these trials will provide crucial evidence on the efficacy of vitamin C as a therapeutic for COVID-19 infection. However, the majority of supplementation studies investigate the intravenous infusion of vitamin C in severe patients. Therefore, there is a lack of studies investigating the potential prophylactic administration of vitamin C via oral supplementation for healthy individuals or potentially asymptomatic SARS-CoV-2 positive patients. Once again, vitamin C intake is part of a healthy diet and the vitamin likely presents minimal risk, but its potential prophylactic or therapeutic effects against COVID-19 are yet to be determined. To maintain vitamin C status, it would be prudent for individuals to ensure that they consume the recommended dietary allowance of vitamin C to maintain a healthy immune system (1757). The recommended dietary allowance according to the FDA is 75-90 mg/d, whereas EFSA recommends 110 mg/d (1874).

10.7 Vitamin D

Of all of the supplements currently under investigation, vitamin D has become a leading prophylactic and therapeutic candidate against SARS-CoV-2. Vitamin D can modulate both the adaptive and innate immune system and is associated with various aspects of immune health and antiviral defense (18751879). Vitamin D can be sourced through diet or supplementation, but it is mainly biosynthesized by the body on exposure to ultraviolet light (UVB) from sunlight. Vitamin D deficiency is associated with an increased susceptibility to infection (1880). In particular, vitamin D deficient patients are at risk of developing acute respiratory infections (1881) and ARDS (1881). 1,25-dihydroxyvitamin D3 is the active form of vitamin D that is involved in adaptive and innate responses; however, due to its low concentration and a short half life of a few hours, vitamin D levels are typically measured by the longer lasting and more abundant precursor 25-hydroxyvitamin D. The vitamin D receptor is expressed in various immune cells, and vitamin D is an immunomodulator of antigen presenting cells, dendritic cells, macrophages, monocytes, and T- and B-lymphocytes (1880, 1882). Due to its potential immunomodulating properties, vitamin D supplementation may be advantageous to maintain a healthy immune system.

Early in the pandemic it was postulated that an individual’s vitamin D status could significantly affect their risk of developing COVID-19 (1883). This hypothesis was derived from the fact that the current pandemic emerged in Wuhan China during winter, when 25-hydroxyvitamin D concentrations are at their lowest due to a lack of sunlight, whereas in the Southern Hemisphere, where it was nearing the end of the summer and higher 25-hydroxyvitamin D concentrations would be higher, the number of cases was low. This led researchers to question whether there was a seasonal component to the SARS-CoV-2 pandemic and whether vitamin D levels might play a role (18831886). Though it is assumed that COVID-19 is seasonal, multiple other factors that can affect vitamin D levels should also be considered. These factors include an individual’s nutritional status, their age, their occupation, skin pigmentation, potential comorbidities, and the variation of exposure to sunlight due to latitude amongst others. Indeed, it has been estimated that each degree of latitude north of 28 degrees corresponded to a 4.4% increase of COVID-19 mortality, indirectly linking a persons vitamin D levels via exposure to UVB light to COVID-19 mortality (1884).

As the pandemic has evolved, additional research of varying quality has investigated some of the potential links identified early in the pandemic (1883) between vitamin D and COVID-19. Indeed, studies are beginning to investigate whether there is any prophylactic and/or therapeutic relationship between vitamin D and COVID-19. A study in Switzerland demonstrated that 27 SARS-CoV-2 positive patients exhibited 25-hydroxyvitamin D plasma concentrations that were significantly lower (11.1 ng/ml) than those of SARS-CoV-2 negative patients (24.6 ng/ml; p = 0.004), an association that held when stratifying patients greater than 70 years old (1887). These findings seem to be supported by a Belgian observational study of 186 SARS-CoV-2 positive patients exhibiting symptoms of pneumonia, where 25-hydroxyvitamin D plasma concentrations were measured and CT scans of the lungs were obtained upon hospitalization (1888). A significant difference in 25-hydroxyvitamin D levels was observed between the SARS-CoV-2 patients and 2,717 season-matched hospitalized controls. It is not clear from the study which diseases caused the control subjects to be admitted at the time of their 25-hydroxyvitamin D measurement, which makes it difficult to assess the observations reported. Both female and male patients possessed lower median 25-hydroxyvitamin D concentrations than the control group as a whole (18.6 ng/ml versus 21.5 ng/ml; p = 0.0016) and a higher rate of vitamin D deficiency (58.6% versus 42.5%). However, when comparisons were stratified by sex, evidence of sexual dimorphism became apparent, as female patients had equivalent levels of 25-hydroxyvitamin D to females in the control group, whereas male patients were deficient in 25-hydroxyvitamin D relative to male controls (67% versus 49%; p = 0.0006). Notably, vitamin D deficiency was progressively lower in males with advancing radiological disease stages (p = 0.001). However, these studies are supported by several others that indicate that vitamin D status may be an independent risk factor for the severity of COVID-19 (18891892) and in COVID-19 patients relative to population-based controls (1893). Indeed, serum concentrations of 25-hydroxyvitamin D above 30 ng/ml, which indicate vitamin D sufficiency, seems to be associated with a reduction in serum C-reactive protein, an inflammatory marker, along with increased lymphocyte levels, which suggests that vitamin D levels may modulate the immune response by reducing risk for cytokine storm in response to SARS-CoV-2 infection (1893). A study in India determined that COVID-19 fatality was higher in patients with severe COVID-19 and low serum 25-hydroxyvitamin D (mean level 6.2 ng/ml; 97% vitamin D deficient) levels versus asymptomatic non-severe patients with higher levels of vitamin D (mean level 27.9 ng/ml; 33% vitamin D deficient) (1894). In the same study, vitamin D deficiency was associated with higher levels of inflammatory markers including IL-6, ferritin, and tumor necrosis factor α. Collectively, these studies add to a multitude of observational studies reporting potential associations between low levels of 25-hydroxyvitamin D and COVID-19 incidence and severity (1887, 1892, 1893, 18951901).

Despite the large number of studies establishing a link between vitamin D status and COVID-19 severity, an examination of data from the UK Biobank did not support this thesis (1902, 1903). These analyses examined 25-hydroxyvitamin D concentrations alongside SARS-CoV-2 positivity and COVID-19 mortality in over 340,000 UK Biobank participants. However, these studies have caused considerable debate that will likely be settled following further studies (1904, 1905). Overall, while the evidence suggests that there is likely an association between low serum 25-hydroxyvitamin D and COVID-19 incidence, these studies must be interpreted with caution, as there is the potential for reverse causality, bias, and other confounding factors including that vitamin D deficiency is also associated with numerous pre-existing conditions and risk factors that can increase the risk for severe COVID-19 (1757, 1884, 1906, 1907).

While these studies inform us of the potential importance of vitamin D sufficiency and the risk of SARS-CoV-2 infection and severe COVID-19, they fail to conclusively determine whether vitamin D supplementation can therapeutically affect the clinical course of COVID-19. In one study, 40 vitamin D deficient asymptomatic or mildly symptomatic participants patients were either randomized to receive 60,000 IU of cholecalciferol daily for at least 7 days (n = 16) or a placebo (n = 24) with a target serum 25-hydroxyvitamin D level >50 ng/ml. At day 7, 10 patients achieved >50 ng/ml, followed by another 2 by day 14. By the end of the study, the treatment group had a greater proportion of vitamin D-deficient participants that tested negative for SARS-CoV-2 RNA, and they had a significantly lower fibrinogen levels, potentially indicating a beneficial effect (1908). A pilot study in Spain determined that early administration of high dose calcifediol (~21,000 IU days 1-2 and ~11,000 IU days 3-7 of hospital admission) with hydroxychloroquine and azithromycin to 50 hospitalized COVID-19 patients significantly reduced ICU admissions and may have reduced disease severity versus hydroxychloroquine and azithromycin alone (1909). Although this study received significant criticism from the National Institute for Health and Care Excellence (NICE) in the UK (1910), an independent follow-up statistical analysis supported the findings of the study with respect to the results of cholecalciferol treatment (1911). Another trial of 986 patients hospitalized for COVID-19 in three UK hospitals administered cholecalciferol (≥ 280,000 IU in a time period of 7 weeks) to 151 patients and found an association with a reduced risk of COVID-19 mortality, regardless of baseline 25-hydroxyvitamin D levels (1912). However, a double-blind, randomized, placebo-controlled trial of 240 hospitalized COVID-19 patients in São Paulo, Brazil administered a single 200,000 IU oral dose of vitamin D. At the end of the study, there was a 24 ng/mL difference of 25-hydroxyvitamin D levels in the treatment group versus the placebo group (p = 0.001), and 87% of the treatment group were vitamin D sufficient versus ~11% in the placebo group. Supplementation was well tolerated. However, there was no reduction in the length of hospital stay or mortality, and no change to any other relevant secondary outcomes were reported (1913). These early findings are thus still inconclusive with regards to the therapeutic value of vitamin D supplementation. However, other trials are underway, including one trial that is investigating the utility of vitamin D as an immune-modulating agent by monitoring whether administration of vitamin D precipitates an improvement of health status in non-severe symptomatic COVID-19 patients and whether vitamin D prevents patient deterioration (1914). Other trials are examining various factors including mortality, symptom recovery, severity of disease, rates of ventilation, inflammatory markers such as C-reactive protein and IL-6, blood cell counts, and the prophylactic capacity of vitamin D administration (19141917). Concomitant administration of vitamin D with pharmaceuticals such as aspirin (1918) and bioactive molecules such as resveratrol (1919) is also under investigation.

The effectiveness of vitamin D supplementation against COVID-19 remains open for debate. All the same, there is no doubt that vitamin D deficiency is a widespread issue and should be addressed not only because of its potential link to SARS-CoV-2 incidence (1920), but also due to its importance for overall health. There is a possibility that safe exposure to sunlight could improve endogenous synthesis of vitamin D, potentially strengthening the immune system. However, sun exposure is not sufficient on its own, particularly in the winter months. Indeed, while the possible link between vitamin D status and COVID-19 is further investigated, preemptive supplementation of vitamin D and encouraging people to maintain a healthy diet for optimum vitamin D status is likely to raise serum levels of 25-hydroxyvitamin D while being unlikely to carry major health risks. These principles seem to be the basis of a number of guidelines issued by some countries and scientific organizations that have advised supplementation of vitamin D during the pandemic. The Académie Nationale de Médecine in France recommends rapid testing of 25-hydroxyvitamin D for people over 60 years old to identify those most at risk of vitamin D deficiency and advises them to obtain a bolus dose of 50,000 to 100,000 IU vitamin D to limit respiratory complications. It has also recommended that those under 60 years old should take 800 to 1,000 IU daily if they receive a SARS-CoV-2 positive test (1921/). In Slovenia, doctors have been advised to provide nursing home patients with vitamin D (1922). Both Public Health England and Public Health Scotland have advised members of the Black, Asian, and minority ethnic communities to supplement for vitamin D in light of evidence that they may be at higher risk for vitamin D deficiency along with other COVID-19 risk factors, a trend that has also been observed in the United States (1923, 1924). However, other UK scientific bodies including the NICE recommend that individuals supplement for vitamin D as per usual UK government advice but warn that people should not supplement for vitamin D solely to prevent COVID-19. All the same, the NICE has provided guidelines for research to investigate the supplementation of vitamin D in the context of COVID-19 (1925). Despite vitamin D deficiency being a widespread issue in the United States (1926), the National Institutes of Health have stated that there is “insufficient data to recommend either for or against the use of vitamin D for the prevention or treatment of COVID-19” (1927/). These are just some examples of how public health guidance has responded to the emerging evidence regarding vitamin D and COVID-19. Outside of official recommendations, there is also evidence that individuals may be paying increased attention to their vitamin D levels, as a survey of Polish consumers showed that 56% of respondents used vitamin D during the pandemic (1928). However, some companies have used the emerging evidence surrounding vitamin D to sell products that claim to prevent and treat COVID-19, which in one incident required a federal court to intervene and issue an injunction barring the sale of vitamin-D-related products due to the lack of clinical data supporting these claims (1929). It is clear that further studies and clinical trials are required to conclusively determine the prophylactic and therapeutic potential of vitamin D supplementation against COVID-19. Until such time that sufficient evidence emerges, individuals should follow their national guidelines surrounding vitamin D intake to achieve vitamin D sufficiency.

10.8 Probiotics

Probiotics are “live microorganisms that, when administered in adequate amounts, confer a health benefit on the host” (1930). Some studies suggest that probiotics are beneficial against common viral infections, and there is modest evidence to suggest that they can modulate the immune response (1931, 1932). As a result, it has been hypothesized that probiotics may have therapeutic value worthy of investigation against SARS-CoV-2 (1933). Probiotics and next-generation probiotics, which are more akin to pharmacological-grade supplements, have been associated with multiple potential beneficial effects for allergies, digestive tract disorders, and even metabolic diseases through their anti-inflammatory and immunomodulatory effects (1934, 1935). However, the mechanisms by which probiotics affect these various conditions would likely differ among strains, with the ultimate effect of the probiotic depending on the heterogeneous set of bacteria present (1935). Some of the beneficial effects of probiotics include reducing inflammation by promoting the expression of anti-inflammatory mediators, inhibiting Toll-like receptors 2 and 4, competing directly with pathogens, synthesizing antimicrobial substances or other metabolites, improving intestinal barrier function, and/or favorably altering the gut microbiota and the brain-gut axis (19351937). It is also thought that lactobacilli such as Lactobacillus paracasei, Lactobacillus plantarum and Lactobacillus rhamnosus have the capacity to bind to and inactivate some viruses via adsorptive and/or trapping mechanisms (1938). Other probiotic lactobacilli and even non-viable bacterium-like particles have been shown to reduce both viral attachment to