Home » Health and Medicine
Category Archives: Health and Medicine
Neonatal Isoerythrolysis in Equines, Felines, and Other Species
By Sara Su, Animal Science and English ’24
Neonatal Isoerythrolysis: An Overview
Neonatal isoerythrolysis (NI) is an alloimmune disease, an immune response against non-self-antigens from the same species. A non-self antigen can be any foreign substance that can trigger the host’s immune system. Hemolytic anemia is the most common symptom of NI, where a newborn’s red blood cells are the non-self-antigens, and are targeted and destroyed by maternal antibodies [1, 2, 3]. These antibodies are absorbed when the newborn ingests colostrum, otherwise known as “first milk” [1, 2]. In essence, neonatal isoerythrolysis occurs when an offspring inherits the father’s blood type while the mother’s blood type contains antibodies against it. Antibodies to red blood cells are commonly referred to as anti-erythrocyte antibodies, and if they are passed to another animal via ingestion or transfusion, they will attack the host’s own red blood cells at a rate at which they cannot be replaced. If left untreated, NI is fatal. Neonatal isoerythrolysis is most common in cats and horses, and is rarely observed in other species [1,2].
Colostrum is highly important because neonates (newborns) ingest the mother’s antibodies to gain passive immunity, the first layer of protection against pathogens after birth. For the first 24 hours of life, neonates are able to absorb antibodies in the gut. After this window, the gut begins to close and absorption decreases dramatically. Because of this, it is important to ingest colostrum as soon as possible after birth [2]. Isoerythrolysis occurs when those maternal antibodies become alloantibodies, which specifically target non-self red blood cells–in this case, the foal’s– and causes hemolysis [4].
The most common symptoms are weakness, pale mucous membranes, icterus (jaundice), tachycardia (rapid heartbeat), and tachypnoea (rapid breathing) [1, 5, 6, 7]. Symptoms can occur hours to a week within ingestion of colostrum, and death will occur if left untreated. Across all species, offspring that are positive for NI should not drink colostrum from their biological mothers. Preventing ingestion of colostrum containing alloantibodies can be done by muzzling the offspring (in the case of foals) or removing them from the mother (in the case of kittens). The mother must be milked periodically to eliminate any remaining colostrum, so that she can later produce milk that is harmless. In the meantime, the newborn(s) should be fed colostrum from a different animal.
Because neonatal isoerythrolysis results in hemolytic anemia, blood transfusions are a common treatment. Similar to human blood transfusions, horses and cats can receive transfusions from any individual with the same blood type. However, in horses it is highly recommended to use the blood from the mother after it is washed with saline and centrifuged to get rid of maternal alloantibodies [1].
To the untrained eye, the various symptoms of NI might appear to be the result of a contagious pathogen rather than an alloimmune disease. Nonetheless, it is very easy to prevent–conscientious breeders should preemptively blood test breeding pair, and test the pregnant mother again two weeks before parturition (birth) [5]. This would serve to inform the breeders about potentially conflicting blood types and prevent the conception of a foal that could become a victim of NI. Additionally, care should be taken when breeding females that have already given birth, or have received blood transfusions or certain vaccinations. Ultimately, the best method of combating NI is through prevention: a simple blood test of a breeding pair will be able to predict whether the neonate will develop this condition or not [1, 5].
Figure 1. The red blood cells of a neonate carrying the father’s blood type is at risk of being targeted by anti-erythrocyte antibodies contained in the mother’s colostrum
Neonatal Isoerythrolysis in Equines
Neonatal isoerythrolysis is most common in horses and mules, although it is also possible in other equids such as zebras. There are 16 blood types for horses; of these types, Aa and Qa are the most likely to become antigenic and cause NI [1, 2, 3]. Reactions to other blood types, such as Qc and Db, have also been observed [6]. When a mare is exposed to these antigens from a pregnancy or blood transfusion, she will produce alloantibodies that will attack the red blood cells of her future foal. This becomes complicated when a mare that has previously given birth to unaffected foals is introduced to an antigen, putting her future foals at risk for NI [1, 2].
Furthermore, there is something called a “donkey factor,” which is a type of red blood cell antigen specific to donkeys [8]. Thus, horses cannot receive blood transfusions from donkeys, but donkeys can receive blood transfusions from horses [1, 8, 9]. The donkey factor means that there is a much higher chance of Neonatal Isoerythrolysis occurring in mares pregnant with mule foals–10% in horse x donkey crosses compared to 1% when breeding horses. However, this is only a problem for mares that have carried more than one mule foal [7, 9].
In horses, NI is diagnosed with a Coomb’s test, which detects hemolytic anemia by cross-matching the mare’s blood serum with her foal’s red blood cells. If the result is positive, agglutination (clumping in the test tube) will occur [5, 10]. While the Coomb’s test is a definitive diagnosis that evaluates hemolysis, NI can also be prevented by conducting a jaundiced foal agglutination test (JFAT), which is a cruder method utilizing the mare’s colostrum and her foal’s erythrocytes. As with the Coomb’s test, if agglutination occurs there is a high risk of NI and the foal should be prevented from nursing [2, 5]. Thus, it is important to run these tests as soon as possible to prevent the foal from nursing from its biological mother [11]. Colostrum from an unrelated donor should be fed to the foal, and the biological mother should be milked periodically. Fortunately, antibody absorption drops dramatically after the first 24 hours of life, so this solution need not be implemented long-term. However, for foals that have already ingested colostrum and become severely anemic to the point of not wanting to nurse, blood transfusions are recommended. The blood should be cross-matched, and if that is not possible it is recommended to transfuse blood from an Aa Qa blood type negative gelding(a castrated male horse) [7]. In all cases, it is not recommended to separate a foal from its mother due to separation anxiety; thus, the foal should be muzzled.
Figure 2. The Coomb’s test detects hemolytic anemia, which is a common symptom of Neonatal Isoerythrolysis, by cross-matching the mare’s blood serum with her foal’s red blood cells
Neonatal Isoerythrolysis in Felines
Neonatal Isoerythrolysis may also occur in cats, and although this condition is much rarer in felines compared to equines, the mortality rate is still very high [12]. There are 3 blood types in cats: A, B, and AB [12, 13]. Type A is the most common by far, and it is also dominant to type B [14]. Type AB is not well-documented, but it is known that it is the rarest blood type. It is dominant to type B, but recessive to type A[12]. If the queen (mother cat) has type B, she naturally produces alloantibodies to blood type A. Neonatal Isoerythrolysis will occur if a Type B female is bred with a Type A tomcat so that the kittens are type A/B. In the United States, Type A is the most common blood type among cats, while Type B is rare [12]. It should be noted that wild felines share the same AB blood-grouping, so diagnosis and treatment may be applied to zoos in addition to domestic homes [13].
In cats, diagnosis is based on clinical signs of the kittens and blood typing. Clinical signs include weak nursing, pale or jaundiced mucous membranes, and dark red-brown urine [12]. If blood work is done, it will show anemia. Because feline litter sizes are large, it is common to use placental blood for testing. It should be noted that kittens in a litter may show varying levels of symptom severity; researchers theorize that this is due to different levels of colostrum intake. Unlike equines, felines can be directly removed from their mother and housed with another queen as soon as symptoms appear.
Neonatal Isoerythrolysis in Other Species
Although these instances are much rarer compared to cats and horses, neonatal isoerythrolysis has also been observed in cattle, dogs, and pigs [15, 16, 17]. Interestingly, NI is not a natural phenomenon in cattle–it was first observed in 1960 and was caused by homologous blood vaccines administered by veterinarians [17]. The threshold of response to these vaccines was variable so that the number of injections that would induce symptoms differed between every individual.
Hemolytic diseases similar to Neonatal Isoerythrolysis exist in humans as erythroblastosis fetalis or Rh disease [18]. It is termed differently because alloantibodies can cross the placenta during human pregnancy, but can’t in other mammals–offspring would only come into contact with these antibodies after birth via colostrum. In human blood typing, the “+” and “-” signs refer to the Rh factor. If an individual is positive, the Rh protein is present on the surface of their red blood cells, and if the Rh protein is not present then the individual tests negative. Rh disease occurs when one develops a hypersensitive immune response, developing anti-Rh factor antibodies that kill healthy Rh-containing blood cells. However, Rh disease can only occur within a specific circumstance: if the father is positive for Rh factor and the mother is negative, yielding an Rh-positive offspring. Even under these conditions, only the second child would be in danger of Rh disease–the mother builds antibodies against the Rh factor during her first pregnancy but the disease itself doesn’t manifest. To prevent Rh disease, if a mother in her second pregnancy has a fetus of opposite sign, RhIg is administered–an anti-Rh factor antibody that inhibits immune response [18, 19].
Conclusion
To conclude, neonatal isoerythrolysis can quickly become fatal to the neonate, but is perfectly preventable and treatable. Measures of prevention are particularly important, because neonates may begin nursing and absorbing antibodies that destroy red blood cells within hours of birth. Because the clinical signs of this disease are so severe, early detection is key to neonate survival. However, it is extremely easy to prevent this disease–a simple blood-type cross-match between a prospective father and mother will suffice. In the end, this is an issue of responsible breeding, thus conscientious breeders should not encounter Neonatal Isoerythrolysis at all.
References
- Jamieson CA, Baillie SL, Johnson JP. 2022. Blood Transfusion in Equids-A Practical Approach and Review. Animals (Basel). 12(17):2162. doi: 10.3390/ani12172162.
- Kähn W, Vaala W, Palmer J. 1991. Die neonatale Isoerythrolyse bei neugeborenen Fohlen [Neonatal isoerythrolysis in newborn foals]. Tierarztl Prax. 19(5):521-9.
- Becht JL, Semrad SD. 1985. Hematology, blood typing, and immunology of the neonatal foal. Vet Clin North Am Equine Pract. 1(1):91-116. Doi: 10.1016/s0749-0739(17)30771-x. PMID: 3907769.
- Proverbio D, Perego R, Baggiani L, Ferrucci F, Zucca E, Nobile F, Spada E. 2020. Prevalence of Ca Blood Type and Alloantibodies in a Population of Horses from Italy. Animals (Basel). 10(7):1179. doi: 10.3390/ani10071179.
- Felippe, J.B. Equine Neonatal Isoerythrolysis. 2017. In Interpretation of Equine Laboratory Diagnostics; John Wiley & Sons, Ltd. pp. 251–255.
- MacLeay JM. 2001. Neonatal isoerythrolysis involving the Qc and Db antigens in a foal. J Am Vet Med Assoc. 219(1):79-81, 50. doi: 10.2460/javma.2001.219.79.
- Carr EA. 2014. Field triage of the neonatal foal. Vet Clin North Am Equine Pract. 30(2):283-300, vii. doi: 10.1016/j.cveq.2014.05.001.
- McClure JJ, Koch C, Traub-Dargatz J. 1994. Characterization of a red blood cell antigen in donkeys and mules associated with neonatal isoerythrolysis. Anim Genet. 25(2):119-20. doi: 10.1111/j.1365-2052.1994.tb00091.x.
- Blackmer, J.M. 2010. Strategies for prevention of neonatal isoerythrolysis in horses and mules. Equine Vet. Educ. 15, 6–10.
- Wardrop KJ. 2005. The Coombs’ test in veterinary medicine: past, present, future. Vet Clin Pathol. 34(4):325-34. doi: 10.1111/j.1939-165x.2005.tb00057.x.
- Becht JL, Page EH, Morter RL, Boon GD, Thacker HL. 1983. Evaluation of a series of testing procedures to predict neonatal isoerythrolysis in the foal. Cornell Vet. 73(4):390-402.
- Silvestre-Ferreira AC, Pastor J. 2010. Feline neonatal isoerythrolysis and the importance of feline blood types. Vet Med Int. 2010:753726. doi: 10.4061/2010/753726.
- Silvestre-Ferreira A, Pastor J. 2021. Wild Felids Blood Group System. Animals (Basel). 11(12):3533. doi: 10.3390/ani11123533.
- Giger U, Kilrain CG, Filippich LJ, Bell K. 1989. Frequencies of feline blood groups in the United States. J Am Vet Med Assoc. 195(9):1230-2.
- Stormont C. 1975. Neonatal isoerythrolysis in domestic animals: a comparative review. Adv Vet Sci Comp Med. 19:23-45.
- Cohn LA, Kaplan-Zattler AJ, Lee JA. 2022. Fluid Therapy for Pediatric Patients. Vet Clin North Am Small Anim Pract. 52(3):707-718. doi: 10.1016/j.cvsm.2022.01.007.
- Stormont, C. 1977. The Etiology of Bovine Neonatal Isoerythrolysis. The Bovine Practitioner. 22–28. https://doi.org/10.21423/bovine-vol1977no12p22-28
- Vossoughi S, Spitalnik SL. 2019. Conquering erythroblastosis fetalis: 50 years of RhIG. Transfusion. 59(7):2195-2196. doi: 10.1111/trf.15307.
- Webb J, Delaney M. 2018. Red Blood Cell Alloimmunization in the Pregnant Patient. Transfus Med Rev. 32(4):213-219. doi: 10.1016/j.tmrv.2018.07.002.
The Use of Remotely Sensed Data in Modeling Cholera Amidst Climate Change
By Shaina Eagle, Global Disease Biology, ‘24
Introduction
Over 300,000 people reported having cholera in 2020 [12]. This infectious disease is spread by water or seafood contaminated by the Vibrio cholerae bacteria. V. cholerae can survive in the open ocean within phytoplankton [5]. The bacteria also spreads into inland water sources such as rivers, getting into people’s drinking water. This spread of cholera is affected by climate variables such as precipitation, temperature, and oceanic conditions [1, 2, 5, 6, 7, 11, 13]. Climate patterns such as the El Nino Southern Oscillation (ENSO) and the Indian Ocean Dipole (IOD) influence local weather patterns in coastal regions, causing more phytoplankton blooms [2, 11]. Climate change also disrupts water, sanitation, and hygiene (WASH) infrastructure, [4] creating favorable environmental conditions for V. cholerae to thrive [2]. As climate change causes fluctuations in weather patterns and coastal biology, researchers need a reliable method for tracking and predicting cholera. Early warning systems are key for health officials to be able to take proper preventative measures–from vaccine deployment campaigns to emergency clean water storage–to reduce the prevalence and fatality of cholera.
Satellites are one method to gather measurements of variables that affect the spread of cholera. Using electromagnetic reflection, satellites provide remotely-sensed geophysical data on variables such as temperature, water quality, precipitation, or vegetation [10]. Researchers use remotely sensed data in conjunction with algorithms and statistical analyses to model cholera outbreaks and predict how changing variables will alter disease spread. Satellite data is widely accessible, often free, and provides data over huge temporal-spatial ranges [5, 10]. Researchers are able to compile their data without being physically near the areas they are studying [10]. This review will analyze how researchers have developed methods for predicting cholera outbreaks using remotely sensed data, and demonstrate how refinement of these techniques will be crucial to combating cholera outbreaks amidst climate change.
Collection of Satellite Data
Natural disasters are increasing in intensity and frequency, heightening the opportunity for a cholera epidemic [2, 4]. Cholera epidemics have historically begun after storms, such as after Hurricane Matthew in Haiti [4]. Hurricanes can destroy WASH infrastructure, allowing cholera to seep into water supplies and leave people vulnerable to drinking this contaminated water [4]. Detecting outbreaks and identifying the source are crucial steps in managing deaths from cholera; it is also crucial to improve sanitation and access to clean drinking water and increase vaccination campaigns. These steps can be aided by remotely sensed data that feeds into prediction models [2].
Remotely sensed data measures variables that are known to be connected to cholera incidence. Huq et al. (2017) published research using remotely sensed precipitation, wind swath, geophysical landmarks, and population density after Hurricane Matthew struck Haiti in October 2016 [4]. The researchers created a map that showed areas at high risk for cholera and were able to predict where outbreak hotspots would occur up to four weeks after the hurricane [4].
Other useful variables include sea surface temperature (SST), sea surface salinity (SSS), land surface temperature (LST), precipitation, chlorophyll-a concentration (Ch-a), and soil moisture (SM) [1, 4, 5, 6, 7, 9, 11, 13]. SST and Ch-a are indicators of a habitat that is suitable for Vibrio cholerae growth [5, 6]. Flooding from extreme precipitation can flush seawater carrying V. cholerae into inland rivers, estuaries, or drinking water [4, 5, 6].
Satellites can provide data on climate variables in regions that health officials cannot access safely, or after a natural disaster when researchers cannot collect field data due to accessibility or time constraints [4]. This data could help researchers identify particular regions at risk for a cholera outbreak after an extreme weather event and help policymakers make informed decisions about where to implement vaccination programs or establish WASH infrastructure. And in districts where cholera survives endemically, remotely sensed data could help identify outbreak sources or thresholds for when an outbreak becomes an epidemic. Satellite data on EVCs and WASH infrastructure needs to remain publicly and freely available, and will be particularly effective in identifying potential cholera outbreaks as climate change increases the intensity and incidence of natural disasters and climate patterns that suit V. cholerae proliferation.
Turning Raw Data into Models
Tracking Essential Climate Variables
Satellites provide data across vast geospatial and temporal ranges about the Essential Climate Variables (ECVs) correlated with cholera outbreaks. Remote sensing systems allow researchers to build models of cholera dynamics based on these relationships [5]. Fooladi et al. (2021) used precipitation data from 1983 to 2016 to compute a non-standardized precipitation index (nSPI) in the Gavkhooni basin in Iran. Their model demonstrates how previous understanding of the environmental conditions that precede cholera outbreaks can be combined with satellite data to make novel predictions about disease outbreaks [3]. For example, an algal bloom is an exponential growth of phytoplankton, which requires chlorophyll-a to photosynthesize sunlight, grow, and produce nutrients [5]. Phytoplankton is a reservoir of V. cholerae and can be seen from space because of its green pigment. Therefore, Ch-a is a close enough proxy to phytoplankton for modeling the levels of V. cholerae bacteria in an area [5]. In 2021, Lai et al. used Landsat images from NASA and Sentinel-2A images from the European Space Agency to measure Ch-a in the Guanting Reservoir, one of the main water supply sources for Beijing, China. They developed a model between variables in the satellite images (bands, normalized difference vegetation index, surface algal bloom index, Apple algorithm values) and Ch-a [8]. Their studies in 2016, 2017, and 2019 predicted Ch-a to be correlated with the actual measured chlorophyll-a levels at a 0.05 significance level [8]. This data allowed the researchers to model trends of Ch-a and water nutrition status, which has applications to reservoir eutrophication statuses [8] and thus disease transmission.
Machine Learning
Variables such as LST and SM can be linked to cholera outbreaks through machine learning (ML) algorithms. ML elucidates complex relationships between variables, such as the risk of a cholera outbreak and EVCs [1]. Statistically analyzing input data taken from satellites, ML allows researchers to build models that predict an output (i.e. an outbreak) [9]. Algorithms such as Random Forest (RF), XGBoost, K-Nearest Neighbors, and Linear Discriminant Analysis have been examined by researchers [1, 9]. Campbell et al. (2020) found RF to be the most effective classifier due to its superior performance in handling oversampled and imbalanced datasets, yielding a high true positive rate (probability that an actual trend is correctly predicted) of 89.5% when fitting a model combining a season encoder, location encoder, LST, Ch-a, SSS, sea level anomalies, SM, and their lag values (using past variables to predict future variables) [1]. Campbell et al.’s model (2020) combined five EVCs and pulled data across forty districts of India from 2010 to 2018 [1].
In a 2013 study, Jutla acknowledged that Ch-a alone cannot serve as an accurate predictive factor of a cholera outbreak, as other organic matters and detritus not represented by a chlorophyll index can also contribute to the presence of cholera bacteria [6]. To account for this, Jutla developed the Satellite Water Marker (SWM) index, which uses wavelength radiance to identify coastal conditions and predict cholera outbreaks one to two months in advance [5]. SWM is based on the shifts in the difference between blue (412 nm) and green (555 nm) wavelengths, which determine the turbidity (impurity) of water [5]. A high correlation between SWM in the early winter months in the Bay of Bengal and cholera peaks in the spring was observed, and likely related to multiple coastal conditions, not just Ch-a [5]. Jutla et al. (2013) tested the Bay of Bengal SWM in Mozambique, where there is one annual cholera peak as opposed to two. They again found that the SWM was a more accurate indicator of cholera than Ch-a alone. Julta’s index was used again by Ogata et al. (2021) to determine the specific environmental conditions in previous seasons that precede cholera outbreaks in northern coastal regions in the Bay of Bengal. They linked spring cholera to summer precipitation and the previous fall/winter SWM. Meanwhile, La Niña-driven SST deviations and floods caused by high summer rainfall anticipated fall cholera outbreaks [11]. Variability in climate conditions and SWM over decades indicates that the predictive models are ever-shifting [11]. A clear understanding of shifts in climate patterns over time is thus integral to accurate forecasting.
Challenges and Limitations
Remotely sensed data is integral to developing timely and accurate predictive models and early warning systems for cholera outbreaks. There is no set of ECVs or a specific ML technique that can be applied universally, especially when looking at endemic versus epidemic cholera [1, 2, 5, 6, 7, 9]. Many studies struggled with a lack of field data against which to test their models, particularly after extreme weather events which may destroy existing data collection infrastructure [7]. Researchers were also challenged by imbalanced datasets when programming ML algorithms, even with particularly resilient algorithms like RF [1, 9]. Cholera is notoriously difficult to model because it can occur through multiple pathways of transmission, and cholera outbreaks are related to several climate variables through complex relationships [5, 6, 9]. Further testing in diverse regions, under various climate conditions, utilizing assorted ECVs, and employing numerous ML techniques is necessary to make these models as accurate as possible. Future studies should focus on long-term observations of variables known to be connected to cholera and V. cholerae, such as sea surface salinity [1, 11]. Future models also need to take socioeconomic data into account [1, 4].
Conclusion
The purpose of this review was to demonstrate how and why remotely sensed data is being used to predict cholera outbreaks, particularly as climate change makes local weather patterns more unpredictable. Researchers do not indicate a lack of sufficient satellite or ML technology necessary to make satellite data-driven cholera prediction models commonplace. However, different regions around the world have different seasonal and interannual variability of cholera transmission [5], making it difficult to develop a universal model. Therefore, future studies should emphasize testing various ML methods with diverse EVCs worldwide. Future studies should also work to formulate indices such as the SWM that can be applied over different geographical regions with minimal alterations. As climate change intensifies, cholera prediction models are vital components of disease prevention. Cholera is unlikely to be eradicated [5], but there are steps that can be taken to control its transmission and minimize its mortality. These steps are more effective the more time officials have to deploy them, so models that can provide significant lead times are critical.
Works Cited
[1] Campbell AM, Racault M-F, Goult S, Laurenson A. 2020. Cholera risk: a machine learning approach applied to essential climate variables. IJERPH. 17(24):9378.
[2] Christaki E, Dimitriou P, Pantavou K, Nikolopoulos GK. 2020. The impact of climate change on cholera: A review on the global status and future challenges. Atmosphere. 11(5):449.
[3] Fooladi M, Golmohammadi MH, Safavi HR, Singh VP. 2021. Fusion-based framework for meteorological drought modeling using remotely sensed datasets under climate change scenarios: resilience, vulnerability, and frequency analysis. Journal of Environmental Management. 297:113283.
[4] Huq A, Anwar R, Colwell R, McDonald MD, Khan R, Jutla A, Akanda S. 2017. Assessment of risk of cholera in Haiti following Hurricane Matthew. The American Journal of Tropical Medicine and Hygiene. 97(3):896–903.
[5] Jutla AS, Akanda AS, Islam S. 2010. Tracking cholera in coastal regions using satellite observations 1. JAWRA Journal of the American Water Resources Association. 46(4):651–662.
[6] Jutla A, Akanda AS, Huq A, Faruque ASG, Colwell R, Islam S. 2013. A water marker monitored by satellites to predict seasonal endemic cholera. Remote Sensing Letters. 4(8): 822-831.
[7] Khan R, Aldaach H, McDonald C, Alam M, Huq A, Gao Y, Akanda AS, Colwell R, Jutla A. 2019. Estimating cholera risk from an exploratory analysis of its association with satellite-derived land surface temperatures. International Journal of Remote Sensing. 40(13):4898–4909.
[8] Lai Y, Zhang J, Song Y, Gong Z. 2021. Retrieval and evaluation of chlorophyll-a concentration in reservoirs with main water supply function in Beijing, China, Based on Landsat Satellite Images. IJERPH. 18(9):4419.
[9] Leo J, Luhanga E, Michael K. 2019. Machine learning model for imbalanced cholera dataset in Tanzania. The Scientific World Journal. 2019:1–12.
[10] Moore GK. 1979. What is a picture worth? A history of remote sensing / Quelle est la valeur d’une image? Un tour d’horizon de télédétection. Hydrological Sciences Bulletin. 24(4):477–485.
[11] Ogata T, Racault M-F, Nonaka M, Behera S. 2021. Climate precursors of satellite water marker index for spring cholera outbreak in Northern Bay of Bengal coastal regions. International Journal of Environmental Research and Public Health. 18(19):10201.
[12] World Health Organization. 2021. Cholera annual report 2020. Weekly Epidemiological Record, Volume 96, page 445-460.
[13] Xu M, Cao CX, Wang DC, Kan B, Xu YF, Ni XL, Zhu ZC. 2016. Environmental factor analysis of cholera in China using remote sensing and geographical information systems. Epidemiol Infect. 144(5):940–951.
Breast Cancer Screenings for Transgender Individuals
By Anisha Narsam, Neurobiology, Physiology, and Behavior, ’23
Author’s note: I hope to raise awareness about the barriers that transgender individuals face in order to obtain mammograms, and possible methods for increasing breast cancer screenings in this population. This article is meant for the general public and informs readers about some of the disparities that members of the LGBTQ community face, while also exploring methods that can be used to potentially bridge this gap in care. I chose this topic because I previously read an article about the disparities in cancer screenings in minority communities, and I wanted to research more about this topic specifically for transgender populations in relation to mammograms. Through this article, I hope readers can become more aware of how both transgender individuals and healthcare professionals lack knowledge on mammogram screening requirements, the barriers that can decrease mammogram rates, and methods that can improve breast cancer screening rates in transgender populations.
ABSTRACT
Objective: The aim is to analyze reasons for the gap in transgender breast cancer screenings, including the lack of proper screening guidelines and barriers to obtaining mammograms, and what can be done to alleviate this issue to improve healthcare for transgender individuals.
Methods: This review analyzed primary research articles from the past three years from PubMed and Google Scholar. The sources were found with the key words “transgender breast cancer screening” and “transgender mammograms”, and were used to determine the extent of the disparity of breast cancer screenings for transgender individuals, existing knowledge of transgender mammogram requirements, barriers to obtain screenings, and methods to combat this issue.
Results: Transgender individuals have decreased rates of breast cancer screenings compared to cisgender individuals due to healthcare workers’ lack of knowledge about transgender health and barriers for obtaining mammograms. Having more training for healthcare professionals, encouraging a more inclusive environment, and having organ inventories for patients to ensure all necessary screenings are met are a few ways to combat this issue and decrease disparities for transgender individuals to obtain these crucial mammograms.
INTRODUCTION
Transgender individuals have disproportionately fewer mammograms than cisgender individuals [1]. According to the CDC, mammograms are pictures of the breast taken with an X-ray that physicians use to detect breast cancer [2]. By analyzing the current breast cancer screening rates of transgender individuals, understanding current knowledge of healthcare professionals on transgender health, addressing barriers that can prevent breast cancer screenings, and exploring ways to increase the number of mammograms transgender individuals obtain, this paper aims to encourage mammogram screenings to benefit transgender health [1, 3-11]. These studies are crucial for recognizing the barriers that prevent transgender individuals from getting mammograms, while also exploring the ways to counter such barriers to ensure the transgender population obtains these preventative screenings. Current research presents decreased breast cancer screening rates and lack of concrete breast cancer screening guidelines for transgender patients compared to cisgender patients [1, 9-11]. Moreover, research shows limited understanding of transgender health for healthcare workers [4-5]. One of the main barriers to obtaining mammograms is anxiety, including emotional and financial distress [3, 7, 11]. However, current research also presents potential solutions to the problem, including having more training for healthcare personnel, and having organ inventories for patients to understand each individual’s unique needs to help alleviate this issue by increasing the mammogram rates of transgender individuals [6, 8, 11]. This review focuses on current literature that provides information about access to breast cancer screenings for transgender individuals and how access can be improved.
DISCUSSION
Screening Rates and Knowledge on Screening Requirements
Transgender individuals have decreased rates of mammograms compared to cisgender individuals [1, 9-11]. Both a survey of transgender individuals from Iowa and a similar survey of transgender people from Dallas, Texas found that transgender men have lower breast cancer screening rates than cisgender females [1, 9]. However, there is variation in the percentage of transgender individuals obtaining mammograms in these different parts of the country. The Iowa study found that 75% of transgender men have obtained mammograms, a lower percentage than the 94% of cisgender women who have had mammograms [1]. On the other hand, the survey results from Texas indicate that only 40% of transgender males assigned female at birth have had mammograms at some point in their lives [9]. This demonstrates how there is variation across different parts of the country in regards to breast cancer screening rates for transgender men, although in both cases, rates are lower than that of cisgender individuals [1, 9]. These variations can be due to cultural differences in obtaining screenings, different state guidelines for screenings, or different socioeconomic statuses of individuals taking the survey that can affect whether or not they can afford their screenings [4, 11]. Moreover, researchers have also determined that rates of clinical breast exams, during which healthcare professionals use their hands to check for lumps in breast tissue, are even lower than the mammogram rates for transgender individuals [1, 12].
Transgender individuals also have limited knowledge of breast cancer screening requirements [9, 11]. In fact, according to researchers, over 65% of the sexual and gender minority community, which includes transgender individuals, is unaware of the screenings they require [11]. Other researchers confirm that there is a lack of knowledge in regards to the healthcare needs of transgender individuals, which translates to decreased knowledge in transgender populations themselves about the need for breast cancer screenings [8, 10]. Additionally, more than half of transgender individuals who were eligible for mammograms have not completed them which, according to researchers, demonstrates a limited understanding of organ-specific screening requirements [8]. Healthcare professionals also have limited knowledge on transgender mammogram requirements [4-5, 8].
Healthcare Professionals’ Existing Knowledge and Practices
Radiologists, genetic counselors, and healthcare staff have limited information about transgender breast cancer screening requirements and are less comfortable asking patients for their pronouns [4-5, 8]. In fact, 65% of radiologists did not recognize the importance of breast cancer screening guidelines for transgender men without chest surgery, and genetic counselors did not have the proper knowledge about the importance of breast cancer screenings for transgender women on estrogen therapy [4-5]. This demonstrates how more guidelines and protocols in relation to transgender mammograms need to be emphasized in healthcare environments [4, 11]. Both genetic counselors and radiologists agree that more research should be done to improve healthcare for transgender populations [4-5]. Genetic counselors reported in a survey feeling comfortable asking both transgender and cisgender patients about their pronouns [5]. Similarly, most radiologists claimed they felt comfortable asking patients about sex and gender identity for mammograms [4]. However, a survey of radiologists found that only 13% of physicians actually record both the sex and gender of transgender patients [4].
Also, in practice, when researchers observed providers and staff in another study, most healthcare professionals hesitated to ask the patient for their pronouns as they claimed they were uncomfortable and scared of offending the patient [8]. This demonstrates how although some healthcare professionals may claim to be comfortable asking about the gender and sex of patients, they may not execute this in practice, possibly because they are scared of offending the patient [4, 5, 8]. Transgender individuals may still develop breast cancer whether or not they identify with their biological sex, so it is imperative that healthcare providers ask for both gender and sex for each individual so they can obtain the preventative breast cancer screenings they require [3]. However, there can be many barriers to prevent access to such screenings [3, 7, 10-11].
Barriers and Inclusivity
Two of the main barriers for obtaining mammograms include anxiety or emotional distress associated with the appointment in transgender patients, as well as discrimination [3, 7, 10-11]. Transgender individuals have a disproportionately higher prevalence of anxiety. In an interview with one transgender individual, the patient stated that he was anxious about getting a breast cancer screening done because he did not want to stand out as the only man in a predominantly female division of the clinic, and this anxiety prevented him from getting screened [3, 7]. In fact, around half of transgender individuals do not get preventative cancer screenings because it causes them more emotional distress, on average, than these screenings do for cisgender individuals [11]. Such anxiety can also stem from the fear of being treated as their gender assigned at birth as opposed to the gender they identify as [7]. Another possible source of emotional distress comes from not having proper guidance on what screenings are recommended [4, 11].Although screening guidelines exist, there are no official nationwide screening recommendations published that can provide information on, for example, how having hormone therapy or gender-affirming surgeries can affect the types of screenings needed [4, 11]. Financial insecurities are a third source of emotional distress, with transgender individuals reporting lower incomes and health insurance coverage, causing around 50% of this population to postpone preventative cancer screenings as a result [3-4 10-11]. Moreover, transgender individuals experience higher rates of discrimination compared to cisgender groups, with around 52% of transgender individuals experiencing some type of discrimination, both during cancer screenings and for individuals who have received cancer treatment [3-4, 10]. Some examples of such discrimination include how 20% of transgender individuals were denied healthcare because they were gender-nonconforming, and how 20% of this population were also denied physician care for being transgender, which can lead to distrust in healthcare providers and a reduction in mammograms for transgender individuals [11].
Lack of inclusivity in medical settings can be another barrier for obtaining breast cancer screenings, and there can be contradicting information about what is considered inclusivity between transgender individuals and other outside populations, including researchers [3,7]. For example, based on a survey sent to radiology device companies, some of which provide the technology necessary for mammograms, researchers found that all responding companies had a third gender option of “other” when registering patients [3]. The only companies that responded were the ones that had more gender options than male and female, but the companies that did not respond may still have only two gender options [3]. Based on this information, researchers concluded that since there are more options than male and female on radiology devices, this demonstrates a more inclusive environment for transgender patients [3]. However, in an interview with one transgender individual, they mention how they feel offended when they are treated as the “other” just because they do not fit into any of the dominant gender groups, since it makes them feel alienated and unacknowledged [7]. This demonstrates how what may seem like positive language and inclusive language to some people can actually be alienating to transgender individuals in practice. Such language can, in fact, dissuade transgender individuals from obtaining mammograms, presenting another barrier to receiving the care they need [7,11]. It is crucial to combat such barriers to improve mammogram rates in transgender populations [3-4, 6-8, 11].
Methods to Improve Mammogram Rates
There are many methods that can be used to improve mammogram rates in transgender populations [3-4, 6-7, 11]. For example, training physicians and other hospital staff on methods for providing a more inclusive environment for transgender individuals can help patients feel more comfortable obtaining mammograms [3, 6, 11]. Also, providing more comfortable spaces, such as waiting rooms and changing rooms, at mammogram centers can help transgender patients feel more reassured and less anxious [11]. Another method to increase inclusivity is to have LGBTQ-specific mammogram services, or specific days the clinic is open just for LGBTQ individuals, so transgender patients would feel more reassured that the clinic is LGBTQ-friendly to ensure a more gender-affirming experience [7, 11]. It is also crucial for doctors and nurses themselves to be more understanding and listen to the patients’ concerns, since this can help build trust between the patient and provider [3, 11]. For example, one interviewed transgender patient mentioned that their physician not only affirmed their gender, but was also not forcing or chiding the patient into getting their previous gender-related preventative screenings, which made them feel a lot more comfortable and respected [7]. Once this trust is built, however, transgender patients are more likely to follow the physician’s screening recommendations, the physician provides, including for mammograms., to ensure transgender patients obtain the screenings they need [6, 9]. Physicians can also help build knowledge on mammograms for transgender individuals by enrolling their patients in a national research database that can consolidate data about the screenings provided to patients and the outcomes of those screenings. Having this data can help healthcare professionals nationwide gain a more holistic view of the rates of certain cancers in the transgender populations, and what screenings should be emphasized for to transgender patients, as a result, to provide early detection for those specific cancers [4]. This can provide more holistic and concrete guidelines for mammograms to the entire country [4]. This database can present screening and outcome information of transgender individuals from around the country, providing a way to detect cancer rates and change how healthcare professionals treat patients based on the needs of the population [4]. Another way to ensure a more holistic understanding of each patient is to recommend organ-specific screenings, since each individual is unique, to ensure that patients do not miss any crucial cancer screenings, including mammograms [8]. In this way, healthcare professionals can ensure that no patient is left behind when it comes to obtaining such screenings.
CONCLUSION
Overall, there are decreased amounts of breast cancer screenings done in transgender populations compared to cisgender populations, which is accompanied by a lack of knowledge of healthcare professionals on transgender screening requirements and barriers to screenings [1, 3-5, 7-11]. However, it has been shown that promoting doctor screening recommendations, having organ inventories, and providing a more supportive environment can positively combat this issue [3-4, 6-8, 11]. There are still more ways to improve screening rates by training healthcare providers and having a more inclusive environment, and further research needs to be done in this area to show the effects of such training on transgender mammogram screening rates [3, 6, 11]. Whereas some researchers argue that there is inclusivity in healthcare environments with radiology equipment having the third option for gender as “other”, transgender individuals themselves may not consider this language to be inclusive and it could instead make them feel sidelined in screening centers [3, 7]. Having more input from transgender individuals themselves on how they can be supported in breast cancer screening centers, and during the mammogram procedure, can increase comfort in healthcare environments to improve mammogram rates in this population [7].
REFERENCES
- Gilbert PA, Lee AA, Pass L, Lappin L, Thompson L, Sittig KW, Baker E, Hoffman-Zinnel D. 2020. Queer in the heartland: cancer risks, screenings, and diagnoses among sexual and gender minorities in Iowa. J Homosex [Internet]. 69(3):1-17. doi:10.1080/00918369.2020.1826832
- Centers for Disease Control and Prevention. What is a mammogram? Accessed Nov 18, 2022. Available from: https://www.cdc.gov/cancer/breast/basic_info/mammograms.htm.
- Matoori S, Donners R, Nuñez D, Nguyen-Duong S, Riopel C, Baumgartner M, Sartoretti E, Sartoretti T, Sartoretti-Schefer S, Volm T, Fröhlich JM, Forstner R, Koh D, Gutzeit A. 2022. Transgender health and medicine – are radiological devices prepared? EMJ Radiol [Internet]. 151:1-3. doi:10.1016/j.ejrad.2022.110320
- Sonnenblick E, Lebron-Zapata L, Yang R, Dialani V, Dontchos BN, Destounis S, Grimm L. 2022. Breast imaging for transgender individuals: assessment of current practice and needs. J Am Coll Radiol [Internet]. 19(2):221-231. doi:10.1016/j.jacr.2021.09.047
- Berro T, Zayhowski K, Field T, Channaoui N, Sotelo J. 2020. Genetic counselors’ comfort and knowledge of cancer risk assessment for transgender patients. J Genet Couns [Internet]. 29(3):342-351. doi:10.1002/jgc4.1172
- Pratt-Chapman M, Ward A. 2020. Provider recommendations are associated with cancer screening of transgender and gender-nonconforming people: a cross-sectional urban survey. Transgend Health [Internet]. 5(2):80-85. doi:10.1089/trgh.2019.0083
- Kerr L, Fisher CM, Jones T. 2021. “I’m not from another planet”: the alienating cancer care experiences of trans and gender-diverse people. Cancer Nurs [Internet]. 44(6):438-446. doi:10.1097/NCC.0000000000000857
- Ulrich I, Harless C, Seamon G, Kim A, Sullivan L, Caldwell J, Reed L, Knoll H. 2022. Implementation of transgender/gender nonbinary care in a family medicine teaching practice. J Am Board Fam Med [Internet]. 35(2):235-243. doi:10.3122/jabfm.2022.02. 210182
- Polizopoulos-Wilson N, Kindratt T, Hansoti E, Pagels P, Cano JP, Day P, Gimpel N. 2021. A need assessment among transgender patients at an LGBTQ service organization in Texas. Transgend Health [Internet]. 6(3):175-183. doi:10.1089/trgh.2020.0048
- Ussher JM, Allison K, Perz J, Power R. 2022. LGBTQI cancer patients’ quality of life and distress: a comparison by gender, sexuality, age, cancer type and geographical remoteness. Front Oncol [Internet]. 12:1-27. doi:10.3389/fonc.2022.873642
- Lombardo J, Ko K, Shimada A, Nelson N, Wright C, Chen J, Maity A, Ruggiero ML, Richard S, Papanagnou D, Mitchell E, Leader A, Simone NL. 2022. Perceptions of and barriers to cancer screening by the sexual and gender minority community: a glimpse into the health care disparity. Cancer Causes Control [Internet]. 33(4):559-582. doi:10.1007/s10552-021-01549-4
- Centers for Disease Control and Prevention. What is breast cancer screening? Accessed Nov 18, 2022. Available from: https://www.cdc.gov/cancer/breast/basic_info/screening.htm.
Current and Potential Therapeutic Options for ALS Individuals
By Anna Truong, Neurobiology, Physiology, and Behavior, ’22
Author’s Note: I wrote this piece of work for an assignment through my UWP 104F course, and felt very connected with it. I decided my topic to be about a disease known as ALS because my father was diagnosed when I was at a young age. At the age of nine, I did not understand the gravity of becoming sick, and how much the world can change when someone important in your life passes away. I did not understand how impactful a disease was until I had the experience as a family member. ALS became a topic of interest to me since then from class presentations about interesting scientific topics to college research papers and literature reviews. This literature review is something that I am proud of because it encompasses ALS as the disease that has involved me and my family. From this review, I hope readers learn more about ALS and how the current research can pave a way for future research in the treatment of ALS.
Introduction
Amyotrophic Lateral Sclerosis (ALS) is a neurodegenerative disease that is characterized by progressive degeneration of motor neurons in the spinal cord and brain [1-5]. Motor neuron degeneration inhibits the ability of the brain to send signals to the muscles to control movement. There are two types of motor neurons responsible for this communication: the upper and lower motor neurons. Lesions in upper motor neurons prevent the signal cascade to the lower motor neurons that send another signal responsible for muscle movement. This can lead to muscular atrophy, paralysis, and eventually death [1-5]. Various injuries such as damage to the spinal cord or strokes, as well as other factors like oxidative stress induced by free radicals, contribute to the destruction of motor neurons [6]. Approximately 5 of every 100,000 individuals will be affected by ALS and the average life expectancy after diagnosis is between 2-5 years [2].
Many studies have focused on identifying the cause for motor neuron cell death and the genes involved in the development of the disease. Although bodily mechanisms by which motor neurons degenerate remain unclear, they are thought to encompass a non-cell autonomous process [3]. The purpose of this literature review is to analyze the current and potential treatments that can be effective toward individuals experiencing ALS. This article will focus on a current drug treatment called Edaravone, followed by potential treatments, astrocyte-based therapy and cell-based therapies.
Drug Treatments
Edaravone is a free radical-scavenging drug that functions to protect motor neurons from free radicals and oxidative stress damage in the central nervous system (CNS) [2,6]. Edaravone effectively acts on oxidative stress by reducing the number of free radicals to slow disease progression. With the absence of a cure, such treatment options have mainly contributed to prolonged survival [6].
ALS Functional Rating Scale
In this section, we will analyze the effect of Edaravone on disease progression through scoring of motor function by the revised ALS Functional Rating Scale (ALSFRS-R). The ALSFRS-R is an instrument designed for the clinical evaluation of functional status of ALS patients and efficacy of clinical trials [6]. It measures 12 aspects of physical function such as swallowing, breathing, and walking, scoring functioning ability from 4 (normal) to 0 (no ability) with a maximum total score of 48 and a minimum of 0.
Edaravone treatment on ALS patients
During normal disease progression, it is assumed that decline in functioning scores is almost linear [6]. When comparing ALSFRS-R scores between patients who received either placebo treatment or Edaravone treatment, there was a significantly faster decline in functional scores for those who received the placebo. This indicates a considerable loss in the ability to perform everyday tasks [2]. In conjunction with these results, a further study has shown greater improvements in ALSFRS-R scores for patients after beginning Edaravone treatment compared to the pre-treatment period [7]. The pre-treatment period lost an average of 4.7 points on the ALSFRS-R whereas the treatment period showed a smaller average loss of 2.3 points over the same time duration [7]. This indicates possible clinical efficacy for Edaravone due to its ability to effect a more gradual decline.
In addition, compared to placebo, Edaravone remains effective for up to a year, after which survival rates start to decline [2]. Edaravone’s effectiveness is also more prevalent in the early stages of ALS progression, but long-term effects of Edaravone are not yet fully evaluated so results past a year are unclear. Further limitations to these studies, including a nonlinear difference in decline between functional rates of early stages of ALS and end stages of ALS, require more research before affirming the long-term health benefits through Edaravone [2,6,7]. Therefore, as a marketed drug, it is difficult to be sure of its full effectiveness from the lack of positive results in life expectancy of the target population. On the other hand, no detrimental effects or worsening of symptoms due to Edaravone were analyzed during patient trials besides a few side effects including bruising, headaches, and hypoxia [6]. Due to these factors, Edaravone remains a partially beneficial drug.
Potential Therapeutic Options
Although Edaravone’s effectiveness is still actively being deciphered, there have been studies on whether other types of cellular targets within the brain and stem cells, such as astrocytes, could help slow down or halt disease progression and thus be effective treatments for ALS [1,3,4]. Astrocytes are a type of glial cell within the CNS that is inflamed under the diseased state [1]. Most of the following research involves the SOD1-G93A transgenic mice expressing the human SOD1 gene with G93A mutation. It is an important mouse model for studying ALS as it presents many of the pathological symptoms experienced by patients, including motor impairment and motor neuron death, allowing for an analogous simulation [1].
In the current state of medication development, the SOD1-G93A transgenic mice are utilized for their relation to astrocytes, a promising target for effective treatments. The increasing number of studies performed on astrocytes show them to be crucially involved in ALS through their influence on motor neuron fate and disease progression. The studies discussed will present multiple experiments on the SOD1-G93A transgene, and explain how elimination and/or alteration of this gene can help slow prominent signs of disease and extend lifespan [1,5].
Role of Astrocytes in ALS
Upon a specific signal within the CNS, astrocytes can transform into either their reactive A1 state characterized by promotion of neurodegeneration and toxicity or their neuroprotective A2 state which promotes healing and repair of injury [3,9]. Among ALS patients, the reactive A1 astrocytes are dominant along with the mutant transgenic SOD1-G93A, contributing toxic components that participate in ALS pathology [1].
Amongst these pathologies, researchers investigate neuroinflammation, characterized by inflammation of the nervous tissue, to prove its benefits for minimizing reactive astrocytes [1]. Neuroinflammatory stimuli like lipopolysaccharide (LPS) lead to a signal transduction cascade that can secrete immunologically active molecules like IL-1α, TNFα, and C1q that transform resting astrocytes to their neurotoxic A1 state [1]. These reactive astrocytes will lose regular functions and secrete factors toxic to neurons [1,3]. Moreover, isolated astrocytes from ALS patients were found to be toxic to healthy, cultured motor neurons [3]. This indicates the involvement of astrocytes in motor neuron death that can lead to a progressive decline of motor function [3]. Lowering the prevalence of neuroinflammation may contribute to a decrease in motor neuron death, and therefore delayed progression of ALS.
Healthy individual’s communication between the motor neuron, astrocytes, and immune cells compared to those of an ALS individual
Astrocyte-Based Therapy
To minimize neuroinflammatory effects of ALS, Guttenplan et al. determined that knockout, or the genetically modified absence of IL-1α, TNFα, and C1q in SOD1-G93A mice did not produce any reactive astrocytes [1]. This triple knockout was also linked to the possibility of neuroinflammatory reactive astrocytes becoming a therapeutic target for ALS. The triple knockout mice presented with lower levels of reactive astrocyte marker C3 and had a significantly extended lifespan of over 50% compared to regular SOD1-G93A mice [1]. Treatments that implement this mechanism of lowering neuroinflammation can contribute to a turning point in increasing efficacy rates of therapies involved in ALS.
In addition, diagnosis is primarily followed after the presence of symptoms [1]. An approach to treatment included restoring normal functionality of endogenous astrocytes through the transplantation of healthy astrocytes in patients [3]. These transplanted healthy astrocytes can provide neuroprotection through reduction of misfolded proteins in motor neurons. However, they can also transform into neurotoxic A1 astrocytes when in the diseased environment of the CNS [3]. The mechanisms through which transplanted astrocytes act continue to be thoroughly understood, yet provide a promising target for an ALS targeted therapy [1,3]. Delay in disease progression may be more effective with a combination of therapies attacking both reactive astrocytes and motor neurons compared to individual therapies [1].
Cell-Based Therapy
Another approach that has been studied as a potential therapeutic target for ALS is through stem cells. Mesenchymal stem cells (MSCs) are adult multipotent precursors that can be prompted to release neurotrophic factors released by A2 astrocytes and have shown to be beneficial in the regeneration of healthy cells [3]. Transplantation of the same individual’s MSCs induced to secrete neurotrophic factors showed early signs of safety and treatment effectiveness [3].
Furthermore, a specific stem cell therapy “Neuro-Cells”, a combination of MSCs and hematopoietic stem cells (HSCs), along with anti-inflammatory measures was administered to both SOD1-G93A mice and FUS-tg mice, a variant of the standard SOD1-G93A strain [4]. In SOD-1 and FUS-tg, inflammation contributes to disease progression, allowing for comparison investigation in these two mutations [4]. When tested on rats subjected to spinal cord injury, this mixture had an anti-inflammatory effect, thus improving motor function and decreasing concentrations of proinflammatory cytokines in the cerebrospinal fluid [4]. Muscle degeneration among FUS-tg mice was also compared during Neuro-Cell injections. Muscular atrophy was noticed to be partly rescued by the mixed stem cell therapy. To verify these results, Neuro-Cells were administered to SOD1-G93A mice. Results showed an indication of improved motor function similar to that of the FUS-tg mice, thus providing further evidence of disease counteraction [4]. These signs of efficacy and preclinical studies of transplantation of MSCs and HSCs are indicators of beneficial treatments from the usage of stem cells through reduction of motor neuron death, prolonged survival and improved motor performances [3]. Coincidingly, according to de Munter et al., stem cell therapy should be utilized as a part of the cell-based treatment of ALS due to the knowledge present already in this field [4]. Even so, more research is needed to define the anti-inflammatory mechanisms in ALS pathology and other effects that “Neuro-Cells” have on ALS.
Possible stem cell therapies that could be used to treat ALS
Discussion/Conclusion
While there are currently effective drug treatments available for ALS, there is still research being conducted on these drugs to better ensure quality and effectiveness. Edaravone’s ability to slow disease progression remains minimal or ineffective to patients who are past the beginning stages of ALS progression, and toward end-stage ALS, respectively. Decline in patients experiencing ALS occurs non-linearly with a rapid decline toward the end-stage, and so clinical effects of Edaravone may not be beneficial. Its therapeutic effects are yet to be better understood and whether or not their effectiveness is due in part to the drug. Although it is currently being used as a treatment option, Edaravone could be further improved for efficacy.
In association, potential treatment options of astrocyte-based therapies and cell-based therapies play an important role in the future of ALS. Targeting astrocytes and neuroinflammation, and utilizing stem cell therapy can provide benefits to slowing disease progression. However, much like the current drugs, there is still much to understand about other subpopulations of astrocytes and stem cells that could contribute to ALS pathology. The intertwined participation of therapies is important to note as it could provide greater benefits to patients seeking out future treatments. Although options of treatment are currently limited, these studies suggest potential therapeutic approaches that can be optimized to halt or slow disease progression. Currently, stem cells are encouraged to be part of treatment in ALS patients, suggesting its potential in reducing inflammation and therefore can be highly effective in minimizing motor neuron death. Additionally, astrocytes are becoming a major direction in the studies of ALS, due to its direct involvement in motor neuron death associated with the disease. Astrocytes may become the center of research in the near future and lead to a more efficient slowing of disease progression compared to the currently approved drugs. With more studies, the cellular mechanisms contributing to the deterioration of motor neurons involved in ALS can lead to promising treatments with greater efficacy against the disease. In due time, we can hope to see an increase in the average life expectancy of 2-5 years to much longer.
References:
- Guttenplan, K. A., Weigel, M. K., Adler, D. I., et al. (2020). Knockout of reactive astrocyte activating factors slows disease progression in an ALS mouse model. Nature communications, 11(1), 3753. https://doi.org/10.1038/s41467-020-17514-9
- Shefner, J., Heiman-Patterson, T., Pioro, E. P., Wiedau-Pazos, M., Liu, S., Zhang, J., Agnese, W., & Apple, S. (2020). Long-term edaravone efficacy in amyotrophic lateral sclerosis: Post-hoc analyses of Study 19 (MCI186-19). Muscle & nerve, 61(2), 218–221. https://doi.org/10.1002/mus.26740
- Izrael, M., Slutsky, S. G., & Revel, M. (2020). Rising Stars: Astrocytes as a Therapeutic Target for ALS Disease. Frontiers in neuroscience, 14, 824. https://doi.org/10.3389/fnins.2020.00824
- de Munter, J., Shafarevich, I., Liundup, A., et al. (2020). Neuro-Cells therapy improves motor outcomes and suppresses inflammation during experimental syndrome of amyotrophic lateral sclerosis in mice. CNS neuroscience & therapeutics, 26(5), 504–517. https://doi.org/10.1111/cns.13280
- Apolloni, S., Amadio, S., Fabbrizio, P., Morello, G., Spampinato, A. G., Latagliata, E. C., Salvatori, I., Proietti, D., Ferri, A., Madaro, L., Puglisi-Allegra, S., Cavallaro, S., & Volonté, C. (2019). Histaminergic transmission slows progression of amyotrophic lateral sclerosis. Journal of cachexia, sarcopenia and muscle, 10(4), 872–893. https://doi.org/10.1002/jcsm.12422
- Luo, L., Song, Z., Li, X., Huiwang, Zeng, Y., Qinwang, Meiqi, & He, J. (2019). Efficacy and safety of edaravone in treatment of amyotrophic lateral sclerosis-a systematic review and meta-analysis. Neurological sciences : official journal of the Italian Neurological Society and of the Italian Society of Clinical Neurophysiology, 40(2), 235–241. https://doi.org/10.1007/s10072-018-3653-2
- Sawada H. (2017). Clinical efficacy of edaravone for the treatment of amyotrophic lateral sclerosis. Expert opinion on pharmacotherapy, 18(7), 735–738. https://doi.org/10.1080/14656566.2017.1319937
- Liu, J., & Wang, F. (2017). Role of Neuroinflammation in Amyotrophic Lateral Sclerosis: Cellular Mechanisms and Therapeutic Implications. Frontiers in immunology, 8, 1005. https://doi.org/10.3389/fimmu.2017.01005
- Liddelow, S., & Barres, B. (2015). SnapShot: Astrocytes in Health and Disease. Cell, 162(5), 1170–1170.e1. https://doi.org/10.1016/j.cell.2015.08.029
Physiological and Psychological Factors in Developing Comorbid Mood Disorders in Complex Regional Pain Syndrome Patients
By Clara Brewer, Neurobiology, Physiology, and Behavior ’22
Author’s Note:
In 2015, I was diagnosed with a rare pain disorder- Complex regional pain syndrome (CRPS). Not only does this disorder cause unimaginable pain, it is also virtually invisible to others, creating a discrepancy between the outside world’s perception of CRPS and the actual struggle that CRPS patients deal with, both physically and emotionally. Current trends for CRPS treatment are focused on the physical aspects of the disorder- increasing mobility and use of the affected limb. Oftentimes, this approach fails to treat the simultaneous psychological changes that can increase a patient’s risk for developing a concurrent mental health disorder. Even though I had access to a top CRPS treatment facility, I still experienced depression and anxiety which made my recovery from CRPS much harder. While writing a different paper focused on educating newly diagnosed individuals on the causes, symptoms, and available treatment options for CRPS, I realized that most of my findings addressed the physical symptoms and not the psychological changes. This discrepancy between both my own experiences and the experiences of many others who were diagnosed with CRPS and the treatment options available inspired me to try to understand this connection between CRPS and the increased diagnosis of comorbid mental health disorders.
While many students who read this paper may not go on to change the treatment for one rare disorder- it is important for anyone who wants to go into the medical field to begin reshaping their approach to medicine by reading articles like mine. I hope to shed light on the importance of reevaluating current treatment protocols for a wide range of disorders to include more mental health support for patients- a topic directed for students hoping to pursue a career in the medical field. By viewing medical diagnosis- in this case CRPS- as an interconnection between mind and body, future medical professionals will be able to holistically address disorder, instead of treating only the more obvious physical symptoms.
Complex regional pain syndrome (CRPS), a neuroinflammatory disease, ranks number one on the McGill Pain Scale, topping fibromyalgia, cancer, and amputation without anesthetics. The development of CRPS typically occurs after trauma to the arm or leg (e.g., breaking of a bone, dislocation of a joint, surgical trauma to a limb) that results in a disproportionately high sensation of pain. This painful response is chronic and characterized by constant pain with additional flare-ups that last different amounts of time from person to person. As a result, those with CRPS will often experience perpetual pain that can be made even worse with stress.
After the initial injury and subsequent chronic pain, a CRPS diagnosis follows the Budapest Diagnostic Criteria, where patients must report symptoms in three of four categories and must present symptoms in two of four categories at the time of evaluation. The categories are as follows: sensory, vasomotor (relating to blood vessels), sudomotor (relating to sweat glands), and motor/tropic (relating to muscles and bones). Symptoms include swelling of the limbs, skin discoloration, abnormal sweat response, and painful responses to non-painful or slightly painful stimuli, among others [1].
CRPS affects 200,000 people in the United States each year. Among those affected, half are also diagnosed with a psychiatric disorder [2]. More specifically, CRPS patients have a much higher prevalence of depression than the general population, with 15.6 percent of CRPS patients diagnosed with depression compared to 3.4 percent of people diagnosed with depression worldwide [2]. Since the pain in CRPS is so intense and the length of a painful flare-up varies from person to person, many patients come to develop a fear of the pain itself, altering their fear-brain circuits and creating a negative relationship between pain and their psychological state [3]. Mood disorders like depression and anxiety also have similar pathophysiology as CRPS, so the onset of CRPS can instigate the development of a comorbid mood disorder without an external trigger like grief, loss, or substance abuse. Within the physiology of CRPS, cytokine and astrocyte levels become dysregulated, mimicking the pathophysiology noted in certain psychiatric disorders and thus increasing rates of comorbid psychiatric disorders. Similarly, the fear-brain circuit is altered during the onset and prolonged management of CRPS, eventually transforming from pain-related fear to an overall manifestation of anxiety.
While the number of patients being diagnosed with comorbid mood disorders is growing, the current CRPS treatment protocol does not typically include the management of these psychiatric disorders. As more research is conducted to explore the physiological and psychological changes that occur with the onset of CRPS, mounting evidence suggests more mental health resources should be provided to alleviate CRPS-related symptoms, both physiological and psychological, and speed up the recovery timeline [4].
Dysregulation of Biomarkers in CRPS and Mood Disorders
TNF-α and Cortisol
During CRPS, depression, and anxiety, plasma levels of the pro-inflammatory cytokine tumor necrosis factor-α (TNF-α) are increased by a statistically significant degree [5]. This cell-signaling protein is integral to maintaining a healthy immune response and stimulates the release of a corticotropin-releasing factor. With CRPS, TNF-α activates and sensitizes primary afferent nociceptors, leading to the characteristic excessive pain [5]. TNF-α not only plays an important role in the development of CRPS, but also in the development and severity of depression. In fact, the more severe the reported depressive symptoms, the higher the concentrations of plasma TNF-α [6].
The increased concentration of TNF-α may be explained by its role in the regulation of the hypothalamic pituitary adrenal (HPA) axis which is responsible for neuroendocrine modulation of the body’s stress response. With an upregulation of pro-inflammatory cytokines like TNF-α in CRPS, the HPA axis increases the secretion of a corticotropin-releasing factor, adrenal-corticotropin hormone, and eventually cortisol. Prolonged elevations of cortisol lead to a shift in tryptophan usage from the tryptophan-serotonin pathway to the tryptophan-kynurenine pathway instead [7]. The shift away from the tryptophan-serotonin pathway greatly limits the production of serotonin, eventually interfering with mood stabilization, sleep cycle regulation, and neuronal communication [8]. Metabolites are then used in the tryptophan-kynurenine pathway instead, leading to the production of two neurotoxic chemicals, 3-hydroxyanthranilic acid, and quinolinic acid.
Quinolinic acid contributes to neurodegeneration seen in conditions like depression through free radical formation, mitochondrial malfunction, and energy store depletion. These changes trigger the mass destruction of neuronal cells which leads to the degeneration of brain functions such as memory and learning [9]. In fact, it has been suggested that elevated TNF-α levels is one precursor to the development of depression [5]. Therefore, TNF-α could be a potential biomarker for comorbid depression in CRPS patients.
Figure 1: TNF-a’s role in the dysregulation of tryptophan metabolism. Under normal conditions, tryptophan is transformed into serotonin in the brain and gut, producing regulatory effects on mood and the sleep cycle as well as promoting health communication between neurons. Under inflammatory conditions, like those seen in CRPS and depression, TNF-a secretion inhibits the production of serotonin through the upregulation of adrenal-corticotropin. Tryptophan is then utilized in the liver for production of neurotoxins through the Kynurenine pathway.
Catecholamines
Following the acute stage of CRPS, stimulation of the sympathetic nervous system and the resulting release of catecholamines increase the production of another important cytokine, interleukin-6 (IL-6). When the body undergoes acute stress, the sympathetic nervous system is activated and causes the release of two catecholamines, epinephrine and norepinephrine. These hormones increase blood pressure, heart rate, breathing rate, and dilate the pupils. Catecholamines are also regulators of IL-6, a cytokine that plays an important role in nociceptor sensitivity while also increasing chances of developing comorbid anxiety. Norepinephrine upregulates the translation of IL-6 by 49-fold, therefore increasing the plasma concentration of IL-6 after sympathetic stimulation [10]. In the case of CRPS, the initial trauma to the affected limb activates this fight-or-flight response and increases IL-6 levels. The chronic elevation of IL-6 not only leads to chronic inflammation, but also increases nociceptor sensitization and the transmission of signals between sensory neurons, both of which are linked to chronically elevated levels of pain [11].
Not only does IL-6 play an important role in inflammatory and pain responses and the onset of CRPS, the cytokine also modulates the expression of another cytokine, interleukin-1 (IL-1). IL-1 is critical for the onset of anxiety-type symptoms by dampening the activation of endocannabinoid receptor CB1R (GABA), which limits GABA’s anti-anxiety effects [12]. In fact, general anxiety disorder patients had statistically significant high levels of IL-6 through environmental stimulation of the sympathetic nervous system [7]. By adding the excitatory effects of CRPS on the sympathetic nervous system, there is no need for external stimulation to begin anxiety symptoms. For this reason, it has been suggested that elevated levels of IL-6 from the onset of CRPS can stimulate IL-1 and induce comorbid anxiety [12].
Figure 2: Sympathetic stimulation following acute CRPS results in an increase in catecholamines. The subsequent upregulation of interleukin-6 and interleukin-1 dampens GABA’s anti-anxiety effects and leads to increased nociceptor activation.
Astrocytes
In conjunction with the changes to cytokines in CRPS, anxiety and depression, subsequent changes to the functioning of astrocytes have been noted in all three diagnoses [13]. Astrocytes are a subclass of glial cells that hold a supportive function for neurons. Under typical conditions, astrocytes are responsible for modulating neuroendocrine functions, regulating synaptic transmission, and regulating glutamate levels in the body [14]. With CRPS, astrocytes become upregulated and activated via stimulation from excess pro-inflammatory cytokines, changing their gene expression to become A1 reactive astrocytes. A1 reactive astrocytes then go on to secrete neurotoxins and more pro-inflammatory cytokines [15]. The activation of A1 reactive astrocytes also induces higher levels of glial fibrillary acidic protein, thus increasing the number of glial glutamate transporters on astrocytes [16]. This hyperactivity and hypersensitivity of astrocytes to glutamate trigger calcium release that increases neurotransmission within nociceptors and ultimately contributes to the intense and chronic pain associated with CRPS [15].
Additionally, patients with major depressive disorder show an mean increase of 35 μmol/L of glutamate concentration in the cortex, a statistically significant changeincrease that is indicative of the severity of depressive symptoms [17]. The shift in gene expression in astrocytes and upregulation of glutamate with the onset of CRPS not only increases nociceptor activation but also neurodegeneration through the reinforcement of the tryptophan-kynurenine pathway discussed earlier. Interestingly, the increased concentration of glutamate inhibits the production of kynurenic acid and instead promotes the production of quinolinic acid [18]. As discussed, quinolinic acid is a potent neurotoxic compound that can lead to neurodegeneration associated with depression. The upregulation and activation of A1 reactive astrocytes in response to inflammation from CRPS increases extrasynaptic glutamate concentrations, causing hyperactivation of nociceptors and increased production of quinolinic acid. With this in mind, there is evidence that the pathophysiology of astrocytes during CRPS may increase the risk of comorbid depression.
CRPS Psychology: Cycle of Pain-Related Anxiety
The fear of pain, also known as harm-avoidance, complicates treating chronic pain conditions like CRPS. This is because certain treatments like physical or occupational therapy methods used in CRPS rehabilitation can quickly become unsuccessful if patients begin to avoid activities that increase symptoms or pain. These treatment techniques can include graded motor imagery, range-of-motion exercises, mirror therapy, desensitization, and electrical stimulation. Although these rehabilitation techniques may temporarily increase CRPS-related pain, therapy is an essential part of treatment and by avoiding painful activities, not only does the CRPS become increasingly worse, but the fear of pain is also cognitively reinforced.
This reinforcement is seen in the fear-learning neural pathway, a series of neurons that extend from the left amygdala to the hippocampus, cerebellum, brainstem, and other parts of the central nervous system. The chronic pain of CRPS and subsequent fear of pain continuously activates neurons that extend to this fear-learning circuit, strengthening the connection between the left amygdala, the fear control center, and the hippocampus, the learning and memory center. This repetitive activation ultimately intensifies the fear brain circuit [19]. Consequently, the cycle of pain-related anxiety begins, transitioning from a fear of pain to an avoidance of pain which then further reinforces the fear of pain [2]. This phenomenon is identified in multiple studies, suggesting a more quantitative link between CRPS and anxiety.
In one study, a group of 64 CRPS patients were evaluated to determine psychological comorbidities. Twenty-eight individuals received a psychiatric diagnosis following the onset of CRPS, with 10 of those 28 diagnosed with anxiety disorders [20]. Not only did the study reveal these diagnoses, they also found that increased anxiety was directly linked to increased pain through the fear-brain pathway [13]. In fact, another study reported that 70 percent of CRPS patients had elevated pain-related fear scores [19]. While the anticipation of pain can elicit pain response, it can also lead to an avoidance of daily responsibilities, physical inactivity, disability, poorer long-term recovery, and higher rates of anxiety and other mood disorders [20]. As a result, the altered fear-brain circuits associated with CRPS increase the likelihood of developing comorbid anxiety.
Conclusion
CRPS alters the body’s physiology and changes certain psychological processes, increasing the chances for developing a comorbid psychiatric disorder. With the dysregulation of cytokines and astrocytes, the immune system’s functioning is disturbed, which leads to abnormal levels of kynurenine and serotonin similar to that of depression [5]. As the levels of kynurenine increase, so does the concentration of extrasynaptic glutamate, upregulating processes that signal both pain and depression [9]. The fear-brain circuit is also altered as pain signals become stronger and more frequent with CRPS, catastrophizing pain and ultimately leading to elevated levels of both pain and anxiety [3]. While pain itself can be debilitating, the simultaneous occurrence of pain and comorbid psychiatric disorders seen in CRPS can lead to avoidance of daily life, thereby worsening both disorders.
Today, most patients managing CRPS disorders are reluctant to express their psychological symptoms and look for help on their own, yet neglecting the psychological side often worsens the symptoms of their diagnosis and makes recovery more difficult [19].Current research suggests that CRPS treatment should shift towards focusing on the psychological components that intensify pain in order to holistically treat CRPS. Over the past few years, more studies have explored the relationship between CRPS and psychiatric disorders, but there has been less research into treatments that would help with both disorders simultaneously. Just as CRPS is often misdiagnosed as an invisible disorder, psychological symptoms may be overlooked or undertreated when physiological responses garner priority. A new perspective of CRPS that acknowledges this association is needed to gain a more comprehensive understanding of the disorder.
References:
- Wie, C., Gupta, R., Maloney, J. et al. 2021. Interventional Modalities to Treat Complex Regional Pain Syndrome. Curr Pain Headache Rep. 25, doi:10.1007/s11916-020-00904-5
- Bean,. J., Johnson, H., Heiss-Dunlop, W., Lee, C., & Kydd, R. 2015. Do psychological factors influence recovery from complex regional pain syndrome type 1? A prospective study. Pain. 156(11), 2310–2318, doi:10.1097/j.pain.0000000000000282
- Antunovich, D.,, Horne, J., Tuck, N., Bean, D. 2021. Are Illness Perceptions Associated with Pain and Disability in Complex Regional Pain Syndrome? A Cross-Sectional Study. Pain Medicine. 22 (1): 100–111, doi:10.1093/pm/pnaa320
- Park HY, Jang YE, Oh S, Lee PB. 2020. Psychological Characteristics in Patients with Chronic Complex Regional Pain Syndrome: Comparisons with Patients with Major Depressive Disorder and Other Types of Chronic Pain. J Pain Res. 13:389-398, doi:10.2147/JPR.S230394
- Üçeyler, N., Eberle, T., Rolke, R., Birklein, F., Sommer, C. 2007. Differential expression patterns of cytokines in complex regional pain syndrome. Pain. 132(2007): 195-205. doi: 10.1016/j.pain.2007.07.031
- Zou, W., Feng, R., Yang, Y. 2018. Changes in the serum levels of inflammatory cytokines in antidepressant drug-naive patients with major depression. Plos One. 13(6): e0197267. doi: 10.1371/journal.pone.0197267
- Strong, J., Jeon, S., Jeon, S., Zhang, J., & Kim, Y. 2020. Glial Cells and Pro-inflammatory Cytokines as Shared Neurobiological Bases for Chronic Pain and Psychiatric Disorders. Overlapping Pain and Psychiatric Syndromes: Global Perspectives. doi:10.1093/med/9780190248253.003.0003
- Carhart-Harris, R. & Nutt, D. 2017. Serotonin and brain function: a tale of two receptors. J Psychopharmacol. 31(9): 1091-1120. doi: 10.1177/0269881117725915
- Pérez-De La Cruz, V., Carrillo-Mora, P., & Santamaría, A. 2012. Quinolinic Acid, an endogenous molecule combining excitotoxicity, oxidative stress and other toxic mechanisms. Int J Tryptophan Res. 5: 1–8, doi: 10.4137/IJTR.S8158
- Burger, A., Benicke, M., Deten, A., Zimmer, H. G. (2001). Catecholamines stimulate interleukin-6 synthesis in rate cardiac fibroblasts. Am J Physiol Heart Circ Physiol. 281: H14-H21. doi:10.1152/ajpheart.2001.281.1.H14
- Zhou, YQ., Liu, Z., Liu, ZH., Chen, SP., Li, M., Shahveranov, A., Ye, DW., & Tian, YK. 2016. Interleukin-6: an emerging regulator of pathological pain. J Neuroinflammation. 13: 141. doi: 10.1186/s12974-016-0607-6
- Rossi, S., Sacchetti, L., Napolitano, F., De Chiara, V., Motta, C., Studer, V., Musella, A., Barbieri, F., Bari, M., Bernardi, G., Maccarrone, M., Usiello, A., Centonze, D. 2012. Interleukin-1β cauSeS anxiety by interacting with the endocannabinoid system. J Neurosci. 32(40): 13896-13905. doi: 0.1523/JNEUROSCI.1515-12.2012
- Ji, RR., Donnelly, C.R. & Nedergaard, M. 2019. Astrocytes in chronic pain and itch. Nat Rev Neurosci. 20: 667–685, doi:10.1038/s41583-019-0218-1
- Jauregui-Huerta, F., Ruvalcaba-Delgadillo, Y., Gonzalez-Castaneda, R., Garcia-Estrada, J., Gonzalez-Perez., Luquin, S. 2010. Response of glial cells to stress and glucocorticoids. Curr Immunol Rev. 6(3): 195-204. doi: 10.2174/157339510791823790
- Li, T., Chen, X., Zhang, C., Zhang, Y., & Yao, W. 2019. An update on reactive astrocytes in chronic pain. J Neuroinflammation. 16: 140. doi: 0.1186/s12974-019-1524-2
- Wesseldijk, F., Fekks, D., Huygen, F., van de Heide-Mulder, M., Zijlstra, F. 2008. Increased plasma glutamate, glycine, and arginine levels in complex regional pain syndrome type 1. Acta Anaesthesiol Scand. 52(5): 688-94. doi: 10.1111/j.1399-6576.2008.01638.x
- Mitani, H., Shirayama, Y., Yamada, T., Maeda, K., Ashby, C., Kawahara, R. 2006. Correlation between plasma levels of glutamate, alanine and serine with severity of depression. Pro Neuro Psychoph. 30(6): 1155-1158. doi: 10.1016/j.pnpbp.2006.03.036
- Schwarcz, R. & Stone, T. 2018. The kynurenine pathway and the brain: challenges, controversies and promises. Neuropharmacology. 122(Pt B): 237-247. doi: 10.1016/j.neuropharm.2016.08.003
- Simons, L. E. 2016. Fear of pain in children and adolescents with neuropathic pain and complex regional pain syndrome. Pain. 157: 90-97, doi:10.1097/j.pain.0000000000000377
- Brinkers, M., Rumpelt, P., Lux, A., Kretzschmar, M., & Pfau, G. 2018. Psychiatric Disorders in Complex Regional Pain Syndrome (CRPS): The Role of the Consultation-Liaison Psychiatrist. Pain Res Manag. doi:10.1155/2018/289436
Treatments for Eye Strain From Screen Exposure
By Anisha Narsam, Neurobiology, Physiology and Behavior ‘23
Author’s Note: I hope to raise awareness about treatments for eye strain from screen exposure because of the current pandemic and the increase in online interactions. This article is meant for students and individuals who work on devices with screens, such as computers or tablets, and want to treat their eye strain. I chose this topic because I have noticed an increase in my eye strain since the pandemic began and I wanted to research how to alleviate this condition for myself and for my peers. Through this article, I hope readers can understand the effectiveness of a range of treatments for eye strain from screen exposure with my analysis of seven peer-reviewed journal articles.
Abstract
Excessive screen time, due to remote learning, is dramatically increasing the incidence of eye strain. Since this condition can elevate tiredness and reduce an individual’s ability to concentrate, promising treatments must be considered to avoid further ocular harm. Previous research on eye strain shows that when people spend a lot of time on their computer, they can benefit from taking frequent breaks away from their screens. However, with increased reliance on technology for everyday tasks, these techniques are not enough. As a result, treatments must be developed and implemented to counter the degenerative effects of long-term eye strain, which may eventually lead to headaches and farsightedness. This paper analyzes seven peer-reviewed journal articles centered around the effectiveness of each treatment on eye strain. Supplemental medications and ergonomic techniques show promising results for combatting dry eye and ocular pain, specific symptoms of eye strain. Studies evaluating the effectiveness of blue light glasses demonstrate conflicting results. Further research and implementation of these methods can decrease eye strain symptoms while improving concentration.
Introduction
Eye strain is one of the most common ocular conditions faced by adults and students around the world [1]. As of 2017, around 64 percent to 90 percent of computer users reported eye strain symptoms [2]. Physiologically, these symptoms result from repetitive rapid eye movements between the keyboard, screen, and other documents [2]. With more interactions taking place virtually because of the COVID-19 pandemic, addressing this condition is crucial for decreasing headaches and improving focus when completing tasks on screens. Many studies have only focused on treatment through minor modifications in a person’s behavioral tendencies, such as by following the 20/20/20 rule. This rule asks individuals to look 20 feet away from their screens for 20 seconds every 20 minutes [3]. Since taking frequent screen breaks may not be helpful for severe eye strain, the purpose of this literature review is to evaluate the effectiveness of medications [1, 4, 5], blue light-blocking glasses [6,7], and ergonomic techniques [2, 3] in combating this condition. By evaluating these treatments, we can determine reliable methods for alleviating eye fatigue.
Medication
In terms of medication, omega-3 fatty acids (O3Fas) can reduce eye strain [1]. Specifically, O3Fas treat patients with dry eye, which is a symptom of eye strain. When people stare at one location for extended periods of time, they often do not blink as much, which results in dry eye because of decreased eye lubrication [2]. On a cellular level, O3Fas has been shown to increase the density of goblet cells, which lubricate the eye. The patients chosen for the study used a computer for more than three hours each day [1]. While 120 patients received the O3Fas treatment, the remaining 236 patients received an olive oil placebo daily and their symptoms were evaluated. Each week, patients assigned a score between zero and three for symptoms such as blurry vision, dry eyes, and red eyes [1]. At the baseline, 60 percent of patients had moderate dry eye symptoms in both the O3Fas and the placebo groups [1]. By the end of the experiment, 70 percent of the O3Fas group and 15 percent of the placebo group were symptom free. In addition, Schirmer’s test was performed, which involves gently placing a filter paper onto the participants’ eyes and analyzing the cells found on this paper for tear production [1, 4]. A statistically significant improvement in dry eye symptoms was found in the O3Fas group compared to the placebo group, demonstrating the effectiveness of O3Fas in combatting dry eye resulting from computer vision syndrome [1]. They can also improve epithelial cellular morphology in the eyes while decreasing tear evaporation rates, which ends up reducing eye strain symptoms. While this study analyzed the effects of O3Fas, it did not test for how the dosage of O3Fas affects an individual’s symptoms.
Besides O3Fas, researchers examined how the dosage of a particular botanical formula can decrease eye strain, while using a machine learning-based model to predict an accurate dosage for each patient [4]. Botanical formulas are natural, plant-based ingredients and oils that are combined in order to treat or supplement a condition. This specific botanical formula is made of carotenoids, naturally derived pigments in plants that support eye health, as well as blackcurrant, chrysanthemum, zeaxanthin, and goji berry. These formulas are antioxidants and are known for their ability to absorb the blue light that typically radiates from visual display units. Researchers split the participants, who are each exposed to screens daily, into four groups. The three experimental groups ingested six, ten, or fourteen milligrams of the chewable tablet, while the fourth group received a placebo [4]. Similar to the O3Fas study, this study also asked patients to take these tablets once daily for 90 days, while self-reporting how often they felt symptoms such as eye soreness and dry eye [1, 4]. In addition, both studies used Schirmer’s test to evaluate dry eye symptoms, which found that both the 10-milligram and 14-milligram groups had increased tear production [1, 4]. Researchers input the symptoms and dosages of 56 of the participants into their machine learning model, XGBoost. The study found that the optimal dosage is 14 milligrams for 39 individuals, 10 milligrams for 17 individuals, and zero milligrams for two participants [4]. Researchers determined that the botanical formula significantly improves eye strain, while outlining the potential for machine learning to determine optimal dosages [4].
Bilberry extract (BE), another naturally-derived medication, has proven to be quite effective in reducing eye strain caused by acute video display terminal (VDT) loads, or devices with a display screen [5]. BE is loaded with anthocyanins, which are known for their ability to decrease further visual disturbance and eye strain. The participants use VDTs daily and have eye strain symptoms. Over an eight-week period, the experimental group ingested 480 milligrams of BE per day, in contrast to the placebo group [5]. The researchers evaluated their symptoms using the Critical Flicker Fusion device (CFF), which analyzes the frequency of a human eye identifying a blinking light as continuous [5, 6]. Lower CFF values correlate to less eye strain symptoms. This contrasts to the O3Fas and botanical formula studies that analyze the dryness of a patient’s eyes through Schirmer’s test [1, 4, 5]. Moreover, all three studies focused on patients with existing eye strain [1, 4, 5]. A self-reported questionnaire asked participants to rate the intensity of their symptoms on a scale from one to ten every week. Based on the CFF tests, researchers found a statistically significant lower CFF value, or less eye strain, for the BE group compared to the baseline [5].This CFF reduction was not observed in the placebo group. Participants in the BE group felt less ocular pain compared to the control group.6 The researchers argued that BE supplements can reduce eye strain caused by VDT loads, while further research can eventually analyze how BE works [5].
(a) Blackcurrant, (b) chrysanthemum, and (c) goji berry compose a botanical formula that can significantly improve eye strain [4].
Blue Light-Blocking Glasses
Medications are important treatments to consider, but there are many behavioral changes that may decrease eye strain symptoms too. One possible behavioral change is wearing blue light-blocking glasses in treating eye strain. Blue light is short-wavelength electromagnetic radiation, which ranges from around 400 to 500 nm in length, and carries one of the highest amounts of energy [7]. There have been many hypotheses previously considered, which shows how blue light could potentially cause retinal damage, especially towards the aging eye. In fact, from studies in animals, increasing amounts of blue light exposure can increase the amount of cell apoptosis in the eyes [7]. It is important to note that blue light is emitted by the sun onto Earth’s surface, but it is the excessive exposure to blue light from screens that can have negative effects [6]. Based on these previous experiments, researchers aim to understand whether or not blocking blue light is effective in preventing eye strain and retinal damage by testing the effects of the blue light-blocking glasses.
Lin et al. assigned 36 participants into groups with the clear lens placebo, low-blocking glasses, and high-blocking glasses [6]. Afterwards, participants performed a 2-hour task on identical computers in similar controlled environments. Researchers used the CFF device and a participant-reported survey to evaluate symptoms of eye strain and fatigue [6]. The CFF depicted significantly less eye fatigue in the high-blocking group compared to the low-blocking and placebo groups. Participant surveys suggested that the high-blocking group reported feeling less pain and itchiness in their eyes after the computer task. However, there was no statistically significant correlation between blue light-blocking glasses and eye strain specifically, based on the participants’ self-reported scores for the intensity of their eye strain symptoms [6]. Therefore, researchers determined that blocking large amounts of blue light may reduce eye strain from screen exposure, but more research needs to be done to determine if this effect is substantial. Awareness of these results can encourage the usage of blue light-blocking glasses to decrease eye pain and itchiness, while further studies can also evaluate its effects on eye strain and the specific amount of blue light blocked by different glasses [6].
Similar to Lin et al., Leung et al. also presented inconclusive results. Leung et al. compared the symptoms of patients wearing blue light-filtering glasses, brown tinted glasses, and a placebo, and found that while blue light filters decrease eye sensitivity, most participants could not detect these changes [7]. A group of 80 computer users wore the lenses for two hours each day for around one month. The participants switched between the lenses during this time period. Researchers performed contrast sensitivity tests on the participants to see how accurately they can read a chart. Researchers use these tests to evaluate how accurately participants could read a chart in which the contrast of each black letter fades into the white background in small increments [7]. This test found no significantly different contrast sensitivity results between the experimental and placebo groups. Based on weekly questionnaires, more than 45 percent of patients reported no changes in their eyesight or eye strain symptoms, while the majority reported no differences between the blue light-blocking glasses and the control lenses [7]. Researchers concluded that analyzing the effects of blue light-blocking glasses is difficult. Spectral transmittance, which evaluates the amount of blue light blocked, showed that the glasses reflected around 10.3 percent to 23.6 percent of harmful blue light [7]. Based on this discovery, researchers found that it is still important to wear these glasses to block the harmful radiation present between 400 and 500 nm in the blue light range, even when participants found no noticeable benefits. The differences between these two studies suggest how these various approaches could lead to similarly inconclusive results.
To compare the two studies, Lin et al.’s research included only 36 participants in a two-hour-long study in identical environments, while Leung et al.’s study had 80 participants and occurred over two months with no supervision [6, 7]. While Leung et al.’s study could analyze long-term effects of the glasses, Lin et al.’s experiment could only analyze the short-term benefits but in a controlled environment. Additionally, Lin et al.’s study only allowed each participant to try on one of the glasses, while Leung et al.’s experiment allowed participants to wear each one of the glasses [6, 7]. As a result, Leung et al.’s study eliminated differences in personal opinion between participants by only evaluating how one group responded to each of the variables. The outcomes of both studies showed how blue light-blocking glasses could relieve eye strain, although more research still needs to be done on this topic.
Ergonomics
While there is conflicting evidence for wearing blue light glasses as a behavioral modification technique, the ergonomic approach shows more promise. To determine the benefits of ergonomics on computer vision syndrome (CVS), Mowatt et al. evaluated the prevalence of eye strain in students at The University of the West Indies (UWI) [2]. Specifically, researchers analyzed how the angle of a computer screen affects eye strain. In this cross-sectional study, 409 students answered a questionnaire related to how often they use a computer, the severity of their eye strain symptoms, and the angle of their screens in relation to their eyes [2]. The results depicted how severe eye strain occurred in 63 percent of the students who look down at their device compared to 21 percent of the participants who keep their device at eye-level [2]. However, the data did not present a relationship between the prevalence of eye strain and the length of time spent on a computer. These results support the use of ergonomic practices, such as keeping a screen at eye-level, to reduce eye fatigue. Increased awareness of such behavioral modification techniques, especially by universities, can prevent eye strain in students [2]. A similar study also uses a survey to analyze practices among individuals who work on computers daily.
Using surveys, researchers analyzed how ergonomics and symptoms of eye strain can be correlated. Office workers answered a questionnaire about eye strain symptoms and workplace conditions [3]. Researchers found that a higher angle of gaze towards a monitor is associated with more CVS prevalence [3]. In addition, looking upwards at a screen should be avoided as it results in muscular strain on the trapezius and neck muscles. This contrasts with the study at UWI, which determined that patients who looked down at their screens, at relatively large angles from eye level, tended to have more strained eyes [2,3]. Based on the results of both studies, placing the screen between eye level and at a small angle of 10 degrees downwards may be the best resolution. Moreover, using a monitor with a filter and adjusting the brightness of an individual’s screen to match that of the environment is correlated with less CVS [3]. Although these results may seem to be solutions for CVS, they are based on surveys rather than controlled studies [2, 3]. Therefore, there is no definite causation between a certain ergonomic practice and eye strain.
Conclusion
When looking at all the possible treatments for eye strain from screen exposure, there are many different medications [1, 4, 5], types of blue light-blocking glasses [6, 7], and ergonomic techniques [2, 3] that can reduce symptoms. O3Fas and the presented botanical formula both show reduction in eye strain symptoms when evaluated with Schirmer’s test for dry eye [1, 4]. The BE study also showed promising results in reducing symptoms of eye fatigue through the CFF test, which focuses more on the temporal processing ability of the eyes [5]. Though blue light-blocking glasses show positive results on the CFF tests and through measured spectral transmittance data, there are mixed results as to whether or not participants detect any changes in eye strain when wearing these glasses [6, 7]. Further testing can be done to evaluate the effects of blue light glasses, such as by examining a larger population or through a longitudinal study. Ergonomic techniques are correlated with less eye strain, according to recent surveys [2, 3]. Clinical trials in controlled environments can show more direct implications of ergonomic practices on eye strain from screen exposure. These treatments combined have the potential to reduce eye strain symptoms, leading to fewer headaches and improved concentration.
References:
- Bhargava R, Kumar P, Phogat H, Kaur A, Kumar M. 2015. Oral Omega-3 Fatty Acids Treatment in Computer Vision Syndrome Related Dry Eye. Cont Lens Anterior Eye [Internet]. 38(3):206-210. doi:10.1016/j.clae.2015.01.007
- Mowatt L, Gordon C, Santosh ABR, Jones T. 2018. Computer Vision Syndrome and Ergonomic Practices Among Undergraduate University Students. Int J Clin Pract [Internet]. 72(1):10.1111/ijcp.13035. doi:10.1111/ijcp.13035
- Ranasinghe P, Wathurapatha WS, Perera YS, et al. 2016. Computer vision syndrome among computer office workers in a developing country: an evaluation of prevalence and risk factors. BMC Res Notes [Internet]. 9:150. doi:10.1186/s13104-016-1962-1
- Kan J, Li A, Zou H, Chen L, Du J. 2020. A Machine Learning Based Dose Prediction of Lutein Supplements for Individuals With Eye Fatigue. Front Nutr [Internet]. 7:577923. doi:10.3389/fnut.2020.577923
- Ozawa Y, Kawashima M, Inoue S, et al. 2015. Bilberry Extract Supplementation for Preventing Eye Fatigue in Video Display Terminal Workers. J Nutr Health Aging [Internet]. 19(5):548-554. doi:10.1007/s12603-014-0573-6
- Lin JB, Gerratt BW, Bassi CJ, Apte RS. 2017. Short-Wavelength Light-Blocking Eyeglasses Attenuate Symptoms of Eye Fatigue. Invest Ophthalmol Vis Sci [Internet]. 58(1):442-447. doi:10.1167/iovs.16-20663
- Leung, T. W., Li, R. W., & Kee, C. S. 2017. Blue-Light Filtering Spectacle Lenses: Optical and Clinical Performances. PloS one [Internet]. 12(1): e0169114. doi:10.1371/journal.pone.0169114
Review: The role of gut microbiota on Autism Spectrum Disorder (ASD) and clinical implications
By Nikita Jignesh Patel, Neurobiology, Physiology, & Behavior ’22
Author’s Note: Ever since I took BIS2C at UC Davis, I was intrigued as to how our gut microbiome plays such a huge role in our homeostasis beyond just digestion – in particular, the correlation between decreased microbiome diversity and allergies we learned about in the lab fascinated me. I recently stumbled upon the term “gut-brain axis” and was in awe as to how this connection between our gut microbes and our brain even exists, and learned that gut microbiome diversity is implicated in a plethora of mental disorders, from depression and anxiety, to autism. I decided to write this review to share my learning of how the gut microbiome can change the brain and potentially contribute to Autism Spectrum Disorder (ASD), because I feel as if this is not a widely known correlation – even as a physiology major, I never learned about the gut-brain axis in my courses. Moreover, the cause of autism is still widely undefined and the gut microbiome may provide a possible explanation for ASD onset in some individuals. I believe a wide range of students will find this upcoming research interesting, but my intended audience is those who research autism or work with autistic individuals, as it may provide an explanation for ASD and seems to be a likely target for clinical therapy for autism in the future. Above all, I want my readers to take away a better understanding of the gut-brain axis and how its imbalance can be implicated in brain disorders like autism.
Introduction
Autism Spectrum Disorder (ASD) is a lifelong neurodevelopmental disorder characterized by a range of symptoms including difficulty with communication, social interaction, and restricted and repetitive behaviors that present differently in every individual [1]. Although 1 in 54 children are estimated to be on the autism spectrum according to the CDC [2], the etiology of the condition remains poorly understood. Factors including genetics and certain maternal environmental conditions have been identified as potential contributors to the development of ASD in children, but the exact cause is still unknown [3].
A common comorbidity experienced by ASD individuals is gastrointestinal (GI) problems—including abdominal pain, constipation, and diarrhea —as such, autism research is pivoting towards studying the gut microbiome. [1]. Specifically, a link between the composition of the gut microbiome and brain development has been established in recent years — termed the “gut-brain axis”— and it appears to be the future of autism research. This literature review aims to identify the role of the human gut microbiome on the development of autism-like behavior and investigate whether therapies targeting the gut microbiome can be effective clinical treatments for Autism Spectrum Disorder (ASD). The article will first define the differences observed in gut microbiota between autistic and neurotypical individuals, then discuss how these differences in composition may affect brain development, and finally propose clinical implications targeting gut microbiota that appear promising in the treatment and diagnosis of ASD-related behaviors.
Gut Microbiome of ASD Patients Differ From Neurotypical Individuals
The gut microbiomes of Autism Spectrum Disorder (ASD) patients have defining characteristics that significantly differ from those of neurotypical individuals. The human gut microbiome consists of a diverse array of predominantly bacteria but also archaea, eukarya and viruses that possess unique microbial enzymes to aid humans in digestion and also a variety of other physiological functions [4]. The three phyla Firmicutes, Bacteroidetes, and Actinobacteria [5] encompass the majority of bacteria present in the gastrointestinal (GI) tract that aid in these functions. However, an imbalance between the ratio of Firmicutes to Bacteroidetes bacteria is found in autistic individuals when compared to the microbiome composition of neurotypical subjects; in particular, patients with autism tend to have an overexpression of Firmicutes in their gut [6,7]. Other studies have demonstrated an excess of the Clostridium genus in the ASD microbiome [8] as well as an overexpression of the genus Bacilli in the mouths and gut of autistic individuals [7]. While the cause behind this imbalance is unknown, these findings signify a consistent pattern of microbial imbalance in the autistic gut microbiome. In fact, a Random Forest prediction model, a computer algorithm that can classify large sets of data into subgroup “trees” based on data similarity, was able to distinguish ASD children from neurotypical children with a high degree of certainty from just microbiome sequencing data [7], demonstrating the predictability of this dysbiosis by artificial intelligence.
Figure 1: This illustrates the difference between species richness and species abundance. Species richness, a measure of alpha diversity, informs on how many species are present in a sample. Species abundance describes how many organisms of each species are present.
Along with microbial imbalance—termed dysbiosis—autistic children also tend to have a decreased alpha diversity [7,9], which measures mean species diversity, as well as significantly lower gut species richness [7], the number of species present, when compared to age and sex-matched neurotypical children. One study found that for neurotypical children, alpha diversity, species richness, and species abundance all increased between the age groups 2-3 to 7-11; yet for ASD children, no significant development in microbial composition was observed with increase in age [7]. Since autism has been found to slow brain development as children age [9], this reduced development of the microbiome mirrors the altered brain development characteristic of ASD pathophysiology, proposing an association between decreased microbial diversity and autism.
Due to this observed correlation between dysbiosis and ASD, whether gut dysbiosis is truly causal for autism has come into question. In a preliminary study, Sharon et al transplanted fecal microbiota from autistic donors into otherwise germ-free mice (mice with a sterile gut) and observed their offspring’s behavior compared to offspring of mice inoculated with microbiota from neurotypical donors. Notably, mice with the ASD microbiome— characterized by decreased alpha and beta diversity and decreased Bacteroidetes — exhibited behaviors paralleling those of autism, including repetitive behaviors, decreased locomotion and decreased communication [9]. This demonstrated that gut dysbiosis can in fact induce the behavioral deficits observed in ASD. This is significant evidence toward the theory that gut dysbiosis indeed contributes to ASD – an important finding that changes our current understanding of the etiology of autism.
Figure 2: Above is a visual depiction of the study conducted by Sharon & colleagues, where germ-free mice were inoculated with gut microbiota from either autistic or neurotypical donors. Offspring of the mice transplanted with ASD microbiome were shown to exhibit autism-like behavior.
How Microbiota Imbalance Affects Brain Function: The Gut-Brain Axis
Since the microbial dysbiosis found to be common in ASD patients contributes to behavioral deficits, several different mechanisms have been proposed for how the altered microbial environment in ASD patients can affect brain development.
Intestinal permeability
The microbes that line the GI tract provide structural and protective benefits to our intestines, including stimulating epithelial cell regeneration and mucus production by the intestinal walls. When microbial diversity is decreased, the integrity of the intestinal walls may be compromised which can lead to increased intestinal permeability [8]. This may allow for lipopolysaccharide (LPS), a pro-inflammatory endotoxin that is found in gram-negative bacterial cell walls, to escape out of the GI tract and into the bloodstream. Serum levels of LPS are in fact found to be significantly higher in autistic individuals [12]. LPS causes inflammation in the central nervous system (CNS) and is found to impair cognition and motivation in the mouse model. Specifically, implications for impaired continuous attention and curiosity behaviors, along with modulation of other areas of the brain like the central amygdala have been associated with circulating LPS [11]. Therefore, altered intestinal permeability is a possible mechanism by which dysbiosis modulates brain inflammation, a hallmark of autism that is thought to contribute to its behavioral symptoms.
Microbial metabolites
As gut microbes carry out cellular functions inside their human hosts, they also secrete compounds as by-products of metabolism. Two such metabolites are 5AV and taurine, which are secreted by gut Bacteroides xylanisolvens and other bacteria. 5AV and taurine levels are found to be significantly lower in autistic individuals [13,14] as well as mice transplanted with ASD microbiome [9], likely due to dysbiosis. Both 5AV and taurine are gamma-aminobutyric (GABA) receptor antagonists, meaning that lower levels of these circulating microbial metabolites can alter the inhibitory signaling of GABA in the nervous system [9]. GABA regulates various developmental processes in the brain, including cell differentiation and synapse formation, so dysfunction in GABA signalling is thought to account for ASD symptoms [15]. Oral administration of taurine and 5AV in a mouse model of ASD with an altered microbiome is shown to reduce repetitive behavior and increase social behavior, suggesting that the deficiencies in these metabolites may contribute to the behavioral manifestations of autism [9]. There are other microbial metabolite imbalances in autistic children, including dopaquinone, pyroglutamic acid, and other molecules involved in neurotransmitter production. These imbalances affect brain signaling pathways, and therefore could contribute to the behavioral deficits often present in autistic children. Further, these metabolite imbalances correlate with the levels of certain gut bacteria, further emphasizing the link between the gut microbiome and neurological disorders such as ASD.
Clinical Implications for ASD Diagnosis and Treatment
Today, symptoms of autism are alleviated with behavioral and educational therapy, and no pharmaceutical treatment exists [1]. With the knowledge that the gut microbiome significantly differs in autistic individuals and that these differences are shown to interfere with the nervous system, preliminary research has been done on potential diagnostics and pharmaceutical therapeutics for ASD that target dysbiosis in the gut.
Diagnostics
To date, there is no objective laboratory test to detect Autism Spectrum Disorder (ASD) in children, so autism is primarily diagnosed through a doctor’s evaluation of a patient’s behavior and developmental history. However, the ability of a computer program to distinguish the autistic microbiome from the neurotypical microbiome holds potential for use in ASD clinical risk assessments through analysis of the gut microbiome, and subsequent gut health monitoring interventions for those detected to have ASD-like dysbiosis [7]. The strong association between the presence of certain bacterial species in the mouth and bacteria in the gut — in particular the significant positive correlation between saliva Chloroflexi and gut Firmicutes—may suggest possible oral biomarkers to predict gut microbial diversity [6]. In addition, the overexpression of certain bacteria in the gut have been identified to be associated with certain symptoms like allergies and abdominal pain, opening an avenue to improve the diagnosis process of ASD through the inclusion of a more objective, laboratory-based test [6].
Microbiota Transfer Therapy
Microbiota Transfer Therapy (MTT) is an emerging therapy that aims to replace the gut microbiome of ASD individuals with a more diverse, healthy gut microbiome. One form of MTT consists of a two-week oral vancomycin antibiotic treatment, followed by a bowel cleanse using MoviPrep, and then finally an extended fecal microbiota transplant for 7-8 weeks, administered orally or rectally. In a clinical trial involving autistic children, MTT significantly increased gut bacterial diversity 8 weeks after treatment stopped, along with improving GI symptoms (including abdominal pain, indigestion, diarrhea and constipation) measured through the Gastrointestinal Symptom Rating Scale (GSRS). Significant improvements in behavioral ASD symptoms were found post treatment as well, measured through increases from baseline scores on a variety of exams that evaluate social skills, irritability, hyperactivity and communication, among other behaviors [16]. These improvements in microbial diversity and subsequently ASD-related behavior were all found to have been maintained at follow-up study two years later, indicating that MTT is a safe and efficient therapy that has potential to improve ASD outcomes in the long-term [17]. However, further studies on the efficacy of MTT are necessary to establish this connection, as the above study sample was small and most symptoms and improvements were self-reported.
Probiotics
Because imbalances in the microbiome are correlated with autism, direct administration of bacterial cultures using probiotics seems to be a potential approach to treat ASD behavioral symptoms. Probiotic treatment that included a combination of Streptococcus, Lactobacillus, and Bifildobacterium was found to be effective in improving core behavioral symptoms of ASD, specifically adaptive functioning, developmental pathways, and multisensory processing in autistic children with GI symptoms [18]. Probiotics have been shown to improve symptoms of other mood disorders like anxiety and depression, both of which are associated with dysbiosis and the gut-brain axis [8], warranting further research on probiotics as a treatment for ASD. Therapies that target microbial metabolite imbalances in ASD individuals, like 5AV and taurine, may also open an avenue for future autism research [9].
Conclusion
The gut microbiome contributes to the maintenance of much of human physiology, with involvement in not only the digestive system but also the immune system and the brain. Dysbiosis of the gut microbiome has been found to be prevalent in children and adults with Autism Spectrum Disorder (ASD), and this dysbiosis may be linked to the behavioral symptoms observed. Treatments that target the gut microbiome, therefore, serve to be useful in improving behavioral deficits associated with ASD and should be a consideration for future research with more rigorous experimental design.
References:
- Mayo Clinic. Autism Spectrum Disorder. Accessed July 30, 2021. Available from: https://www.mayoclinic.org/diseases-conditions/autism-spectrum-disorder/symptoms-causes/syc-20352928.
- Centers for Disease Control and Prevention. Data & Statistics on Autism Spectrum Disorder. Accessed July 30, 2021. Available from: https://www.cdc.gov/ncbddd/autism/data.html.
- Fattorusso A, Genova L, Dell’Isola G, Mencaroni E, Esposito S. 2019. Autism Spectrum Disorders and the gut microbiota. Nutrients.11(2):521.
- Kho Z, Lal S. 2018.The human gut microbiome—A potential controller of wellness and disease. Frontiers in Microbiology. 9:1835.
- Thursby E, Juge N. 2017. Introduction to the human gut microbiota. Biochemical Journal. 474(11): 1823-1836.
- Kong X, Liu J, Cetinbas M, Sadreyev R, Koh M, Huang H, Adeseye A, He P, Zhu J, Russell H, Hobbie C, Liu K, Onderdonk A. 2019. New and preliminary evidence on altered oral and gut microbiota in individuals with Autism Spectrum Disorder (ASD): Implications for ASD diagnosis and subtyping based on microbial biomarkers. Nutrients. 11(9): 2128
- Dan Z, Mao X, Liu Q, Guo M, Zhuang Y, Liu Z, Chen K, Chen J, Xu R, Tang J, Qin L, Gu B, Liu K, Su C, Zhang F, Xia Y, Hu Z, Liu X. 2020. Altered gut microbial profile is associated with abnormal metabolism activity of Autism Spectrum Disorder. Gut Microbes. 11(5): 1246-1267
- Mangiola F, Ianiro G, Franceschi F, Fagiuoli S, Gasbarrini G, Gasbarrini, A. 2016. Gut microbiota in autism and mood disorders. World Journal of Gastroenterology. 22(1): 361-368.
- Sharon G, Cruz N, Kang D, Gandal M, Wang B, Kim Y, Zink E, Casey C, Taylor B, Lane C, Bramer L, Isern N, Hoyt D, Noecker C, Sweredoski M, Moradian A, Borenstein E, Jansson J, Knight R, Metz T, Lois C, Geschwind D, Krajmalnik-Brown R, Mazmanian S. 2019. Human gut microbiota from Autism Spectrum Disorder promote behavioral symptoms in mice. Cell. 177(6): 1600-1618
- Hua X, Thompson P, Leow A, Madsen S, Caplan R, Alger J, O’Neill J, Joshi K, Smalley S, Toga A, Levitt J. 2013. Brain growth rate abnormalities visualized in adolescents with autism. Human Brain Mapping. 34(2):425-36.
- Haba R, Shintani N, Onaka Y, Wang H, Takenaga R, Hayata A, Baba A, Hashimoto H. 2012. Lipopolysaccharide affects exploratory behaviors toward novel objects by impairing cognition and/or motivation in mice: Possible role of activation of the central amygdala. Behavioral Brain Research. 228(2):423-31.
- Emenuele E, Orsi P, Boso M, Broglia D, Brondino N, Barale F, Ucelli di Nemi S, Politi P. 2010. Low-grade endotoxemia in patients with severe autism. Neuroscience Letters. 471(3):162-5
- Ming X, Stein T, Barnes V, Rhodes N, Guo L. 2012. Metabolic perturbance in autism spectrum disorders: a metabolomics study. Journal of Proteome Research. 11(12): 5856-62
- Park E, Cohen I, Gonzalez M, Castellano M, Flory M, Jenkins E, Brown W, Schuller-Levis G. 2017. Is taurine a biomarker in Autistic Spectrum Disorder. Advances in Experimental Medicine and Biology. 975
- Pizzarelli R, Cherubini E. 2011. Alterations of GABAergic signaling in Autism Spectrum Disorders. Neural Plasticity. 2011:297153
- Kang D, Adams J, Gregory A, Borody T, Chittick L, Fasano A, Khoruts A, Geis E, Maldonado J, McDonough-Means S, Pollard E, Roux S, Sadowsky M, Lipson K, Sullivan M, Caporaso J, Brown R. 2017. Microbiota Transfer Therapy alters gut ecosystem and improves gastrointestinal and autism symptoms: an open-label study. Microbiome. 5(1):10
- Kang D, Adams J, Coleman D, Pollard E, Maldonado J, McDonough-Means S, Caporaso J, Krajmalnik-Brown, R. 2019. Long-Term benefit of Microbiota Transfer Therapy on autism symptoms and gut microbiota. Scientific Reports. 9(1):5821
- Santocchi E, Guiducci L, Prosperi M, Calderoni S, Gaggini M, Apicella F, Tancredi R, Billeci L, Mastromarino P, Grossi E, Gastaldelli A, Morales M, Muratori F. 2020. Effects of probiotic supplementation on gastrointestinal, sensory and core symptoms in Autism Spectrum Disorders: A randomized controlled trial. Frontiers in Psychiatry. 11:550593
How Poop is Fighting COVID-19
By Laura Gardner, Biochemistry and Molecular Biology ‘22
Author’s Note: With so much information in the media and online about COVID-19, I find many people get lost in, and fall victim to, false information. I want to reassure the Davis community with factual information on how Davis is fighting COVID-19. With UC Davis’ strong scientific community, I was curious what tools were being used to mitigate the spread of COVID-19. In January 2021, I attended a virtual COVID-19 symposium called Questions about Tests and Vaccines led by Walter S Leal, distinguished Professor of the Department of Molecular and Cellular Biology at University of California-Davis (UC Davis). In this symposium, I learned about Dr. Heather Bischel’s work testing the sewer system. This testing is another source for early detection of COVID-19. In combination with biweekly testing, I have no doubt that UC Davis is being proactive in their precautions throughout the pandemic, which made me personally feel more safe. I hope that this article will shed light on wastewater epidemiology as a tool that can be implemented elsewhere.
Dr. Heather Bischel is an assistant professor in the Department of Civil and Environmental Engineering at the University of California, Davis. Bischel has teamed up with the city of Davis through the Healthy Davis Together initiative to use wastewater epidemiology, a technique for measuring chemicals in wastewater, to monitor the presence of SARS-CoV-2, the virus that causes COVID-19 [6]. When a person defecates, their waste travels through the pipes and is collected in the sewer system. In both pre-symptomatic and asymptomatic individuals, their feces will carry the genetic material that indicates the virus is present. This is because SARS-CoV-2 uses angiotensin-converting enzyme 2, also known as ACE2, as a cellular receptor, which is abundantly expressed in the small intestine allowing viral replication in the gastrointestinal tract [1]. This serves as an early indicator of a possible COVID-19 outbreak and leads to quick treatment and isolation, which are important to stop the spread of the disease.
Samples are taken periodically from manholes around campus using a mechanical device called an autosampler. These autosamplers are lowered into manholes to collect wastewater flow samples every 15 minutes for 24 hours. Next, the samples are taken to the lab where they are able to extract genetic material and use Polymerase Chain Reaction (PCR) to detect the virus. Chemical markers that attach to the specific genetic sequence of the virus are added to the sample, which reacts to the COVID-19 virus by fluorescing visible light. This light is the signal that indicates positive test results.
The samples are collected throughout campus, with a focus on residential halls. An infected person will excrete the virus through their bowel movements before showing symptoms. The samples are so sensitive that if even just one person among thousands is sick, they are still able to detect the presence of COVID-19 genetic material. When a PCR test provides a positive signal, the program works closely with the UC Davis campus to identify if there has been someone who has reported a positive COVID-19 test. If no one from the building is known to be positive, they send out a communication email asking all the students of the building to get tested as soon as possible. That way the infected person can be identified and isolated as soon as possible, eliminating exposure from unidentified cases [4].
In collaboration with the UC Davis campus as well as the city of Davis, Dr. Bische has implemented wastewater epidemiology throughout the community. Since summer 2020, Dr. Bische’s team of researchers have collected data which is available online through the Healthy Davis Together initiative [4].
In addition to being an early indicator, this data has also been used to determine trends, which can indicate if existing efforts to combat the virus are working or not [2]. Existing efforts include vaccinations, mask wearing, washing hands, maintaining proper social distancing, and staying home when one feels ill. UC Davis has implemented protocols including biweekly testing and a daily symptom survey that must be completed and approved in order to be on campus.
Wastewater epidemiology has been implemented all over the world, at more than 233 Universities and in 50 different countries, according to monitoring efforts from UC Merced [3]. This testing has been used in the past to detect polio, but has never before been implemented on the scale of a global pandemic. Lacking infrastructure, such as ineffective waste disposal systems, open defecation, and poor sanitation pose global challenges, especially in developing countries [2]. Without tools for early detection, these communities are in danger of having an exponential rise in cases.
“Our work enables data-driven decision-making using wastewater infrastructure at city, neighborhood, and building scales,” Dr. Bische stated proudly in her latest blog post [2]. These decisions are crucial in confining COVID-19 as we continue to push through the pandemic.
Summary of how wastewater epidemiology is used to fight COVID-19
References:
- Aguiar-Oliveira, Maria de Lourdes et al. “Wastewater-Based Epidemiology (WBE) and Viral Detection in Polluted Surface Water: A Valuable Tool for COVID-19 Surveillance-A Brief Review.” International journal of environmental research and public health vol. 17,24 9251. 10 Dec. 2020, doi:10.3390/ijerph17249251
- Bischel, Heather. Catching up with our public-facing COVID-19 wastewater research. Accessed August 15, 2021.Available from H.Bischel.faculty.ucdavis
- Deepshikha Pandey, Shelly Verma, Priyanka Verma,et al. SARS-CoV-2 in wastewater: Challenges for developing countries, International Journal of Hygiene and Environmental Health,Volume 231,2021,113634, ISSN 1438-4639, https://doi.org/10.1016/j.ijheh.2020.113634.
- Healthy Davis Together. Accessed February 2, 2021. Available from Healthy Davis Together – Working to prevent COVID-19 in Davis
- UCMerced Researchers. Covid Poops Summary of Global SARS-CoV-2 Wastewater Monitoring Efforts. Accessed February 2, 2021. Available from COVIDPoops19 (arcgis.com)
- Walter S Leal. January 13, 2021. COVID symposium Questions about Tests and Vaccines. Live stream online on zoom.
Investigating Anthelmintics for Vector Control
By Anna Cutshall, Animal Biology, ’21
Author’s Note: When considering the topic of my literature review and analysis, I wanted to select work that I could continue research on in graduate school. As I entered academia, my career and life experiences had prepared me well for the unique intersection of veterinary medicine, ecology, and epidemiology. I have been on a pre-veterinary track for many years and have worked professionally in the veterinary field for more than three years. As an Animal Biology major and Global Disease Biology minor, my coursework largely centered around the emerging threat of zoonotic and vector-borne diseases. These experiences considered, my primary research interests lie in how we may integrate veterinary medicine into One Health practices to better combat emerging disease threats. In this literature review, I investigate the viability of anthelmintic drugs against arthropod vectors of disease. The use of anthelmintics against arthropods is fairly new, and the pool of current literature is limited but promising. This review was written for those, like myself, who are interested in new approaches to the control of tropical diseases, especially through the lens One Health. I hope to leave readers with a clear picture of what is next for this field, what gaps in the data should be filled, and how we can use information gained in responsible, sustainable ways to combat both emerging and established vector-borne diseases.
Abstract
This literature review analyzes the efficacy of currently available anthelmintic drugs against key disease vectoring arthropods. When comparing effective dosages between different drugs and vector genera, we found that relatively low concentrations are effective against most vectors, but there is evidence to suggest that ivermectin resistance has been established in some species (Aedes spp). The avermectin drug class also displayed limited efficacy over time, as the drugs degrade in vertebrate species faster than the isoxazoline drug class or fipronil. We determined that the current findings related to this method of vector control are promising. However, further research must be conducted before we implement anthelmintics for mass drug administration as a part of integrated vector management.
Keywords: anthelmintics, insecticides, vector, disease vector, mosquito, sandfly, One Health, integrated vector management, mass drug administration
Introduction
Vector-borne diseases threaten the well-being of hundreds of millions of people globally. This is predicted to increase as climate change and human activity facilitate the spread of vector species to previously unoccupied locations. In a press release by the Sacramento-Yolo Mosquito & Vector Control District, it was reported that multiple invasive mosquito species, including Aedes aegypti, had been identified in northern California [1]. Recent literature suggests that these habitual expansions may be due, in part, to climate change as these species are able to adapt to broader regions that are of similar climate to their native regions [2]. The continued spread of these species leaves unprepared countries at risk for outbreaks of the diseases vectored by invading species. Moreover, most vector-borne diseases remain uncontrolled in endemic regions. The most direct way to mitigate the threat of globalizing tropical vector-borne diseases is to control the species that are vectoring them. Unfortunately, traditional insecticide-based methods of vector control have become ineffective due to the emergence of insecticide resistance. In 2012, the World Health Organization identified the status of insecticide resistance as “widespread”, as most of the globe reported resistance in at least one major malaria vector [3]. Traditional spray and topical insecticides have been compromised by such resistance. Therefore, it is essential that new methods of vector control,without acquired resistance, be discovered, evaluated, and implemented.
There are many new methods of vector control currently under evaluation. These include genetically modifying vectors to render them sterile, the use of entomopathogenic fungi and viruses, trapping, repellents, and environmental modification [4]. As we continue to evaluate each method for its efficacy, the Integrated Vector Management (IVM) method may be our best option for the elimination of many tropical diseases. Through IVM, we take careful and integrated approaches to vector control via intersectional communication between Public Health officials, Governments, Non-Governmental Organizations, and communities in which we hope to implement our strategies [5]. IVM calls for multiple vector control strategies, and increasing control efficacy via synergy between control efforts. Unfortunately, the primary tool utilized for the control of adult mosquitoes, insecticides, has lost efficacy over time. This is a result of vector insect populations developing resistance to common insecticides, such as pyrethroids and organophosphates, that are used to control adult mosquito populations. However, there is a reservoir of insecticides that have not been utilized against human disease-vectors, which therefore have minimal acquired resistance . This class is oral insecticides, or insecticides ingested by vertebrates that act when a vector is exposed via blood meal from a treated animal. The use of oral insecticides has been standard in veterinary medicine for years, in the form of flea and tick prevention. Common classes of oral insecticides include avermectins, isoxazolines, and phenylpyrazoles. These compounds have been standard in human and/or veterinary medicine as ectoparasiticides, demonstrating their safety for use in vertebrates. Avermectins, isoxazolines, and phenylpyrazoles have similar modes of action as neurotoxins, with both interrupting the function of GABA-gated chloride ion channels, resulting in insect paralysis [6, 7]. Importantly, there is still diversity within these classes as tools against vector species, as they bind to different sites on the GABA receptors [7, 8]. Investigating the efficacy of these drugs for use as insecticides, against key vectors of diseases such as malaria, zika, west nile virus, leishmaniasis, and African Trypanosomiasis, could be part of the solution to the increasingly urgent problem of insecticide resistance.
Research is currently underway, across the globe, to investigate the efficacy of ectoparasiticides against disease vectors. The question still stands, however, if the approach of oral insecticides is any more effective than the traditional insecticides available. To answer this question, we assessed the current literature regarding the testing of ectoparasiticides against disease vectors, and developed a database of studies testing the efficacy of these compounds against vector insects. This analysis aims to determine the relative efficacy of these compounds to determine if these drug classes are worth consideration for use in vector control and management.
Materials and methodology
To establish a database of the relevant literature, we first mined the scientific literature via the UC Davis Library. Using access granted to undergraduate students, the search terms utilized were input as follows: title/abstract contain “vector” AND “veterinary” AND “control”
AND “arthropod” in the key word function. Papers were then selected for further analysis. These articles were input into an AI-based literature analysis tool, “Research Rabbit”, to identify additional relevant studies [9]. In addition, studies were selected from the works cited of previously selected works. Papers not testing the efficacy of oral insecticides on adult disease vectors were excluded from the study. Additionally, papers without comparable data (did not supply direct mortality or density data) were also excluded.
Each paper was analyzed to extract relevant data on the efficacy of oral insecticides against disease vectors. The data was collated into a Microsoft Excel spreadsheet. Categories selected for further evaluation included: drug type, the concentrations used, associated concentration resulting in 50% mortality (LC50 values), time to mortality, the reduction of vectors present in field study by visual count (resting density), and drug effects on vector fecundity. However, for the purposes of this study, we focused on LC50 and temporal values.
Other categories were not consistent across publications.
When creating data visualizations for comparison of different drug types, R’s “ggplot2” package, “dplyr” package and “esquisse” package were used [10-13]. The categories determined to be best for visual comparison were “Temporal Data” and “LC50” data. After initial visualization was made in R, figures were exported to Adobe Illustrator to edit aesthetically, which was limited to modification of font types and caption content [14]. When creating visualizations for LC50 data, both sandfly and mosquito vectors were compared on the same figure, to compare the efficacy of not only drugs in relation to each other, but also drugs in relation to their efficacy against different disease vectors. When creating the data visualization for this comparison, the drugs “Moxidectin” and “NTBC” were excluded. Moxidectin’s LC50 value was too high to allow for reasonable comparison to other drugs, and NTBC only had a value for tsetse flies (Glossina spp), which were not represented in any other drug. Additionally, the Lutzomyia spp displayed LC50 values too high to be effectively compared to other disease vectors. When visualizing temporal data, only Anopheles spp and Phebotomus spp had enough supporting information in the literature for effective comparisons. There were 5 studies that supplied data for Anopheles spp temporal data and 2 studies that supplied data for Phlebotomus spp temporal data. These temporal data were plotted as Day of Feeding against Mortality, faceted by drug type, and grouped by dose. 2 Figures were created, one for Phlebotomus spp and another for Anopheles spp.
Results
Database Creation
From the initial search in the UC Davis Library system, 15 studies were selected. Then, based on the output from Research Rabbit, an additional 5 studies were selected. Finally, 3 additional papers were identified and integrated into the analysis from the references of the 20 studies. These 23 studies were then evaluated individually from January, 2021 through March, 2021.
We obtained multiple data categories for comparison between the selected papers. Figure 1 displays the summary of the resulting database. Due to the recent nature of this research, resources from which to draw for our database were limited. Table 1 shows a summary of the data types and the number of papers each data type was collected from. Within each paper, some investigated efficacy against multiple vector genuses while others investigated only one. Based on the data we were able to collect, we will be moving forward directly comparing temporal data as well as LC50 data.
Comparing Effectiveness of Dosages Between Drugs
We sought to compare the concentrations of drugs required to be effective against the disease vectoring arthropods studied. Figure 2 displays the LC50 values chosen for comparison as a “lollipop” plot. Within the plot, each “dot” represents a single datapoint taken from a study, and 7 studies were compiled to create the plot. In this figure, we are able to compare 2 major drug classes: avermectins and isoxazolines. The isoxazoline drug class had more available data across insect families, and it is clear that the LC50 value is variable between genuses. Sandflies have more resistance to isoxazolines (especially fluralaner) than mosquito species. Amongst the avermectins, Anopheles spp. Display the most consistent, and relatively low LC50 values. However, Aedes spp. displays higher resistance to ivermectin compared to Anopheles spp.
Comparing Temporal Data
Temporal data involving different insecticides were first to be compared. In temporal data, the “Days Post Feeding (Day of Blood Meal)” represents the number of days after the initial dosing of the vertebrate animals (for example, “Day 3” indicates mosquitos that fed on an animal 3 days after it was given the drug). We were able to create 2 figures for comparing the efficacy of drugs over time at various doses. Doses were represented as variance in color in the figures. Figure 3 displays the efficacy of oral dosing to vertebrates of eprinomectin and ivermectin against Anopheles spp. over time. Both of the drugs in this comparison were of the avermectin class, and neither displayed robust effects on mortality past the 15 day mark after single-dosing of vertebrates with the drugs. Additionally, we observe great variance in the efficacy of ivermectin, even within dosages (that is, mortality varied within dosages between different studies). Unfortunately, mosquito genuses outside of Anopheles did not provide enough data to compare efficacy over time.
Created in a similar fashion, Figure 4 displays the efficacy of oral dosing of vertebrates with fipronil and fluralaner. Here, two separate drug classes were tested for efficacy. While fluralaner (a member of the isoxazoline drug class) acts in a similar mode of action to avermectins, it maintains efficacy over time in Sandflies. Because mosquitos displayed more sensitivity to isoxazolines than sandflies, (Figure 2) one may predict similar, if not more deadly, effects when isoxazolines are tested over time for mosquitos. The drug fipronil displays varying efficacy between doses. Unfortunately, the study involving fipronil did not collect data past 21 days of administration, but it is possible some of the dosages would have remained effective from visual interpretation of the figure. Due to the limited amount of studies investigating the efficacy of oral insecticides, we were not able to compare the efficacy of all drugs over time, as other studies used different methods of efficacy measurements.
Discussion
Based on the findings of our review of currently available literature, oral insecticides certainly show promise as a method of Disease Vector management. As displayed in Figure 2, we are able to determine effective dosages for each drug as a concentration in blood. However, there was significant variance in the data between taxa and between drugs. The highest resistance was observed in Aedes spp. against ivermectin, which could be evidence of acquired resistance due to the common use of ivermectin in humans as an anthelmintic [15]. Another significant variance observed was the relative resistance of sandflies to isoxazolines, requiring approximately twice the concentration or more compared to mosquito species [16]. It is unclear if this effect carries over to other drugs, as there is no available data.
There were also studies analyzed that were not included in the visual data analyses performed. These included studies that investigated the efficacy of isoxazolines against the kissing bug, of nitisinone against the tsetse fly, a field study, and data from otherwise integrated studies measuring the effect of the drugs on fecundity of arthropods [16-22]. The investigation by Loza et al. regarding the efficacy of isoxazolines against the kissing bug showed similar temporal data results to papers investigating isoxazolines against Sandflies, which was visualized in Figure 4 [17]. The isoxazoline drug class, then, has been shown to be effective against 3 major disease vector families. Another drug class also shows promise. A study by Sterkel et al proposes the use of nitisinone (traditionally used in the treatment of hereditary tyrosinemia type 1, a genetic disorder) as an insecticide dispensed to vertebrates, and investigates its efficacy against the African Trypanosomiasis vector, the tsetse fly. This study highlights the importance of looking for alternative methods to vector control, and manipulates a characteristic of a drug originally developed to aid in human disease against disease vectors. The 2021 study found that concentrations above 0.5 micrograms per milliliter in blood impacted survival of feeding tsetse flies significantly, while also studying pharmacokinetics when ingested by mouse models [18]. Pharmacokinetic data supplied by the mouse models in this study may assist in any later calculations for human dosage. No evidence is available on the effectiveness of nitisinone on other disease vectors.
Three studies supplied data involving sublethal effects on adult arthropods, including fecundity. These studies found that there were significant effects on Anopheles spp. fecundity, regardless of vertebrate being dosed and observed across multiple doses [20-22]. There is no currently available data involving the effects of isoxazolines or phenylpyrazoles. Should they be provided, however, they show additional promise as vector management tools. When an insecticide is able to exhibit both lethal and sublethal effects, particularly regarding fecundity, insects that survive the initial exposure produce less offspring than their unexposed peers.
In order for these methods to be effective in Disease Vector management, there would need to be a considerable number of individuals in the population participating to make a significant impact on the burden of vector borne diseases [16, 19]. Mass Drug Administration (MDA) is expensive, and cost is a limiting factor in many of the areas we hope to lower disease burden in. Due to this, and issues related to the accessibility of MDA, it is important that drugs remain effective for an extended period of time. Fortunately, we found this to be the case. As evident in Figure 4, both isoxazoline drugs display extended efficacy on mortality of sandflies over 40 days after initial vertebrate dosage. Additionally, it may be that fipronil displays a similar effect in higher tested dosages, following the trajectory of the available data. Unfortunately, there is limited literature on this subject, so we are unable to say with absolute certainty that these effects would carry over to mosquito species. Figure 3 suggests that the avermectin drug class does not have the same long-term effect on arthropod mortality. For both ivermectin and eprinomectin, mortality dropped below 50% overall after just 14 days from initial dosage. For this reason, the isoxazoline and phenylpyrazole drug classes may be more effective for MDA, although their testing for safety in humans is less extensive (than ivermectin).
Additionally, there is the question of if MDA should be dispensed to humans or livestock. The field study by Poche et al. applied previous findings to the field, dosing cattle in several tribes in Africa and visually measuring the effects on the density of mosquitos found in nearby homes. They observed that the dosage of livestock with fipronil reduced the “resting density” of mosquito species known to feed on both cattle and humans, but did not significantly affect the resting density of particularly anthropophilic species [19]. This study highlights the importance of catering an MDA to the specific species you want to impact by ensuring dosage to vertebrates that it is likely to take a blood meal from.
When considering a drug for use in MDA, the safety of the drug must be copiously studied, and current findings are promising. At the dosages used in the study that were effective against adult arthropods, vertebrates suffered no severe adverse effects attributed to the dosages in all of the studies analyzed. This strongly suggests that these drugs are safe for use in IVM. Additionally, when considering MDA, taking a “One Health” approach will also be key to success. Too often, non-Governmental Organizations have gone into regions with targeted endemic diseases, and neglected to listen to native perspectives on previously used methods of disease control and basic needs. While investigating the efficacy of these drugs is important to protecting communities against vector-borne disease, giving aid to impoverished communities must first address the baseline health of individuals at risk. Only then can we hope to earn the trust of native populations, and continue to help them in sustainable ways. Continuing to thoroughly investigate the efficacy and safety of this sector of vector management before beginning any large implementation will also be essential.
Overall, it can be inferred from the amount of studies performed that there needs to be an enormous amount of research performed before we integrate oral insecticides, especially in humans, into IVM. What we do know, though, gives promise in the face of the insurmountable resistance to traditional pesticides.
References
- First Detection of Invasive Mosquitoes in Yolo County. Sacramento-Yolo Mosquito & Vector Control https://www.fightthebite.net/news-posts/first-detection-of-invasive-mosquitoes-in-yolo-c ounty/ (2020).
- Caminade, C., McIntyre, K. M. & Jones, A. E. Impact of recent and future climate change on vector‐borne diseases. Ann N Y Acad Sci 1436, 157–173 (2019).
- World Health Organization. Global Malaria Programme & World Health Organization. Global plan for insecticide resistance management in malaria vectors. GPIRM (2012).
- Takken, W. & Knols, B. G. J. Malaria vector control: current and future strategies. Trends in Parasitology 25, 101–104 (2009).
- Beier, J. C. et al. Integrated vector management for malaria control. Malar J 7, S4 (2008).
- Wang, C. C. & Pong, S. S. Actions of avermectin B1a on GABA nerves. Prog Clin Biol Res 97, 373–395 (1982).
- Ratra, G. S. & Casida, J. E. GABA receptor subunit composition relative to insecticide potency and selectivity. Toxicology Letters 122, 215–222 (2001).
- Casida, J. E. & Durkin, K. A. Novel GABA receptor pesticide targets. Pesticide Biochemistry and Physiology 121, 22–30 (2015).
- ResearchRabbit, Version 2.0 Human Intelligence Technologies, Incorporated Available from: https://www.researchrabbit.ai/
- R Core Team (2013). R: A language and environment for statistical computing. R foundation for Statistical Computing, Vienna, Austria. http://www.R-project.org/
- H. Wickham. ggplot2: Elegant Graphics for Data Analysis. Springer-Verlag New York, 2016.
- Hadley Wickham, Romain François, Lionel Henry and Kirill Müller (2021). dplyr: A Grammar of Data Manipulation. R package version 1.0.5. https://CRAN.R-project.org/package=dplyr
- Fanny Meyer and Victor Perrier (2021). esquisse: Explore and Visualize Your Data Interactively. R package version 1.0.1. https://CRAN.R-project.org/package=esquisse
- Adobe Inc. Adobe Illustrator [Internet]. 2019. Available from: https://adobe.com/products/illustrator
- Hadlett, M., Nagi, S. C., Sarkar, M., Paine, M. J. I. & Weetman, D. High concentrations of membrane-fed ivermectin are required for substantial lethal and sublethal impacts on Aedes aegypti. Parasit Vectors 14, 9 (2021).
- Miglianico, M. et al. Repurposing isoxazoline veterinary drugs for control of vector-borne human diseases. Proc Natl Acad Sci U S A 115, E6920–E6926 (2018).1.
- Loza, A. et al. Systemic insecticide treatment of the canine reservoir of Trypanosoma cruzi induces high levels of lethality in Triatoma infestans, a principal vector of Chagas disease. Parasit Vectors 10, 344 (2017).
- Sterkel, M. et al. Repurposing the orphan drug nitisinone to control the transmission of African trypanosomiasis. PLOS Biology 19, e3000796 (2021).
- Poché, R. M. et al. Preliminary efficacy investigations of oral fipronil against Anopheles arabiensis when administered to Zebu cattle (Bos indicus) under field conditions. Acta Trop 176, 126–133 (2017).
- Belay, A., Petros, B., Gebre-Michael, T. & Balkew, M. Effect of LongRangeTM eprinomectin on Anopheles arabiensis by feeding on calves treated with the drug. Malaria Journal 18, 332 (2019).
- Mekuriaw, W. et al. The effect of ivermectin® on fertility, fecundity and mortality of Anopheles arabiensis fed on treated men in Ethiopia. Malar J 18, 357 (2019).
- Pasay, C. J. et al. Treatment of pigs with endectocides as a complementary tool for combating malaria transmission by Anopheles farauti (s.s.) in Papua New Guinea. Parasites & Vectors 12, 124 (2019).
- Alout, H. et al. Evaluation of ivermectin mass drug administration for malaria transmission control across different West African environments. Malaria Journal 13, 417 (2014).
- Bongiorno, G. et al. A single oral dose of fluralaner (Bravecto®) in dogs rapidly kills 100% of blood-fed Phlebotomus perniciosus, a main visceral leishmaniasis vector, for at least 1 month after treatment. Med Vet Entomol 34, 240–243 (2020).
- Chaccour, C. J. et al. Targeting cattle for malaria elimination: marked reduction of Anopheles arabiensis survival for over six months using a slow-release ivermectin implant formulation. Parasites & Vectors 11, 287 (2018).
- Davey, R. B., Miller, J. A., George, J. E. & Klavons, J. A. Efficacy of a Single Doramectin Injection Against Adult Female Boophilus microplus (Acari: Ixodidae) in the Final Stages of Engorgement Before Detachment. Journal of Medical Entomology 44, 277–282 (2007).
- Foy, B. D. et al. Efficacy and risk of harms of repeat ivermectin mass drug administrations for control of malaria (RIMDAMAL): a cluster-randomised trial. The Lancet 393, 1517–1526 (2019).
- Fritz, M. L., Walker, E. D. & Miller, J. R. Lethal and sublethal effects of avermectin/milbemycin parasiticides on the African malaria vector, Anopheles arabiensis. J Med Entomol 49, 326–331 (2012).
- Gomez, S. A. et al. A randomized, blinded, controlled trial to assess sand fly mortality of fluralaner administered orally in dogs. Parasit Vectors 11, 627 (2018).
- Kobylinski, K. C. et al. The effect of oral anthelmintics on the survivorship and re-feeding frequency of anthropophilic mosquito disease vectors. Acta Trop 116, 119–126 (2010).
- Lozano-Fuentes, S. et al. Evaluation of a topical formulation of eprinomectin against Anopheles arabiensis when administered to Zebu cattle (Bos indicus) under field conditions. Malar J 15, 324 (2016).
- Ouédraogo, A. L. et al. Efficacy and safety of the mosquitocidal drug ivermectin to prevent malaria transmission after treatment: a double-blind, randomized, clinical trial. Clin Infect Dis 60, 357–365 (2015).
- Poché, R. M., Burruss, D., Polyakova, L., Poché, D. M. & Garlapati, R. B. Treatment of livestock with systemic insecticides for control of Anopheles arabiensis in western Kenya. Malaria Journal 14, 351 (2015).
- Poché, R. M., Garlapati, R., Singh, M. I. & Poché, D. M. Evaluation of fipronil oral dosing to cattle for control of adult and larval sand flies under controlled conditions. J Med Entomol 50, 833–837 (2013).
- Pooda, H. S. et al. Administration of ivermectin to peridomestic cattle: a promising approach to target the residual transmission of human malaria. Malaria Journal 14, 496 (2015).
- Smit, M. R. et al. Safety and mosquitocidal efficacy of high-dose ivermectin when co-administered with dihydroartemisinin-piperaquine in Kenyan adults with uncomplicated malaria (IVERMAL): a randomised, double-blind, placebo-controlled trial. Lancet Infect Dis 18, 615–626 (2018).
- Smit, M. R. et al. Pharmacokinetics-Pharmacodynamics of High-Dose Ivermectin with Dihydroartemisinin-Piperaquine on Mosquitocidal Activity and QT-Prolongation (IVERMAL). Clin Pharmacol Ther 105, 388–401 (2019).
COVID-19 survivors can retrain their smell to enjoy food and wine again
By Daniel Erenstein, Neurobiology, Physiology & Behavior ‘21
Author’s Note: Last spring, I enrolled in the inaugural offering of the University Writing Program’s wine writing course. Our instructor, Dr. Alison Bright, encouraged us to report on topics of personal interest through our news stories on the wine industry, viticulture, enology, and more. In this article, which was prepared for an audience of general science enthusiasts, I examine how biologists are making sense of a puzzling COVID-19 symptom — anosmia, or loss of smell — and what COVID-19 patients with this condition can do to overcome it. Eighteen months into this pandemic, scientists continue to study cases of COVID-19-related anosmia with dreams of a treatment on the horizon. I hope that readers feel inspired by this article to follow this in-progress scientific story. I extend my appreciation to Dr. Bright, who throughout the quarter shared approaches to rhetorical awareness that elevated my grasp of effective writing.
Image caption: Anton Ego, the “Grim Eater” from PIXAR’s Ratatouille, is reminded of his childhood by Remy’s rendition of ratatouille, a Provençal dish of stewed vegetables.
With a single bite of Remy’s latest culinary creation, the eyes of Anton Ego, a notoriously harsh food critic, dilate, and Ratatouille’s viewers are transported back in time with Monsieur Ego. The meal — a simple yet elegant serving of ratatouille, accompanied by a glass of 1947 Château Cheval Blanc — has triggered a flashback to one singular moment, a home-cooked meal during his childhood. The universal charm of this enduring scene resonates; in Ego’s eyes, many recognize how our senses of smell and taste can impact a culinary experience.
Imagine how a real-life version of this scene might change for the millions of COVID-19 patients who have lost their sense of smell [1]. Anosmia, the phenomenon of smell loss, has become one of the more perplexing COVID-19 symptoms since first observed in patients during the earliest months of the pandemic [2].
What happens when we lose our sense of smell? During the pandemic, scientists have studied smell loss, which affects more than 85 percent of COVID-19 patients according to research published this year in the Journal of Internal Medicine [3]. In fact, anosmia is so common in COVID-19 patients that physicians were offered guidance for testing olfactory function as an indicator of infection last year [4].
To simplify studies of these complicated senses, taste and smell are often examined independently of one another, even though these senses are usually experienced simultaneously.
“Smell is just — it’s so crucial to taste, which means it’s really crucial to everything that I do,” said Tejal Rao, a New York Times food critic, in a March episode of The Daily [5]. “And it’s really difficult to cook without a sense of smell if you’re not used to it. You don’t know what’s going on. It’s almost like wearing a blindfold.”
Rao, who lost her sense of smell in mid-January after contracting COVID-19, began to search for answers to this mystery from scientists. Rao’s journey started with TikTok “miracle cures” and other aromatherapies — unfortunately, they were too good to be true — but she eventually discovered the work of Dr. Pamela Dalton, a scientist at the Monell Chemical Senses Center in Philadelphia [6]. At the center, Dalton studies the emotions that are triggered by our sense of smell [7].
During simple colds or viral infections, smell is normally affected when the molecules in food and other aromas are physically blocked off from chemoreceptors in our nose by congestion. Scientists have also cited Alzheimer’s and Parkinson’s diseases, head trauma, and chemotherapy as triggers for anosmia [8]. But a separate phenomenon was occurring in the case of COVID-19.
“COVID is different in that way, because most people who lost their sense of smell did so without having any nasal congestion whatsoever,” Dalton told Rao during an interview.
One study published in October of last year by Dr. Nicolas Meunier, a French neuroscientist, aimed to investigate how the SARS-CoV-2 virus, which causes COVID-19, may disrupt sustentacular cells [9]. These structural cells express the ACE2 receptor, which the virus hijacks to gain entry into our cells, at higher levels [10]. Sustentacular cells support the specialized neurons that transmit signals from the nose to the brain.
When Meunier and his team at Paris-Saclay University in France infected hamsters with the virus, tiny hair-like projections known as cilia on the surfaces of olfactory neurons began to peel back from sustentacular cells. This disruption is a possible explanation for the difficulties with smell that COVID-19 patients experience.
If it is true that damage to sustentacular cells causes anosmia, loss of smell is not an irreversible brain condition. In this case, the poor connection between incoming odors and brain networks that typically process these stimuli is at fault, not direct damage to the brain itself. The sudden onset of smell loss in COVID-19 patients supports this thinking.
“It was just like a light bulb got turned off or a switch got flicked to off,” Dalton said. “And one moment they could smell. And the next moment, nothing smelled.”
But because olfactory support cells regularly regenerate, this loss of smell is only temporary, which allows for retraining of our senses. Two months of smell training, also known as olfactory training, allowed Rao to regain her sense of smell.
Olfactory training gradually exposes patients to particularly strong smells. Spices such as cinnamon or cumin, for example, were perfect for Rao’s first smell training session [5], and AbScent, a British charity offering support to patients with anosmia, sells kits with rose, lemon, and eucalyptus [8]. Scientists have found that recurring exposure to these strong scents gives the brain time to recalibrate its networks, a feature known as neuroplasticity [11].
But “you don’t just go from hurt to healed overnight,” Rao said. “It’s more like adjusting and learning how to live in a new space. That’s really just the beginning.”
Our chemical senses have the power to satisfy, to inspire, even to cause our memory to reveal itself, as 20th-century French author Marcel Proust observed in his seven-volume novel À la recherche du temps perdu, or In Search of Lost Time. Researchers have even speculated that our sense of smell can facilitate learning in other sensory domains, including vision [12].
While scientists further investigate how coronavirus causes loss of smell, olfactory training can provide an avenue in the meantime for COVID-19 patients to recover this crucial sense. Indeed, many patients are “in search of lost time,” and smell training can help them to once again experience food and wine in its sensory entirety.
References:
- Allen J, Almukhtar S, Aufrichtig A, Barnard A, Bloch M, Cahalan S, Cai W, Calderone J, Collins K, Conlen M, et al. 2021. Coronavirus in the U.S.: Latest Map and Case Count. New York (NY): New York Times; [accessed 28 July 2021]. https://www.nytimes.com/interactive/2021/us/covid-cases.html.
- Symptoms of COVID-19. 2021. Atlanta (GA): Centers for Disease Control and Prevention, National Center for Immunization and Respiratory Diseases, Division of Viral Diseases; [accessed 28 July 2021]. https://www.cdc.gov/coronavirus/2019-ncov/symptoms-testing/symptoms.html.
- Lechien JR, Chiesa-Estomba CM, Beckers E, Mustin V, Ducarme M, Journe F, Marchant A, Jouffe L, Barillari MR, Cammaroto G, et al. 2021. Prevalence and 6-month recovery of olfactory dysfunction: a multicentre study of 1363 COVID-19 patients. J Intern Med. 290(2):451–461. https://doi.org/10.1111/joim.13209.
- Whitcroft KL, Hummel T. 2020. Olfactory Dysfunction in COVID-19: Diagnosis and Management. JAMA. 323(24):2512–2514. https://doi.org/10.1001/jama.2020.8391.
- Antolini T, Dorr W, Powell D, Schreppel C. 2021. A Food Critic Loses Her Sense of Smell. New York (NY): New York Times; [accessed 28 July 2021]. https://www.nytimes.com/2021/03/23/podcasts/the-daily/coronavirus-smell-food.html.
- Rao T. 2021. Will Fish Sauce and Charred Oranges Return the World Covid Took From Me? New York (NY): New York Times; [accessed 28 July 2021]. https://www.nytimes.com/2021/03/02/dining/covid-loss-of-smell.html.
- What COVID 19 is teaching us about the importance of smell, with Pamela Dalton, PhD. 17 Mar 2021, 33:31 minutes. American Psychological Association; [accessed 28 July 2021]. https://youtu.be/0pG_U13XDog.
- Schoch D. 2021. Distorted, Bizarre Food Smells Haunt Covid Survivors. New York (NY): New York Times; [accessed 28 July 2021]. https://www.nytimes.com/2021/06/15/health/covid-smells-food.html
- Bryche B, St Albin A, Murri S, Lacôte S, Pulido C, Ar Gouilh M, Lesellier S, Servat A, Wasniewski M, Picard-Meyer E, et al. 2020. Massive transient damage of the olfactory epithelium associated with infection of sustentacular cells by SARS-CoV-2 in golden Syrian hamsters. Brain Behav Immun. 89(2):579–586. https://doi.org/10.1016/j.bbi.2020.06.032.
- Brann DH, Tsukahara T, Weinreb C, Lipovsek M, Van den Berge K, Gong B, Chance R, Macaulay IC, Chou HJ, Fletcher RB, et al. 2020. Non-neuronal expression of SARS-CoV-2 entry genes in the olfactory system suggests mechanisms underlying COVID-19-associated anosmia. Sci Adv. 6(31): eabc5801.
- Kollndorfer K, Kowalczyk K, Hoche E, Mueller CA, Pollak M, Trattnig S, Schöpf V. 2014. Recovery of Olfactory Function Induces Neuroplasticity Effects in Patients with Smell Loss. Neural Plast. 1–7. https://doi.org/10.1155/2014/140419.
- Olofsson JK, Ekström I, Lindström J, Syrjänen E, Stigsdotter-Neely A, Nyberg L, Jonsson S, Larsson M. 2020. Smell-Based Memory Training: Evidence of Olfactory Learning and Transfer to the Visual Domain. Chem Senses. 45(7):593–600. https://doi.org/10.1093/chemse/bjaa049.