Home » Posts tagged 'ucdavis' (Page 3)

Tag Archives: ucdavis

Want to Get Involved In Research?

[su_heading size="15" margin="0"]The BioInnovation Group is an undergraduate-run research organization aimed at increasing undergraduate access to research opportunities. We have many programs ranging from research project teams to skills training (BIG-RT) and Journal Club.

If you are an undergraduate interested in gaining research experience and skills training, check out our website (https://bigucd.com/) to see what programs and opportunities we have to offer. In order to stay up to date on our events and offerings, you can sign up for our newsletter. We look forward to having you join us![/su_heading]

Newest Posts

Could Training the Nose Be the Solution to Strange Post COVID-19 Odors?

By Bethiel Dirar, Human Biology ’24

Author’s Note: I wrote this article as a New York Times-inspired piece in my UWP102B course, Writing in the Disciplines: Biological Sciences. Having chosen the topic of parosmia treatments as a writing focus for the class, I specifically discuss olfactory training in this article. In the midst of the pandemic, this condition caught my attention once I found out about it through social media. It had me wondering what it would be like to struggle to enjoy food post-COVID infection. I simply hope that readers learn something new from this article!

Ask someone who has had COVID-19 if they’ve had issues with their sense of smell, and they may very well say yes. According to the CDC, one of the most prevalent symptoms of the respiratory disease is loss of smell [1]. However, there is a lesser understood nasal problem unfolding due to COVID-19: parosmia. Parosmia, as described by the University of Utah, is a condition in which typically pleasant or at least neutral smelling foods become displeasing or repulsive to smell and taste [2].

As a result of this condition, the comforts and pleasures of having meals, snacks, and drinks disappear. Those who suffer from this condition have shared their experiences through TikTok. In one video that has amassed 9.1 million views, user @hannahbaked describes how parosmia has severely impacted her physical and mental health. She tearfully explains how water became disgusting to her, and discloses hair loss and a reliance on protein shakes as meal replacements. 

The good news, however, is that researchers have now identified a potential solution to this smelly situation that does not involve drugs or invasive procedures: this solution is olfactory training.
A new study shows that rehabilitation through olfactory training could allow patients with parosmia induced by COVID-19 to return to enjoying their food and drink. Olfactory training is a therapy in which pleasant scents are administered nasally [3].

Modified olfactory training was explored in a 2022 study as a possible treatment for COVID-19-induced parosmia. Aytug Altundag, MD and the other researchers of the study recruited 75 COVID-19 patients with parosmia from the Acibadem Taksim Hospital in Turkey and sorted them into two different groups. One group received modified olfactory training and another group served as a control and received no olfactory training [3]. Modified olfactory training differs from classical olfactory training (COT) in that it expands the number of scents used beyond COT’s four scents: rose, eucalyptus, lemon, and cloves [4 ]. These four scents were popularized in olfactory training use as they represent different categories of odor (floral, resinous, fruity, and spicy, respectively) [5].
For 36 weeks, the treatment group was exposed to a total of 12 scents twice a day that are far from foul. In each 12-week period, four scents were administered. For the first 12 weeks, they started with smelling eucalyptus, clove, lemon, and rose. During the next 12 weeks, the next set of scents were administered: menthol, thyme, tangerine, and jasmine. To round it off for the last 12 weeks, they smelled green tea, bergamot, rosemary, and gardenia scents. Throughout the study, the subjects would smell a scent for 10 seconds, then wait 10 seconds before smelling the next scent. The subjects completed the five minute training sessions around breakfast time and bedtime. [3]. 

To evaluate the results of the study, the researchers implemented a method known as the Sniffin’ Sticks test. This test combines an odor threshold test, odor discrimination test, and odor identification test, to form a TDI (threshold, discrimination, identification) score. According to the test, the higher the score is, the more normal the state of an individual’s olfactory perception is. A composite score between 30.3 and the maximum score of 48 indicates normal olfactory function while scores below 30.3 point to olfactory dysfunction [3].

The results of this research are promising. By the ninth month of the study, a statistically significant difference in average TDI scores had been found between the group that received modified olfactory training and the control group (27.9 versus 14) [3]. This has led the researchers to believe that with prolonged periods of the therapy, olfactory training could soon become a proven treatment for COVID-19-induced parosmia. 

With this conclusion, there is greater hope now for those living with this smell distortion. Fifth Sense, a UK charity focusing on smell and taste disorders, has spotlighted stories emphasizing the need for effective treatments for parosmia. One member of the Fifth Sense community and sufferer of parosmia, 24-year-old Abbie, discussed the struggles of dealing with displeasing odors. “I ended up losing over a stone in weight very quickly because I was skipping meals, as trying to find food that I could eat became increasingly challenging,” she recounted to Fifth Sense [6].

If olfactory training becomes an effective treatment option, eating and drinking might no longer be a battle for those with parosmia. Countless people suffering from the condition will finally experience an improvement in their quality of life so desperately needed, especially with COVID becoming endemic.

REFERENCES:

  1. Centers for Disease Control and Prevention. Symptoms of COVID-19. Accessed November 20, 2022. Available from: https://www.cdc.gov/coronavirus/2019-ncov/symptoms-testing/symptoms.html
  2. University of Utah Office of Public Affairs. Parosmia after COVID-19: What Is It and How Long Will It Last? Accessed November 20, 2022. Available from: https://healthcare.utah.edu/healthfeed/postings/2021/09/parosmia.php
  3. Altundag Aytug, Yilmaz Eren, Caner Kesimli, Mustafa. 2022. Modified Olfactory Training Is an Effective Treatment Method for COVID-19 Induced Parosmia. The Laryngoscope [Internet]. 132(7):1433-1438. doi:10.1002/lary.30101
  4. Yaylacı Atılay, Azak Emel, Önal Alperen, Ruhi Aktürk Doğukaan, and Karadenizli Aynur. 2022. Effects of classical olfactory training in patients with COVID-19-related persistent loss of smell. EUFOS [Internet]. 280(2): 757–763. doi:10.1007/s00405-022-07570-w
  5. AbScent. Rose, lemon, clove and eucalyptus. Accessed February 5, 2023. Available from: https://abscent.org/insights-blog/blog/rose-lemon-clove-and-eucalyptus 
  6. Fifth Sense. Abbie’s Story: Parosmia Following COVID-19 and Tips to Manage It. Accessed November 23, 2022. Available from: https://www.fifthsense.org.uk/stories/abbies-story-covid-19-induced-parosmia-and-tips-to-manage-it/

Western Sandpiper Population Decline on the Pacific Coast of North America

By Emma Hui, Biological Sciences ‘26 

INTRODUCTION

The migration of Western Sandpipers from the high Arctics to Southern California has always been a treasured gem in the fall. Yet as decades roll by, Western Sandpiper populations have been in continuous decline, and the rugged coastline of the Pacific Northwest seems lonelier than ever [1]. As a migratory bird species, the Western Sandpiper plays crucial ecological roles as an indicator of ecosystem health and by connecting diverse habitats across continents.

The purpose of this essay is to introduce the ongoing decline of Western Sandpiper populations in recent years, with a particular focus on the population decline in North America. This paper will provide an overview of Western Sandpiper migration and population changes, examine the potential causes behind the dynamics, and analyze the decline’s corresponding ecological effects. I will also explore possible remedies for the issue from the perspectives of habitat restoration, conservation, and legislative measures. The ultimate objective of this essay is to raise awareness and promote action for the ecological conservation of Western Sandpipers before it is too late.

Background

Western Sandpipers are small migratory birds that breed in high Arctic regions of Alaska and Siberia and migrate south to the Pacific coast of North and South America for winter. Their migration is 15,000 kilometers every year along the Pacific Flyway, spanning from Alaska to South America. During winter, their nonbreeding season, they move to coastal areas with mudflats, estuaries, and beaches, which allows the birds to rest and forage for food. In spring, the Western Sandpipers take a similar reverse migration route, stopping at critical habitats along the way until they reach the treeless Arctic tundra. As they fly north, they breed in Northwestern Alaska and Eastern Siberia, and each female lays three to four eggs. 

They measure 6 to 7 inches in length and have reddish brown-gold markings on their head and wings. Their most salient features are their slender pointed bills and long legs. The bills are adapted for foraging crustaceans, insects, and mollusks in muddy areas, while their pair of long but thin legs are used for wading in shallow water and sand. These small, darting birds can be seen in tidal areas, foraging in mudflats for invertebrates and biofilms at low and middle tides with other shorebird communities.

Having multiple species foraging together makes shorebirds among the most difficult birds to identify, especially with many species being quite similar in morphology as well as call. As they always smoothly blend into the community, it is not surprising that the population decline of the small Western Sandpipers went unnoticed at first and was reported only when changes in population levels became more obvious.

Causes of Western Sandpiper population decline

The population decline in the Western Sandpiper population has been continuous throughout the past decade. According to the North American Breeding Bird Survey, which monitors populations of breeding birds across the continent, the Western Sandpiper had a relatively stable population trend in the United States from 1966 to 2015, with an annual population decline of 0.1% over this period [2]. In more recent years, a research team in British Columbia, Canada that investigates estuary condition change has noticed the decline in Western Sandpipers inhabiting the Fraser River estuary. Observing the Western Sandpiper population during Northern migration on the Fraser River estuary, the team concluded a 54% decline in Western Sandpipers over the entire study period of 2019 []. The negative trend in migrating Western Sandpipers in North America is consistent with this study in Fraser River. A study using Geolocator wetness data to detect periods of migratory flight examined the status and trends of 45 shorebird species in North America, including the Western sandpiper. The author found that the Western Sandpiper population in the U.S. declined by 37% from 1974 to 2014, with an estimated population of 2,450,000 individuals in 2014 compared to 3,900,000 individuals in 1974.[3]

Currently, on BirdLife International Data Zone, Western Sandpipers have been labeled “least concern” for their wide range of inhabitation, but their population is decreasing. The species faces threats from habitat loss and degradation, pollution, and disturbance, particularly in its wintering and stopover sites along the North American Pacific coast. Habitat loss due to human activities, namely agricultural expansion and oil development, has contributed to the loss and degradation of Western Sandpiper’s breeding, wintering, and stopover habitats. [4] The loss of these habitats has led to reductions in breeding success, migration stopover times, and overwintering survival of Western Sandpipers. Meanwhile, Western Sandpipers are constantly exposed to various pollutants including pesticides, heavy metals, oil spills, and plastics. These contaminants affect Western Sandpiper’s health and reproductive success directly and impact Western Sandpiper’s prey and predators. As habitat loss leads to reduced food resources, Western Sandpipers’ overall health is negatively impacted, making them even more vulnerable to pollutants and contaminants. 

Climate change is also expected to have future impacts on the species. One possible shift that climate change can impose is on the timing, intensity, and distribution of precipitation. The precipitation shifts have caused droughts and floods in areas that are breeding and stopover habitats for Western Sandpiper and other shorebirds, leading to reduced breeding success and increased mortality in the Western Sandpiper population. Climate change also imposes effects on sea level, temperature, and the frequency and severity of extreme weather, which can all affect the quality of breeding habitat and food availability for Western Sandpipers.

The interactions between these factors are complex and can lead to a feedback loop of negative impacts on the population. As habitat loss leads to reduced food resources, Western Sandpipers’ overall health is negatively impacted, making them even more vulnerable to pollutants and contaminants.

Effects of Western Sandpiper population decline

The decline of the Western Sandpiper population can have significant impacts on ecosystems. As a migratory shorebird, the Western Sandpiper’s ecological role lies in coastal environments; by preying on invertebrates along the coastal shoreline, the Western Sandpipers control their prey species populations and balance the ecosystem. The decline of the Western Sandpiper population can lead to an increase in their prey species such as polychaete worms and bivalves, which can lead to changes in the composition of other species that prey on similar invertebrates and perturbates the ecosystem’s equilibrium. Furthermore, many predator species, such as falcons and owls, depend on the Western Sandpiper as a food source, and their decline will negatively impact these predator species.

Aside from predator-prey dynamics, Western Sandpipers also forage with many other migratory shorebird species in muddy areas along the coast. These birds, such as the Marble Godwit and the Red Knot, depend on the same stopover habitats as the Western Sandpiper during their own migrations and thus compete for similar resources. As the Western Sandpiper population declines, changing interspecies dynamics will shift the survival and reproductive success of other species, disturbing the equilibrium of the stopover ecosystems.

Western Sandpipers are a popular bird species among birdwatchers and nature enthusiasts, and their migration stopover sites in the Pacific Northwest and Alaska have an important role in ecotourism and its respective economic and cultural values. The economic impact of Western Sandpiper ecotourism in the Copper River Delta, Alaska was evaluated to produce over $1.5 million in revenue and 100 jobs. [5]

Overall, the decline of the Western Sandpiper population can have a complex and far-reaching impact on both the ecosystem and human society. By interacting with native species and migratory species in their natural habitats, the Western Sandpiper’s role deeply interweaves within the ecosystem.

Conservation efforts and solutions

Conservation efforts to protect and restore Western Sandpiper populations are critical in maintaining ecosystem health. One of the main strategies to protect the Western Sandpiper is to conserve their stopover sites and breeding grounds by monitoring and researching invasive species and coastal development. Aside from consistent restoration of degraded habits after human disturbance, prevention of further human development in Western Sandpiper habitats is also critical in maintaining the habitat’s health.

Educating the public about the importance of Western Sandpipers and their habitats is a crucial aspect in raising awareness and gaining support for conservation efforts. Outreach such as public lectures, bird festivals, and school tours are great opportunities to connect humans to the beautiful avian community and improve public consciousness regarding ecosystem conservation. An example includes the Monterey Bay Birding Festival, which is an annual festival in California during shorebird fall migration season. This festival promotes awareness of shorebirds with its educational workshops and bird tours.[6]

Currently, conservation efforts of shorebird populations face limitations in funding and coordination. Significant funding efforts are required to restore what has been lost, but limited budgets restrict the scope and effectiveness of conservation approaches. In addition, since conservation efforts are implemented on a site-by-site basis, there is a need for improved coordination among different agencies to solve problems together. Potential solutions to the need for adequate funding and coordination are the implementation of stronger policies of avian conservation and habitat conservation as well as the encouragement of sustainable tourism and outreach efforts.

CONCLUSION

The Western Sandpiper population in North American tidal areas has been experiencing a significant decline in recent years, largely due to human activities and subsequent climate change. Population changes of this small, long-legged shorebird affect many species that interact and co-exist with them in the coastal ecosystem. They are one of the most abundant shorebird species in North America and play a vital part in the ecological and cultural values along the coast. Population dynamics vary year to year and between different populations, and increasing efforts in the monitoring and conservation of the Western Sandpiper community and their respective habitats is essential to ensuring the species’ survival. We need to investigate the causes behind the population’s decline in recent years and take action before the negative effects have gone too far and these ballerinas of the beach are unable to recover.

REFERENCES

[1] Andres, B., Smith, B. D., Morrison, R. I. G., Gratto-Trevor, C., Brown, S. C., Friis, C. A., … Paquet, J. (2013). Population estimates of North American shorebirds, 2012. Wader Study Group Bulletin, 119, 178-194.

 [2] The Cornell Lab of Ornithology. (n.d.). Western Sandpiper Overview, All About Birds, Cornell Lab of Ornithology. Cornell University. https://www.allaboutbirds.org/guide/Western_Sandpiper/overview

[3] The Wader Study Group. (n.d.). Geolocator Wetness Data Accurately Detect Periods of Migratory Flight in Two Species of Shorebird. https://www.waderstudygroup.org/article/9619/

[4] Smith, B. D., Andres, B. A., & Morrison, R. I. G. (2017). Declines in shorebird populations in North America. Wader Study, 124(1), 1-11.

[5] Vogt, D. F., Hopey, M. E., Mayfield, G. R. III, Soehren, E. C., Lewis, L. M., Trent, J. A., & Rush, S. A. (2012). Stopover site fidelity by Tennessee warblers at a southern Appalachian high-elevation site. The Wilson Journal of Ornithology, 124(2), 366-370. https://doi.org/10.1676/11-107.1

[6] Cornell Lab of Ornithology. (2019, September 24). Monterey Bay Festival of Birds [Web log post]. All About Birds. https://www.allaboutbirds.org/news/event/monterey-bay-festival-of-birds/#

 [7] Haig, S. M., Kaler, R. S. A., & Oyler-McCance, S. J. (2014). Causes of contemporary population declines in shorebirds. The Condor, 116(4), 672-681.

 [8] Kallenberg, M. (2021). The 121st Christmas Bird Count in California. Audubon. https://www.audubon.org/news/the-121st-christmas-bird-count-california

 [9] Reiter, P. (2001). Climate change and mosquito-borne disease. Environmental Health Perspectives, 109(1). https://doi.org/10.1289/ehp.01109s1141

 [10] Sandpipers Go with the Flow: Correlations … – Wiley Online Library. (n.d.). Wiley Online Library. https://doi.org/10.1002/ece3.7240

[11] The Wader Study Group. (n.d.). Comparison of Shorebird Abundance and Foraging Rate Estimates from Footprints, Fecal Droppings an,d Trail Cameras. https://www.waderstudygroup.org/article/13389/

 [12] US Fish and Wildlife Service. (2022). Western Sandpiper (Calidris mauri). https://www.fws.gov/species/western-sandpiper-calidris-mauri 

[13] Wamura, T., Iwamura, T., & Possingham, H. P. (2013). Migratory connectivity magnifies the consequences of habitat loss from sea-level rise for shorebird populations. Proceedings of the Royal Society B, 280(1761), 20130325. https://doi.org/10.1098/rspb.2013.0325

Review of Literature: Use of Deep Learning for Cancer Detection in Endoscopy Procedures

By Nitya Lorber, Biology and Human Physiology ’23

Author’s Note: I think now more than ever, the reality of artificial intelligence is knocking on our doors. We are already seeing how the use of AI programs are becoming more and more normalized for our daily use. AI is now driving our cars, talking to us through chatbots, and opening our phones with facial recognition. Frankly, I find it both incredible and intimidating having an artificial and computerized program making decisions with the intent of modeling the reasoning capabilities of the human mind. As an aspiring oncologist, I was really interested to see how AI is being used in the healthcare system, specifically in the field of oncology. So when my biological sciences writing class asked me to write a literature review on a topic of my choice, it was a no brainer – no AI needed. I hope that readers of this review can come away with a sense of comfort that AI is being used for improving cancer detection to potentially save lives.

ABSTRACT

Deep learning is a new technological science programmed to emulate and broaden human intellect [1]. With technological improvements and the development of state-of-the-art machine learning algorithms, the applications are endless for deep learning in medicine, specifically in the field of oncology. Several facilities worldwide train deep learning to recognize lesions, polyps, neoplasms, and other irregularities that may suggest the potential presence of various cancers. For colorectal cancers, deep learning can help with the early detection during colonoscopies, increasing adenoma detection rate (ADR) and decreasing adenoma miss rate (AMR), both essential indicators of colonoscopy quality. For gastrointestinal cancers, deep learning systems, such as ENDOANGEL, GRAIDS, and A-CNN, can help in early detection, giving patients a higher chance of survival. Further research is required to evaluate how these programs will perform in a clinical setting as a potential secondary tool for diagnosis and treatment.

INTRODUCTION 

Artificial intelligence is the ability of a computer to execute functions generally linked to human intelligence, such as the ability to reason, find meaning, summarize information, or learn from experience [2]. Over the years, computer computing power has significantly improved, and its progress has provided several opportunities for machine learning applications in medicine [1]. Generally, deep learning in medicine utilizes machine learning models to search medical data and highlight pathways to improve the health and well-being of the patient, most commonly through physician decision support and medical imaging analysis [3]. Machine intelligence collects data and identifies pixel-level features from microimaging structures, which are easily overlooked or invisible to the naked eye [1, 4]. Deep learning is a subfield of machine learning that uses artificial neural networks to learn patterns and relationships in data. Its basic structure involves trained interconnected nodes or “neurons” organized into layers [1]. What sets deep learning apart from other types of machine learning is the depth of the neural network, which allows it to learn increasingly complex features and relationships in the data. The field of oncology has begun to incorporate deep learning in their screenings for cancers by training deep learning to recognize lesions, polyps, neoplasms, and other irregularities that may suggest the potential presence of various cancers, including lung, breast, and skin cancers.  In an experimental trial setting, deep learning has shown its ability to aid in early cancer detection for a variety of cancers, specifically colorectal and gastrointestinal cancers, and although few studies show its performance in clinical settings, preliminary studies illustrate promising results for future deep learning applications in revolutionizing oncology today. The traditional approach to detecting colorectal and gastrointestinal cancers is through screening endoscopy procedures, which allow physicians to view internal structures [5-8]. Colonoscopies are a type of endoscopy that inserts a long flexible tube called the colonoscope into the rectum and large intestine to detect abnormalities, such as precancerous and cancerous lesions [7-9]. Advancing diagnostic sensitivity and accuracy of cancer detection through deep learning helps save lives by catching the disease before it progresses too far [1, 4].

DETECTION OF COLORECTAL CANCERS 

Colorectal cancers (CRC), cancers of the colon and rectum, have the second highest cancer death rate for men and women worldwide [5]. Frequent colonoscopy and polypectomy screening can reduce the occurrence and mortality from CRC by up to 68% [5, 7]. However, several significant factors determine colonoscopy quality: the number of polyps and adenomas found during colonoscopy, procedural factors such as bowel preparation, morphological characteristics of the lesion, and most importantly, the endoscopist [5-8]. The performance of the endoscopist can vary for several factors, including the level of training, technical and cognitive skills, knowledge, and years of experience inspecting the colorectal mucosa to recognize polypoid (elevated) and non-polypoid (non-elevated) lesions [6, 7].

The most essential and reliable performance indicator for individual endoscopists is their adenoma detection rate (ADR) [5, 6]. ADR is the percentage of average-risk screening colonoscopies in which one or more adenomatous colorectal lesions are found, quantifying the endoscopists’ sensitivity for detecting CRC neoplasia [5, 7]. ADR is inversely related to incidence and mortality of CRC after routine colonoscopies [5-7]. Another performance indicator commonly used to investigate differences between endoscopists or technologies is the adenoma miss rate (AMR), calculated in sets of two repeated colonoscopies on the same subject and by finding the number of lesions missed in the first trial but found in the second [7]. The issue with the current approach to detecting CRC is the variability in performance, leading to widely diverse ADRs and AMRs amongst endoscopists. This variability often results in missed polyps and overlooked adenomatous lesions in patients, which can have serious consequences [5-8]. 

DEEP LEARNING IN COLONOSCOPIES

Deep learning provides a possible solution to the endoscopist performance variability problem. Deep learning could provide a standardized approach to colonoscopy imaging that would help eliminate inaccuracies generated by endoscopists who may have been distracted, exhausted, or less experienced [6, 8]. Over the past few years, several studies have analyzed deep learning’s impact on endoscopy quality (i.e. ADR, AMR) and how it plays a role in reducing the rate of CRCs. Convolutional neural networks (CNNs) succeed in image analysis tasks, including finding and categorizing lesions [5]. In addition, another experimental approach involves developing a computer-aided detection (CADe) system using an original CNN-based algorithm for assisting endoscopists in detecting colorectal lesions during colonoscopy [7]. Overall, deep learning systems can improve endoscopy quality and possibly reduce the CRC death rate by increasing ADR and polyp detection rates in the general population [5-8]. 

The known fact that deep learning can increase ADR has led to several subsequent studies on how this technology may impact our current system. For instance, it was not previously known how the increase of ADR by deep learning relates to physician experience. In trying to determine this relationship, Repici A, et al. (2022) discovered that both experienced and non-experienced endoscopists displayed a similar ADR increase during routine colonoscopies with CADe assistance compared to those without CADe assistance [6]. Surprisingly, this study concluded that deep learning was a significant factor for the ADR score, while also finding that the level of experience of the endoscopist was not [6]. Along with increasing ADR, Kamba et al. 2021 explored how deep learning would impact AMR and  found a reduced AMR in colonoscopies conducted with CADe assistance compared to standard colonoscopies [7]. This study further confirmed conclusions made by Repici A, et al., saying endoscopists of all experiences using CADe will benefit from the reduced AMR and increased ADR [6, 7]. 

Moreover, deep learning is exceptionally well-trained in detecting flat lesions, which are often overlooked by endoscopists [6-8]. In evaluating deep learning use for detecting Lynch Syndrome (LS), the most common hereditary CRC syndrome, Hüneburg R, et al. found a higher detection rate of flat adenomas using deep learning compared to the High-Definition White-Light Endoscopy (HD-WLE), a standard protocol commonly used to examine polyps [8]. However, unlike other studies, the overall ADR was not significantly different between deep learning and HD-WLE groups, most likely from the study’s small sample size and exploratory nature [8]. This study was not the only one to observe a lack of significant increase in ADR. Zippelius C, et al. (2022) sought to assess the accuracy and diagnostic performance of a commercially available deep learning system named the GI Genius system in real-time colonoscopy [5]. Although the GI Genius system performs well in daily clinical practice and could very well reduce performance variability and increase overall ADR in less experienced endoscopists [8], it performed no better than that of expert endoscopists [5]. Overall, deep learning demonstrated to be superior or equal to standard colonoscopy performance, but never worse [5-8]. 

DETECTION OF UPPER GASTROINTESTINAL CANCERS

Upper gastrointestinal cancers, including esophageal and gastric cancer, are among the highest-ranked malignancies and causes of cancer-related deaths worldwide [4, 10, 11]. Of these, gastric cancer is the fifth most common form of cancer and the third leading cause of cancer-related deaths worldwide, with approximately 730,000 deaths each year [10,11]. Most upper gastrointestinal cancers are diagnosed at late stages in cancer because their signs and symptoms go unnoticed or are too general to produce a correct prognosis [10]. On the other hand, if these cancers are detected early, the 5-year survival rate of patients can exceed 90% [10, 11]. To diagnose gastrointestinal cancers, endoscopists must first conduct esophagogastroduodenoscopy (EGD) procedures examining upper gastrointestinal lesions to first find the early gastric cancer (EGC) [4, 11]. However, similar to colonoscopies, endoscopists require long-term specialized training and experience to accurately detect the difficult-to-see EGC lesions with EGD [4, 11]. EGD quality varies significantly by the endoscopist performance, and consequently impacts patient health [4, 10-11]. Because of the subjective, operator-dependent nature of endoscopy diagnosis, many patients are at risk of leaving their endoscopy examinations with undetected suspicious upper gastrointestinal cancers, especially if they are in less developed remote regions [10]. The rates of undetected upper gastrointestinal cancers go as high as 25.8%, and 73% of these cases resulted from endoscopists’ mistakes, such as the inability to detect a specific lesion or by mischaracterizing the lesion as benign during a biopsy [11]. There is a dire need for improved endoscopy quality and reliability as current tests rely too greatly on endoscopist knowledge and experience, creating too great of a variable for EGC detection [10, 11].

 DEEP LEARNING IN ENDOSCOPIES

Deep learning systems may effectively monitor blind spots during EGDs, but very little research on deep learning applications in upper gastrointestinal cancers was conducted before 2019 [4, 11]. Previously, deep learning had been mainly used to distinguish between neoplastic, or monoclonal, and non-neoplastic, or polyclonal, lesions [10, 11]. However, CNNs were not among the researched algorithms, and the then-examined systems could not sufficiently distinguish between malignant and benign lesions [10, 11]. The first functional deep learning system to specifically detect gastric cancer was the 2019 “original convolutional neural network” (O-CNN), but this system had a low statistical precision, rendering it unviable for clinical practice [11]. This prior lack of research led to the development of three deep learning systems that could be used to detect and diagnose upper gastrointestinal cancers in hopes of catching the disease in its early stages to help the patient best: GRAIDS, ENDOANGEL, and A-CNN. 

The first deep learning system developed and validated was the Gastrointestinal Artificial Intelligence Diagnostic System (GRAIDS), a deep learning semantic segmentation model capable of providing the first real-time automated detection of upper gastrointestinal cancers [10]. Luo H, et al. (2019) trained GRAIDS to detect suspicious lesions during endoscopic examination using over one million endoscopy images from six hospitals of different experiences across China [10]. GRAIDS is designed to provide real-time assistance for diagnosing upper gastrointestinal cancers during endoscopies as well as for retrospectively assessing the images [10]. In the study, Luo H, et al. (2019) found that GRAIDS could detect upper gastrointestinal cancers retrospectively and in a prospective observational setting with high accuracy and specificity [10]. GRAIDS’s high sensitivity is similar to that of expert endoscopists. However, GRAIDS cannot recognize some gastric contours delineated by experts leading to an increased risk of false positives, suggesting that this system is most effective as a secondary tool [10]. GRAIDS is seen as a cost-effective method for early cancer detection that can help endoscopists of every experience level [10]. 

The second deep learning diagnostic system is called Advanced Convolutional Neural Network (A-CNN), an upgraded version of O-CNN developed by Namikawa K, et al. (2020) [11]. Improving upon its predecessor, A-CNNs were able to successfully distinguish gastric cancers from gastric ulcers with high accuracy, sensitivity, and specificity [11]. This upgraded system is an essential improvement because gastric ulcers are often mistaken for cancer, leading to unnecessary cancer treatments for the patient. A-CNN can now help endoscopists in early diagnosis, improving survival rates of gastric cancers [11]. In addition, this program also helps to standardize the endoscopy approach to assuage some of the endoscopist performance variability [11].

The third deep learning system is ENDOANGEL, developed by Wu L, et al. (2021). Like A-CNN, ENDOANGEL is an upgrade of an older algorithm derived from CNNs called WISENSE [4]. Before the update, WISENSE illustrated the ability to monitor blind spots and create phosphodocumentation in real time during EGD [4]. Compared to WISENSE, ENDOANGEL achieved real-time monitoring during EGD with fewer endoscopic blind spots, a longer inspection time, and EGC detection with high accuracy, sensitivity, and specificity [4]. The deep learning program shows potential for detecting EGC in real clinical settings [4]. 

FUTURE IMPROVEMENT IN DEEP LEARNING DEVELOPMENT

Because deep learning is a moderately new technology, much of the available research is prospective. These studies attempt to determine if deep learning is a possible approach to reducing endoscopist performance variability. However, most require further research to illustrate how this technology will be used in a clinical setting. For example, most studies involving deep learning systems that were not commercially available and were conducted in highly specialized centers cannot indicate deep learning’s performance for lesion detection in daily clinical practice on different populations around the world [4, 6-7, 10-11]. Additionally, studies need to incorporate a greater patient sample size before they can be generalized to a larger population [7, 8]. Lastly, researchers should still consider endoscopist performance in their trials to explore every option and ensure each patient will get the same treatment no matter who their physician may be or their personal views and acceptance of deep learning technology [4, 5, 8]. These preliminary studies show potential, but the systems need improvement and research before they can be used as standalone options [4, 10-11].

CONCLUSION

Overall, deep learning has demonstrated impressive ability in detecting colorectal and gastrointestinal cancers in experimental trial settings. Deep learning provides a more standardized approach to conducting colonoscopies and endoscopies that may help to homogenize efficient screenings for every patient, regardless of their endoscopist. In colorectal cancers, studies have illustrated increased ADR and decreased AMR using machine learning. In gastrointestinal studies, deep learning has shown its ability to detect cancer just as well as expert endoscopists. Despite these advances, neural networks can only partially improve the cancer detection problem at hand. Even if neural networks improve the overall accuracy and sensitivity of cancer screenings, it will be useless if patients do not get their recommended cancer screenings at the recommended time. At the moment, human intervention is still required, in conjunction with deep learning support, to give the patient their most accurate results. It still needs to be fully understood how deep learning will perform in clinical settings as a secondary tool locally and globally. However, the preliminary studies discussed in this review illustrate promising results for future deep learning applications in revolutionizing oncology today.

Willowbrook’s Hepatitis Study

By J Capone, Agriculture and Environmental Education Major, ’24

The Willowbrook State School was a housing institution created by New York State in 1947 to house intellectually disabled children and young adults. At the time, there were little to no public resources for caregivers, and state schools like Willowbrook were created to address that problem.  Conditions at Willowbrook were horrific – rampant disease, neglect, and abuse meant most individuals lived sad lives on the grounds of the institution. Robert Kennedy, then-senator of New York, described it as a “snake pit” after his unplanned visit in 1965 exposed horrific conditions. Dr. Saul Krugman, a professor of epidemiology, was brought to Willowbrook in 1955 to control and mitigate the spread of infectious diseases that spread like wildfire through the halls. However, Krugman didn’t necessarily prevent the spread of disease;  in some cases, explicitly encouraged it amongst residents, so he could test his theories on possible treatments. Thus, the Willowbrook Study was born. 

Built on Staten Island to originally house 4,000 patients, the institution quickly swelled in population to 6,000, peaking in 1969 at 6,200 [10]. There were often not enough resources, including basic clothing and staff, to go around. Conditions at Willowbrook were grim; some 60% of patients were not toilet trained, and others could not feed or bathe themselves [9]. Abuse and neglect ran rampant, along with infectious diseases that patients suffered from due to unsanitary conditions. Facing these problems, the directors of the institution hired Dr. Saul Krugman from NYU medical school in 1955 to deal with the state of endemic diseases facing patients at Willowbrook. Dr. Krugman began by implementing an epidemiological survey to calculate the extent of the problem. These surveys discovered that children had a 1 in 2 chance of catching hepatitis in their first year at Willowbrook [6]. Further surveys showed 90% of patients had markers for hepatitis A, indicating a previous infection [6]. Hepatitis A is a disease that affects the liver, causing jaundice, loss of appetite, abdominal pain, and is usually spread via fecal to oral contamination.  Dr. Krugman’s studies also discovered the strain of hepatitis B, a sexually transmitted disease that lived in both adult and child populations in Willowbrook. In an institution where over half of the population was not toilet trained with not enough caretakers to go around, it was unquestionable how hepatitis became such an intense and ingrained problem in Willowbrook’s halls [3]. 

Faced with these grim conditions, Dr. Krugman came to a conclusion; discovering a way to inoculate children against hepatitis would help not only those institutionalized but could reap benefits around the world. In short, he wanted to use the conditions already established at Willowbrook to create a vaccine for hepatitis. Gathering consent from parents, Dr. Krugman and his team ran several experiments to observe the benefits of gamma globulin injections, a type of antibody-containing blood plasma, from those who have recovered from hepatitis into those who have not yet been infected [1]. Some experiments exposed children to hepatitis with no administered globulin injections; others were injected and then exposed to the virus; and some were injected and were never exposed to the virus, all carefully observed by the medical team. One way the study exposed children to the hepatitis virus was by taking the feces of infected residents and combining it with chocolate milk for the study’s participants to consume unwittingly [10]. The children who participated in these studies were housed separately from the rest of the patients, in newer, cleaner facilities with round-the-clock care, while the other residents of Willowbrook still lived in squalor [6].  Krugman’s studies later encapsulated measles and rubella studies, resulting in his 20+ year stay as a medical director at Willowbrook. Only much later in his career did he receive backlash from the medical community regarding his study participants and methods.

Krugman claims he did not choose Willowbrook because he could prey on vulnerable disabled children. Instead, he argues that his experiments helped the children at Willowbrook and defended their morality and ethics until the day he died. He argued hepatitis was already rampant at Willowbrook; anyone who came to live there was bound to get it at some point. Under his lab’s controlled experiments, Dr. Krugman argued, the patients had better care and a better chance of survival than in the main facilities at Willowbrook [6]. In the study, residents had the opportunity to become immune to hepatitis without ever falling ill and facing its dangerous effects through Dr. Krugman’s experiments. Even if they did, they had access to excellent care from his team of doctors and nurses. However, Krugman missed a key component of any study’s ethical backing. Purposefully infecting children with hepatitis, no matter how “likely” it is they will get it in the future, puts them needlessly in harm’s way. No matter how well cared for the patients were, hepatitis is known to be a potentially fatal disease. Additionally, the infection method of feces-contaminated food and drink is a deplorable method that hints at possibly worse conditions of the Willowbrook study. Nobody, no matter what they’ve consented to, should be unwittingly drinking human excrement. This egregious violation of moral and ethical standards, even to professionals in the 1950s and 60s, shows how poorly these patients were treated. Every physician takes an oath to first do no harm. Dr. Krugman violated this with his treatment of intellectually disabled children at Willowbrook State School. 

Other issues regarding the Willowbrook Study were the methods of informed consent. On paper, everything seemed fine. Potential patients’ parents were given a tour of the facility, allowed to ask questions to the researchers, and were given various forms of education on the purpose of the study. They met with a social worker and were allowed to discuss the study with their private doctors if they wanted to. Parents knew they could withdraw from the study at any time. No orphan or ward of the state was allowed to become a potential participant in the hepatitis studies. Dr. Krugman also asserts that their methods of informed consent were innovative, and paved the way for the standard of human experimentation. However, there is evidence to suggest that the undue influence of admittance to Willowbrook is a further kink in Krugman’s ethical armor. Willowbrook was already inundated but was one of the only state institutions for intellectually disabled children operating in New York State. Parents believed that the only way to get their child into the selective school was by allowing them to participate in the hepatitis study, as that gave them priority registration. Otherwise, these caregivers had practically no options. State institutions were some of the only interventions available at the time. Willowbrook already had a huge waitlist while operating at 1.5x capacity. Although sending your child to an overpopulated institution may not seem like the best option, sometimes it was all these families had to provide care for their loved ones. This instance of undue influence further establishes how unethical Dr. Krugman’s study was.

What brought the downfall to a 20-year study? Geraldo Rivera, an up-and-coming investigative reporter, created a documentary airing on ABC in 1972 showing the horrible abuse and neglect people at the Willowbrook State School had to endure. Willowbrook: The Last Great Disgrace led a call-to-arms from the people of Staten Island, who were horrified such an atrocity could occur on their very shores. A class-action lawsuit was filed that same year, with a final ruling that Willowbrook had to begin closing procedures in 1975. Around this time, other members of the medical field started criticizing Dr. Krugman’s studies and questioned the necessity of human experimentation for immunization studies. Dr. Krugman’s later studies developing hepatitis B vaccines using chimpanzees created conflict in the medical community, as chimpanzees are considered an acceptable model for human epidemiology studies for vaccine development, yet Krugman decided on human trials before primates. However, even after the study ended at Willowbrook, Dr, Krugman was still lauded for his advancements in the epidemiology field with not just hepatitis, but also developments of measles and rubella vaccinations later in his time at Willowbrook. Until the very end, Krugman defended his choices and decisions regarding the treatment and methods used at Willowbrook State School. 

Public outrage over the conditions at Willowbrook spurred several laws and acts into being over the 70s and 80s. After the 1972 class action lawsuit was settled in 1975, the Willowbrook Consent Decree was signed, stating that the institution had to start deflating its population from around 5,000 to 250 in six years, among other reforms regarding the treatment of patients at the facility. Other acts, such as the Education for All Handicapped Children Act (1975) and the Bill of Rights Act (1975) worked to ensure disabled populations were protected in society. Other programs, such as the Protection and Advocacy System of the Developmental Disabilities Assistance were formed to further preserve the rights of disabled individuals. The Belmont Report, published in 1976, also worked to establish ethical standards regarding human experimentation revolving around their three principles: Respect of Persons, Beneficence, and Justice. 

The tragedy of Willowbrook State School is a permanent mark on the scientific community’s mistreatment of human research participants. The unacceptable treatment and conditions that children and adults were forced to face while institutionalized were a disgrace to scientific research. While there have been many scientific discoveries resulting from this study, including the creation of a hepatitis A & B vaccine, the ends never justify the means in human experimentation. 

A Warmer World Leading to a Health Decline

By Abigail Lin, Biological Sciences.

INTRODUCTION

Rising temperatures due to global climate change cause several detrimental impacts on the world around us. This paper will analyze the consequences of climate change, specifically temperature changes, within California. Livelihoods of farmers and fishermen, distribution of disease, and fire intensity are examples of how California is affected by this crisis. Climate change in California is especially visible because California dominates the nation’s fruit and nut production, two water-intensive crops. The state’s reliance on large quantities of water to fuel its agricultural system makes it particularly susceptible to drought. Proliferation of detrimental disease vectors, loss of beneficial crops, and elevated levels of dryness imply a complex interaction between California ecosystems and climate change. 

Crops

There are many farmers and agricultural workers in California impacted by changing climates, as the state is a major agricultural hotspot. Two-thirds of the nation’s fruits and over one-third of the nation’s vegetables are produced in California [1]. Crops such as apricots, peaches, plums, and walnuts are projected to be unable to grow in 90% or more of the Central Valley by the end of the century because of the increase of disease, pests, and weeds that accompany rising temperatures [1]. 

Figure 1. Projection of crop failure by the end of the century. Heat increases diseases, pests, and weeds. Plum, apricot, peach and walnut crops will be unable to grow in 90% of Central Valley as a result.

Crop yields significantly decrease when heat sensitive plants are not grown in cool enough conditions. Fruits and nuts require chill hours, when the temperature is between 32 and 45 degrees Fahrenheit, to ensure adequate reproduction and development [2]. However, with increasing temperatures, crops are receiving less chill hours during the winter. California grows 98% of the country’s pistachios, but changes in chill hours have affected fertilization [3]. A study found that pistachios need 700 chill hours each winter, yet there have been less than 500 chill hours over the past four years combined [1]. As a result, in 2015, 70% of pistachio shells were missing the kernel (the edible part of the nut) that should have been inside [3]. 

Repeated crop failures have also left farmers mentally taxed. Evidence suggests that suicide rates for farmers are already rising in response to farm debt that accumulates in response to poor crop yields [4]. Not only is people’s financial well-being threatened by climate change, but so is their mental health. Mental stress threatens to rise as climates warm around the world, causing economic loss and upheaving agricultural careers. 

Crab Fisheries

Crab fisheries and fishers in California are also negatively impacted by the rise in temperatures. Warming oceans have led to an uncontrollable growth of algal blooms, which contaminates crab meat with domoic acid, a potent neurotoxin that causes seizures and memory loss [5]. The spread of this toxin has forced many fisheries to close. California fishers lost over half the crabs they regularly catch per season, and qualified for more than 25 million dollars of federal disaster relief, during 2015 to 2016 [5]. In response to financial loss, fishers adapted by catching seafood species other than crab, moved to locations where algal blooms have not contaminated their catch, or in the worst case, stopped fishing altogether [5]. California crab fishers’ careers have already been dramatically altered by global warming, and the amount of algal blooms will only continue to increase if warming continues. 

Disease

Temperature plays a major role in the prevalence of infectious diseases because it increases the activity, growth, development, and reproduction of disease vectors, living organisms that carry infectious agents and transmit them to other organisms. It is predicted that warm, humid climates will allow bacteria and viruses, mosquitoes, flies, and rats (all common disease vectors) to thrive [6]. Most animal disease vectors are r-selected, meaning they put little parental investment into individual offspring, but produce many. Warm temperatures allow r-selected species to grow quickly and reproduce often. However, warm temperatures speed up biochemical reactions and are very energy demanding on organism metabolism [7]. In response, disease vector ectotherms, organisms requiring external sources of heat for controlling body temperature, have successfully adapted to changing temperatures. These organisms thermoregulate, or carry out actions that maintain body temperature [7]. Behavioral thermoregulation has shifted the geographical distribution of infectious diseases as disease vectors move to the warm environments that they favor [7]. 

Initial models about the distribution and prevalence of disease suggested a net increase of the geographical range of diseases, while more recent models suggest a shift in disease distribution [7]. Recent models recognize that vector species have upper and lower temperature limits that affect disease distribution [7]. It is estimated that by 2050, there will be 23 million more cases of malaria at higher latitudes, where previously infections were nonexistent, but 25 million less cases of malaria at lower latitudes, where previously malaria proliferated rapidly through populations, because the conditions necessary for malaria transmission will shift [7]. 

Figure 2. Shift of malaria disease distribution by 2050. Higher latitudes will have 23 million more cases of malaria while lower latitudes will have 25 million less cases. Although habitat suitability changed, there is little net change in malaria cases. 

Cases of Coccidioidomycosis (Valley fever), an infectious disease spread from inhaling Coccidioides fungal spores, have recently reached record highs in California [8]. Valley fever is especially prevalent in areas experiencing fluctuating climates, vacillating between extreme drought and high precipitation [8]. After studying 81,000 cases collected over 20 years, researchers identified that major droughts have a causal relationship with increasing Coccidioidomycosis transmission rates [8]. Initially, drought will suppress disease transmission because it prevents proliferation of the Coccidioides fungi. However, transmission rebounds in the years following drought because competing bacteria die off in high heat [8]. Fungi have a number of traits that make them more tolerable to drought compared to bacteria including osmolytes for maintaining cell volume, thick cell walls to mitigate water loss, melanin which aids in thermoregulation, and hyphae that extend throughout the soil to forage for water [9]. Disease spikes are seen after drought, such as the wet season between 2016 and 2017, which had about 2,500 more cases of Valley fever in comparison to the previous year. [8]. 

The role of rising temperatures in increasing Valley fever cases is evident in Kern County, one of the hottest and driest regions of California. Kern Country has the highest Valley fever incident rates in California; 3,390 cases occurred in a 47-month drought from 2012 to 2016 [8]. Kern County has many cases of Valley fever because of its drought-like conditions. As climate change pushes areas throughout California that are usually cool and wet year-round into alternating dry and wet weather conditions, Valley fever cases are projected to increase. 

Fires

Climate change is also associated with an increase in fire season intensity. The Western United States experienced three years of massive wildfires from 2020 to 2022, with each year burning more than 1.2 million acres [10]. The ongoing drought has led to an accumulation of dry trees, shrubs, and grasses [10]. A 2016 study found that this increase of dry organic plant material has more than doubled the number of large fires in the Western United States since 1984 [10]. One of the ways that dry matter may ignite is by lightning. Projections show that by 2060, there will be a 30% increase of area burned by lightning-ignited wildfires compared to 2011 [10]. 

Residents in California are in danger of losing their lives and property to fire damage. A single fire can lead to massive destruction. In 2018, the Woolsey Fire burned 96,949 acres and hundreds of homes, and killed three people [11]. Over one million buildings in California are within high-risk fire zones, and this number is projected to increase as temperatures continue to rise [10]. With the amount of dry organic matter increasing and wildfire incidence surging, there will be more cases of property damage and loss of life in California. High temperatures and extreme weather events make it more likely that people will fall victim to these life-threatening disasters. 

CONCLUSION

Increases in global temperature have a negative effect on human physical health and mental wellbeing. Climate change is making it more difficult to secure a livelihood, changing the spread of disease, and destroying lives and property. However, projections about rising temperatures allow farmers the chance to make informed decisions about which crops to grow, fishermen to relocate to areas that are less impacted by algal blooms, health experts to predict when and where outbreaks of certain diseases might occur, and fire protection services to increase their presence in high-risk areas. Projections help people predict where and when a climate change associated event is likely to occur, so that they may hopefully respond quicker and more efficiently. Consequences of climate change can be mitigated by using models as a guide for what to expect in California’s future. 

REFERENCES

  1. James I. 2018. California agriculture faces serious threats from climate change, study finds. The Desert Sun. Accessed January 31, 2023. Available from www.desertsun.com/story/news/environment/2018/02/27/california-agriculture-faces-serious-threats-climate-change-study-finds/377289002/
  2. U.S. Department of Agriculture. Climate Change and WINTER CHILL. Accessed December 23, 2023. Available from www.climatehubs.usda.gov/sites/default/files/Chill%20Hours%20Ag%20FS%20_%20120620.pdf
  3. Zhang S. 2015. Time to Add Pistachios to California’s List of Woes. WIRED. Accessed February 15, 2023. Available from www.wired.com/2015/09/time-add-pistachios-californias-list-problems/
  4. Semuels A. 2019. ‘They’re Trying to Wipe Us Off the Map.’ Small American Farmers Are Nearing Extinction. TIME. Accessed January 31, 2023. Available from time.com/5736789/small-american-farmers-debt-crisis-extinction/
  5. Gross L. 2021. As Warming Oceans Bring Tough Times to California Crab Fishers, Scientists Say Diversifying is Key to Survival. Inside Climate News. Accessed January 31, 2023. Available from insideclimatenews.org/news/01022021/california-agriculture-crab-fishermen-climate-change/
  6. Martens P. 1999. How Will Climate Change Affect Human Health? The question poses a huge challenge to scientists. Yet the consequences of global warming of public health remain largely unexplored. Am Scien. 87(6):534–541. 
  7. Lafferty KD. 2009. The ecology of climate change and infectious diseases. Ecol Soc Amer. 90(4):888-900. 
  8. Hanson N. 2022. Climate change drives another outbreak: In California, it’s a spike in Valley fever cases. Courthouse News Service. Accessed March 8, 2023. Available from www.courthousenews.com/climate-change-drives-another-outbreak-in-california-its-a-spike-in-valley-fever-cases/
  9. Treseder KK, Berlemont R, Allison SD, & Martiny AC. 2018. Drought increases the frequencies of fungal functional genes related to carbon and nitrogen acquisition. PLoS ONE [Internet]. 13(11):e0206441. doi.org/10.1371/journal.pone.0206441
  10. National Oceanic and Atmospheric Administration. 2022. Wildfire climate connection. Accessed January 31, 2023. Available from www.noaa.gov/noaa-wildfire/wildfire-climate-connection#:~:text=Research%20shows%20that%20changes%20in,fuels%20during%20the%20fire%20season
  11. Lucas S. 2019. Los Angeles is the Face of Climate Change. OneZero. Accessed January 31, 2023. Available from onezero.medium.com/los-angeles-is-burning-f9fab1c212cb

Floating Photovoltaics (FPVs): Impacts on Algal Growth in Reservoir Systems

By Benjamin Narwold, Environmental Science and Management major ’23

Author’s Note: I wrote this review paper to learn more about the environmental impacts of floating photovoltaics (FPVs) because this topic directly applies to my work as an undergraduate researcher position with the Global Ecology and Sustainability Lab at UC Davis. I wanted to focus specifically on the impacts of FPV on algae because of the biological implications of disturbing ecologically important photosynthesizers in reservoirs. I want readers to develop an understanding of FPVs as a climate change mitigation solution, how these systems may disturb algae, and the uncertainties in whether expected and observed changes in algae growth are beneficial or detrimental to the aquatic environment.

ABSTRACT

Floating photovoltaics (FPVs) are typical photovoltaics mounted on plastic pontoon floats and deployed on man-made water bodies. If FPVs are developed to cover 27% of the surface area of US reservoirs, they would provide 10% of the electricity in the US. Freshwater reservoirs are host to vulnerable ecosystems; therefore, understanding the water quality impacts of FPVs is necessary for sustainable development. This review aimed to fingerprint the impacts of FPVs on reservoir aquatic ecology in terms of algal growth and identify the uncertainties in FPV-induced algae reduction to present our current understanding of the environmental impacts of reservoir-based FPVs. The UC Davis Library database was searched for papers from peer-reviewed journals published from 2018 to 2022 that covered “floating photovoltaics”, “algae reduction”, and “environmental impacts”. A consistent result across studies was that FPVs reduce algal growth by reducing the sunlight entering the host waterbody, and this can disrupt phytoplankton dynamics and have cascading effects on the broader ecosystem. Modeling and experimental approaches found that 40% coverage of the reservoir by FPVs is optimal for energy production while maintaining the necessary algae levels to support the local ecosystem. The lack of research on the ideal percent coverage of FPVs to reduce algal growth but not disrupt ecosystem dynamics emphasizes the need for future research that addresses FPV disturbance of local microclimates, algae response to reduced sunlight, and the corresponding cascading impacts on other organisms dependent on the products of algal photosynthesis.

Keywords: floating photovoltaics, algae reduction, environmental impacts, water and ecology management, energy and water nexus

Caption: Floating photovoltaic (FPV) system in Altamonte Springs, Florida. One of four sites monitored by the Global Ecology and Sustainability Lab for water quality impacts of FPV.

INTRODUCTION

Climate change is a global problem of increasing intensity and poses challenges to food, water, and energy security. Global climate models predict a 2-4°C increase in global temperatures from now until 2100, which will degrade human health and threaten ecosystems [1]. Renewable energy is a critical component of reducing anthropogenic greenhouse gas emissions, and the widespread transition away from fossil fuels is becoming increasingly feasible with new technologies. One of these new renewable energy systems is floating photovoltaics (FPVs), standard photovoltaic (solar panel) modules mounted to a polyethylene pontoon float system, positioned off the water’s surface, and anchored to the bottom or shore of the host waterbody [2]. FPVs represent an intriguing and novel renewable energy solution because they can be deployed on human-constructed water bodies and improve land-use efficiency. Ground-mounted solar projects compete for land against agricultural and urbanization interests, whereas many artificial and semi-natural water bodies, such as wastewater discharge pools, have no conflicting human interests [3]. FPV development thus presents an opportunity to sustainably increase solar energy production without interfering with agricultural and urban development, which will continue to expand as world populations increase. In addition to optimizing land use, FPVs can produce up to 22% more power than conventional solar due to evaporative cooling [4]. The solar panels are located just above the water’s surface, so the local water evaporation contributes to a reduction in solar panel temperature, thus increasing efficiency. Generating electricity using FPVs is intended to augment solar power generation capacity and supply more renewable energy to the grid for households and industry.

Among the most abundantly available space to develop this pivotal land-use optimization and climate change mitigation solution are reservoirs, lakes formed from damming a river for water storage and hydropower production. A GIS analysis found that covering 27% of the surface area of reservoirs in the United States with FPVs would generate enough electricity to meet 9.6% (2116 Gigawatts) of the country’s 2016 energy demands [4]. But reservoirs and similar bodies of water nevertheless represent vulnerable freshwater ecosystems, so developing an understanding of the water quality and species impacts of FPVs represents the primary hurdle to informing sustainable development of these systems.

FPVs reduce the amount of sunlight reaching the surface of their host waterbody, which reduces the amount of evaporative water loss and results in significant changes to algae growth [5]. Several studies have found that FPVs alter phytoplankton dynamics and can have cascading effects on the other organisms in the ecosystem [6–8]. A key agent of uncertainty surrounding reservoir FPVs is determining the equilibrium range of algal growth needed to support reservoir food webs. In some reservoir systems, we see strong summertime algal blooms. An algal bloom is a rapid increase in or overaccumulation of an algal population that can result in oxygen-depleted waterbodies called “dead zones,” where the algae eventually die and decompose [9]. FPV-induced shading can counter harmful algal blooms, providing environmental benefits to augment renewable energy generation. Alternatively, in reservoirs that do not have problematic algal blooms, adding an FPV system may reduce healthy algal populations and cause adverse rippling effects to other species in the ecosystem. Developing an understanding of what percent of the total water surface area of the reservoir covered by FPVs is enough to reduce algal growth and bloom potential but not too large to disrupt ecosystem dynamics will require further research. Specifically, assessing the disturbance of local microclimates caused by FPVs, algae response to reduced sunlight conditions, and the impact on other aquatic species dependent on the ecosystem functioning provided by algae. Due to climate change, we predict an increase in temperature and shifting precipitation patterns; therefore, it is important to contextualize the water quality impacts of FPV and its influence on algae, given this variability.

Figure 1. Impact of FPVs on algal in reservoir ecosystems. FPV-induced shading can provide additional environmental benefits in reservoirs with algal blooms and may cause adverse effects in healthy reservoirs.

Methods

This review surveys what we know regarding the impacts of FPVs on algal growth in reservoir systems. The UC Davis Library database was searched for papers from peer-reviewed journals using the following keywords: “floating photovoltaics,” “algae reduction,” and “environmental impacts.” I looked at experiments on reservoir-based FPVs from 2018-2022 to analyze plot scale impacts on algal growth, quantified with chlorophyll-a monitoring data, and assessed global-scale changes in algal growth from a climate change perspective, with consideration of FPV materials and design. Although a study on crystalline solar cells incorporated in this review is from 2016 and falls outside the 5-year range of focus, it represents a necessary juxtaposition to the semitransparent polymer cell technology. Overall, I analyzed the methods and results of site-specific, laboratory, and global-scale studies to fingerprint the current state of knowledge on the impacts of FPVs on algae and algal blooms to inform reservoir management.

Algal Growth and FPV Coverage Scenarios

Algae are responsible for producing oxygen in the waterbody, and the impact of FPVs on algae growth depends on the percentage of the waterbody covered by the FPV and is measured by looking at chlorophyll-a (ch-a) differences. Ch-a, a pigment present in all photosynthetically active algae, is often used as a proxy measurement to assess algal growth dynamics within a waterbody [10]. Ch-a is measured using optical sensors and wavelengths of light, so it is an indirect measurement of algal concentration. FPVs reduce the amount of sunlight reaching the surface of their host waterbody and disrupt phytoplankton dynamics. Hass et al. (2020) and Wang et al. (2022) investigated different FPV coverage scenarios and used ch-a as a proxy for algal growth. Hass et al. used the ELCOM-CAED model to evaluate ten different FPV coverage scenarios, and Wang et al. simulated 40% coverage relative to 0% coverage control ponds using black polyethylene weaving nets as a proxy for an FPV array. Both the model output and experiment-based approach settled on 40% FPV coverage as an equilibrium development target [7, 11]. The results of these studies show continuity; however, Hass et al. did not consider the difference in absorption wavelength range for different microalgal taxa, and Wang et al. did not use actual solar panels in their experimental design. Additionally, Andini et al. (2022) investigated the difference in algae between 0% and 100% coverage at Mahoni Lake in Indonesia by experimenting with mesocosms, isolated systems that mimic real-world environmental conditions while allowing control for biological composition by taking samples at the same water depths. These researchers found that 100% FPV coverage reduced ch-a between 0 and 1.25 mg/L, average temperature between 0 and 2.5℃, dissolved oxygen between 0 and 1.5 mg/L, and electrical conductivity categorically in the waterbody. However, the researchers only considered directly measured water quality variables and did not assess the long-term trophic consequences of 100% FPV coverage [6]. Clearly, the study was designed to show the polarity between 0% and 100% coverage in terms of several water quality parameters; however, realistic intermediate FPV coverages incorporated into both Hass et al. and Wang et al. were absent from this study. Given these compiled results, future research can continue to work toward the broader question of determining what percent FPV coverage can be applied to a reservoir to maximize energy production and minimize environmental disturbance.

Algal Blooms and Mitigation Potential

Algal blooms are a product of high productivity conditions that favor rapid algae growth, and the shading provided by FPV systems could mitigate the intensity and negative impacts of summertime algal blooms. High productivity conditions include high water temperature, intense sunlight, and abundant nutrients such as nitrogen and phosphorus. The first two variables can be controlled by FPV coverage. In a study of the global change in phytoplankton blooms since the 1980s, Ho et al. (2019) found that most of the 71 large lakes sampled saw an increase in peak summertime bloom intensity over the past three decades, and the lakes that showed improvement in bloom conditions experienced little to no warming. Temperature, precipitation, and fertilizer inputs were the considered variables, and this study could not find significant correspondence of blooms to any of these variables exclusively [12]. This insignificant result suggests a diversity of causal agents on a per-lake basis. Thus, conducting site-specific studies and monitoring these water quality variables will help establish algal bloom causation and the relative intensity of the confounding variables and, therefore, whether FPV coverage would be an effective mitigation agent. If the algae in a reservoir are linked to less-controllable variables like carbon dioxide concentration in the water or nutrient loading from agricultural runoff, FPV-shading will have a negligible effect on algae [6, 7]. Such considerations are critical to informing the potential environmental co-benefits of an FPV installation.

FPV Solar Cell Design

The properties of solar cells within the photovoltaic panels themselves are instrumental in determining what wavelengths of light interact with the surface of the host waterbody under the panels. Crystalline silicon solar cells absorb radiation wavelengths from 300-1300 nm and have a thick active layer of about 300 µm, responsible for high photon absorption [13]. These properties result in opaque solar panels that do not allow photons to travel through the panel and interact with the waterbody. Conversely, semitransparent polymer solar cells (ST-PSCs) represent an alternative material and technological approach, and algae growth can be regulated by engineering the panels to provide specific transmission windows and light intensities. Zhang et al., 2020 found that the growth rate for the algal genus Chlorella was minimized under the opaque treatment; however, the changes in photosynthetic efficiencies did not significantly affect the growth rate of Chlorella during the 24-hour experimentation window. While the researchers were able to show the variability in the number of photons penetrating the panels from 300-1000 nm across three treatments of different layering of material within the ST-PSCs, they were unable to yield a significant result in their study [5]. These results have limited scope because this study was conducted in a lab and did not assess real-sized PV panels in the field; however, it highlights how algae species may prefer different light wavelengths for photosynthesis that may be discontinuous with the wavelengths an FPV system best controls. Therefore, it is vital to coordinate solar panel material design in order to reflect and absorb the primary wavelengths that support algal photosynthesis. The viability of prioritizing this component of FPV is uncertain; however, new materials and technologies are being developed and utilized, and this relationship must be considered as we work to maximize FPV coverage in reservoir systems with minimal ecological complications (Figure 2).

Figure 2. Relationship between FPV transparency, light profiles entering the waterbody and interacting with algae, and FPV coverage optimization. Solar cell design influences light transmission, and photosynthetic rates in algae vary with light wavelength and intensity, providing site-specific design opportunities.

CONCLUSION

FPVs are relatively untapped climate change mitigation solutions and can potentially reduce algae, benefitting water quality in freshwater ecosystems and reservoirs that suffer from strong summertime algal blooms. Algae are critical primary producers in reservoir ecosystems; therefore, areas for future research include microalgae response to the reduced sunlight conditions created by FPVs and the ecological role of algal taxa within the reservoir ecosystem. Further laboratory studies of solar panel designs in this context are needed. Future research on FPVs and water quality must also account for climate change, shifting baselines, and environmental variables. From a reservoir management viewpoint, this includes studying whether reservoirs have lower nutrient loading and whether the algae can be managed with FPV arrays, fingerprinting the inter-reservoir variability to determine where we should spatially place FPV arrays and localize impacts, and further modeling the relationship between warming and algal blooms to understand the long-term effectiveness of FPV-based algae management. Climate change will continue to operate in the background, and energy security issues will intensify. Our understanding of the environmental impacts of FPVs is currently limited to the point where we cannot safely approve and construct these systems on most reservoirs; therefore, future studies are needed to incorporate this modern technology into the global renewable energy portfolio.

REFERENCES

  1. Ara Begum R, Lempert R, Ali E, Benjaminsen TA, Bernauer T, Cramer W,Cui X, Mach K, Nagy G, Stenseth NC, Sukumar R, Wester P. 2022. Point of Departure and Key Concepts. In: Pörtner HO, Roberts DC, Tignor M, Poloczanska ES, Mintenbeck K, Alegría A, Craig M, Langsdorf S, Löschke S, Möller V, Okem A, Rama B (eds.). Climate Change 2022: Impacts, Adaptation and Vulnerability. Contribution of Working Group II to the Sixth Assessment Report of the Intergovernmental Panel on Climate Change. Cambridge (UK) and New York (NY): Cambridge University Press. 121–196. doi:10.1017/9781009325844.003.
  2. Energy Sector Management Assistance Program, Solar Energy Research Institute of Singapore. 2019. Where Sun Meets Water: Floating Solar Handbook for Practitioners. Washington, DC (USA): World Bank.
  3. Cagle AE, Armstrong A, Exley G, Grodsky SM, Macknick J, Sherwin J, Hernandez RR. 2020. The Land Sparing, Water Surface Use Efficiency, and Water Surface Transformation of Floating Photovoltaic Solar Energy Installations. Sustainability [Internet]. 12(19):8154. doi:10.3390/su12198154
  4. Spencer RS, Macknick J, Aznar A, Warren A, Reese MO. 2019. Floating Photovoltaic Systems: Assessing the Technical Potential of Photovoltaic Systems on Man-Made Water Bodies in the Continental United States. Environ Sci Technol [Internet]. 53(3):1680–1689. doi:10.1021/acs.est.8b04735
  5. Zhang N, Jiang T, Guo C, Qiao L, Ji Q, Yin L, Yu L, Murto P, Xu X. 2020. High-performance semitransparent polymer solar cells floating on water: Rational analysis of power generation, water evaporation and algal growth. Nano Energy [Internet]. 77:105111. doi:10.1016/j.nanoen.2020.105111
  6. Andini S, Suwartha N, Setiawan EA, Ma’arif S. 2022. Analysis of Biological, Chemical, and Physical Parameters to Evaluate the Effect of Floating Solar PV in Mahoni Lake, Depok, Indonesia: Mesocosm Experiment Study. J Ecol Eng [Internet]. 23(4):201–207. doi:10.12911/22998993/146385
  7. Haas J, Khalighi J, de la Fuente A, Gerbersdorf SU, Nowak W, Chen PJ. 2020. Floating photovoltaic plants: Ecological impacts versus hydropower operation flexibility. Energy Conversion and Management. Energy Convers Manag [Internet]. 206:112414. doi:10.1016/j.enconman.2019.112414
  8. Pimentel Da Silva GD, Branco DAC. 2018. Is floating photovoltaic better than conventional photovoltaic? Assessing environmental impacts. Impact Assess [Internet]. 36(5):390–400. doi:10.1080/14615517.2018.1477498
  9. Sellner KG, Doucette GJ, Kirkpatrick GJ. 2003. Harmful algal blooms: causes, impacts and detection. J Ind Microbiol Biotechnol [Internet]. 30(7):383–406. doi:10.1007/s10295-003-0074-9
  10. Pápista É, Ács É, Böddi B. 2002. Chlorophyll-a determination with ethanol – a critical test. Hydrobiologia [Internet]. 485(1):191–198. doi:10.1023/A:1021329602685
  11. Wang TW, Chang PH, Huang YS, Lin TS, Yang SD, Yeh SL, Tung CH, Kuo SR, Lai HT, Chen CC. 2022. Effects of floating photovoltaic systems on water quality of aquaculture ponds. Aquac Res [Internet]. 53(4):1304–1315. doi:10.1111/are.15665
  12. Ho JC, Michalak AM, Pahlevan N. 2019. Widespread global increase in intense lake phytoplankton blooms since the 1980s. Nature [Internet]. 574(7780):667–670. doi:10.1038/s41586-019-1648-7
  13. Battaglia C, Cuevas A, De Wolf S. 2016. High-efficiency crystalline silicon solar cells: status and perspectives. Energy Environ Sci [Internet]. 9(5):1552–1576. doi:10.1039/C5EE03380B

How does prenatal nicotine exposure increase the chance of a child developing asthma?

By Madhulika Appajodu,  Cell Biology ’24

Author’s Note: My name is Madhulika Appajodu and I am a 3rd Year Cell Biology major at UC Davis. I am a pre-medical student and hope to go on to medical school. I chose Cell Biology as a major because I found the focus on cell organization and function to be very interesting. I am a volunteer at Shifa Community Clinic and a member of MEDLIFE, SEND4C, and H4H. I am also a BioLaunch Mentor and a Learning Assistant for the Physics Department. I wrote this piece to answer the question: “How does prenatal nicotine exposure increase the risk of asthma in offspring?” I wrote this for undergraduate students in the field of epigenetics/prenatal exposures and experts/professors in the field but also for the general public who have some knowledge in science. I chose this topic in particular because epigenetics interests me greatly. I find that environmental factors likely play a large part in the life outcomes of people who may be genetically similar but grew up in different environments. I hope that readers will understand how important environmental factors are in the grand scheme of physical, emotional, and mental health for not just the reader but their future families (if they choose to have them) health as well.

ABSTRACT

Previous studies have studied prenatal nicotine exposure and its effects which follow offspring over the course of their lives. One of these effects is asthma. Asthma is a chronic respiratory condition characterized by the narrowing of one’s airways in response to an allergen or irritant. It is a widespread condition, affecting over 25 million people currently in the US alone. The mechanisms of asthma and its causes are currently being investigated. However, researchers agree that prenatal nicotine exposure increases the risk of asthma in offspring exponentially. 

There is currently no cure for asthma, only methods to lessen the intensity of asthmatic episodes, such as through the use of an inhaler. This literature review details the mechanisms through which prenatal nicotine exposure increases the risk of asthma in offspring, according to current research. The three potential causes of this increased risk are placental damage, epigenetic alteration, and nicotine exposure alone. The mechanisms will be evaluated through a synthesis of experimental and survey data in mice and human models in studies done in the past seven years. Comparisons will be drawn between articles that cite the same mechanism as the cause of the increased risk of asthma. Once the mechanism(s) are identified, research can be done to identify a solution so asthma due to prenatal nicotine exposure can be prevented. 

INTRODUCTION 

In the United States, approximately 25 million people are currently diagnosed with asthma [1]. Asthma is a respiratory condition characterized by difficulty breathing due to narrowing airways, caused by inflammation and excess mucus production. This inflammatory response is often triggered by viruses or air-borne allergens. Researchers are currently investigating the underlying immune mechanisms that cause the intense inflammatory response, which is often more intense when someone has been subjected to risk factors such as prenatal nicotine exposure. Since there is currently no cure for asthma, research about the underlying mechanisms of the inflammatory response is vital so that asthma can be prevented rather than simply managed. 

Researchers have studied prenatal nicotine exposure and its effects on offspring for decades, focusing on human subjects who smoked while pregnant. Over the past thirty years, there has been a shift toward using animal trials to investigate the mechanisms associated with  the risk factors for asthma. 

The primary model in asthma research in mice is the house dust mite (HDM) model. The HDM model involves exposing one group of pregnant mice to tobacco smoke-infused air and another group of pregnant mice to filtered air. The offspring of both groups are exposed to house dust mites– a common allergen– and their inflammatory immune response is examined. There are variations to the model, such as exposing the fathers to nicotine prior to mating or exposing the female mice to nicotine prior to or during pregnancy. 

Current literature cites three main factors that contribute to an increased risk of asthma: nicotine smoke exposure alone, placental damage induced by nicotine, and epigenetic alterations induced by nicotine. Nicotine passes from the mother’s blood to the fetus through the umbilical cord during pregnancy. Nicotine can also damage the placenta through vasoconstriction of blood vessels and alter the fetus’ epigenetic markers through DNA methylation. 

The purpose of this literature review is to examine precisely how prenatal nicotine exposure increases the risk of asthma, first in experimental data using the HDM model and then in experimental & survey data regarding humans. 

Prenatal Nicotine Smoke Exposure 

In 2015, Eyring et al proposed that nicotine use in pregnant women increased the risk of asthma in offspring through epigenetic alterations [2]. Eyring et al exposed one group of female mice to tobacco infused smoke (ETS) for five weeks and mated them to male mice and examined the offspring. The pregnant female mice were then exposed to ETS until they gave birth. There was also a control group of female mice exposed to filtered air and mated to male mice. The offspring of the ETS exposed group did display an increased inflammatory response when exposed to house dust mites compared to the control group. However, the level of DNA expression of both groups were not statistically different. Thus, Eyring et al. came to the conclusion that prenatal nicotine exposure can cause an increased risk of asthma in offspring, but was unable to identify the mechanism through which prenatal ETS causes an increased inflammatory response [2]. It is possible that the Bisulfite sequencing equipment at the time of Eyring et al.’s study was not sensitive enough to detect the difference in methylation that newer studies observed. 

Figure 1. Expression levels of IL-5 (Th2 cytokine producing protein) are the same for the CS (ETS exposed group) and FA (filtered air group) mice when exposed to house dust mites (HDM). This indicated that the gene expression levels were not affected by ETS.

A three-generation survey study on human subjects found a correlation between maternal smoking and the increased risk of asthma in offspring, as well as a correlation between grandmothers smoking during pregnancy and their grandchildren having an increased risk of asthma, regardless of the intermediate generation’s smoking habits [3]. The researchers also found a correlation between paternal smoking and an increased risk of asthma in the offspring [3]. They have hypothesized that paternal smoking causes altered microRNA (miRNA) in the sperm. MiRNA is a nucleic acid that regulates expression of genes. During fertilization, this altered miRNA can change the gene expression of the progeny, increasing the risk of asthma in the offspring. The conclusion of this study is that maternal, paternal, and grandmaternal nicotine exposure is correlated with an increased risk of asthma in offspring. The researchers also proposed epigenetic alteration as the mechanism of increased asthma risk, but due to the nature of the study, they were unable to confirm this hypothesis [3]. 

Placental Damage 

A survey of mothers who smoked and mothers who did not smoke by Zacharasiewicz et al. concluded that prenatal exposure to nicotine causes placental damage by decreasing nutrient delivery to the fetus [4]. Prenatal nicotine exposure decreases alveolar surface area, thereby decreasing the tidal volume of fetal lungs after birth [5]. Tidal volume is the amount of air that enters the lungs per breath. A decreased tidal volume results in less oxygen entering the body under standard conditions and a vastly reduced amount of oxygen entering the body when exposed to an allergen. Placental damage also results in the increased aging of the fetus’ lungs as pulmonary cells perform less glycogenolysis and glycolysis, causing cells to die prematurely [6]. The premature death of lung cells means the lungs are weaker, unable to exchange a normal amount of oxygen, and therefore more prone to intense allergic reactions. 

Similarly, a study by Cahill et al. using the HDM mice model found that inhaling nicotine causes vasoconstriction– the narrowing of blood vessels– in the mother, resulting in less oxygen and nutrients delivered to the fetus [7]. They also found that placental HSD2 (a crucial enzyme in fetal development) is decreased when pregnant mothers are exposed to nicotine. Cahill et al also observed placental damage from nicotine use which resulted in decreased birth weights and lung size in fetuses [7]. Decreased lung size leads to intense asthmatic episodes because the airways are smaller and narrower than the airways of an individual not exposed to nicotine prenatally. Ultimately, Zacharasiewicz and Cahill came to the same conclusion that nicotine consumption or exposure in pregnant women increases the risk of asthma in their offspring by negatively affecting the offspring’s lungs [4,7].

Epigenetic Alteration 

Researchers agree that DNA methylation is the one of the mechanisms leading to an increased asthma risk [8]. DNA methylation, the primary form of epigenetic alteration that occurs when a fetus is exposed to nicotine, is a chemical reaction where a methyl (-CH3) group is added to a cytosine base. This methyl group prevents transcription factors from binding to DNA and recruiting repression proteins, resulting in underexpressed genes, which in this case is a disproportionate inflammatory response. However, there is disagreement among researchers about which genes are being alternatively methylated. Christensen et al. conducted an HDM mouse study and found that methylation of genes which produce and regulate Th2 cytokines was decreased in the offspring of mothers exposed to ETS [9]. Cytokines are small proteins that regulate the immune response; Th2 cells produce cytokines that encourage inflammation. Thus the increased expression of Th2 intensifies the inflammatory response that occurs in response to the asthma trigger of house dust mites. Christensen et al. found that Th1 cytokine levels remained constant and methylation was unaffected [9]. 

Conversely, Singh et al. found that Th1 cytokine levels decreased due to hypermethylation [10]. Singh et al. did also find that Th2 cytokine levels increased due to hypomethylation, which concurs with the findings of Christensen et al [9-10]. 

Figure 2. Expression levels of IL-3 (Th2 producing gene) in the groups that were exposed to tobacco infused smoke (SS) or filtered air (FA). There is a statistically significant increase in expression in the SS group indicating a decrease in methylation.

Christensen et al. exposed pregnant female mice to either tobacco smoke-infused air or filtered air and then examined the offspring [9]. Singh et al. exposed both male and female mice to tobacco smoke-infused air or filtered air prior to mating and then examined the offspring [10]. This variation in experimental methods could contribute to the difference seen in the methylation of Th1 cytokine-producing genes. However, both researchers concluded that the nicotine-induced DNA methylation levels changed in genes that produced inflammatory responses to allergens [9-10]. 

Zakarya et al. found that DNA methylation levels were altered in genes associated with fetal growth and nicotine detoxification [11]. This review examined epigenome-wide association studies (EWAS) on patients suffering from asthma whose mothers smoked or vaped during pregnancy. These studies showed increased methylation in placental, whole blood, and fetal lung genes [12]. These results differed from the research done by Singh et al. and Christensen et al. both in the affected genes and the way that methylation was altered [9-10]. The difference in results can be attributed to the difference between mice and humans as well as the variation in experimental design. Christensen and Singh used the HDM model on mice and controlled the levels of nicotine the mice were exposed to [9-10]. Zakarya et al. used data from children of women who reported smoking during pregnancy [11]. The levels of nicotine that the subjects were exposed to was not controlled and varied greatly. These differences between the studied species and experimental design could explain the different conclusions that the researchers drew. 

CONCLUSION 

There is not a simple answer about the mechanism by which nicotine use during pregnancy increases the risk of asthma in offspring. However, both epigenetic alterations and placental damage due to nicotine exposure play a role in increased asthma risk. 

Research citing nicotine-induced epigenetic alteration as the main cause of the increased risk of asthma identifies various genes being altered by DNA methylation. The HDM studies cited in this review conclude that genes producing cytokines had a decrease in methylation, while a study using human subjects concluded that genes involving fetal growth and nicotine detoxification had an increase in methylation. Further research should determine which altered genes are increasing the risk of asthma so that methylation can be induced or repressed in those genes as a preventive measure for asthma. Further research should also focus on which aspect of nicotine-induced placental damage is the biggest factor in the increased risk of asthma so that a solution can be found to address that aspect. 

Future research studies should continue to investigate the two presented mechanisms and identify the factors that are increasing the risk of asthma so that nicotine-induced asthma can be prevented in future generations.

Your genes and you: Examining the effect of direct-to-consumer genetic testing visualizations on conceptions of identity

By Adyasha Padhi, Biochemistry & Molecular Biology and Sociocultural Anthropology ’25

Author’s Note: I wrote this paper for my ANT 109: Visualization in Science Course and we chose a specific visualization and entity connected to it to focus on. 23&Me has always been a company that has interested me and in looking deeper into their business practices, I think that it’s really important that we consider how our identities and our perception of our identity has changed, especially in the 21st century.

Introduction
In recent years, direct-to-consumer (DTC) genetic testing has become widespread, and with it, consumers have had more access to our genetic code than ever before in human history. More than 26 million people—roughly 8% of the US population—have taken at-home DNA tests and as a multi-billion dollar industry, the DTC market is rapidly becoming more widespread. 23&Me, a personal genomics and biotech company based in California, was the first company to begin offering autosomal genetic testing for ancestry, and remains a giant in the field, becoming near ubiquitous in the market of DTC and the minds of many consumers. 

23&Me, as they say on their website, aims to provide its customers “DNA testing with the most comprehensive DNA breakdown,” allowing them to “know [their] personal story, in a whole new way.” For consumers who are typically not geneticists themselves, this analysis and breakdown of their DNA is what they are primarily looking for, expecting to receive information on what their genes mean from their ancestry to health. The interpretation and visualization of DNA test results are what nearly all companies operate as their main product and selling point, more specifically, the idea that they can provide the consumer with a way to know themselves better and understand their ancestry and family history on a deeper level. 

Because of this, the way that companies create and present this genetic information is paramount to understanding the ways that DTC impacts consumers and the wider society’s conception of ancestry and identity. This review will look at a specific case study of 23&Me’s “Ancestry Composition” visualization, looking into how it is created, interacted with, and what it communicates about ancestry and identity, examining the broader impact of quantitative tools on personal/community identity and how the way our genes impact us on both a biological level and on how our understanding of genes and genetics influences the way that we move through the world.

23&Me’s “Ancestry Composition” Visualization:

Figure 1: A sample “Ancestry Composition” report from 23&Me’s website

23&Me’s “Ancestry Composition” visualization is typical of similar genetic ancestry results in the field and is composed of 3 main parts: a pie-chart representing the consumer’s percent ancestry, a list breaking down those percentages by world region, then by ethnicity or nationality country/ethnic group, and then a map that illustrates the different regions of the world in different colors depending on the ancestry found. This iconography dominates most visual communication about ancestry in this day and age with the rise of DTC. 

First, it is important to understand what DNA is. Deoxyribonucleic acid, or DNA for short, is a complex molecule that contains genetic information for the development and functioning of an organism, acting as the hereditary material in nearly all organisms through sequences of nucleotides. DNA in a sense acts as the blueprint that an organism’s cells use to create more cells, growing from a single cell to a fetus and eventually a full human being. As hereditary material, genes are passed from parents to their biological offspring and the complete set of genes or genetic material present in a cell or organism is known as the genome, with genes being organized into chromosomes. DNA that codes for functional molecules called proteins is the most commonly known, however, so-called coding DNA only makes up a tiny percentage of the total genome, only about 1-5%, with the rest composed of non-coding regions. In addition, genetic material is constantly changing through not only mutations but also epigenetic changes. These modify chemical marks on the DNA called the epigenome and change how genes are expressed, and consequently the phenotype of a person, without altering the genetic sequence itself. In some cases, epigenetic changes can be inherited such as through the germ-line transmission of altered epigenomes between generations in the absence of continued environmental exposures (Nilsson 2015).  As such, analyzing and drawing conclusions from DNA is a complex process and is not as simple as it may seem. 

23&Me goes through a process to take the DNA sample that the consumer provides into a visualization that is accessible to the consumer, translating DNA into ancestry information that they can understand. 23&Me specifically analyzes your DNA by looking for specific genetic variants across your entire genome including autosomal DNA, sex chromosomes, and mitochondrial DNA (mtDNA). The locations in the genome that vary from person to person are called single nucleotide polymorphisms (SNPs for short), with different versions of SNPs called alleles. Everyone carries two alleles at most SNPs, one allele from each parent, and while each single-nucleotide polymorphism only contains a small amount of information, by combining events across many SNPs, their algorithm can develop a picture of your genetic ancestry. It is not the SNPs themselves, but instead their variation over time in populations that can be used to map human migration, isolation, and population development (Henn  2012). As such, ethnicities can’t be determined simply by single genes. 

There are six main steps that 23&Me goes through when determining ancestry composition and creating this visualization: preparing for genotyping (amplifying the DNA from the provided sample), training the artificial intelligence algorithms using reference data sets, phasing and determining which genetic information was inherited together on the same chromosome, estimating ancestry for each window of the genome, smoothing window assignments (making adjustments so that the result is more cohesive and understandable), and calibrating and returning the results to the individual in the form of the “Ancestry Composition” visualization (Durand 2021).

Social context of direct-to-consumer genetic ancestry tests 

DTC genetic testing addresses a series of existing social desires with new technological means, particularly combining the modern enthusiasm for science with primal interests in asserting the “natural” of one’s identity and postmodern emphasis on radical individualism (Lang and Winkler 2021). Being just among the latest of ways that we as humans have tried to understand our relationships with others, looking into its history can lend insight into the practice in its current form. Throughout history, ancestry has been used to solidify relations and thus power in many societies, such as hierarchical monarchies or caste systems (Lang and Winkler 2021). Biological relations allow membership into communities and into structures of power, so being able to prove ancestry and have a record of ancestry in some way has been important. Fundamentally, as humans, we have always been trying to make sense of ourselves and the world around us.

However, with this desire to organize the world, structures of power and groups who want power arise; the easiest way to gain power is by dividing people up and creating hierarchies. This is where movements such as racism, eugenics, and other movements serve as justification for dehumanization and violence, creating system-driven violence that cannot be easily dismantled as the violence is no longer individual-to-individual but part of a wider pattern of systemic violence. This includes historical slavery, colonialism, and recent racially motivated violence.  

Impact of DTC on the social construction of Ancestry & Identity 

To understand the impact of DTC on consumer identity, we can start by examining the sociotechnical architecture of 23&Me. The products and visualizations created by DTC companies are often structured in such a way that the user is not provided with sufficient context to understand the results that they receive. As seen in the sample results, there is limited information provided on the most prominent consumer-facing pages, with the results pages primarily showing simply a percentage of the consumer’s DNA associated with a certain heritage. This can be attributed in part to the sociotechnical architecture of 23&Me’s consumer-facing information architecture and UX design more generally. In a similar way that a building’s architecture is an organization of materials and components that together define the building, the sociotechnical architecture of the technology explores how the way that a technology’s technical aspects (its physical system and the task it aims to do) interacts with the social aspects (the structure and organization and how it impacts people cognitively and socially). 

While they do disclose the difficulty with quantifying ancestry, their marketing and product presentation do not do enough to recognize the broader socio-cultural and historical context of which they are a part of. Furthermore, compared to similar companies, 23&Me provides as much raw information to its consumers as possible and builds off the idea that a user possesses the expertise and autonomy to determine the reliability/utility of test results presented to them. This absolves them from the responsibility of misinterpretation, which downplays the difficulty of understanding SNP test results (Parthasarathy 2010). As a whole, by presenting the consumer’s results in a very quantitative manner, and pushing these ideas in their marketing while not providing much information in an accessible way near these results, 23&Me’s products can push onto its customers a genetic essentialist bias, cognitive biases arising from exposure to beliefs that genes are relevant for behavior, condition, and social grouping (Dar-Nimrod & Heine 2011). This  leads to the erroneous perception that conditions associated with genetic attributions are more immutable, determined, homogenous, and natural. 

Another core aspect of this process is its pool of reference genotypes that are used at multiple points throughout the process of visualization production. The groups that are most represented in these reference genotypes are people of European ancestry (Wapner 2020).  This is for a range of reasons, one being structures of power that have allowed those populations to have access to those resources and thus their ancestry records and methods of ancestry remembrance preserved. The data and information that these tests provide is not trivial, especially when it comes to 23&Me’s other half, health genetic testing. Therefore, marginalized groups should have more accessibility, representation, and thus accurate utilization of these tools, though it is also important to recognize the flaws in this system and not blindly encourage individuals to seek out giving their data to these companies without understanding the full picture. There are also no genes specifically associated with specific ethnic groups. 

More broadly, research investigating the impact of genetic ancestry tests on racial essentialism found that while there was no significant average effect of genetic testing on views of racial essentialism, there were significant differences between individuals with high genetic knowledge versus individuals with the least genetic knowledge. Roth found that “essentialist beliefs significantly declined after testing among individuals with high genetic knowledge, but increased among those with the least genetic knowledge”, and also found that this trend was not impacted by the specific genetic ancestry found, demonstrating that this difference was due to different understanding of genetics (Roth 2020).  Recognizing that those who have the least genetic knowledge are those who are most likely to develop essentialist beliefs demonstrates how important it is that education about the process behind genetic testing and how the results are generated is easily accessible and should be more prominent in DTC companies’ products and marketing. 

Conclusion 

As direct-to-consumer genetic testing becomes more and more prevalent, it is impacting the way that we communicate about and conceptualize ancestry, promoting the construction of essentialist identities through the process of DTC genetic ancestry testing, from the marketing to the final visualization. The impacts of this push disproportionately affect individuals of marginalized communities within wider society and increased education about genetics and how these systems work is essential to combating essentialism, both within the companies themselves and the wider society.

Works Referenced

News Articles: 

  1. Bahrampour, Tara. “They considered themselves white, but DNA tests told a more complex story.” The Washington Post, 6 February 2018, https://www.washingtonpost.com/local/social-issues/they-considered-themselves-white-but-dna-tests-told-a-more-complex-story/2018/02/06/16215d1a-e181-11e7-8679-a9728984779c_story.html.
  2. Brown, Kristen V. “23andMe to Use DNA Tests to Make Cancer Drugs.” Bloomberg.com, 4 November 2021, https://www.bloomberg.com/news/features/2021-11-04/23andme-to-use-dna-tests-to-make-cancer-drugs
  3. Copeland, Libby. “Opinion | DNA and Race: What Ancestry and 23andMe Reveal.” The New York Times, 16 February 2021, https://www.nytimes.com/2021/02/16/opinion/23andme-ancestry-race.html.
  4. Molla, Rani. “What 23andMe and other genetic testing tools can do with your data.” Vox, 13 December 2019, https://www.vox.com/recode/2019/12/13/20978024/genetic-testing-dna-consequences-23andme-ancestry.
  5. Pomerantz, Dorothy. “23andMe had devastating news about my health. I wish a person had delivered it.” STAT News, 8 August 2019, https://www.statnews.com/2019/08/08/23andme-genetic-test-revealed-high-cancer-risk/.
  6. Servick, Kelly. “Frustrated U.S. FDA Issues Warning to 23andMe.” Science Insider, 25 November 2013, https://www.science.org/content/article/frustrated-us-fda-issues-warning-23andme.

Scientific Articles: 

  1. Bryc, Katazyna, et al. “The Genetic Ancestry of African Americans, Latinos, and European Americans across the United States.” AJHG, vol. 96, no. 1, 2015, pp. 37-53, https://www.cell.com/ajhg/fulltext/S0002-9297(14)00476-5.
  2. Durand, Eric Y., et al. “Ancestry Composition: A Novel, Efficient Pipeline for Ancestry Deconvolution.” bioRxiv, 2014, https://www.biorxiv.org/content/biorxiv/early/2014/10/18/010512.full.pdf.
  3. Durand, Eric Y., et al. “Reducing Pervasive False-Positive Identical-by-Descent Segments Detected by Large-Scale Pedigree Analysis.” Molecular Biology & Evolution, vol. 31, no. 8, 2014, pp. 2212-2222, https://academic.oup.com/mbe/article/31/8/2212/2925728.
  4. Durand, Eric Y., et al. “A scalable pipeline for local ancestry inference using tens of thousands of reference haplotypes.” bioRxiv, 2021, https://www.biorxiv.org/content/10.1101/2021.01.19.427308v1.
  5. Henn, Brenna M., et al. “Cryptic Distant Relatives Are Common in Both Isolated and Cosmopolitan Genetic Samples.” Plos One, 2012, https://journals.plos.org/plosone/article?id=10.1371/journal.pone.0034267.
  6. Henn, Brenna M., et al. “Hunter-gatherer genomic diversity suggests a southern African origin for modern humans.” PNAS, vol. 108, no. 13, 2011, pp. 5154-5162, https://www.pnas.org/doi/full/10.1073/pnas.1017511108.
  7. Kim, Soyeon, et al. “Shared genetic architectures of subjective well-being in East Asian and European ancestry populations.” Natural Human Behavior, 2022, https://pubmed.ncbi.nlm.nih.gov/35589828/#affiliation-1.

Documentaries: 

  1. DNA Testing: The Promise & the Peril. Performance by Scott Wapner, 2020, https://www.peacocktv.com/watch/asset/tv/dna-testing-the-promise-and-the-peril/55fc9111-fb6e-399f-a921-5c036dfe54f3?orig_ref=https://www.google.com/.
  2. “Identity | Tribeca.” Tribeca Film Festival, https://tribecafilm.com/studios/identity-short-film-series.
  3. Gray, Edward. “Secrets in our DNA | NOVA.” PBS, 13 January 2021, https://www.pbs.org/wgbh/nova/video/secrets-in-our-dna/. 

STS Articles: 

  1. Abel, Sarah. “Reading DNA ancestry portraits against the grain.” Slaveries and Post-Slaveries, 2020, https://journals.openedition.org/slaveries/2343.
  2. Boas, Franz. “The Race Problem in Modern Society.” 1909, https://www.jstor.org/stable/1634659#metadata_info_tab_contents.
  3. Dar-Nimrod, Ilan, and Steven J. Heine. “Genetic Essentialism: On the Deceptive Determinism of DNA.” Psychol Bull., 2011. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3394457/.
  4. Duello, Theresa M. “Race and genetics versus ‘race’ in genetics – PMC.” NCBI, https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8604262/.
  5. Gelman, Susan A. “Essentialism in everyday thought.” Psychological Science Agenda, 2005. American Psychological Association, https://www.apa.org/science/about/psa/2005/05/gelman.
  6. Heine, Steven J., et al. “Making Sense of Genetics: The Problem of Essentialism.” Genetic Essentialism and Its Vicissitudes, 2019, https://onlinelibrary.wiley.com/doi/full/10.1002/hast.1013.
  7. Lang, Alexander, and Florian Winkler. “Co-constructing ancestry through direct-to-consumer genetic testing.” https://irihs.ihs.ac.at/id/eprint/5817/1/Lang-Winkler-2021-co-constructing-ancestry-through-direct-to-consumer-genetic-testing.pdf.
  8. Montagu, MF Ashely. “THE CONCEPT OF RACE IN THE HUMAN SPECIES IN THE LIGHT OF GENETICS.” Journal of Heredity, Journal of Heredity, https://academic.oup.com/jhered/article-abstract/32/8/243/817951.
  9. Oh, Jeongmin, and Uichin Lee. “Exploring UX issues in Quantified Self technologies.” IEEE, https://ieeexplore.ieee.org/document/7061028.
  10. Parthasarathy, Shobita. “Assessing the social impact of direct-to-consumer genetic testing: Understanding socio-technical architectures.” Genetics in Medicine, vol. 12, 2010, pp. 544–547, https://www.nature.com/articles/gim201090.
  11. Prainsack, Barbara. “Understanding Participation: The ‘Citizen Science’ of Genetics | 17 |.” Taylor & Francis eBooks, 2014, https://www.taylorfrancis.com/chapters/edit/10.4324/9781315584300-17/understanding-participation-citizen-science-genetics-barbara-prainsack.
  12. Templeton, Alan R. “Biological Races in Humans – PMC.” NCBI, 16 May 2013, https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3737365/.
  13. Roth, Wendy D., et al. “Do genetic ancestry tests increase racial essentialism? Findings from a randomized controlled trial.” Edited by Mellissa H. WithersDo genetic ancestry tests increase racial essentialism? Findings from a randomized controlled trial. PLoS One, 2020, https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6988910/.
  14. Swan, Melanie. “The Quantified Self: Fundamental Disruption in Big Data Science and Biological Discovery.” Big Data, vol. 1, no. 2, 2013, https://www.liebertpub.com/doi/10.1089/big.2012.0002

Miscellaneous:  

  1. “The 23andMe Ancestry Algorithm Gets an Upgrade.” WP Engine, https://blog.23andme.com/articles/algorithm-gets-an-upgrade.
  2. “Ancestry Composition.” 23andMe, https://www.23andme.com/ancestry-composition-guide/
  3. “Understanding The Difference Between Your Ancestry Percentages And Your Genetic Groups.” 23andMe Customer Care, https://customercare.23andme.com/hc/en-us/articles/5328923468183-Understanding-The-Difference-Between-Your-Ancestry-Percentages-And-Your-Genetic-Groups. 

The Use of Remotely Sensed Data in Modeling Cholera Amidst Climate Change

By Shaina Eagle, Global Disease Biology, ‘24

Introduction 

Over 300,000 people reported having cholera in 2020 [12]. This infectious disease is spread by water or seafood contaminated by the Vibrio cholerae bacteria. V. cholerae can survive in the open ocean within phytoplankton [5]. The bacteria also spreads into inland water sources such as rivers, getting into people’s drinking water. This spread of cholera is affected by climate variables such as precipitation, temperature, and oceanic conditions [1, 2, 5, 6, 7, 11, 13]. Climate patterns such as the El Nino Southern Oscillation (ENSO) and the Indian Ocean Dipole (IOD) influence local weather patterns in coastal regions, causing more phytoplankton blooms [2, 11]. Climate change also disrupts water, sanitation, and hygiene (WASH) infrastructure, [4] creating favorable environmental conditions for V. cholerae to thrive [2]. As climate change causes fluctuations in weather patterns and coastal biology, researchers need a reliable method for tracking and predicting cholera. Early warning systems are key for health officials to be able to take proper preventative measures–from vaccine deployment campaigns to emergency clean water storage–to reduce the prevalence and fatality of cholera.

Satellites are one method to gather measurements of variables that affect the spread of cholera. Using electromagnetic reflection, satellites provide remotely-sensed geophysical data on variables such as temperature, water quality, precipitation, or vegetation [10]. Researchers use remotely sensed data in conjunction with algorithms and statistical analyses to model cholera outbreaks and predict how changing variables will alter disease spread. Satellite data is widely accessible, often free, and provides data over huge temporal-spatial ranges [5, 10]. Researchers are able to compile their data without being physically near the areas they are studying [10]. This review will analyze how researchers have developed methods for predicting cholera outbreaks using remotely sensed data, and demonstrate how refinement of these techniques will be crucial to combating cholera outbreaks amidst climate change. 

Collection of Satellite Data

Natural disasters are increasing in intensity and frequency, heightening the opportunity for a cholera epidemic [2, 4]. Cholera epidemics have historically begun after storms, such as after Hurricane Matthew in Haiti [4]. Hurricanes can destroy WASH infrastructure, allowing cholera to seep into water supplies and leave people vulnerable to drinking this contaminated water [4]. Detecting outbreaks and identifying the source are crucial steps in managing deaths from cholera; it is also crucial to improve sanitation and access to clean drinking water and increase vaccination campaigns. These steps can be aided by remotely sensed data that feeds into prediction models [2].

Remotely sensed data measures variables that are known to be connected to cholera incidence. Huq et al. (2017) published research using remotely sensed precipitation, wind swath, geophysical landmarks, and population density after Hurricane Matthew struck Haiti in October 2016 [4]. The researchers created a map that showed areas at high risk for cholera and were able to predict where outbreak hotspots would occur up to four weeks after the hurricane [4].

Other useful variables include sea surface temperature (SST), sea surface salinity (SSS), land surface temperature (LST), precipitation, chlorophyll-a concentration (Ch-a), and soil moisture (SM) [1, 4, 5, 6, 7, 9, 11, 13]. SST and Ch-a are indicators of a habitat that is suitable for Vibrio cholerae growth [5, 6]. Flooding from extreme precipitation can flush seawater carrying V. cholerae into inland rivers, estuaries, or drinking water [4, 5, 6]. 

Satellites can provide data on climate variables in regions that health officials cannot access safely, or after a natural disaster when researchers cannot collect field data due to accessibility or time constraints [4]. This data could help researchers identify particular regions at risk for a cholera outbreak after an extreme weather event and help policymakers make informed decisions about where to implement vaccination programs or establish WASH infrastructure. And in districts where cholera survives endemically, remotely sensed data could help identify outbreak sources or thresholds for when an outbreak becomes an epidemic. Satellite data on EVCs and WASH infrastructure needs to remain publicly and freely available, and will be particularly effective in identifying potential cholera outbreaks as climate change increases the intensity and incidence of natural disasters and climate patterns that suit V. cholerae proliferation.

Turning Raw Data into Models

Tracking Essential Climate Variables

Satellites provide data across vast geospatial and temporal ranges about the Essential Climate Variables (ECVs) correlated with cholera outbreaks. Remote sensing systems allow researchers to build models of cholera dynamics based on these relationships [5]. Fooladi et al. (2021) used precipitation data from 1983 to 2016 to compute a non-standardized precipitation index (nSPI) in the Gavkhooni basin in Iran. Their model demonstrates how previous understanding of the environmental conditions that precede cholera outbreaks can be combined with satellite data to make novel predictions about disease outbreaks [3]. For example, an algal bloom is an exponential growth of phytoplankton, which requires chlorophyll-a to photosynthesize sunlight, grow, and produce nutrients [5]. Phytoplankton is a reservoir of V. cholerae and can be seen from space because of its green pigment. Therefore, Ch-a is a close enough proxy to phytoplankton for modeling the levels of V. cholerae bacteria in an area [5]. In 2021, Lai et al. used Landsat images from NASA and Sentinel-2A images from the European Space Agency to measure Ch-a in the Guanting Reservoir, one of the main water supply sources for Beijing, China. They developed a model between variables in the satellite images (bands, normalized difference vegetation index, surface algal bloom index, Apple algorithm values) and Ch-a [8]. Their studies in 2016, 2017, and 2019 predicted Ch-a to be correlated with the actual measured chlorophyll-a levels at a 0.05 significance level [8]. This data allowed the researchers to model trends of Ch-a and water nutrition status, which has applications to reservoir eutrophication statuses [8] and thus disease transmission.

Machine Learning

Variables such as LST and SM can be linked to cholera outbreaks through machine learning (ML) algorithms. ML elucidates complex relationships between variables, such as the risk of a cholera outbreak and EVCs [1]. Statistically analyzing input data taken from satellites, ML allows researchers to build models that predict an output (i.e. an outbreak) [9]. Algorithms such as Random Forest (RF), XGBoost, K-Nearest Neighbors, and Linear Discriminant Analysis have been examined by researchers [1, 9]. Campbell et al. (2020) found RF to be the most effective classifier due to its superior performance in handling oversampled and imbalanced datasets, yielding a high true positive rate (probability that an actual trend is correctly predicted) of 89.5% when fitting a model combining a season encoder, location encoder, LST, Ch-a, SSS, sea level anomalies, SM, and their lag values (using past variables to predict future variables) [1].  Campbell et al.’s model (2020) combined five EVCs and pulled data across forty districts of India from 2010 to 2018 [1]. 

In a 2013 study, Jutla acknowledged that Ch-a alone cannot serve as an accurate predictive factor of a cholera outbreak, as other organic matters and detritus not represented by a chlorophyll index can also contribute to the presence of cholera bacteria [6]. To account for this, Jutla developed the Satellite Water Marker (SWM) index, which uses wavelength radiance to identify coastal conditions and predict cholera outbreaks one to two months in advance [5]. SWM is based on the shifts in the difference between blue (412 nm) and green (555 nm) wavelengths, which determine the turbidity (impurity) of water [5]. A high correlation between SWM in the early winter months in the Bay of Bengal and cholera peaks in the spring was observed, and likely related to multiple coastal conditions, not just Ch-a [5]. Jutla et al. (2013) tested the Bay of Bengal SWM in Mozambique, where there is one annual cholera peak as opposed to two. They again found that the SWM was a more accurate indicator of cholera than Ch-a alone. Julta’s index was used again by Ogata et al. (2021) to determine the specific environmental conditions in previous seasons that precede cholera outbreaks in northern coastal regions in the Bay of Bengal. They linked spring cholera to summer precipitation and the previous fall/winter SWM. Meanwhile, La Niña-driven SST deviations and floods caused by high summer rainfall anticipated fall cholera outbreaks [11]. Variability in climate conditions and SWM over decades indicates that the predictive models are ever-shifting [11]. A clear understanding of shifts in climate patterns over time is thus integral to accurate forecasting.

Challenges and Limitations

Remotely sensed data is integral to developing timely and accurate predictive models and early warning systems for cholera outbreaks. There is no set of ECVs or a specific ML technique that can be applied universally, especially when looking at endemic versus epidemic cholera [1, 2, 5, 6, 7, 9]. Many studies struggled with a lack of field data against which to test their models, particularly after extreme weather events which may destroy existing data collection infrastructure [7]. Researchers were also challenged by imbalanced datasets when programming ML algorithms, even with particularly resilient algorithms like RF [1, 9].  Cholera is notoriously difficult to model because it can occur through multiple pathways of transmission, and cholera outbreaks are related to several climate variables through complex relationships [5, 6, 9]. Further testing in diverse regions, under various climate conditions, utilizing assorted ECVs, and employing numerous ML techniques is necessary to make these models as accurate as possible. Future studies should focus on long-term observations of variables known to be connected to cholera and V. cholerae, such as sea surface salinity [1, 11]. Future models also need to take socioeconomic data into account [1, 4].

Conclusion

The purpose of this review was to demonstrate how and why remotely sensed data is being used to predict cholera outbreaks, particularly as climate change makes local weather patterns more unpredictable. Researchers do not indicate a lack of sufficient satellite or ML technology necessary to make satellite data-driven cholera prediction models commonplace. However, different regions around the world have different seasonal and interannual variability of cholera transmission [5], making it difficult to develop a universal model. Therefore, future studies should emphasize testing various ML methods with diverse EVCs worldwide. Future studies should also work to formulate indices such as the SWM that can be applied over different geographical regions with minimal alterations. As climate change intensifies, cholera prediction models are vital components of disease prevention. Cholera is unlikely to be eradicated [5], but there are steps that can be taken to control its transmission and minimize its mortality. These steps are more effective the more time officials have to deploy them, so models that can provide significant lead times are critical.

Works Cited

[1] Campbell AM, Racault M-F, Goult S, Laurenson A. 2020. Cholera risk: a machine learning approach applied to essential climate variables. IJERPH. 17(24):9378.

[2] Christaki E, Dimitriou P, Pantavou K, Nikolopoulos GK. 2020. The impact of climate change on cholera: A review on the global status and future challenges. Atmosphere. 11(5):449.

[3] Fooladi M, Golmohammadi MH, Safavi HR, Singh VP. 2021. Fusion-based framework for meteorological drought modeling using remotely sensed datasets under climate change scenarios: resilience, vulnerability, and frequency analysis. Journal of Environmental Management. 297:113283.

[4] Huq A, Anwar R, Colwell R, McDonald MD, Khan R, Jutla A, Akanda S. 2017. Assessment of risk of cholera in Haiti following Hurricane Matthew. The American Journal of Tropical Medicine and Hygiene. 97(3):896–903.

[5] Jutla AS, Akanda AS, Islam S. 2010. Tracking cholera in coastal regions using satellite observations 1. JAWRA Journal of the American Water Resources Association. 46(4):651–662.

[6] Jutla A, Akanda AS, Huq A, Faruque ASG, Colwell R, Islam S. 2013. A water marker monitored by satellites to predict seasonal endemic cholera. Remote Sensing Letters. 4(8): 822-831.

[7] Khan R, Aldaach H, McDonald C, Alam M, Huq A, Gao Y, Akanda AS, Colwell R, Jutla A. 2019. Estimating cholera risk from an exploratory analysis of its association with satellite-derived land surface temperatures. International Journal of Remote Sensing. 40(13):4898–4909.

[8] Lai Y, Zhang J, Song Y, Gong Z. 2021. Retrieval and evaluation of chlorophyll-a concentration in reservoirs with main water supply function in Beijing, China, Based on Landsat Satellite Images. IJERPH. 18(9):4419.

[9] Leo J, Luhanga E, Michael K. 2019. Machine learning model for imbalanced cholera dataset in Tanzania. The Scientific World Journal. 2019:1–12.

[10] Moore GK. 1979. What is a picture worth? A history of remote sensing / Quelle est la valeur d’une image? Un tour d’horizon de télédétection. Hydrological Sciences Bulletin. 24(4):477–485.

[11] Ogata T, Racault M-F, Nonaka M, Behera S. 2021. Climate precursors of satellite water marker index for spring cholera outbreak in Northern Bay of Bengal coastal regions. International Journal of Environmental Research and Public Health. 18(19):10201.

[12] World Health Organization. 2021. Cholera annual report 2020. Weekly Epidemiological Record, Volume 96, page 445-460. 

[13] Xu M, Cao CX, Wang DC, Kan B, Xu YF, Ni XL, Zhu ZC. 2016. Environmental factor analysis of cholera in China using remote sensing and geographical information systems. Epidemiol Infect. 144(5):940–951.

Current threats to the Greater Everglades Ecosystem by invasive Burmese pythons

By Jessica Baggott, Evolution Ecology and Biodiversity Major, Professional Writing Minor, ’23

Author’s note: I wrote this piece in the Spring Quarter of 2022 for UWP 102B, Writing in the Disciplines: Biology. I wrote this piece partially because I have always fostered an interest in invasive species — how they enter, alter, and succeed in ecosystems. And, how we as scientists and policymakers address these threats to native ecosystems. I was also compelled to write this review because of the abundance of recent literature and the lack of another review, to my knowledge, that covered the same topics as I intended to.

I hope that readers walk away from this piece with a greater understanding of the Burmese python in the Florida Everglades — their invasion, success, and alterations to a fragile and precious ecosystem. I wish for readers to recognize the connections that I have made, combing through the literature, and I wish for them to make their own connections, too. There is no greater gift than your engagement with my work.

 

INTRODUCTION

Southern Florida’s Greater Everglades Ecosystem (GEE) once included over 8 million acres of 0.5-2.0 foot deep wetland from the Kissimmee Chain of Lakes just south of Orlando to the southern tip of Florida Bay [1]. Now, the GEE is estimated to be half of its historical size and is fragmented into various national, state, regional, and local parks as well as more than 12 wildlife refuges and marine preserves [2, 3, 4]. Everglades National Park (ENP), one of the federally protected regions of the GEE, only includes 1.5 million acres of this vast ecosystem [5]. However, even within the protected region of ENP, canals, pump stations, and roads have been  constructed to increase human accessibility to the Everglades, severely altering precise hydrological processes [1, 6]. These hydrological alterations, encroaching human settlements, degraded water quality, anthropogenic climate change, and the introduction of invasive species all pose significant threats to the GEE, and work in conjunction to increase negative effects on the GEE [4]. 

Perhaps the most infamous invasive species in the U.S., the Burmese python is the most well known threat to the GEE (Python molurus bivittatus). The snakes’ long lifespan, high fecundity or ability to produce offspring, as well as their generalist lifestyle which allows them to adapt their behavior and dietary habits to their environment, has allowed a small number of pythons to establish and thrive in the GEE [7]. Currently, Burmese pythons are drastically altering trophic structures as well as introducing and transmitting disease in the GEE. Furthermore, Burmese pythons have and have the potential to extend their range northward, putting other ecosystems and species at risk. A comprehensive literature review is required to inform policy decisions and assess the risk posed by Burmese pythons beyond the GEE.

Background

Native to Southeast Asia, the Burmese python was introduced into the GEE in the 1980s during a boom in the exotic pet trade and the subsequent release of the snakes into the Everglades by owners [7]. Since being first recognized in ENP in 2000, the invasive range of the Burmese python has rapidly expanded to the entirety of ENP and much of Big Cypress National Preserve [8]. However, population estimates have been hindered by the combination of cryptic python behavior (including long periods of inactivity), excellent natural camouflage, and human park management goals that include the removal of every python encountered without necessarily documenting the removed numbers [9]. These factors have caused extremely low python detection probabilities, ranging from 0.0001 to 0.0146 using visual surveys and radio transmitters [9]. Given low detection probability, population estimates range from tens of thousands to hundreds of thousands [9, 10]. Better population estimates are required for effective management strategies and to monitor changing populations of pythons [9].

Northward Range Expansion

Burmese pythons exhibit seasonal habitat preference, primarily choosing covered habitats close to water, though recent studies have found evidence that they may also be attracted to human development [11-17]. Smith et al. (2021) found that within their native range in Thailand, Burmese pythons do not avoid human dominated landscapes. Similarly, Bartoszek et al. (2021) found that in a northwest portion of ENP, within their invasive range, Burmese python hotspots were merely 515 meters from urban development on average. Researchers attributed this proximity to high quantities of readily available prey in these areas, in the form of livestock and birds attracted to the artificial lakes [11, 16]. However, egg clutches deposited in or near urban areas may exhibit lower survival rates than those in other habitats [8]. Though juveniles can travel long distances, particularly through use of agricultural canals, Pittman & Bartoszek (2021) hypothesize that in fact adult pythons with more sophisticated navigational capacities are the population driving expansion [18]. Adult sufficiency in and attraction to urban environments indicates that northward Burmese python expansion may not be hindered by human settlements. 

Besides suitable habitat, the range of ectotherms such as the Burmese python is typically limited by climate and/or the possession of behavioral adaptations such as retreating into underground refugia during winter months [19]. Though a conservative estimate allows Burmese pythons to survive for short periods of time at 5 °C,, temperatures must be above 16 °C in order for them to maintain digestion [19]. In isolation, these requirements make further expansion of the Burmese python in more northern parts of Florida extremely unlikely without the additional development of hibernation behaviors [19]. However, other researchers have found evidence of rapid adaptation for increased thermal tolerance after an extreme cold event in 2010 that caused high python mortality [20]. Adaptations included the maintenance of an active digestive system and changes in gene expression related to regenerative organ growth and behavior [20]. This rapid evolution by natural selection may permit Burmese pythons to expand their range northward into more temperate climates. 

However, there have been no studies in the last decade examining Burmese python’s potential for northward expansion, despite advances in climate and habitat models, tracking, and a greater understanding of Burmese python cold physiology. What studies do exist were inconclusive and results varied greatly: Rodda et al. (2009) and Pyron et al. (2008) provided oppositional potential range estimates. Rodda et al. (2009) concluded that the potential Burmese python range could include most of the southern U.S., from California through North Carolina. In contrast, Pyron et al. (2008) only included southern Florida and extreme southern Texas as the potential range of Burmese python expansion. Previous studies examining potential Burmese python range primarily agreed with Pyron et al. (2008) and all but two directly refuted the range suggested by Rodda et al. (2009) [19, 23-25]. Furthermore, climate change is projected to decrease the frequency and intensity of cold events in North America, allowing tropical species historically found at or near the equator, such as the Burmese python, to move poleward [26]. A literature review examining potential northward expansion of tropical organisms as a whole, with brief mentions of the Burmese python in Florida, posits that Burmese python range expansion is likely given the evidence for rapid adaptation for cold tolerance presented by Card et al. (2018) [26]. However, a complete understanding of the adaptive capacity of species, ecosystems, and biomes to climate change still remains lacking [26]. 

In addition to rapid adaptation to cold temperatures, Burmese pythons have shown evidence of hybridizing with another closely related invasive species, the Indian python (Python molurus) [27]. Hybridization has increased the population’s genetic diversity and allowed Burmese pythons to mitigate the founding and bottleneck effects — loss of genetic diversity due to a small founding population size or environmental effects [27]. Additionally, Hunter et al. (2018) found evidence of multiple paternity—the insemination of a female by more than one male during a single reproductive event—in Burmese pythons, also increasing python diversification rate. These behaviors allow for pythons to increase genetic diversity and will likely increase fitness, increasing the probability of northward expansion. 


Burmese Python Presence (1979–2016), Conyers & Sen Roy 2021.

Disease 

The invasion of the Burmese python in the GEE has introduced at least one pathogen, a lung parasite known as Raillietiella orientalis. Lacking coevolution with North American hosts, the spread and severity of this pathogen has increased in native species. This parasite now affects 13 species of native snakes and has extended beyond the python range into north central Alachua County, Florida, approximately 170 miles from the northernmost point of the GEE [28-30]. Researchers observed higher infection intensity, prevalence, and body size of R. orientalis in native snakes than in Burmese pythons, as native snakes do not share evolutionary history with R. orientalis and therefore are immunologically naive [29]. Infection by R. orientalis may be lethal or sublethal, and may be the cause of population decline of the pygmy rattlesnake [29, 31]. Additionally, R. orientalis’ native snake hosts have the highest rate of competence, or are most likely to transmit a resultant infection to a new host or vector after being exposed to a parasite. Furthermore, as R. orientalis’ native snake hosts are three of the most abundant snakes in North America [29], the parasite has a high likelihood of continued expansion throughout North America and possibly beyond [29]. Since the snakes of North America have not coevolved with R. orientalis, infections will be more severe and may cause population wide declines potentially resulting in devastating trophic cascades. The negative effects of the introduced parasite compound with those of Burmese python predation create weakened native populations more susceptible to parasitism, disease, and other stressors. More research is needed to ascertain the complete range of R. orientalis, expansion rate, intermediate hosts, sublethal effects on native snakes, and impact on populations. 

In addition to introducing a novel pathogen, Burmese pythons are competent hosts of at least one native pathogen and are suspected to be competent hosts of more [28, 32]. As a competent host to native pathogens, the Burmese python likely acts as a reservoir for these pathogens, and increases transmission to native species and humans [28, 32]. However, Burmese pythons are also able to change disease transmission through alteration of host communities via predation. Such is the case with the endemic Everglades Virus (EVEV), which can cause inflammation of the active tissues of the brain, known as clinical encephalitis, in humans. Decreased mammal diversity as a result of Burmese python predation was found to increase blood meals on amplifying hosts—hosts in which infectious agents multiply rapidly to high levels—increasing EVEV infection in mosquitoes [12]. Thus, it is possible that Burmese pythons could increase disease prevalence in humans as well, though contact with infected hosts is required for spread and therefore human disease may be driven by different factors than those in the mosquito-rodent cycle [12]. Understanding of the complex relationship between Burmese python predation on host species while also acting as hosts themselves remains lacking for many other important diseases, and presents an opportunity for future research. Additionally, studies should be conducted to estimate human risk as a result of the Burmese python altering host communities.

Further disease spillback is mediated by elevated rates of mosquito feedings on Burmese pythons [32]. The mosquitos that prefer feeding on Burmese Pythons also feed on a range of other species, including mammals, birds, reptiles, and amphibians [32]. Additionally, mosquito ranges extend beyond that of the Burmese python [32]. Thus, through both preferential feeding by mosquitoes on Burmese pythons and large mosquito range, the introduction of the Burmese python into the Everglades has increased disease spread beyond the python range.

Predation

The Burmese python has more than 40 prey documented in the Everglades, including a wide range of mammals and birds, and occasionally American alligators [33]. Given their appetite and potentially large population numbers, Burmese pythons are able to exert control over species populations. The decline of particular species relative to others can then cause ecosystem-wide cascades. Pythons have been found to cause severe mammal population declines through predation in their invasive range including 99.3%, 98.9%, and 87.5% decreases in observation frequency of raccoons, opossum, and bobcats respectively [33, 34]. Additionally, pythons have caused a complete local extinction of marsh rabbits, once one of the most commonly seen animals in ENP [33, 35, 36]. When reintroduced to ENP, marsh rabbits were able to establish a breeding population five months after translocation, but by 11 months after reintroduction, 77% of deaths were attributed to Burmese pythons and the population was unable to reestablish [35]. This disproportionate predation makes the reestablishment of this and other similarly affected species impossible as long as the python persists. Similarly, an analysis of anthropogenic stressors and those posed by pythons found that the strongest predictor for marsh rabbit occurrence was distance from the epicenter of python invasion [36]. These results indicate that pythons have profound effects on ecosystem composition through predation and are able to cause trophic cascades, damaging the ecosystem. Additionally, as is the case with Marsh Rabbits, species may be unable to reestablish in the core invasion area, even with translocation efforts. This demonstrates that without removal of Burmese pythons from the GEE, biodiversity and community composition of the GEE may be irreparably damaged.

Large, highly fecund species with wide habitat breaths were found to be the least susceptible to increased pressure from pythons, so the decline of a highly fecund and habitat generalist such as the marsh rabbit is especially concerning [37]. Using trait relationships, researchers predicted exclusively negative responses in occupancy probabilities to the presence of Burmese pythons regarding five unobserved species of concern: the everglades mink, feral hog, gray fox, red fox, and Key Largo woodrat [37]. Though rodent populations were previously thought to be resistant to the effects of pythons, declines in these populations have also been observed, and due to their lack of evolutionary history, one species, the Eastern woodrat, has even been suggested to be attracted to python scent [34, 38]. These results and research conducted on mammal resilience to pythons have shown that there is little evidence of resilience among mammals within the core invasion area, which only further contributes to the homogenization of the ecosystem [34]. Additionally, it is likely that loss of diversity and competition will allow other invasive species to establish more easily [34]. The results show the need for continued monitoring of species to analyze trends, research on response to novel predators, and the mechanisms for negative responses of native species to Burmese pythons. Furthermore, these results suggest that removal or significant population reduction of Burmese pythons may be the only way to curb their negative impacts. 

CONCLUSION

The purpose of this review was to examine the effects of the Burmese python in the GEE through predation, introduction and alteration of disease transmission, and potential range expansion. It is evident from this review that the Burmese python, through predation trophic alteration, has had severe effects on the native fauna of the GEE. Ultimately, it is the lack of coevolution between the Burmese python and native fauna that have led to the acute and persistent problems in the GEE. Burmese python establishment in the GEE has proved to be extremely detrimental to an ecosystem already facing considerable anthropogenic stressors. Given this, special attention should be paid to curb further Burmese python expansion to avoid similar ecological catastrophes due to the Burmese python. Further studies should be conducted regarding native resilience and recovery as populations eventually enter the third stage of invasion. Additionally, studies should be conducted to better quantify python density as to frame future understanding of ecosystem dynamics. The Burmese python is a prime example of many regarding invasive species across the globe. So, it is not only critical to better understand these aspects of python success and native fauna response, but the results may be applicable in the broader effort to manage invasive species. 

REFERENCES

  1. South Florida Water Management District. 2022. History of the Greater Everglades Ecosystem: Role of the Everglades in the Greater Everglades ecosystem.
  2. Office of Economic & Demographic Research. 2022. Annual Assessment of The Everglades. 5:1-19.
  3. Congressional Research Service. 2017. Everglades Restoration: Federal Funding and Implementation progress.
  4. Defenders of Wildlife. Greater Everglades.
  5. National Park Service. 2021. Everglades National Park Frequently Asked Questions.
  6. Florio. 2021. Removing the cork in the bottle: Reconstructing Tamiami Trail to restore water flow to Everglades National Park. 
  7. Willson, et al. 2011. Biol Invasions. 13: 1493-1504. 
  8. Pittman & Bartoszek. 2021. BMC Zoology. 6(33).
  9. Nafus, et al. 2020. Journal of Herpetology. 54(1): 24-30. 
  10. Janos. 2020. How Burmese Pythons Took Over the Florida Everglades. History. 
  11. Bartoszek, et al. 2021. Ecosphere. 12(6). 
  12. Burkett-Cadena, et al. 2021. Communications Biology. 4(804). 
  13. Conyers & Sen Roy. 2021. Spatial Information Research. 29: 749–760. 
  14. Hart, et al. 2015. Animal Biotelemetry. 3(8).
  15. Mutascio, et al. 2018. Landscape Ecology. 33, 257–274. 
  16. Smith, et al. 2021. Sci Rep-UK. 11(7014). 
  17. Walters, et al. 2016. Journal of Herpetology. 50(1): 50-56.
  18. Pittman, et al. 2014. Biology Letters. 10(3). 
  19. Jacobson, et al. 2012. Integrative Zoology. 7(3): 271-285. 
  20. Card, et al. 2018. Molecular Ecology. 27(23): 4744-4757. 
  21. Rodda, et al. 2009. Biol Invasions. 11: 241–252. 
  22. Pyron, et al. 2008. PLOS ONE. 3(8): e2931.
  23. Avery, et al. 2010. Biol Invasions. 12: 3649–3652.
  24. Dorcas, et al. 2011. Biol Invasions. 13: 793–802.
  25. Mazzotti, et al. 2016. Ecosphere. 7(8): e01439. 
  26. Osland, et al. 2021. Glob Change Biol. 27(13): 3009-3034. 
  27. Hunter, et al. 2018. Ecol Evol. 8(17): 9034-9047. 
  28. Miller, et al. 2018. Ecol Evol. 8(2): 830–840. 
  29. Miller, et al. 2020. Ecosphere. 11(6): e03153. 
  30. Walden, et al. 2020. Frontiers in Veterinary Science. 7:467.
  31. Farrell, et al. 2019. Herpetological Review. 50(1): 73-76.
  32. Reeves, et al. 2018. PLOS ONE. 13(1): e0190633. 
  33. Dorcas, et al. 2012. PNAS. 109(7): 2418-2422. 
  34. Taillie, et al. 2021. Biol Conserv. 261: 109290.
  35. McCleery, et al. 2015. P. Roy Soc B-Biol Sci. 282(1805). 
  36. Sovie, et al. 2016. Biol Invasions. 18: 3309–3318. 
  37. Soto-Shoender, et al. 2020. Biol Invasions. 22: 2671–2684. 
  38. Beckmann, et al. 2021. J Mammal. 103(1): 136–145.