Below are some good papers for each project that will help you get a feel for what the project is about.

Acoustic Species ID

    • [1] J. Ayers, Y. Jandali, Y. J. Hwang, G. Steinberg, E. Joun, M. Tobler, I. Ingram, R. Kastner, and C. Schurgers, “Challenges in Applying Audio Classification Models to Datasets Containing Crucial Biodiversity Information,” in 38th International Conference on Machine Learning, Jul. 2021. Available at: https://www.climatechange.ai/papers/icml2021/14

      The acoustic signature of a natural soundscape can reveal consequences of climate change on biodiversity. Hardware costs, human labor time, and expertise dedicated to labeling audio are impediments to conducting acoustic surveys across a representative portion of an ecosystem. These barriers are quickly eroding away with the advent of low-cost, easy to use, open source hardware and the expansion of the machine learning field providing pre-trained neural networks to test on retrieved acoustic data. One consistent challenge in passive acoustic monitoring (PAM) is a lack of reliability from neural networks on audio recordings collected in the field that contain crucial biodiversity information that otherwise show promising results from publicly available training and test sets. To demonstrate this challenge, we tested a hybrid recurrent neural network (RNN) and convolutional neural network (CNN) binary classifier trained for bird presence/absence on two Peruvian bird audiosets. The RNN achieved an area under the receiver operating characteristics (AUROC) of 95% on a dataset collected from Xeno-canto and Google’s AudioSet ontology in contrast to 65% across a stratified random sample of field recordings collected from the Madre de Dios region of the Peruvian Amazon. In an attempt to alleviate this discrepancy, we applied various audio data augmentation techniques in the network’s training process which led to an AUROC of 77% across the field recordings

      [http]

  1. [2] S. Kahl, T. Denton, H. Klinck, H. Reers, F. Cherutich, H. Glotin, H. Goëau, W.-P. Vellinga, R. Planqué, and A. Joly, “Overview of BirdCLEF 2023: Automated bird species identification in Eastern Africa,” Working Notes of CLEF, 2023, Available at: https://www.researchgate.net/publication/373603820_Overview_of_BirdCLEF_2023_Automated_Bird_Species_Identification_in_Eastern_Africa_40_International_CC_BY_40

    [http]

  2. [3] S. Kahl, M. Clapp, W. A. Hopping, H. Goëau, H. Glotin, R. Planqué, W.-P. Vellinga, and A. Joly, “Overview of birdclef 2020: Bird sound recognition in complex acoustic environments,” in CLEF 2020-Conference and Labs of the Evaluation Forum, 2020. Available at: https://ceur-ws.org/Vol-2696/paper_262.pdf

    [http]

FishSense

Mangrove Monitoring

  1. [1] A. J. Hsu, E. Lo, J. Dorian, K. Qi, M. T. Costa, and B. G. Martinez, “Lessons on monitoring mangroves,” in UC San Diego: Aburto Lab, UC San Diego, 2019. Available at: https://escholarship.org/uc/item/3bg3206z

    [http]

Maya Archeology

    • [1] K. Guo, S. Ramaniyer, and T. Sharkey, “Digital Preservation of Maya Archaeological Sites Using Virtual Reality,” Jun. 2022, Available at: https://kastner.ucsd.edu/ryan/wp-content/uploads/sites/5/2022/06/admin/maya-vr.pdf

      There’s an ongoing tension between the preservation of cultural heritage sites and the need for usable land. As a result, there have been several efforts to preserve these sites using current advances in scanning, modeling, and visualization technologies- notably RGB-D cameras, scene reconstruction pipeline, and Virtual Reality (VR). However, these individual technologies have mostly been developed independently, and little effort has been dedicated to integrating them. This paper presents an evolution of an existing system for reconstructing and displaying cultural heritage sites in Virtual Reality environments. To achieve this, we develop a pipeline to (i) track the cameras using visual-inertial SLAM, (ii) perform a 3D reconstruction using registered depth and RGB data, (iii) facilitate loading and displaying the reconstruction in VR, and (iv) create virtual voice-guided tours. We discuss the details of this automated pipeline and demonstrate its ability to take an unskilled user from data capture to an immersive virtual tour without the need for 3rd party or command-line tools by designing a desktop GUI interface. We hope that this automated scanning and reconstruction pipeline will help the digitization and education of cultural heritage sites.

      [http]

Radio Telemetry Tracking

    • [1] N. T. Hui, E. K. Lo, J. B. Moss, G. P. Gerber, M. E. Welch, R. Kastner, and C. Schurgers, “A more precise way to localize animals using drones,” Journal of Field Robotics, 2021, doi: https://doi.org/10.1002/rob.22017.

      Abstract Radio telemetry is a commonly used technique in conservation biology and ecology, particularly for studying the movement and range of individuals and populations. Traditionally, most radio telemetry work is done using handheld directional antennae and either direction-finding and homing techniques or radio-triangulation techniques. Over the past couple of decades, efforts have been made to utilize unmanned aerial vehicles to make radio-telemetry tracking more efficient, or cover more area. However, many of these approaches are complex and have not been rigorously field-tested. To provide scientists with reliable quality tracking data, tracking systems need to be rigorously tested and characterized. In this paper, we present a novel, drone-based, radio-telemetry tracking method for tracking the broad-scale movement paths of animals over multiple days and its implementation and deployment under field conditions. During a 2-week field period in the Cayman Islands, we demonstrated this system’s ability to localize multiple targets simultaneously, in daily 10 min tracking sessions over a period of 2 weeks, generating more precise estimates than comparable efforts using manual triangulation techniques.

      Keywords: aerial robotics, environmental monitoring, exploration, rotorcraft

      [DOI | http]

Smartfin

    • [1] P. Bresnahan, T. Cyronak, R. J. W. Brewin, A. Andersson, T. Wirth, T. Martz, T. Courtney, N. Hui, R. Kastner, A. Stern, T. McGrain, D. Reinicke, J. Richard, K. Hammond, and S. Waters, “A high-tech, low-cost, Internet of Things surfboard fin for coastal citizen science, outreach, and education,” Continental Shelf Research, vol. 242, p. 104748, 2022, doi: https://doi.org/10.1016/j.csr.2022.104748.

      Coastal populations and hazards are escalating simultaneously, leading to an increased importance of coastal ocean observations. Many well-established observational techniques are expensive, require complex technical training, and offer little to no public engagement. Smartfin, an oceanographic sensor–equipped surfboard fin and citizen science program, was designed to alleviate these issues. Smartfins are typically used by surfers and paddlers in surf zone and nearshore regions where they can help fill gaps between other observational assets. Smartfin user groups can provide data-rich time-series in confined regions. Smartfin comprises temperature, motion, and wet/dry sensing, GPS location, and cellular data transmission capabilities for the near-real-time monitoring of coastal physics and environmental parameters. Smartfin’s temperature sensor has an accuracy of 0.05 °C relative to a calibrated Sea-Bird temperature sensor. Data products for quantifying ocean physics from the motion sensor and additional sensors for water quality monitoring are in development. Over 300 Smartfins have been distributed around the world and have been in use for up to five years. The technology has been proven to be a useful scientific research tool in the coastal ocean—especially for observing spatiotemporal variability, validating remotely sensed data, and characterizing surface water depth profiles when combined with other tools—and the project has yielded promising results in terms of formal and informal education and community engagement in coastal health issues with broad international reach. In this article, we describe the technology, the citizen science project design, and the results in terms of natural and social science analyses. We also discuss progress toward our outreach, education, and scientific goals.

      Keywords: Coastal oceanography, Citizen science, Surfing, Sea surface temperature, Outreach

      [DOI | http]

  1. [2] N. Hui, “Smartfin Current Efforts.” GitHub, September 2023. Available at: https://github.com/UCSD-E4E/smartfin-docs/blob/master/current_efforts.md

    [http]

Research Support Group

  1. [1] “E4E Hardware Group.” Available at: https://github.com/UCSD-E4E/e4e-hw

    [http]

  2. [2] “E4E Engineering Support Group.” Available at: https://github.com/UCSD-E4E/engineering_support_group

    [http]