Onboarding Papers
Below are some good papers for each project that will help you get a feel for what the project is about.
Acoustic Species ID
-
[1] J. Ayers, Y. Jandali, Y. J. Hwang, G. Steinberg, E. Joun, M. Tobler, I. Ingram, R. Kastner, and C. Schurgers, “Challenges in Applying Audio Classification Models to Datasets Containing Crucial Biodiversity Information,” in 38th International Conference on Machine Learning, Jul. 2021. Available at: https://www.climatechange.ai/papers/icml2021/14
[http]
The acoustic signature of a natural soundscape can reveal consequences of climate change on biodiversity. Hardware costs, human labor time, and expertise dedicated to labeling audio are impediments to conducting acoustic surveys across a representative portion of an ecosystem. These barriers are quickly eroding away with the advent of low-cost, easy to use, open source hardware and the expansion of the machine learning field providing pre-trained neural networks to test on retrieved acoustic data. One consistent challenge in passive acoustic monitoring (PAM) is a lack of reliability from neural networks on audio recordings collected in the field that contain crucial biodiversity information that otherwise show promising results from publicly available training and test sets. To demonstrate this challenge, we tested a hybrid recurrent neural network (RNN) and convolutional neural network (CNN) binary classifier trained for bird presence/absence on two Peruvian bird audiosets. The RNN achieved an area under the receiver operating characteristics (AUROC) of 95% on a dataset collected from Xeno-canto and Google’s AudioSet ontology in contrast to 65% across a stratified random sample of field recordings collected from the Madre de Dios region of the Peruvian Amazon. In an attempt to alleviate this discrepancy, we applied various audio data augmentation techniques in the network’s training process which led to an AUROC of 77% across the field recordings
-
[2] S. Kahl, T. Denton, H. Klinck, H. Reers, F. Cherutich, H. Glotin, H. Goëau, W.-P. Vellinga, R. Planqué, and A. Joly, “Overview of BirdCLEF 2023: Automated bird species identification in Eastern Africa,” Working Notes of CLEF, 2023, Available at: https://www.researchgate.net/publication/373603820_Overview_of_BirdCLEF_2023_Automated_Bird_Species_Identification_in_Eastern_Africa_40_International_CC_BY_40
[http]
-
[3] S. Kahl, M. Clapp, W. A. Hopping, H. Goëau, H. Glotin, R. Planqué, W.-P. Vellinga, and A. Joly, “Overview of birdclef 2020: Bird sound recognition in complex acoustic environments,” in CLEF 2020-Conference and Labs of the Evaluation Forum, 2020. Available at: https://ceur-ws.org/Vol-2696/paper_262.pdf
[http]
FishSense
-
[1] P. Tueller, R. Maddukuri, P. Paxson, V. Suresh, A. Ashok, M. Bland, R. Wallace, J. Guerrero, B. Semmens, and R. Kastner, “FishSense: Underwater RGBD Imaging for Fish Measurement and Classification,” in OCEANS 2021 MTS/IEEE SAN DIEGO, IEEE, Sep. 2021. Available at: https://agu.confex.com/agu/OVS21/meetingapp.cgi/Paper/787405
[http]
There is a need for reliable underwater fish monitoring systems that can provide oceanographers and researchers with valuable data about life underwater. Most current methods rely heavily on human observation which is both error prone and costly. FishSense provides a solution that accelerates the use of depth cameras underwater, opening the door to 3D underwater imaging that is fast, accurate, cost effective, and energy efficient. FishSense is a sleek handheld underwater imaging device that captures both depth and color images. This data has been used to calculate the length of fish, which can be used to derive biomass and health. The FishSense platform has been tested through two separate deployments. The first deployment imaged a toy fish of known length and volume within a controlled testing pool. The second deployment was conducted within an 70,000 gallon aquarium tank with multiple species of fish. A Receiver Operating Characteristic (ROC) curve has been computed based on the detector’s performance across all images, and the mean and standard deviation of the length measurements of the detections has been computed.
-
[2] E. Wong, I. Humphrey, S. Switzer, C. Crutchfield, N. Hui, C. Schurgers, and R. Kastner, “Underwater Depth Calibration Using a Commercial Depth Camera,” Proceedings of the 16th International Conference on Underwater Networks & Systems, 2022, doi: 10.1145/3567600.3568158.
Depth cameras are increasingly used in research and industry in underwater settings. However, cameras that have been calibrated in air are notably inaccurate in depth measurements when placed underwater, and little research has been done to explore pre-existing depth calibration methodologies and their effectiveness in underwater environments. We used four methods of calibration on a low-cost, commercial depth camera both in and out of water. For each of these methods, we compared the predicted distance and length of objects from the camera with manually measured values to get an indication of depth and length accuracy. Our findings indicate that the standard methods of calibration in air are largely ineffective for underwater calibration and that custom calibration techniques are necessary to achieve higher accuracy.
Keywords: depth camera calibration, Underwater stereo vision
Mangrove Monitoring
-
[1] A. J. Hsu, E. Lo, J. Dorian, K. Qi, M. T. Costa, and B. G. Martinez, “Lessons on monitoring mangroves,” in UC San Diego: Aburto Lab, UC San Diego, 2019. Available at: https://escholarship.org/uc/item/3bg3206z
[http]
-
[2] D. Hicks, R. Kastner, C. Schurgers, A. Hsu, and O. Aburto, “Mangrove Ecosystem Detection using Mixed-Resolution Imagery with a Hybrid-Convolutional Neural Network,” in Thirty-fourth Conference on Neural Information Processing Systems Workshop: Tackling Climate Change with Machine Learning, 2020. Available at: https://www.climatechange.ai/papers/neurips2020/23/paper.pdf
Mangrove forests are rich in biodiversity and are a large contributor to carbon sequestration critical in the fight against climate change. However, they are currently under threat from anthropogenic activities, so monitoring their health, extent, and productivity is vital to our ability to protect these important ecosystems. Traditionally, lower resolution satellite imagery or high resolution unmanned air vehicle (UAV) imagery has been used independently to monitor mangrove extent, both offering helpful features to predict mangrove extent. To take advantage of both of these data sources, we propose the use of a hybrid neural network, which combines a Convolutional Neural Network (CNN) feature extractor with a Multilayer-Perceptron (MLP), to accurately detect mangrove areas using both medium resolution satellite and high resolution drone imagery. We present a comparison of our novel Hybrid CNN with algorithms previously applied to mangrove image classification on a data set we collected of dwarf mangroves from consumer UAVs in Baja California Sur, Mexico, and show a 95% intersection over union (IOU) score for mangrove image classification, outperforming all our baselines
Radio Telemetry Tracking
-
[1] N. T. Hui, E. K. Lo, J. B. Moss, G. P. Gerber, M. E. Welch, R. Kastner, and C. Schurgers, “A more precise way to localize animals using drones,” Journal of Field Robotics, 2021, doi: https://doi.org/10.1002/rob.22017.
Abstract Radio telemetry is a commonly used technique in conservation biology and ecology, particularly for studying the movement and range of individuals and populations. Traditionally, most radio telemetry work is done using handheld directional antennae and either direction-finding and homing techniques or radio-triangulation techniques. Over the past couple of decades, efforts have been made to utilize unmanned aerial vehicles to make radio-telemetry tracking more efficient, or cover more area. However, many of these approaches are complex and have not been rigorously field-tested. To provide scientists with reliable quality tracking data, tracking systems need to be rigorously tested and characterized. In this paper, we present a novel, drone-based, radio-telemetry tracking method for tracking the broad-scale movement paths of animals over multiple days and its implementation and deployment under field conditions. During a 2-week field period in the Cayman Islands, we demonstrated this system’s ability to localize multiple targets simultaneously, in daily 10 min tracking sessions over a period of 2 weeks, generating more precise estimates than comparable efforts using manual triangulation techniques.
Keywords: aerial robotics, environmental monitoring, exploration, rotorcraft
Smartfin
-
[1] P. Bresnahan, T. Cyronak, R. J. W. Brewin, A. Andersson, T. Wirth, T. Martz, T. Courtney, N. Hui, R. Kastner, A. Stern, T. McGrain, D. Reinicke, J. Richard, K. Hammond, and S. Waters, “A high-tech, low-cost, Internet of Things surfboard fin for coastal citizen science, outreach, and education,” Continental Shelf Research, vol. 242, p. 104748, 2022, doi: https://doi.org/10.1016/j.csr.2022.104748.
Coastal populations and hazards are escalating simultaneously, leading to an increased importance of coastal ocean observations. Many well-established observational techniques are expensive, require complex technical training, and offer little to no public engagement. Smartfin, an oceanographic sensor–equipped surfboard fin and citizen science program, was designed to alleviate these issues. Smartfins are typically used by surfers and paddlers in surf zone and nearshore regions where they can help fill gaps between other observational assets. Smartfin user groups can provide data-rich time-series in confined regions. Smartfin comprises temperature, motion, and wet/dry sensing, GPS location, and cellular data transmission capabilities for the near-real-time monitoring of coastal physics and environmental parameters. Smartfin’s temperature sensor has an accuracy of 0.05 °C relative to a calibrated Sea-Bird temperature sensor. Data products for quantifying ocean physics from the motion sensor and additional sensors for water quality monitoring are in development. Over 300 Smartfins have been distributed around the world and have been in use for up to five years. The technology has been proven to be a useful scientific research tool in the coastal ocean—especially for observing spatiotemporal variability, validating remotely sensed data, and characterizing surface water depth profiles when combined with other tools—and the project has yielded promising results in terms of formal and informal education and community engagement in coastal health issues with broad international reach. In this article, we describe the technology, the citizen science project design, and the results in terms of natural and social science analyses. We also discuss progress toward our outreach, education, and scientific goals.
Keywords: Coastal oceanography, Citizen science, Surfing, Sea surface temperature, Outreach
-
[2] N. Hui, “Smartfin Current Efforts.” GitHub, September 2023. Available at: https://github.com/UCSD-E4E/smartfin-docs/blob/master/current_efforts.md
[http]