Skip to content

Commit

Permalink
Merge branch 'main' into homepage-layout
Browse files Browse the repository at this point in the history
  • Loading branch information
Sean1572 authored May 23, 2024
2 parents a4a848f + 072be0b commit 71a9545
Show file tree
Hide file tree
Showing 36 changed files with 248 additions and 61 deletions.
48 changes: 48 additions & 0 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -160,6 +160,17 @@ Fill in permalink, title, and category with the same values used in the respecti

Your post feed page on the website will be found at `/insert-project-link/project-updates`

#### OPTIONAL: Add onboarding papers
- Add your bib file for your project to `_bibliography/onboarding_papers`
- In `onboarding_papers.md`, add the following to the front matter's paper list:

---
- bib_file: name_of_bib_file
name: Project Name
url: /insert-project-link
---

url MUST be the same as permalink on your project

## Components (Includes)
Jeykll allows for components to be embedded in markdown files. See [https://jekyllrb.com/docs/includes/](https://jekyllrb.com/docs/includes/)
Expand Down Expand Up @@ -235,3 +246,40 @@ Format:
enable_nav=true or false
%}

## Adding blog posts
Create a new file in `/_posts` with the following name: `{year}-{month}-{day}-{hyphenated-title}.md`. For example: `2024-05-10-e4e-releases-new-jekyll-website.md`.

At the top of the file, add the following:
```
---
date: {year}-{month}-{day} {hour}:{minute}-{timezone offset}:00
layout: blog-post
title: {title}
categories:
- news-and-updates
author: {your name}
featuredImage: {relative path to featured image}
tags:
- {additional tags}
---
```

For example:
```
---
date: 2024-05-07 21:45-07:00
layout: blog-post
title: Ronan Wallace Awarded Fulbright for Floods of Lubra
categories:
- news-and-updates
author: Nathan Hui
featuredImage: assets/floods_of_lubra/fieldwork-nepal.jpg
tags:
- floods-of-lubra
- fulbright
---
```

Add the contents of your blog post after this preable, using the appropriate components.

Commit this and any included images to a new branch (we recommend using the same format as the blob post file name). Request a review from one of the website admins and enable auto merge.
26 changes: 26 additions & 0 deletions _bibliography/onboarding_papers/acoustic_species_id.bib
Original file line number Diff line number Diff line change
@@ -0,0 +1,26 @@
@INPROCEEDINGS{ayers_jandali_hwang_et_al_icml_2021,
author = {Ayers, Jacob and Jandali, Yaman and Hwang, Yoo Jin and Steinberg, Gabriel and Joun, Erika and Tobler, Mathias and Ingram, Ian and Kastner, Ryan and Schurgers, Curt},
title = {Challenges in Applying Audio Classification Models to Datasets Containing Crucial Biodiversity Information},
booktitle = {38th International Conference on Machine Learning},
year = {2021},
volume={38},
abstract={The acoustic signature of a natural soundscape can reveal consequences of climate change on biodiversity. Hardware costs, human labor time, and expertise dedicated to labeling audio are impediments to conducting acoustic surveys across a representative portion of an ecosystem. These barriers are quickly eroding away with the advent of low-cost, easy to use, open source hardware and the expansion of the machine learning field providing pre-trained neural networks to test on retrieved acoustic data. One consistent challenge in passive acoustic monitoring (PAM) is a lack of reliability from neural networks on audio recordings collected in the field that contain crucial biodiversity information that otherwise show promising results from publicly available training and test sets. To demonstrate this challenge, we tested a hybrid recurrent neural network (RNN) and convolutional neural network (CNN) binary classifier trained for bird presence/absence on two Peruvian bird audiosets. The RNN achieved an area under the receiver operating characteristics (AUROC) of 95% on a dataset collected from Xeno-canto and Google’s AudioSet ontology in contrast to 65% across a stratified random sample of field recordings collected from the Madre de Dios region of the Peruvian Amazon. In an attempt to alleviate this discrepancy, we applied various audio data augmentation techniques in the network’s training process which led to an AUROC of 77% across the field recordings},
month={July},
url={https://www.climatechange.ai/papers/icml2021/14/paper.pdf},
eprint={https://www.climatechange.ai/papers/icml2021/14}}
@article{kahl_denton_klinck_et_al_CLEF_2023,
title={Overview of BirdCLEF 2023: Automated bird species identification in Eastern Africa},
author={Kahl, Stefan and Denton, Tom and Klinck, Holger and Reers, Hendrik and Cherutich, Francis and Glotin, Herv{\'e} and Go{\"e}au, Herv{\'e} and Vellinga, Willem-Pier and Planqu{\'e}, Robert and Joly, Alexis},
journal={Working Notes of CLEF},
url={https://www.researchgate.net/publication/373603820_Overview_of_BirdCLEF_2023_Automated_Bird_Species_Identification_in_Eastern_Africa_40_International_CC_BY_40},
year={2023}
}
@inproceedings{kahl_clapp_hopping_et_al_CLEF_2020,
title={Overview of birdclef 2020: Bird sound recognition in complex acoustic environments},
author={Kahl, Stefan and Clapp, Mary and Hopping, W Alexander and Go{\"e}au, Herv{\'e} and Glotin, Herv{\'e} and Planqu{\'e}, Robert and Vellinga, Willem-Pier and Joly, Alexis},
booktitle={CLEF 2020-Conference and Labs of the Evaluation Forum},
volume={2696},
number={262},
url={https://ceur-ws.org/Vol-2696/paper_262.pdf},
year={2020}
}
26 changes: 26 additions & 0 deletions _bibliography/onboarding_papers/fishsense.bib
Original file line number Diff line number Diff line change
@@ -0,0 +1,26 @@
@INPROCEEDINGS{tueller_maddukuri_paxson_et_al_oceans_2021,
author = {Peter Tueller and Raghav Maddukuri and Patrick Paxson and Vivaswat Suresh and Arjun Ashok and Madison Bland and Ronan Wallace and Julia Guerrero and Brice Semmens and Ryan Kastner},
title = {FishSense: Underwater RGBD Imaging for Fish Measurement and Classification},
booktitle = {OCEANS 2021 MTS/IEEE SAN DIEGO},
year = {2021},
month={September},
publisher={IEEE},
abstract={There is a need for reliable underwater fish monitoring systems that can provide oceanographers and researchers with valuable data about life underwater. Most current methods rely heavily on human observation which is both error prone and costly. FishSense provides a solution that accelerates the use of depth cameras underwater, opening the door to 3D underwater imaging that is fast, accurate, cost effective, and energy efficient. FishSense is a sleek handheld underwater imaging device that captures both depth and color images. This data has been used to calculate the length of fish, which can be used to derive biomass and health. The FishSense platform has been tested through two separate deployments. The first deployment imaged a toy fish of known length and volume within a controlled testing pool. The second deployment was conducted within an 70,000 gallon aquarium tank with multiple species of fish. A Receiver Operating Characteristic (ROC) curve has been computed based on the detector’s performance across all images, and the mean and standard deviation of the length measurements of the detections has been computed.},
url={https://agu.confex.com/agu/OVS21/meetingapp.cgi/Paper/787405}}
@ARTICLE{wong_humphrey_switzer_wuwnet_2022,
author = {Wong, Emily and Humphrey, Isabella and Switzer, Scott and Crutchfield, Christopher and Hui, Nathan and Schurgers, Curt and Kastner, Ryan},
url = {https://doi.org/10.1145/3567600.3568158},
publisher = {Association for Computing Machinery},
title = {Underwater Depth Calibration Using a Commercial Depth Camera},
isbn = {9781450399524},
year = {2022},
address = {New York, NY, USA},
doi = {10.1145/3567600.3568158},
abstract = {Depth cameras are increasingly used in research and industry in underwater settings. However, cameras that have been calibrated in air are notably inaccurate in depth measurements when placed underwater, and little research has been done to explore pre-existing depth calibration methodologies and their effectiveness in underwater environments. We used four methods of calibration on a low-cost, commercial depth camera both in and out of water. For each of these methods, we compared the predicted distance and length of objects from the camera with manually measured values to get an indication of depth and length accuracy. Our findings indicate that the standard methods of calibration in air are largely ineffective for underwater calibration and that custom calibration techniques are necessary to achieve higher accuracy.},
booktitle = {Proceedings of the 16th International Conference on Underwater Networks & Systems},
articleno = {22},
numpages = {5},
keywords = {depth camera calibration, Underwater stereo vision},
location = {Boston, MA, USA},
series = {WUWNet '22}
}
18 changes: 18 additions & 0 deletions _bibliography/onboarding_papers/mangrove_monitoring.bib
Original file line number Diff line number Diff line change
@@ -0,0 +1,18 @@
@INCOLLECTION{hsu_lo_dorian_et_al_ucsd_2019,
author = {Astrid J. Hsu and Eric Lo and John Dorian and Katherine Qi and Matthew T. Costa and Benigno Guerrero Martinez},
title = {Lessons on monitoring mangroves},
booktitle = {UC San Diego: Aburto Lab},
publisher = {UC San Diego},
year = {2019},
url={https://escholarship.org/uc/item/3bg3206z}}
@inproceedings{hicks_kastner_schurgers_et_all_neurips_2020,
title={Mangrove Ecosystem Detection using Mixed-Resolution Imagery with a Hybrid-Convolutional Neural Network},
author={Hicks, Dillon and Kastner, Ryan and Schurgers, Curt and Hsu, Astrid and Aburto, Octavio},
year={2020},
volume={},
number={},
pages={},
doi={},
booktitle={Thirty-fourth Conference on Neural Information Processing Systems Workshop: Tackling Climate Change with Machine Learning},
url={https://www.climatechange.ai/papers/neurips2020/23/paper.pdf},
abstract={Mangrove forests are rich in biodiversity and are a large contributor to carbon sequestration critical in the fight against climate change. However, they are currently under threat from anthropogenic activities, so monitoring their health, extent, and productivity is vital to our ability to protect these important ecosystems. Traditionally, lower resolution satellite imagery or high resolution unmanned air vehicle (UAV) imagery has been used independently to monitor mangrove extent, both offering helpful features to predict mangrove extent. To take advantage of both of these data sources, we propose the use of a hybrid neural network, which combines a Convolutional Neural Network (CNN) feature extractor with a Multilayer-Perceptron (MLP), to accurately detect mangrove areas using both medium resolution satellite and high resolution drone imagery. We present a comparison of our novel Hybrid CNN with algorithms previously applied to mangrove image classification on a data set we collected of dwarf mangroves from consumer UAVs in Baja California Sur, Mexico, and show a 95% intersection over union (IOU) score for mangrove image classification, outperforming all our baselines}}
10 changes: 10 additions & 0 deletions _bibliography/onboarding_papers/radio_telemetry_tracking.bib
Original file line number Diff line number Diff line change
@@ -0,0 +1,10 @@
@article{hui_lo_moss_et_al_jfr_2021,
author = {Hui, Nathan T. and Lo, Eric K. and Moss, Jen B. and Gerber, Glenn P. and Welch, Mark E. and Kastner, Ryan and Schurgers, Curt},
title = {A more precise way to localize animals using drones},
journal = {Journal of Field Robotics},
year = {2021},
keywords = {aerial robotics, environmental monitoring, exploration, rotorcraft},
doi = {https://doi.org/10.1002/rob.22017},
url = {https://onlinelibrary.wiley.com/doi/abs/10.1002/rob.22017},
eprint = {https://onlinelibrary.wiley.com/doi/pdf/10.1002/rob.22017},
abstract = {Abstract Radio telemetry is a commonly used technique in conservation biology and ecology, particularly for studying the movement and range of individuals and populations. Traditionally, most radio telemetry work is done using handheld directional antennae and either direction-finding and homing techniques or radio-triangulation techniques. Over the past couple of decades, efforts have been made to utilize unmanned aerial vehicles to make radio-telemetry tracking more efficient, or cover more area. However, many of these approaches are complex and have not been rigorously field-tested. To provide scientists with reliable quality tracking data, tracking systems need to be rigorously tested and characterized. In this paper, we present a novel, drone-based, radio-telemetry tracking method for tracking the broad-scale movement paths of animals over multiple days and its implementation and deployment under field conditions. During a 2-week field period in the Cayman Islands, we demonstrated this system's ability to localize multiple targets simultaneously, in daily 10 min tracking sessions over a period of 2 weeks, generating more precise estimates than comparable efforts using manual triangulation techniques.}}
24 changes: 24 additions & 0 deletions _bibliography/onboarding_papers/smartfin.bib
Original file line number Diff line number Diff line change
@@ -0,0 +1,24 @@
@article{bresnehan_cyronak_brewin_et_al_csr_2022,
title = {A high-tech, low-cost, Internet of Things surfboard fin for coastal citizen science, outreach, and education},
journal = {Continental Shelf Research},
volume = {242},
pages = {104748},
year = {2022},
issn = {0278-4343},
doi = {https://doi.org/10.1016/j.csr.2022.104748},
url = {https://www.sciencedirect.com/science/article/pii/S0278434322001029},
author = {Philip Bresnahan and Tyler Cyronak and Robert J.W. Brewin and Andreas Andersson and Taylor Wirth and Todd Martz and Travis Courtney and Nathan Hui and Ryan Kastner and Andrew Stern and Todd McGrain and Danica Reinicke and Jon Richard and Katherine Hammond and Shannon Waters},
keywords = {Coastal oceanography, Citizen science, Surfing, Sea surface temperature, Outreach},
abstract = {Coastal populations and hazards are escalating simultaneously, leading to an increased importance of coastal ocean observations. Many well-established observational techniques are expensive, require complex technical training, and offer little to no public engagement. Smartfin, an oceanographic sensor–equipped surfboard fin and citizen science program, was designed to alleviate these issues. Smartfins are typically used by surfers and paddlers in surf zone and nearshore regions where they can help fill gaps between other observational assets. Smartfin user groups can provide data-rich time-series in confined regions. Smartfin comprises temperature, motion, and wet/dry sensing, GPS location, and cellular data transmission capabilities for the near-real-time monitoring of coastal physics and environmental parameters. Smartfin's temperature sensor has an accuracy of 0.05 °C relative to a calibrated Sea-Bird temperature sensor. Data products for quantifying ocean physics from the motion sensor and additional sensors for water quality monitoring are in development. Over 300 Smartfins have been distributed around the world and have been in use for up to five years. The technology has been proven to be a useful scientific research tool in the coastal ocean—especially for observing spatiotemporal variability, validating remotely sensed data, and characterizing surface water depth profiles when combined with other tools—and the project has yielded promising results in terms of formal and informal education and community engagement in coastal health issues with broad international reach. In this article, we describe the technology, the citizen science project design, and the results in terms of natural and social science analyses. We also discuss progress toward our outreach, education, and scientific goals.}
}

@Misc{current_efforts,
author = {Nathan Hui},
howpublished = {GitHub},
month = sep,
title = {Smartfin Current Efforts},
year = {2023},
url = {https://github.com/UCSD-E4E/smartfin-docs/blob/master/current_efforts.md},
}

@Comment{jabref-meta: databaseType:bibtex;}
11 changes: 11 additions & 0 deletions _bibliography/onboarding_papers/support_group.bib
Original file line number Diff line number Diff line change
@@ -0,0 +1,11 @@
@Misc{e4e-hw,
title = {E4E Hardware Group},
url = {https://github.com/UCSD-E4E/e4e-hw},
}

@Misc{e4e_esg,
title = {E4E Engineering Support Group},
url = {https://github.com/UCSD-E4E/engineering_support_group},
}

@Comment{jabref-meta: databaseType:bibtex;}
70 changes: 30 additions & 40 deletions _data/carousels/front_carousel.yml
Original file line number Diff line number Diff line change
@@ -1,42 +1,32 @@
title: front_carousel
images:
- image: {
"url": /assets/banner_guatemala.jpg,
"title": "Mapping a Maya Temple in Guatemala"
}
- image: {
"url": /assets/banner_guatemala.jpg,
"title": "Launching the Autonomous Airplane at Scripps"
}
- image: {
"url": /assets/banner_tiger.jpg,
"title": "Intelligent Camera Trap at San Diego Zoo"
}
- image: {
"url": /assets/banner_laketahoe.jpg,
"title": "In the Field at Lake Tahoe, CA"
}
- image: {
"url": /assets/banner_dustin_tunnel.jpg,
"title": "Scanning Tunnels in Maya Temples"
}
- image: {
"url": /assets/banner_spherecam.jpg,
"title": "Testing the SphereCam in La Jolla Cove"
}
- image: {
"url": /assets/banner_balloon.jpg,
"title": "Aerial Camera Platform and Fallen Star at UCSD"
}
- image: {
"url": /assets/banner_barges.png,
"title": "Imaging Barges in Emerald Bay, Lake Tahoe"
}
- image: {
"url": /assets/kayak_banner.jpg,
"title": "Flying Copters Off of Kayaks"
}
- image: {
"url": /assets/banner_mayatemple.jpg,
"title": "On top of a Maya Temple in Guatemala"
}
- url: assets/fishsense/2023-07-17_nathan_fishsense_lite.jpg
title: "Staff Engineer Nathan Hui diving the FishSense Lite Camera"
crop: "4000x1754+0+550"
- url: assets/smartfin/2022-07-13_Robert_OBrien_heads_out_to_ocean_to_test_smartfin.jpg
title: "Robert O'Brien heads out to the ocean to test the device"
crop: "2800x1228+0+0"
- url: assets/acoustic_species_id/2021-08-10_expedition_team.jpg
title: "Sean Perry, Jacob Ayers, and Mugen Blue at the Scripps Reserve placing Audiomoths"
crop: "2736x1200+0+0"
- url: assets/2024-01-26_e4e_cse_open_house.jpg
title: E4E at the 2024 CSE Open House
crop: 4080x1789+0+1000
- url: assets/2023-08-10_sealab_reu_group_photo.jpg
title: 2023 REU Students in Sealab
crop: 6048x2652+0+650
- url: assets/fishsense/2023-08-07_anna_jordan_hamish_testing_fishsense_in_pool.jpg
title: Anna Perez, Jordan Reichhardt, and Hamish Grant testing the FishSense Lite camera in a pool
crop: 6048x2652+0+100
- url: assets/fishsense/2023-08-07_jordan_holding_fishsense_camera.jpg
title: Jordan Reichhardt calibrating a FishSense Lite camera
crop: 6048x2652+0+100
- url: assets/mangrove/2022-03-21_crocodile.jpg
title: Crocodiles encountered in Jamaica while flying drones
crop: 4080x1789+0+1000
- url: assets/mangrove/2022-03-17_jamaica_expedition_team.jpg
title: Jamaica Mangrove Expedition Team
crop: 2866x1257+303+1200
- url: assets/fishsense/2022-08-18_emily_testing_realsense.jpg
title: Emily Wong demonstrating the Intel RealSense camera
crop: 6000x2631+0+00
18 changes: 0 additions & 18 deletions _data/carousels/test_carousel.yml

This file was deleted.

5 changes: 3 additions & 2 deletions _includes/carousel.html
Original file line number Diff line number Diff line change
Expand Up @@ -12,9 +12,10 @@
<ul class="carousel__track">
{% for item in carousels.images %}
<div class="carousel__slide_title">
<li class="carousel__slide" style="background-image: url('{{ item.image['url'] }}'); | resize: ',webp'"></li>
{% capture resize_cmd %}1140x500!,jpg,{% if item['crop'] %},{{ item['crop'] }}{% endif %}{% endcapture %}
<li class="carousel__slide" style="background-image: url('{{ item['url'] | resize: resize_cmd }}');"></li>
<div class="carousel_titles">
<p class="carousel_title"> {{item.image['title']}} </p>
<p class="carousel_title"> {{item['title']}} </p>
</div>
</div>
{% endfor %}
Expand Down
Loading

0 comments on commit 71a9545

Please sign in to comment.