Data

Data from: RapidBenthos – Automated segmentation and multi-view classification of coral reef communities from photogrammetric reconstruction

James Cook University
Remmers, Tiny ; Boutros, Nader ; Wyatt, Mathew
Viewed: [[ro.stat.viewed]] Cited: [[ro.stat.cited]] Accessed: [[ro.stat.accessed]]
ctx_ver=Z39.88-2004&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Adc&rfr_id=info%3Asid%2FANDS&rft_id=info:doi10.25903/hk08-qx50&rft.title=Data from: RapidBenthos – Automated segmentation and multi-view classification of coral reef communities from photogrammetric reconstruction&rft.identifier=10.25903/hk08-qx50&rft.publisher=James Cook University&rft.description=Background (extract from related publication): Underwater photogrammetry is routinely used to monitor large areas of complex and heterogeneous ecosystems, such as coral reefs.  However, deriving data on benthic constituents from photogrammetry products has remained challenging due to the highly time-consuming process of manual extraction. We developed a machine learning approach to quantify benthic community composition in coral reefs from orthomosaics, which requires no manual delineation of benthic constituents for training or implementation. Methods: The RapidBenthos workflow was applyed to 12 plots, segmenting and classifying benthic constituents on the orthomosaics. These sites were selected to assess the performance of RapidBenthos in different environmental conditions and reef habitats, as they ranged from high visibility offshore reefs to turbid inshore reefs, spanned depths from 5 to 15 meters, and included a range of intra-reefal environments (i.e., reef front, flank, back, and lagoon). This method resulted in extracting community composition and colony-level metrics (i.e., colony planar-area and colony frequency). Research significance: The significance of this research lies in developing a workflow that automatically extract community composition information from close-range photogrammetry in any coral reefs environment. We evaluated that our method was 195 time faster than manual segmentation and classification allowing to sustainably scale 3D photogrammetry monitoring, both in replication and size of reefs surveyed compared to manual data extraction. Data record: The data available in this repository are the RapidBenthos python codes, outputs, and validation data created for the publication titled “RapidBenthos – Automated segmentation and multi-view classification of coral reef communities from photogrammetric reconstruction”. All 3D photogrammetric models and underlying images to replicate the method can be accessed on request (AIMS metadata: https://apps.aims.gov.au/metadata/edit/5905b3eb-aad0-4f9c-a03e-a02fb3488082). Software/equipment used to create/collect the data: All sites were imaged using a standardized diver-rig photogrammetry workflow described by Gordon et al. (2023). High-resolution benthic images (5686 x 3217 pixels) were captured using two Nikon D850 DSLR cameras with 20 mm Nikkor prime lens shooting at 0.5 second intervals (full camera settings described in (Gordon et al., 2023). Cameras were housed in Nauticam underwater housings with 8-inch dome ports and were mounted on an aluminum rig at a distance of 57 cm between lenses (60 % overlap between adjacent images, Figure 1c). Each site was imaged by a single diver on SCUBA over a period of 10-15 minutes to capture approximately 3,000 photos. Nadiral and oblique imagery was captured at an altitude of approximately 1.5 m using a “lawn-mowing” swim pattern consisting of 5 longitudinal passes and an additional 4-8 perpendicular passes. The swim pattern and speed used ensured a minimum overlap of 80 and 60 % between temporally and spatially adjacent photos, respectively  (Figure 1d). Six GPCs were distributed across the depth gradient of the site prior imaging to scale resultant models in X, Y, and Z axes (details provided in(Gordon et al., 2023). Depth was also recorded for each GPC to incorporate bathymetric information into 3D model building.  Software/equipment used to manipulate/analyse the data: The computational processing conducted in this study used Python 3.9.16 on a high-performance computer with a NVIDIA A100 PCIe 40 GB GPU node.&rft.creator=Remmers, Tiny &rft.creator=Boutros, Nader &rft.creator=Wyatt, Mathew &rft.date=2024&rft.coverage=east=142.954573; north=-9.998606; projection=WGS84&rft.coverage=east=143.404986; north=-9.753724; projection=WGS84&rft.coverage=east=143.40224; north=-9.863334; projection=WGS84&rft.coverage=east=145.446662; north=-14.698018; projection=WGS84&rft.coverage=east=146.223601; north=-16.876504; projection=WGS84&rft.coverage=east=146.506937; north=-18.553509; projection=WGS84&rft.coverage=east=146.504878; north=-18.614363; projection=WGS84&rft.coverage=east=147.653386; north=-18.829129; projection=WGS84&rft.coverage=east=150.971005; north=-23.199935; projection=WGS84&rft.coverage=east=151.926038; north=-23.45679; projection=WGS84&rft.coverage=east=152.404431; north=-23.893999; projection=WGS84&rft.coverage=Great Barrier Reef, Australia&rft.coverage=Torres Strait Islands&rft_rights=&rft_rights=CC BY-NC-SA 4.0: Attribution-Noncommercial-Share Alike 4.0 International http://creativecommons.org/licenses/by-nc-sa/4.0&rft_subject=Coral reefs&rft_subject=photogrammetry&rft_subject= automatic segmentation&rft_subject=multi-view classification&rft_subject=benthic community composition&rft_subject= colony level metrics&rft_subject=Landscape ecology&rft_subject=Ecological applications&rft_subject=ENVIRONMENTAL SCIENCES&rft_subject=Community ecology (excl. invasive species ecology)&rft_subject=Ecology&rft_subject=BIOLOGICAL SCIENCES&rft_subject=Marine and estuarine ecology (incl. marine ichthyology)&rft_subject=Marine biodiversity&rft_subject=Marine systems and management&rft_subject=ENVIRONMENTAL MANAGEMENT&rft_subject=Rehabilitation or conservation of marine environments&rft.type=dataset&rft.language=English Access the data

Licence & Rights:

Non-Commercial Licence view details
CC-BY-NC-SA

CC BY-NC-SA 4.0: Attribution-Noncommercial-Share Alike 4.0 International
http://creativecommons.org/licenses/by-nc-sa/4.0

Access:

Open view details

Open: free access under license

Contact Information



Full description

Background (extract from related publication): Underwater photogrammetry is routinely used to monitor large areas of complex and heterogeneous ecosystems, such as coral reefs.  However, deriving data on benthic constituents from photogrammetry products has remained challenging due to the highly time-consuming process of manual extraction. We developed a machine learning approach to quantify benthic community composition in coral reefs from orthomosaics, which requires no manual delineation of benthic constituents for training or implementation.

Methods: The RapidBenthos workflow was applyed to 12 plots, segmenting and classifying benthic constituents on the orthomosaics. These sites were selected to assess the performance of RapidBenthos in different environmental conditions and reef habitats, as they ranged from high visibility offshore reefs to turbid inshore reefs, spanned depths from 5 to 15 meters, and included a range of intra-reefal environments (i.e., reef front, flank, back, and lagoon). This method resulted in extracting community composition and colony-level metrics (i.e., colony planar-area and colony frequency).

Research significance: The significance of this research lies in developing a workflow that automatically extract community composition information from close-range photogrammetry in any coral reefs environment. We evaluated that our method was 195 time faster than manual segmentation and classification allowing to sustainably scale 3D photogrammetry monitoring, both in replication and size of reefs surveyed compared to manual data extraction.

Data record: The data available in this repository are the RapidBenthos python codes, outputs, and validation data created for the publication titled “RapidBenthos – Automated segmentation and multi-view classification of coral reef communities from photogrammetric reconstruction”. All 3D photogrammetric models and underlying images to replicate the method can be accessed on request (AIMS metadata: https://apps.aims.gov.au/metadata/edit/5905b3eb-aad0-4f9c-a03e-a02fb3488082).

Software/equipment used to create/collect the data: All sites were imaged using a standardized diver-rig photogrammetry workflow described by Gordon et al. (2023). High-resolution benthic images (5686 x 3217 pixels) were captured using two Nikon D850 DSLR cameras with 20 mm Nikkor prime lens shooting at 0.5 second intervals (full camera settings described in (Gordon et al., 2023). Cameras were housed in Nauticam underwater housings with 8-inch dome ports and were mounted on an aluminum rig at a distance of 57 cm between lenses (60 % overlap between adjacent images, Figure 1c). Each site was imaged by a single diver on SCUBA over a period of 10-15 minutes to capture approximately 3,000 photos. Nadiral and oblique imagery was captured at an altitude of approximately 1.5 m using a “lawn-mowing” swim pattern consisting of 5 longitudinal passes and an additional 4-8 perpendicular passes. The swim pattern and speed used ensured a minimum overlap of 80 and 60 % between temporally and spatially adjacent photos, respectively  (Figure 1d). Six GPCs were distributed across the depth gradient of the site prior imaging to scale resultant models in X, Y, and Z axes (details provided in(Gordon et al., 2023). Depth was also recorded for each GPC to incorporate bathymetric information into 3D model building. 

Software/equipment used to manipulate/analyse the data: The computational processing conducted in this study used Python 3.9.16 on a high-performance computer with a NVIDIA A100 PCIe 40 GB GPU node.

Created: 2024-05-17

Data time period: 10 2021 to 31 12 2022

This dataset is part of a larger collection

Click to explore relationships graph

142.95457,-9.99861

142.954573,-9.998606

143.40499,-9.75372

143.404986,-9.753724

143.40224,-9.86333

143.40224,-9.863334

145.44666,-14.69802

145.446662,-14.698018

146.2236,-16.8765

146.223601,-16.876504

146.50694,-18.55351

146.506937,-18.553509

146.50488,-18.61436

146.504878,-18.614363

147.65339,-18.82913

147.653386,-18.829129

150.97101,-23.19994

150.971005,-23.199935

151.92604,-23.45679

151.926038,-23.45679

152.40443,-23.894

152.404431,-23.893999

dcmiPoint: east=142.954573; north=-9.998606; projection=WGS84

dcmiPoint: east=143.404986; north=-9.753724; projection=WGS84

dcmiPoint: east=143.40224; north=-9.863334; projection=WGS84

dcmiPoint: east=145.446662; north=-14.698018; projection=WGS84

dcmiPoint: east=146.223601; north=-16.876504; projection=WGS84

dcmiPoint: east=146.506937; north=-18.553509; projection=WGS84

dcmiPoint: east=146.504878; north=-18.614363; projection=WGS84

dcmiPoint: east=147.653386; north=-18.829129; projection=WGS84

dcmiPoint: east=150.971005; north=-23.199935; projection=WGS84

dcmiPoint: east=151.926038; north=-23.45679; projection=WGS84

dcmiPoint: east=152.404431; north=-23.893999; projection=WGS84

text: Great Barrier Reef, Australia

text: Torres Strait Islands

Identifiers
  • DOI : 10.25903/HK08-QX50
  • Local : researchdata.jcu.edu.au//published/ce168ce0118d11ef9cdb556d1c39e52c