GLORIA

GEOMAR Library Ocean Research Information Access

feed icon rss

Your email was sent successfully. Check your inbox.

An error occurred while sending the email. Please try again.

Proceed reservation?

Export
  • 1
    Publication Date: 2023-11-08
    Description: The data are counts of megafaunal specimens in seabed photographs captured with a Teledyne Gavia autonomous underwater vehicle deployed from the RRS James Cook in May 2019 at a site in UK sector of the Central North Sea (Connelly, 2019), as part of the Strategies for Environmental Monitoring of Marine Carbon Capture and Storage (STEMM-CCS) project. The seabed photographs were captured using a GRAS-14S5M-C camera with a Tamron TAM 23FM08-L lens mounted to the Gavia autonomous underwater vehicle. The camera captured photographs at a temporal frequency of 1.875 frames per second, a resolution of 1280 x 960 pixels, and at a target altitude of 2 m above the seafloor. Overlapping photos were removed. Megafaunal specimens (〉1 cm) in the non-overlapping images were detected using the MAIA machine learning algorithm in BIIGLE. The potential specimens detected using this method were reviewed to remove false positives and classified into morphotypes manually. Counts by morphotype, latitude and longitude (in degrees), camera altitude (m above seafloor) and seabed area (m2) are provided for each photo. The following additional unchecked raw data are also provided: date, time, AUV mission number, and AUV heading, pitch, and roll. Acknowledgements We thank the crew and operators of the RRS James Cook and the Gavia autonomous underwater vehicle. The project was funded by the European Union's Horizon 2020 research and innovation programme under grant agreement No. 654462.
    Keywords: Actiniaria indeterminata; Aphrodita aculeata; Area; Asterias rubens; Astropecten irregularis; Autonomous underwater vehicle (Gavia); AUV; Bolocera tuediae; Cancer pagurus; Counting; DATE/TIME; Device type; Dive number; Eledone cirrhosa; Event label; fish; Fish; Heading; HEIGHT above ground; Hippasteria phrygiana; Image number/name; James Cook; JC180; JC180_AUV-5; JC180_AUV-7; JC180_AUV-8; LATITUDE; LONGITUDE; megafauna; Metridium senile; Myxine glutinosa; Nephrops; Nephrops norvegicus; North Sea; Pagurus sp.; Pennatula phosphorea; Pitch angle; Porifera; Resolution; Roll angle; seabed photograph; Spatangoida; STEMM-CCS; Strategies for Environmental Monitoring of Marine Carbon Capture and Storage; Unknown
    Type: Dataset
    Format: text/tab-separated-values, 80342 data points
    Location Call Number Limitation Availability
    BibTip Others were also interested in ...
  • 2
    ISSN: 1546-1718
    Source: Nature Archives 1869 - 2009
    Topics: Biology , Medicine
    Notes: [Auszug] Analysis of classical mouse mutations has been useful in the identification and study of many genes. We previously mapped Sox18, encoding an SRY-related transcription factor, to distal mouse chromosome 2 (ref. 2). This region contains a known mouse mutation, ragged (Ra), that affects the coat ...
    Type of Medium: Electronic Resource
    Location Call Number Limitation Availability
    BibTip Others were also interested in ...
  • 3
    Electronic Resource
    Electronic Resource
    Springer
    Mammalian genome 11 (2000), S. 1147-1149 
    ISSN: 1432-1777
    Source: Springer Online Journal Archives 1860-2000
    Topics: Biology , Medicine
    Type of Medium: Electronic Resource
    Location Call Number Limitation Availability
    BibTip Others were also interested in ...
  • 4
    Publication Date: 2020-11-04
    Repository Name: EPIC Alfred Wegener Institut
    Type: Article , peerRev
    Location Call Number Limitation Availability
    BibTip Others were also interested in ...
  • 5
    Publication Date: 2021-12-15
    Description: Carbon capture and storage (CCS) is a key technology to reduce carbon dioxide (CO2) emissions from industrial processes in a feasible, substantial, and timely manner. For geological CO2 storage to be safe, reliable, and accepted by society, robust strategies for CO2 leakage detection, quantification and management are crucial. The STEMM-CCS (Strategies for Environmental Monitoring of Marine Carbon Capture and Storage) project aimed to provide techniques and understanding to enable and inform cost-effective monitoring of CCS sites in the marine environment. A controlled CO2 release experiment was carried out in the central North Sea, designed to mimic an unintended emission of CO2 from a subsurface CO2 storage site to the seafloor. A total of 675 kg of CO2 were released into the shallow sediments (~3 m below seafloor), at flow rates between 6 and 143 kg/d. A combination of novel techniques, adapted versions of existing techniques, and well-proven standard techniques were used to detect, characterise and quantify gaseous and dissolved CO2 in the sediments and the overlying seawater. This paper provides an overview of this ambitious field experiment. We describe the preparatory work prior to the release experiment, the experimental layout and procedures, the methods tested, and summarise the main results and the lessons learnt.
    Repository Name: EPIC Alfred Wegener Institut
    Type: Article , isiRev , info:eu-repo/semantics/article
    Format: application/pdf
    Location Call Number Limitation Availability
    BibTip Others were also interested in ...
  • 6
    Publication Date: 2021-02-08
    Description: Digital imaging has become one of the most important techniques in environmental monitoring and exploration. In the case of the marine environment, mobile platforms such as autonomous underwater vehicles (AUVs) are now equipped with high-resolution cameras to capture huge collections of images from the seabed. However, the timely evaluation of all these images presents a bottleneck problem as tens of thousands or more images can be collected during a single dive. This makes computational support for marine image analysis essential. Computer-aided analysis of environmental images (and marine images in particular) with machine learning algorithms is promising, but challenging and different to other imaging domains because training data and class labels cannot be collected as efficiently and comprehensively as in other areas. In this paper, we present Machine learning Assisted Image Annotation (MAIA), a new image annotation method for environmental monitoring and exploration that overcomes the obstacle of missing training data. The method uses a combination of autoencoder networks and Mask Region-based Convolutional Neural Network (Mask R-CNN), which allows human observers to annotate large image collections much faster than before. We evaluated the method with three marine image datasets featuring different types of background, imaging equipment and object classes. Using MAIA, we were able to annotate objects of interest with an average recall of 84.1% more than twice as fast as compared to “traditional” annotation methods, which are purely based on software-supported direct visual inspection and manual annotation. The speed gain increases proportionally with the size of a dataset. The MAIA approach represents a substantial improvement on the path to greater efficiency in the annotation of large benthic image collections.
    Type: Article , PeerReviewed
    Format: text
    Format: archive
    Location Call Number Limitation Availability
    BibTip Others were also interested in ...
  • 7
    Publication Date: 2022-01-31
    Description: The evaluation of large amounts of digital image data is of growing importance for biology, including for the exploration and monitoring of marine habitats. However, only a tiny percentage of the image data collected is evaluated by marine biologists who manually interpret and annotate the image contents, which can be slow and laborious. In order to overcome the bottleneck in image annotation, two strategies are increasingly proposed: “citizen science” and “machine learning”. In this study, we investigated how the combination of citizen science, to detect objects, and machine learning, to classify megafauna, could be used to automate annotation of underwater images. For this purpose, multiple large data sets of citizen science annotations with different degrees of common errors and inaccuracies observed in citizen science data were simulated by modifying “gold standard” annotations done by an experienced marine biologist. The parameters of the simulation were determined on the basis of two citizen science experiments. It allowed us to analyze the relationship between the outcome of a citizen science study and the quality of the classifications of a deep learning megafauna classifier. The results show great potential for combining citizen science with machine learning, provided that the participants are informed precisely about the annotation protocol. Inaccuracies in the position of the annotation had the most substantial influence on the classification accuracy, whereas the size of the marking and false positive detections had a smaller influence.
    Type: Article , PeerReviewed
    Format: text
    Location Call Number Limitation Availability
    BibTip Others were also interested in ...
Close ⊗
This website uses cookies and the analysis tool Matomo. More information can be found here...