GLORIA

GEOMAR Library Ocean Research Information Access

feed icon rss

Your email was sent successfully. Check your inbox.

An error occurred while sending the email. Please try again.

Proceed reservation?

Export
Filter
  • PANGAEA  (2)
  • PERGAMON-ELSEVIER SCIENCE LTD  (1)
Document type
Publisher
Years
  • 1
    Publication Date: 2023-11-08
    Description: The data are counts of megafaunal specimens in seabed photographs captured with a Teledyne Gavia autonomous underwater vehicle deployed from the RRS James Cook in May 2019 at a site in UK sector of the Central North Sea (Connelly, 2019), as part of the Strategies for Environmental Monitoring of Marine Carbon Capture and Storage (STEMM-CCS) project. The seabed photographs were captured using a GRAS-14S5M-C camera with a Tamron TAM 23FM08-L lens mounted to the Gavia autonomous underwater vehicle. The camera captured photographs at a temporal frequency of 1.875 frames per second, a resolution of 1280 x 960 pixels, and at a target altitude of 2 m above the seafloor. Overlapping photos were removed. Megafaunal specimens (〉1 cm) in the non-overlapping images were detected using the MAIA machine learning algorithm in BIIGLE. The potential specimens detected using this method were reviewed to remove false positives and classified into morphotypes manually. Counts by morphotype, latitude and longitude (in degrees), camera altitude (m above seafloor) and seabed area (m2) are provided for each photo. The following additional unchecked raw data are also provided: date, time, AUV mission number, and AUV heading, pitch, and roll. Acknowledgements We thank the crew and operators of the RRS James Cook and the Gavia autonomous underwater vehicle. The project was funded by the European Union's Horizon 2020 research and innovation programme under grant agreement No. 654462.
    Keywords: Actiniaria indeterminata; Aphrodita aculeata; Area; Asterias rubens; Astropecten irregularis; Autonomous underwater vehicle (Gavia); AUV; Bolocera tuediae; Cancer pagurus; Counting; DATE/TIME; Device type; Dive number; Eledone cirrhosa; Event label; fish; Fish; Heading; HEIGHT above ground; Hippasteria phrygiana; Image number/name; James Cook; JC180; JC180_AUV-5; JC180_AUV-7; JC180_AUV-8; LATITUDE; LONGITUDE; megafauna; Metridium senile; Myxine glutinosa; Nephrops; Nephrops norvegicus; North Sea; Pagurus sp.; Pennatula phosphorea; Pitch angle; Porifera; Resolution; Roll angle; seabed photograph; Spatangoida; STEMM-CCS; Strategies for Environmental Monitoring of Marine Carbon Capture and Storage; Unknown
    Type: Dataset
    Format: text/tab-separated-values, 80342 data points
    Location Call Number Limitation Availability
    BibTip Others were also interested in ...
  • 2
    Publication Date: 2024-02-27
    Description: The data consist of hourly observations of sedimentation impacts located approximately 31 m southwest of the drilling location, including measurements of proxies of suspended material in the water column, along with observations of the lamellate desmosponge specimen. Acoustic backscatter (1.9-2.0 MHz) and current speed were measured using a Seaguard RCM DW. A time-lapse camera was also deployed: the Nikon E995 camera was set to F 6.0, ISO 200, exposure 1/60, with photos of 2048 x 1536 pixels. As another estimate of suspended material in the water column, brightness (as mean RGB) was calculated for top corners (256 x 256 pixels) in photos, where the corners were not obscured by fish. Settlement of sediment on the sponge specimen was estimated as brightness of a portion of it (approximately 3600 pixels2) in the images. Movement of the sponge was estimated as the distance between successive xy-positions of the apex of the sponge in images. Mean values (6- and 12-hourly) centred on the hourly data, and sums of distance over 6- and 12-h periods were also calculated.
    Keywords: Backscatter; CLASS; Climate Linked Atlantic Sector Science; current meter; Current meter, SeaGuard; Current speed; Current speed as east vector; Current speed as north vector; DATE/TIME; Digital camera, Nikon, E995; Echo backscatter; Experiment duration; iAtlantic; Image brightness, RGB mean value; Image brightness, sponge, RGB mean value; Integrated Assessment of Atlantic Marine Ecosystems in Space and Time; LATITUDE; LONGITUDE; Movement distance, sponge, 2D; North_Atlantic_Hydrocarbon_Drilling; North Atlantic; offshore drilling; Scientific and Environmental ROV Partnership using Existing iNdustrial Technology; SERPENT; sponge; time-lapse photography; Underwater Photography
    Type: Dataset
    Format: text/tab-separated-values, 9829 data points
    Location Call Number Limitation Availability
    BibTip Others were also interested in ...
  • 3
    Publication Date: 2017-06-01
    Description: Given the need to describe, analyze and index large quantities of marine imagery data for exploration and monitoring activities, a range of specialized image annotation tools have been developed worldwide. Image annotation - the process of transposing objects or events represented in a video or still image to the semantic level, may involve human interactions and computer-assisted solutions. Marine image annotation software (MIAS) have enabled over 500 publications to date. We review the functioning, application trends and developments, by comparing general and advanced features of 23 different tools utilized in underwater image analysis. MIAS requiring human input are basically a graphical user interface, with a video player or image browser that recognizes a specific time code or image code, allowing to log events in a time-stamped (and/or geo-referenced) manner. MIAS differ from similar software by the capability of integrating data associated to video collection, the most simple being the position coordinates of the video recording platform. MIAS have three main characteristics: annotating events in real time, in posteriorly to annotation and interact with a database. These range from simple annotation interfaces, to full onboard data management systems, with a variety of toolboxes. Advanced packages allow to input and display of data from multiple sensors or multiple annotators via intranet or internet. Posterior human-mediated annotation often include tools for data display and image analysis, e.g. length, area, image segmentation, point count; and in a few cases the possibility of browsing and editing previous dive logs or to analyze annotation data. The interaction with a database allows the automatic integration of annotations from different surveys, repeated annotation and collaborative annotation of shared datasets, browsing and querying of data. Progress in the field of automated annotation is mostly in post processing, for stable platforms or still images. Integration into available MIAS is currently limited to semi-automated processes of pixel recognition through computer-vision modules that compile expert-based knowledge. Important topics aiding the choice of a specific software are outlined, the ideal software is discussed and future trends are presented.
    Repository Name: EPIC Alfred Wegener Institut
    Type: Article , isiRev
    Location Call Number Limitation Availability
    BibTip Others were also interested in ...
Close ⊗
This website uses cookies and the analysis tool Matomo. More information can be found here...