GLORIA

GEOMAR Library Ocean Research Information Access

Ihre E-Mail wurde erfolgreich gesendet. Bitte prüfen Sie Ihren Maileingang.

Leider ist ein Fehler beim E-Mail-Versand aufgetreten. Bitte versuchen Sie es erneut.

Vorgang fortführen?

Exportieren
Filter
  • Wiley  (1)
  • Hu, Leland S.  (1)
  • Kazerouni, Anum S.  (1)
  • 1
    In: Magnetic Resonance in Medicine, Wiley, Vol. 91, No. 5 ( 2024-05), p. 1803-1821
    Kurzfassung: has often been proposed as a quantitative imaging biomarker for diagnosis, prognosis, and treatment response assessment for various tumors. None of the many software tools for quantification are standardized. The ISMRM Open Science Initiative for Perfusion Imaging–Dynamic Contrast‐Enhanced (OSIPI‐DCE) challenge was designed to benchmark methods to better help the efforts to standardize measurement. Methods A framework was created to evaluate values produced by DCE‐MRI analysis pipelines to enable benchmarking. The perfusion MRI community was invited to apply their pipelines for quantification in glioblastoma from clinical and synthetic patients. Submissions were required to include the entrants' values, the applied software, and a standard operating procedure. These were evaluated using the proposed score defined with accuracy, repeatability, and reproducibility components. Results Across the 10 received submissions, the score ranged from 28% to 78% with a 59% median. The accuracy, repeatability, and reproducibility scores ranged from 0.54 to 0.92, 0.64 to 0.86, and 0.65 to 1.00, respectively (0–1 = lowest–highest). Manual arterial input function selection markedly affected the reproducibility and showed greater variability in analysis than automated methods. Furthermore, provision of a detailed standard operating procedure was critical for higher reproducibility. Conclusions This study reports results from the OSIPI‐DCE challenge and highlights the high inter‐software variability within estimation, providing a framework for ongoing benchmarking against the scores presented. Through this challenge, the participating teams were ranked based on the performance of their software tools in the particular setting of this challenge. In a real‐world clinical setting, many of these tools may perform differently with different benchmarking methodology.
    Materialart: Online-Ressource
    ISSN: 0740-3194 , 1522-2594
    URL: Issue
    Sprache: Englisch
    Verlag: Wiley
    Publikationsdatum: 2024
    ZDB Id: 1493786-4
    Standort Signatur Einschränkungen Verfügbarkeit
    BibTip Andere fanden auch interessant ...
Schließen ⊗
Diese Webseite nutzt Cookies und das Analyse-Tool Matomo. Weitere Informationen finden Sie hier...