GLORIA

GEOMAR Library Ocean Research Information Access

Your email was sent successfully. Check your inbox.

An error occurred while sending the email. Please try again.

Proceed reservation?

Export
  • 1
    In: European Journal of Clinical Investigation, Wiley, Vol. 51, No. 5 ( 2021-05)
    Abstract: Early diagnosis of cardiac amyloidosis (CA) is warranted to initiate specific treatment and improve outcome. The amyloid light chain (AL) and inferior wall thickness (IWT) scores have been proposed to assess patients referred by haematologists or with unexplained left ventricular (LV) hypertrophy, respectively. These scores are composed of 4 or 5 variables, respectively, including strain data. Methods Based on 2 variables common to the AL and IWT scores, we defined a simple score named AMYLoidosis Index (AMYLI) as the product of relative wall thickness (RWT) and E/e′ ratio, and assessed its diagnostic performance. Results In the original cohort (n = 251), CA was ultimately diagnosed in 111 patients (44%). The 2.22 value was selected as rule‐out cut‐off (negative likelihood ratio [LR−] 0.0). In the haematology subset, AL CA was diagnosed in 32 patients (48%), with 2.36 as rule‐out cut‐off (LR− 0.0). In the hypertrophy subset, ATTR CA was diagnosed in 79 patients (43%), with 2.22 as the best rule‐out cut‐off (LR− 0.0). In the validation cohort (n = 691), the same cut‐offs proved effective: indeed, there were no patients with CA in the whole population or in the haematology or hypertrophy subsets scoring  〈  2.22, 〈 2.36 or  〈  2.22, respectively. Conclusions The AMYLI score (RWT*E/e′) may have a role as an initial screening tool for CA. A  〈  2.22 value excludes the diagnosis in patients undergoing a diagnostic screening for CA, while a  〈  2.36 and a  〈  2.22 value may be better considered in the subsets with suspected cardiac AL amyloidosis or unexplained hypertrophy, respectively.
    Type of Medium: Online Resource
    ISSN: 0014-2972 , 1365-2362
    URL: Issue
    Language: English
    Publisher: Wiley
    Publication Date: 2021
    detail.hit.zdb_id: 2004971-7
    Location Call Number Limitation Availability
    BibTip Others were also interested in ...
  • 2
    In: ESC Heart Failure, Wiley, Vol. 8, No. 4 ( 2021-08), p. 3014-3025
    Abstract: Reverse remodelling (RR) is the recovery from left ventricular (LV) dilatation and dysfunction. Many arbitrary criteria for RR have been proposed. We searched the criteria with the strongest prognostic yield for the hard endpoint of cardiovascular death. Methods and results We performed a systematic literature search of diagnostic criteria for RR. We evaluated their prognostic significance in a cohort of 927 patients with LV ejection fraction (LVEF)  〈  50% undergoing two echocardiograms within 12 ± 2 months. These patients were followed for a median of 2.8 years (interquartile interval 1.3–4.9) after the second echocardiogram, recording 123 cardiovascular deaths. Two prognostic models were defined. Model 1 included age, LVEF, N‐terminal pro‐B‐type natriuretic peptide, ischaemic aetiology, cardiac resynchronization therapy, estimated glomerular filtration rate, New York Heart Association, and LV end‐systolic volume (LVESV) index, and Model 2 the validated Cardiac and Comorbid Conditions Heart Failure score. We identified 25 criteria for RR, the most used being LVESV reduction ≥15% (12 studies out of 42). In the whole cohort, two criteria proved particularly effective in risk reclassification over Model 1 and Model 2. These criteria were (i) LVEF increase 〉 10 U and (ii) LVEF increase ≥1 category [severe (LVEF ≤ 30%), moderate (LVEF 31–40%), mild LV dysfunction (LVEF 41–55%), and normal LV function (LVEF ≥ 56%)]. The same two criteria yielded independent prognostic significance and improved risk reclassification even in patients with more severe systolic dysfunction, namely, those with LVEF  〈  40% or LVEF ≤ 35%. Furthermore, LVEF increase 〉 10 U and LVEF increase ≥1 category displayed a greater prognostic value than LVESV reduction ≥15%, both in the whole cohort and in the subgroups with LVEF  〈  40% or LVEF ≤ 35%. For example, LVEF increase 〉 10 U independently predicted cardiovascular death over Model 1 and LVESV reduction ≥15% (hazard ratio 0.40, 95% confidence interval 0.18–0.90, P  = 0.026), while LVESV reduction ≥15% did not independently predict cardiovascular death ( P  = 0.112). Conclusions Left ventricular ejection fraction increase 〉 10 U and LVEF increase ≥1 category are stronger predictors of cardiovascular death than the most commonly used criterion for RR, namely, LVESV reduction ≥15%.
    Type of Medium: Online Resource
    ISSN: 2055-5822 , 2055-5822
    URL: Issue
    Language: English
    Publisher: Wiley
    Publication Date: 2021
    detail.hit.zdb_id: 2814355-3
    Location Call Number Limitation Availability
    BibTip Others were also interested in ...
  • 3
    Online Resource
    Online Resource
    Wiley ; 2015
    In:  Quality and Reliability Engineering International Vol. 31, No. 7 ( 2015-11), p. 1161-1175
    In: Quality and Reliability Engineering International, Wiley, Vol. 31, No. 7 ( 2015-11), p. 1161-1175
    Abstract: Engineers often cope with the problem of assessing the lifetime of industrial components, on the basis of observed industrial feedback data. Usually, lifetime is modelled as a continuous random variable, for instance, exponentially or Weibull distributed. However, in some cases, the features of the piece of equipment under investigation rather suggest the use of discrete probabilistic models. This happens for equipment that only operates on cycles or on demand. In these cases, the lifetime is rather measured in number of cycles or number of solicitations before failure; therefore, in theory, discrete models should be more appropriate. This article aims at bringing some light to the practical interest of the reliability engineer in using two discrete models among the most popular: the inverse Pólya distribution (IPD), based on a Pólya urn scheme, and the so‐called Weibull‐1 model. It is shown that for different reasons, the practical use of both models should be restricted to specific industrial situations. In particular, when nothing is a priori known over the nature of ageing and/or data are heavily right censored, they can remain of limited interest with respect to more flexible continuous lifetime models such as the usual (continuous) Weibull distribution. Nonetheless, the intuitive meaning given to the IPD could favour its use by engineers in low (decelerated) ageing situations. Copyright © 2015 John Wiley & Sons, Ltd.
    Type of Medium: Online Resource
    ISSN: 0748-8017 , 1099-1638
    URL: Issue
    Language: English
    Publisher: Wiley
    Publication Date: 2015
    detail.hit.zdb_id: 2021089-9
    Location Call Number Limitation Availability
    BibTip Others were also interested in ...
  • 4
    In: ESC Heart Failure, Wiley, Vol. 8, No. 2 ( 2021-04), p. 1216-1229
    Abstract: This study aimed to evaluate a novel echocardiographic algorithm for quantitative estimation of pulmonary artery wedge pressure (PAWP) and pulmonary vascular resistance (PVR) in patients with heart failure and pulmonary hypertension (PH) scheduled to right heart catheterization (RHC). Methods and results In this monocentric study, 795 consecutive patients (427 men; age 68.4 ± 12.1 years) undergoing echocardiography and RHC were evaluated. Multiple regression analysis was performed to identify echocardiographic predictors of PAWP and PVR measured by RHC in the derivation group (the first 200 patients). The diagnostic accuracy of the model was then tested in the validation group (the remaining 595 patients). PH was confirmed by RHC in 507 (63.8%) patients, with 192 (24.2%) cases of precapillary PH, 248 (31.2%) of postcapillary PH, and 67 (8.4%) of combined PH. At regression analysis, tricuspid regurgitation maximal velocity, mitral E/e′ ratio, left ventricular ejection fraction, right ventricular fractional area change, inferior vena cava diameter, and left atrial volume index were included in the model ( R  = 0.8, P   〈  0.001). The model showed a high diagnostic accuracy in estimating elevated PAWP (area under the receiver operating characteristic curve = 0.97, 92% sensitivity, and 93% specificity, P   〈  0.001) and PVR (area under the receiver operating characteristic curve = 0.96, 89% sensitivity, and 92% specificity, P   〈  0.001), outperforming 2016 American Society of Echocardiography/European Association of Cardiovascular Imaging recommendations ( P   〈  0.001) and Abbas' equation ( P   〈  0.001). Bland–Altman analysis showed satisfactory limits of agreement between echocardiography and RHC for PAWP (bias 0.7, 95% confidence interval −7.3 to 8.7) and PVR (bias −0.1, 95% confidence interval −2.2 to 1.9 Wood units), without indeterminate cases. Conclusions A novel quantitative echocardiographic approach for the estimation of PAWP and PVR has high diagnostic accuracy in patients with heart failure and PH.
    Type of Medium: Online Resource
    ISSN: 2055-5822 , 2055-5822
    URL: Issue
    Language: English
    Publisher: Wiley
    Publication Date: 2021
    detail.hit.zdb_id: 2814355-3
    Location Call Number Limitation Availability
    BibTip Others were also interested in ...
  • 5
    Online Resource
    Online Resource
    Wiley ; 2014
    In:  Quality and Reliability Engineering International Vol. 30, No. 7 ( 2014-11), p. 921-933
    In: Quality and Reliability Engineering International, Wiley, Vol. 30, No. 7 ( 2014-11), p. 921-933
    Abstract: Seismic hazard curves provide the rate (or probability) of exceedance of different levels of a ground motion parameter (e.g., the peak ground acceleration, PGA) in a given geographical point and for a given time frame. Hence, to evaluate seismic hazard curves, one needs an occurrence model of earthquakes and an attenuation law of the ground motion with the distance. Generally, the input data needed to define the occurrence model consists in values of the magnitude, experimentally observed or, in the case of ancient earthquakes, indirectly inferred based on historically recorded damages. In this paper, we sketch a full Bayesian methodology for estimating the parameters characterizing the seismic activity in pre‐determined seismotectonical zones, given such a catalog of recorded magnitudes. The statistical model, following the peak over threshold formalism, consists in the distribution of the annual number of earthquakes exceeding a given magnitude, coupled with the probability density of the magnitudes, given that they exceed the threshold. Then, as an example of the possible applications of the proposed methodology, the PGA is evaluated in several sites of interest, while taking into account the uncertainty tainting the parameters of the magnitudes' distribution in several seismotectonical zones and the attenuation law. Finally, some perspectives are sketched. Copyright © 2014 John Wiley & Sons, Ltd.
    Type of Medium: Online Resource
    ISSN: 0748-8017 , 1099-1638
    URL: Issue
    Language: English
    Publisher: Wiley
    Publication Date: 2014
    detail.hit.zdb_id: 2021089-9
    Location Call Number Limitation Availability
    BibTip Others were also interested in ...
  • 6
    In: Cancer Communications, Wiley, Vol. 42, No. 10 ( 2022-10), p. 1041-1045
    Type of Medium: Online Resource
    ISSN: 2523-3548 , 2523-3548
    URL: Issue
    Language: English
    Publisher: Wiley
    Publication Date: 2022
    detail.hit.zdb_id: 2922913-3
    Location Call Number Limitation Availability
    BibTip Others were also interested in ...
  • 7
    Online Resource
    Online Resource
    Wiley ; 2016
    In:  Quality and Reliability Engineering International Vol. 32, No. 6 ( 2016-10), p. 2043-2054
    In: Quality and Reliability Engineering International, Wiley, Vol. 32, No. 6 ( 2016-10), p. 2043-2054
    Abstract: Complex physical systems are increasingly modeled by computer codes which aim at predicting the reality as accurately as possible. During the last decade, code validation has benefited from a large interest within the scientific community because of the requirement to assess the uncertainty affecting the code outputs. Inspiring from past contributions to this task, a testing procedure is proposed in this paper to decide either a pure code prediction or a discrepancy‐corrected one should be used to provide the best approximation of the physical system. In a particular case where the computer code depends on uncertain parameters, this problem of model selection can be carried out in a Bayesian setting. It requires the specification of proper prior distributions that are well known as having a strong impact on the results. Another way consists in specifying non‐informative priors. However, they are sometimes improper, which is a major barrier for computing the Bayes factor. A way to overcome this issue is to use the so‐called intrinsic Bayes factor (IBF) in order to replace the ill‐defined Bayes factor when improper priors are used. For computer codes which depend linearly on their parameters, the computation of the IBF is made easier, thanks to some explicit marginalization. In the paper, we present a special case where the IBF is equal to the standard Bayes factor when the right‐Haar prior is specified on the code parameters and the scale of the code discrepancy. On simulated data, the IBF has been computed for several prior distributions. A confounding effect between the code discrepancy and the linear code is pointed out. Finally, the IBF is computed for an industrial computer code used for monitoring power plant production. Copyright © 2016 John Wiley & Sons, Ltd.
    Type of Medium: Online Resource
    ISSN: 0748-8017 , 1099-1638
    URL: Issue
    Language: English
    Publisher: Wiley
    Publication Date: 2016
    detail.hit.zdb_id: 2021089-9
    Location Call Number Limitation Availability
    BibTip Others were also interested in ...
  • 8
    Online Resource
    Online Resource
    Wiley ; 2013
    In:  Environmental Toxicology and Chemistry Vol. 32, No. 3 ( 2013-03), p. 602-611
    In: Environmental Toxicology and Chemistry, Wiley, Vol. 32, No. 3 ( 2013-03), p. 602-611
    Abstract: The species sensitivity distribution (SSD) approach is recommended for assessing chemical risk. In practice, however, it can be used only for the few substances for which large‐scale ecotoxicological results are available. Indeed, the statistical frequentist approaches used for building SSDs and for deriving hazardous concentrations (HC5) inherently require extensive data to guarantee goodness‐of‐fit. An alternative Bayesian approach to estimating HC5 from small data sets was developed. In contrast to the noninformative Bayesian approaches that have been tested to date, the authors' method used informative priors related to the expected species sensitivity variance. This method was tested on actual ecotoxicological data for 21 well‐informed substances. A cross‐validation compared the HC5 values calculated using frequentist approaches with the results of our Bayesian approach, using both complete and truncated data samples. The authors' informative Bayesian approach was compared with noninformative Bayesian methods published in the past, including those incorporating loss functions. The authors found that even for the truncated sample the HC5 values derived from the informative Bayesian approach were generally close to those obtained using the frequentist approach, which requires more data. In addition, the probability of overestimating an HC5 is rather limited. More robust HC5 estimates can be practically obtained from additional data without impairing regulatory protection levels, which will encourage collecting new ecotoxicological data. In conclusion, the Bayesian informative approach was shown to be relatively robust and could be a good surrogate approach for deriving HC5 values from small data sets. Environ. Toxicol. Chem. 2013;32:602–611. © 2012 SETAC
    Type of Medium: Online Resource
    ISSN: 0730-7268 , 1552-8618
    URL: Issue
    Language: English
    Publisher: Wiley
    Publication Date: 2013
    detail.hit.zdb_id: 2027441-5
    SSG: 12
    Location Call Number Limitation Availability
    BibTip Others were also interested in ...
  • 9
    In: Risk Analysis, Wiley, Vol. 37, No. 7 ( 2017-07), p. 1315-1340
    Abstract: Models for the assessment of the risk of complex engineering systems are affected by uncertainties due to the randomness of several phenomena involved and the incomplete knowledge about some of the characteristics of the system. The objective of this article is to provide operative guidelines to handle some conceptual and technical issues related to the treatment of uncertainty in risk assessment for engineering practice. In particular, the following issues are addressed: (1) quantitative modeling and representation of uncertainty coherently with the information available on the system of interest; (2) propagation of the uncertainty from the input(s) to the output(s) of the system model; (3) (Bayesian) updating as new information on the system becomes available; and (4) modeling and representation of dependences among the input variables and parameters of the system model. Different approaches and methods are recommended for efficiently tackling each of issues (1)‒(4) above; the tools considered are derived from both classical probability theory as well as alternative, nonfully probabilistic uncertainty representation frameworks (e.g., possibility theory). The recommendations drawn are supported by the results obtained in illustrative applications of literature.
    Type of Medium: Online Resource
    ISSN: 0272-4332 , 1539-6924
    URL: Issue
    Language: English
    Publisher: Wiley
    Publication Date: 2017
    detail.hit.zdb_id: 2001458-2
    SSG: 25
    SSG: 3,6
    Location Call Number Limitation Availability
    BibTip Others were also interested in ...
Close ⊗
This website uses cookies and the analysis tool Matomo. More information can be found here...