GLORIA

GEOMAR Library Ocean Research Information Access

feed icon rss

Your email was sent successfully. Check your inbox.

An error occurred while sending the email. Please try again.

Proceed reservation?

Export
  • 1
    Electronic Resource
    Electronic Resource
    Springer
    Climate dynamics 12 (1996), S. 287-297 
    ISSN: 1432-0894
    Source: Springer Online Journal Archives 1860-2000
    Topics: Geosciences , Physics
    Notes: Abstract. A case study of the application of recent methods of nonlinear time series analysis is presented. The 1848–1992 biweekly time series of the Great Salt Lake (GSL) volume is analyzed for evidence of low dimensional dynamics and predictability. The spectrum of Lyapunov exponents indicates that the average predictability of the GSL is a few hundred days. Use of the false nearest neighbor statistic shows that the dynamics of the GSL can be described in time delay coordinates by four dimensional vectors with components lagged by about half a year. Local linear maps are used in this embedding of the data and their skill in forecasting is tested in split sample mode for a variety of GSL conditions: lake average volume, near the beginning of a drought, near the end of a drought, prior to a period of rapid lake rise. Implications for modeling low frequency components of the hydro-climate system are discussed.
    Type of Medium: Electronic Resource
    Location Call Number Limitation Availability
    BibTip Others were also interested in ...
  • 2
    Electronic Resource
    Electronic Resource
    Springer
    Climate dynamics 12 (1996), S. 287-297 
    ISSN: 1432-0894
    Source: Springer Online Journal Archives 1860-2000
    Topics: Geosciences , Physics
    Notes: Abstract A case study of the application of recent methods of nonlinear time series analysis is presented. The 1848–1992 biweekly time series of the Great Salt Lake (GSL) volume is analyzed for evidence of low dimensional dynamics and predictability. The spectrum of Lyapunov exponents indicates that the average predictability of the GSL is a few hundred days. Use of the false nearest neighbor statistic shows that the dynamics of the GSL can be described in time delay coordinates by four dimensional vectors with components lagged by about half a year. Local linear maps are used in this embedding of the data and their skill in forecasting is tested in split sample mode for a variety of GSL conditions: lake average volume, near the beginning of a drought, near the end of a drought, prior to a period of rapid lake rise. Implications for modeling low frequency components of the hydro-climate system are discussed.
    Type of Medium: Electronic Resource
    Location Call Number Limitation Availability
    BibTip Others were also interested in ...
  • 3
    Electronic Resource
    Electronic Resource
    Springer
    Stochastic environmental research and risk assessment 11 (1997), S. 523-547 
    ISSN: 1436-3259
    Source: Springer Online Journal Archives 1860-2000
    Topics: Architecture, Civil Engineering, Surveying , Energy, Environment Protection, Nuclear Power Engineering , Geography , Geosciences
    Notes: Abstract Kernel density estimators are useful building blocks for empirical statistical modeling of precipitation and other hydroclimatic variables. Data driven estimates of the marginal probability density function of these variables (which may have discrete or continuous arguments) provide a useful basis for Monte Carlo resampling and are also useful for posing and testing hypotheses (e.g bimodality) as to the frequency distributions of the variable. In this paper, some issues related to the selection and design of univariate kernel density estimators are reviewed. Some strategies for bandwidth and kernel selection are discussed in an applied context and recommendations for parameter selection are offered. This paper complements the nonparametric wet/dry spell resampling methodology presented in Lall et al. (1996).
    Type of Medium: Electronic Resource
    Location Call Number Limitation Availability
    BibTip Others were also interested in ...
  • 4
    Electronic Resource
    Electronic Resource
    Springer
    Numerical algorithms 9 (1995), S. 85-106 
    ISSN: 1572-9265
    Keywords: Least absolute deviations ; robust regression ; smoothing and regression splines ; thin plate splines ; lowess ; cross validation ; nonparametric estimation
    Source: Springer Online Journal Archives 1860-2000
    Topics: Computer Science , Mathematics
    Notes: Abstract The computation ofL 1 smoothing splines on large data sets is often desirable, but computationally infeasible. A locally weighted, LAD smoothing spline based smoother is suggested, and preliminary results will be discussed. Specifically, one can seek smoothing splines in the spacesW m (D), with [0, 1] n ⊆D. We assume data of the formy i =f(t i )+ε i ,i=1,..., N with {t i } i=1 N ⊂D, the ε i are errors withE(ε i )=0, andf is assumed to be inW m . An LAD smoothing spline is the solution,s λ, of the following optimization problem $$\mathop {\min }\limits_{g \in W_m } \frac{1}{N}\sum\limits_{i = 1}^N {\left| {y_i - g(t_i )} \right| + \lambda J_m (f),} $$ whereJ m (g) is the seminorm consisting of the standard sum of the squaredL 2 norms of themth partial derivatives ofg. Such an LAD smoothing spline,s λ, would be expected to give robust smoothed estimates off in situations where the ε i are from a distribution with heavy tails. For fixed λ〉0, the solution to such a problem is known to be a thin plate spline onW m , and hences λ is assumed to be of the form $$s_\lambda = \sum\nolimits_{\nu = 1}^M {d_\nu } \phi _\nu + \sum\nolimits_{i = 1}^N {c_i } \zeta _i $$ where $$\zeta _i (t) = R_1 (t_i ,t),R(s,t) = R_0 (s,t) + R_1 (s,t)$$ is the reproducing kernel forW m (D), R 1 (t i ,t)=projW m 0 R(t i ,t), and the functions {φ v } v=1 M span the Kern (proj W m 0 )=Kern(J m ). Optimality conditions definings λ as the solution to (1) yield an algorithm for its computation. However, this computation becomes unwieldy whenN≃O(103). A possible remedy is to solve “local” problems of the form of (1), on neighborhoods of “size”b, and to blend these locally optimal LAD splines together producing a globally smooth estimator. Two smoothing parameters (the global value of “λ”, and the “local neighborhood” size “b”) should preferably have data driven, cross validated, choice.
    Type of Medium: Electronic Resource
    Location Call Number Limitation Availability
    BibTip Others were also interested in ...
  • 5
    Electronic Resource
    Electronic Resource
    Springer
    Numerical algorithms 5 (1993), S. 407-417 
    ISSN: 1572-9265
    Keywords: Least absolute deviations ; robust regression ; smoothing and regression splines ; thin plate splines ; radial basis functions ; cross validation ; nonparametric estimation
    Source: Springer Online Journal Archives 1860-2000
    Topics: Computer Science , Mathematics
    Notes: Abstract We propose an algorithm for the computation ofL 1 (LAD) smoothing splines in the spacesW M (D), with $$[0, 1]^n \subseteq D$$ . We assume one is given data of the formy i =(f(t i ) +ε i , i=1,...,N with {itti} i=1 N ⊂D , theε i are errors withE(ε i )=0, andf is assumed to be inW M . The LAD smoothing spline, for fixed smoothing parameterλ⩾0, is defined as the solution,s λ, of the optimization problem $$\min _{g \in W_M }$$ (1/N)∑ i=1 N ¦y i −g(t i ¦+λJ M (g), whereJ M (g) is the seminorm consisting of the sum of the squaredL 2 norms of theMth partial derivatives ofg. Such an LAD smoothing spline,s λ, would be expected to give robust smoothed estimates off in situations where theε i are from a distribution with heavy tails. The solution to such a problem is a “thin plate spline” of known form. An algorithm for computings λ is given which is based on considering a sequence of quadratic programming problems whose structure is guided by the optimality conditions for the above convex minimization problem, and which are solved readily, if a good initial point is available. The “data driven” selection of the smoothing parameter is achieved by minimizing aCV(λ) score of the form $$(1/N)[\sum\nolimits_{i = 1}^N {\left| {y_i - s_\lambda (t_i )} \right| + \sum\nolimits_{res_i = 0} 1 } ]$$ .The combined LAD-CV smoothing spline algorithm is a continuation scheme in λ↘0 taken on the above SQPs parametrized inλ, with the optimal smoothing parameter taken to be that value ofλ at which theCV(λ) score first begins to increase. The feasibility of constructing the LAD-CV smoothing spline is illustrated by an application to a problem in environment data interpretation.
    Type of Medium: Electronic Resource
    Location Call Number Limitation Availability
    BibTip Others were also interested in ...
  • 6
    Publication Date: 2021-04-22
    Description: Pluvial flood risk is mostly excluded in urban flood risk assessment. However, the risk of pluvial flooding is a growing challenge with a projected increase of extreme rainstorms compounding with an ongoing global urbanization. Considered as a flood type with minimal impacts when rainfall rates exceed the capacity of urban drainage systems, the aftermath of rainfall‐triggered flooding during Hurricane Harvey and other events show the urgent need to assess the risk of pluvial flooding. Due to the local extent and small‐scale variations, the quantification of pluvial flood risk requires risk assessments on high spatial resolutions. While flood hazard and exposure information is becoming increasingly accurate, the estimation of losses is still a poorly understood component of pluvial flood risk quantification. We use a new probabilistic multivariable modeling approach to estimate pluvial flood losses of individual buildings, explicitly accounting for the associated uncertainties. Except for the water depth as the common most important predictor, we identified the drivers for having loss or not and for the degree of loss to be different. Applying this approach to estimate and validate building structure losses during Hurricane Harvey using a property level data set, we find that the reliability and dispersion of predictive loss distributions vary widely depending on the model and aggregation level of property level loss estimates. Our results show that the use of multivariable zero‐inflated beta models reduce the 90% prediction intervalsfor Hurricane Harvey building structure loss estimates on average by 78% (totalling U.S.$3.8 billion) compared to commonly used models.
    Description: Key Points Recent severe pluvial flood events highlight the need to integrate pluvial flooding in urban flood risk assessment Probabilistic models provide reliable estimation of pluvial flood loss across spatial scales Beta distribution model reduces the 90% prediction interval for Hurricane Harvey building loss by U.S.$3.8 billion or 78%
    Description: Bundesministerium für Bildung und Forschung (BMBF) http://dx.doi.org/10.13039/501100002347
    Description: NSF GRFP
    Description: Fulbright Doctoral Program
    Keywords: 551.5 ; pluvial flooding ; loss modeling ; urban flooding ; probabilistic ; Hurricane Harvey ; climate change adaptation
    Type: article
    Location Call Number Limitation Availability
    BibTip Others were also interested in ...
Close ⊗
This website uses cookies and the analysis tool Matomo. More information can be found here...