GLORIA

GEOMAR Library Ocean Research Information Access

feed icon rss

Your email was sent successfully. Check your inbox.

An error occurred while sending the email. Please try again.

Proceed reservation?

Export
  • 1
    Online Resource
    Online Resource
    Berlin, Heidelberg :Springer Berlin / Heidelberg,
    Keywords: Neurobiology. ; Electronic books.
    Type of Medium: Online Resource
    Pages: 1 online resource (488 pages)
    Edition: 1st ed.
    ISBN: 9783642545931
    DDC: 612.8
    Language: English
    Note: Intro -- Preface -- The Neural Field: A Framework for Brain Data Integration? -- References -- Contents -- 1 Tutorial on Neural Field Theory -- 1.1 Background -- 1.1.1 Synaptic Processing -- 1.1.2 Dendritic Processing -- 1.2 Tissue Level Firing Rate Models with Axo-Dendritic Connections -- 1.2.1 Turing Instability Analysis -- 1.2.2 Weakly Nonlinear Analysis: Amplitude Equations -- 1.2.2.1 Amplitude Equations for Planar Neural Fields -- 1.2.3 Brain Wave Equations -- 1.3 Travelling Waves and Localised States -- 1.3.1 Travelling Front -- 1.3.2 Stationary Bump -- 1.3.3 Interface Dynamics -- 1.4 Inverse Neural Modelling -- 1.4.1 Inverse Problems -- 1.4.2 Cognitive Modelling -- Appendix 1 -- Appendix 2 -- Appendix 3 -- Appendix 4 -- References -- Part I Theory of Neural Fields -- 2 A Personal Account of the Development of the Field Theory of Large-Scale Brain Activity from 1945 Onward -- 2.1 Introduction -- 2.2 Lotka-Volterra Dynamics -- 2.2.1 A Neural Analogy -- 2.3 Population Dynamics -- 2.3.1 The Wilson-Cowan Equations -- 2.4 A Master Equation Approach -- 2.5 Field Theory -- 2.5.1 The Operator Map -- 2.5.2 Coherent States -- 2.5.3 Moment Generating Equations and Spin-Coherent States -- 2.5.4 A Neural Network Path Integral -- 2.6 Back to the Master Equation -- 2.6.1 A System-Size Expansion of the Master Equation -- 2.7 Another Look at Path Integrals -- 2.7.1 Bosonic Path-Integrals for Neural Networks -- 2.7.2 Observables of Neural Activity -- 2.7.3 The Effective Spiking Model -- 2.7.4 Variational Techniques -- 2.7.5 Renormalizing the Action -- 2.7.6 Avalanches -- 2.7.7 Calculations Concerning Brain Rhythms and Spike Statistics -- 2.7.8 Closed Moment Equations -- 2.8 Stochastic Wilson-Cowan Equations -- 2.8.1 The E/I Master Equation -- 2.8.2 Work in Progress -- 2.9 Concluding Remarks -- References. , 3 Heaviside World: Excitation and Self-Organization of Neural Fields -- 3.1 Introduction -- 3.2 Dynamics of Excitation in a Homogeneous Neural Field -- 3.2.1 1D 1-Layer Field -- 3.2.2 1D Field with Two Layers -- 3.2.3 2D Field of Neural Excitation -- 3.3 Self-Organization of Neural Fields -- 3.3.1 Field Model of Self-Organization -- 3.3.2 Dynamics of the Receptive Field -- 3.3.3 Equilibrium Solution of Learning -- 3.3.4 Stability of the Equilibrium Solution -- 3.3.5 Kohonen Map -- 3.4 Conclusions -- References -- 4 Spatiotemporal Pattern Formation in Neural Fields with Linear Adaptation -- 4.1 Introduction -- 4.2 Bifurcations from the Homogeneous State -- 4.2.1 One Spatial Dimension -- 4.2.1.1 Zero Eigenvalue -- 4.2.1.2 Imaginary Eigenvalues -- 4.2.2 Two Spatial Dimensions -- 4.2.2.1 Zero Eigenvalue -- 4.2.2.2 Imaginary Eigenvalues -- 4.2.3 Summary of Pattern Formation -- 4.3 Response to Inputs in the Ring Network -- 4.3.1 Existence of Stationary Bumps -- 4.3.2 Linear Stability of Stationary Bumps -- 4.3.3 Existence of Traveling Bumps -- 4.3.4 Linear Stability of Traveling Bumps -- 4.4 Activity Bumps on the Infinite Line -- 4.4.1 Natural and Stimulus-Induced Stationary Activity Bumps -- 4.4.2 Natural and Stimulus-Locked Traveling Activity Bumps -- References -- 5 PDE Methods for Two-Dimensional Neural Fields -- 5.1 Introduction -- 5.2 Results in One Spatial Dimension -- 5.3 Two Dimensional Bumps and Rings -- 5.4 Spiral Waves -- 5.5 Conclusion -- References -- 6 Numerical Simulation Scheme of One- and Two Dimensional Neural Fields Involving Space-Dependent Delays -- 6.1 Introduction -- 6.2 The Novel Principle -- 6.3 The Numerical Implementation in Two Spatial Dimensions -- 6.4 Conclusion -- References -- 7 Spots: Breathing, Drifting and Scattering in a Neural FieldModel -- 7.1 Introduction -- 7.2 Heaviside Firing Rate and Interface Dynamics. , 7.3 Equivalent PDE Model and Numerical Bifurcation Analysis -- 7.4 Amplitude Equations for Breathing -- 7.5 Drifting -- 7.6 Center Manifold Reduction: Particle Description -- 7.7 Scattering -- 7.8 Discussion -- Appendix -- References -- 8 Heterogeneous Connectivity in Neural Fields: A StochasticApproach -- 8.1 Introduction -- 8.2 Local Inhomogeneity in Connection Density -- 8.2.1 Travelling Fronts and Pulses -- 8.2.2 Persistent Fluctuations -- 8.2.2.1 Simple Deterministic Connection Function -- 8.2.3 Activity Patterns and Correlations -- 8.2.4 Persistent Fluctuations in 2D -- 8.3 Long Range Connections -- 8.4 Conclusions -- References -- 9 Stochastic Neural Field Theory -- 9.1 Introduction -- 9.2 Population Density Method and Mean Field Theory -- 9.3 Stochastic Rate-Based Models -- 9.3.1 Neural Langevin Equation -- 9.3.2 Neural Master Equation -- 9.3.3 Continuum Limit -- 9.4 Traveling Waves in Stochastic Neural Fields -- 9.4.1 Traveling Fronts in a Deterministic Neural Field -- 9.4.2 Stochastic Neural Field with Extrinsic Noise -- 9.4.3 Explicit Results for a Heaviside Rate Function -- 9.5 Path Integral Representation of a Stochastic Neural Field -- 9.5.1 Pulled Fronts, Absorbing States and Extinction Events -- 9.5.2 Derivation of Path Integral Representation -- 9.5.3 Hamiltonian-Jacobi Dynamics and Population Extinction in the Weak-Noise Limit -- References -- 10 On the Electrodynamics of Neural Networks -- 10.1 Introduction -- 10.2 Pyramidal Neuron Model -- 10.2.1 General Solution of the Circuit Equations -- 10.2.2 Observation Model -- 10.2.3 Neurodynamics -- 10.3 Leaky Integrate-and-Fire Model -- 10.3.1 Simplification -- 10.3.2 Simulation -- 10.4 Continuum Neural Field Model -- 10.4.1 Rate Model -- 10.4.2 Neuroelectrodynamics -- 10.5 Discussion -- References -- Part II Applications of Neural Fields -- 11 Universal Neural Field Computation. , 11.1 Introduction -- 11.2 Principles of Universal Computation -- 11.2.1 Variables and Data Types -- 11.2.2 Algorithms and Sequential Processes -- 11.3 Dynamic Field Automata -- 11.3.1 Turing Machines -- 11.3.2 Nonlinear Dynamical Automata -- 11.3.3 Neural Field Computation -- 11.4 Discussion -- References -- 12 A Neural Approach to Cognition Based on Dynamic Field Theory -- 12.1 Introduction -- 12.2 Grounding DFT in Neurophysiology -- 12.3 Dynamic Field Theory -- 12.4 DFT as an Approach to Cognition -- 12.5 Modeling Visual Working Memory and Change Detection with Dynamic Neural Fields -- 12.6 Conclusions -- References -- 13 A Dynamic Neural Field Approach to Natural and Efficient Human-Robot Collaboration -- 13.1 Introduction -- 13.2 Dynamic Neural Field Model of Joint Action -- 13.3 Model Details -- 13.4 Setup of Human-Robot Experiments -- 13.5 Results -- 13.5.1 Selection Based on an Anticipatory Model of Action Observation and Shared Task Knowledge -- 13.5.2 Understanding Partially Occluded Actions -- 13.6 Discussion -- References -- 14 Neural Field Modelling of the Electroencephalogram: Physiological Insights and Practical Applications -- 14.1 Introduction -- 14.2 A Mean Field Model of Electrocortical Rhythmogenesis -- 14.2.1 Model Extensions -- 14.3 Dynamical Features -- 14.3.1 A Novel Route to Chaos -- 14.3.2 Metabifurcations -- 14.3.3 Multistability -- 14.4 Physiological Relevance -- 14.4.1 The Resting Alpha Rhythm -- 14.4.1.1 Endogenous Pharmacological Modulation -- 14.4.1.2 Exogenous Pharmacological Modulation -- 14.4.2 Mass Action and the Monitoring of Anaesthetic Action -- 14.5 Conclusion -- References -- 15 Equilibrium and Nonequilibrium Phase Transitions in a Continuum Model of an Anesthetized Cortex -- 15.1 Introduction -- 15.2 Induction of Anesthesia as an Equilibrium Phase Transition. , 15.3 Critical Fluctuations at the ``Opalescent'' Point -- 15.4 Nonequilibrium Phase Transitions in the Cortical Model -- 15.4.1 Emergence of Hopf Instability -- 15.4.2 Emergence of Turing Instability -- 15.4.3 Mixed Modes: Hopf-Turing Interactions -- 15.5 Discussion -- 15.5.1 Critical Fluctuations -- 15.5.2 Nonequilibrium Phase Transitions in the Cortex -- 15.5.3 Effect of Anesthesia on Sub-brain Structures -- 15.5.4 Multiple Equilibria and Cortical Self-Organization -- Appendix -- Model Equations -- Modeling Propofol Anesthesia -- Linear Stability Analysis for Homogenous Stationary States -- References -- 16 Large Scale Brain Networks of Neural Fields -- 16.1 Introduction -- 16.2 Mathematical Formulation of a Large-Scale Brain Network of Neural Fields -- 16.3 Homogeneous Approximations of Neural Fields -- 16.4 Homogeneous Modeling of Neural Fields and First Heterogeneous Extensions -- 16.5 Full Brain Network Modeling Using the Connectome -- 16.6 The Virtual Brain -- 16.7 Conclusions and Final Remarks -- References -- 17 Neural Fields, Masses and Bayesian Modelling -- 17.1 Introduction -- 17.1.1 Neural Masses and Fields -- 17.1.2 Neural Fields as Models for Empirical Data -- 17.2 Generative Models for Cross Spectral Densities -- 17.2.1 The Jansen and Rit Model -- 17.2.2 The Canonical Microcircuit Model -- 17.2.3 Neural Field Extensions of the Canonical Microcircuit and Jansen-Rit Models -- 17.2.4 Power Spectra of Neural Fields -- 17.3 Neural Fields as Dynamic Causal Models -- 17.3.1 Probabilistic Models of Empirical Data and Their Inversion -- 17.3.2 LFP Auditory Cortex Data -- 17.3.3 MEG Data from the Visual Cortex -- 17.4 Conclusions -- References -- 18 Neural Field Dynamics and the Evolution of the Cerebral Cortex -- 18.1 Introduction -- 18.1.1 Genetic Expression, Cell Firing and Apoptosis in Cortical Development. , 18.1.2 Cell Firing and Synchronous Oscillation.
    Location Call Number Limitation Availability
    BibTip Others were also interested in ...
  • 2
    Online Resource
    Online Resource
    Cambridge :MIT Press,
    Keywords: Brain-Physiology. ; Dynamics. ; Electronic books.
    Description / Table of Contents: Experimental and theoretical approaches to global brain dynamics that draw on the latest research in the field.
    Type of Medium: Online Resource
    Pages: 1 online resource (355 pages)
    Edition: 1st ed.
    ISBN: 9780262305587
    Series Statement: Computational Neuroscience Series
    DDC: 612.8/2
    Language: English
    Note: Intro -- Contents -- Series Foreword -- Introduction -- 1 The Dynamical and Structural Basis of Brain Activity -- 1.1 Introduction -- 1.2 Attractors and Brain Dynamics -- 1.3 Autonomous Brain Dynamics -- 1.4 Conclusion -- 2 Functional Connectivity, Neurocognitive Networks, and Brain Dynamics -- 2.1 A Connectivity and Network Perspective on Cognition -- 2.2 Identifying Major Cognitive Networks -- 2.3 Intrinsic Connectivity Networks -- 2.4 Three Core Neurocognitive Networks -- 2.5 Dynamics of Signaling in the CEN, SN, and DMN -- 2.6 A Dynamical Network Model of Saliency, Attention, and Control -- 2.7 Conclusion -- 3 Decoding Mental States from Patterns of Brain Activity -- 3.1 Introduction -- 3.2 Multivariate Decoding -- 3.3 Decoding Dynamics -- 3.4 Conclusion -- 4 Transient Brain Dynamics -- 4.1 Transients versus Attractors in Perception -- 4.2 Robustness through Metastability: Winnerless Competition -- 4.3 Hierarchical Competition: Canonical Models -- 4.4 Modulation Instability and Resting-State Dynamics -- 4.5 Modeling Psychopathology -- 4.6 Conclusion -- 5 A Dynamic Field Account of Language-Related Brain Potentials -- 5.1 Introduction -- 5.2 ERP Experiment -- 5.3 Dynamic Cognitive Modeling -- 5.4 Conclusion -- 6 Recognition of Sequences of Sequences Using Nonlinear Dynamical Systems -- 6.1 Introduction -- 6.2 Stable Heteroclinic Channels -- 6.3 Hierarchies of Stable Heteroclinic Channels -- 6.4 Bayesian Recognition Using SHC Hierarchies -- 6.5 A Model of Speech Recognition -- 6.6 Some Simulated Examples -- 6.7 Discussion -- 7 The Stability of Information Flows in the Brain -- 7.1 How to Define an Information Flow in the Brain? -- 7.2 Working Memory Capacity -- 7.3 Limits of Sequential Language Processing -- 7.4 Interaction of Heteroclinic Channels: Binding Problem -- 7.5 Sequential Decision Making -- 7.6 Future Directions. , 8 Multiscale Electroencephalographic Dynamics and Brain Function -- 8.1 Neuronal Correlates of Electroencephalographic and Local Field Potential Rhythms -- 8.2 Measuring Multiscale Field Dynamics -- 8.3 Spatiotemporal Field Dynamics in Sleep -- 8.4 Mechanisms of Neuronal Synchronization -- 8.5 Effects of Electrical Fields on Neuronal Activity during Sleep -- 8.6 Conclusion -- 9 Mapping the Multiscale Information Content of Complex Brain Signals -- 9.1 Brain as a Complex System -- 9.2 Information-Theoretic Tools -- 9.3 Information and Nonlinear Dynamics -- 9.4 Approximate and Sample Entropy -- 9.5 Estimation of Sample Entropy -- 9.6 Multiscale Entropy -- 9.7 Applications of Multiscale Entropy -- 9.8 Complexity and Spectral Power -- 9.9 Complexity and Nonstationarity -- 9.10 Complexity and Network Structure -- 9.11 Conclusion -- 10 Connectivity and Dynamics of Neural Information Processing -- 10.1 Introduction -- 10.2 Structured Flows as Functional Units of Processes -- 10.3 Structured Flows on Manifolds -- 10.4 Emergence of SFMs from Spiking Neuron Networks -- 10.5 Functional Architectures: SFM and Timescale Hierarchies -- 10.6 Discussion -- 11 Transient Motor Behavior and Synchronization in the Cortex -- 11.1 Motor Behavior and Loss of Stability -- 11.2 Behavioral Models -- 11.3 A Model for Cortical Areas -- 11.4 Neural Mass Models - From Coupled Oscillators to Network Activity -- 11.5 Conclusion -- 12 Free Energy and Global Dynamics -- 12.1 Introduction -- 12.2 The Free-Energy Principle -- 12.3 Action and Its Observation -- 12.4 Conclusion -- 13 Perception, Action, and Utility: The Tangled Skein -- 13.1 Introduction -- 13.2 The Classical View -- 13.3 Segregation of Probability and Utility in the Brain? -- 13.4 Approximate Inference -- 13.5 New Vistas -- 13.6 Conclusion -- 14 Short Guide to Modern Nonlinear Dynamics -- 14.1 Dynamical Systems. , 14.2 Chaotic Dynamics -- 14.3 Homoclinic and Heteroclinic Dynamics -- 14.4 Conclusion -- Contributors -- Index.
    Location Call Number Limitation Availability
    BibTip Others were also interested in ...
  • 3
    Publication Date: 2024-03-06
    Description: 〈title xmlns:mml="http://www.w3.org/1998/Math/MathML"〉Abstract〈/title〉〈p xmlns:mml="http://www.w3.org/1998/Math/MathML" xml:lang="en"〉The usually short lifetime of convective storms and their rapid development during unstable weather conditions makes forecasting these storms challenging. It is necessary, therefore, to improve the procedures for estimating the storms' expected life cycles, including the storms' lifetime, size, and intensity development. We present an analysis of the life cycles of convective cells in Germany, focusing on the relevance of the prevailing atmospheric conditions. Using data from the radar‐based cell detection and tracking algorithm KONRAD of the German Weather Service, the life cycles of isolated convective storms are analysed for the summer half‐years from 2011 to 2016. In addition, numerous convection‐relevant atmospheric ambient variables (e.g., deep‐layer shear, convective available potential energy, lifted index), which were calculated using high‐resolution COSMO‐EU assimilation analyses (0.0625°), are combined with the life cycles. The statistical analyses of the life cycles reveal that rapid initial area growth supports wider horizontal expansion of a cell in the subsequent development and, indirectly, a longer lifetime. Specifically, the information about the initial horizontal cell area is the most important predictor for the lifetime and expected maximum cell area during the life cycle. However, its predictive skill turns out to be moderate at most, but still considerably higher than the skill of any ambient variable is. Of the latter, measures of midtropospheric mean wind and vertical wind shear are most suitable for distinguishing between convective cells with short lifetime and those with long lifetime. Higher thermal instability is associated with faster initial growth, thus favouring larger and longer living cells. A detailed objective correlation analysis between ambient variables, coupled with analyses discriminating groups of different lifetime and maximum cell area, makes it possible to gain new insights into their statistical connections. The results of this study provide guidance for predictor selection and advancements of nowcasting applications.〈/p〉
    Description: 〈p xmlns:mml="http://www.w3.org/1998/Math/MathML" xml:lang="en"〉Based on a combination of data of the cell tracking algorithm KONRAD of the German Weather Service and COSMO‐EU model analyses for the summer half‐years from 2011 to 2016, statistical relationships between storm attributes (lifetime and maximum horizontal area), and ambient variables as well as the storms' history are quantified. The initial growth of the cell area is a better indicator of the lifetime and maximum area than ambient variables are. Of the latter, measures of the midtropospheric wind and vertical wind shear, in particular, are most suitable for distinguishing between convective cells with short and long lifetimes, whereas higher convective instability favours larger cells. 〈boxed-text position="anchor" id="qj4505-blkfxd-0001" content-type="graphic" xml:lang="en"〉〈graphic position="anchor" id="jats-graphic-1" xlink:href="urn:x-wiley:00359009:media:qj4505:qj4505-toc-0001"〉 〈/graphic〉 〈/boxed-text〉〈/p〉
    Description: Bundesministerium für Digitales und Verkehr http://dx.doi.org/10.13039/100008383
    Keywords: ddc:551.6 ; convective storms ; life cycle ; multisource data ; nowcasting ; statistics ; weather prediction
    Language: English
    Type: doc-type:article
    Location Call Number Limitation Availability
    BibTip Others were also interested in ...
  • 4
    Publication Date: 2023-01-04
    Description: Based on the numerical weather prediction model COSMO of Germany's national meteorological service (Deutscher Wetterdienst, DWD), regional reanalysis datasets have been developed with grid spacing of up to 2 km. This development started as a fundamental research activity within the Hans-Ertel-Centre for Weather Research (HErZ) at the University of Bonn and the University of Cologne. Today, COSMO reanalyses are an established product of the DWD and have been widely used in applications on European and national German level. Successful applications of COSMO reanalyses include renewable energy assessments as well as meteorological risk estimates. The COSMO reanalysis datasets are now publicly available and provide spatio-temporal consistent data of atmospheric parameters covering both near-surface conditions and vertical profiles. This article reviews the status of the COSMO reanalyses, including evaluation results and applications. In many studies, evaluation of the COSMO reanalyses point to an overall good quality and often an added value compared to different contemporary global reanalysis datasets. We further outline current plans for the further development and application of regional reanalyses in the HErZ research group Cologne/Bonn in collaboration with the DWD.
    Type: Article , PeerReviewed
    Format: text
    Location Call Number Limitation Availability
    BibTip Others were also interested in ...
  • 5
    Publication Date: 2022-03-31
    Description: The local ensemble transform Kalman filter (LETKF) suggested by Hunt et al., 2007 is a very popular method for ensemble data assimilation. It is the operational method for convective‐scale data assimilation at Deutscher Wetterdienst (DWD). At DWD, based on the LETKF, three‐dimensional volume radar observations are assimilated operationally for the operational ICON‐D2. However, one major challenge for the LETKF is the situation where observations show precipitation (reflectivity) whereas all ensemble members do not show such reflectivity at a given point in space. In this case, there is no sensitivity of the LETKF with respect to the observations, and the analysis increment based on the observed reflectivity is zero. The goal of this work is to develop a targeted covariance inflation (TCI) for the assimilation of 3D‐volume radar data based on the LETKF, adding artificial sensitivity and making the LETKF react properly to the radar observations. The basic idea of the TCI is to employ an additive covariance inflation as entrance point for the LETKF. Here, we construct perturbations to the simulated observation which are used by the core LETKF assimilation step. The perturbations are constructed such that they exhibit a correlation between humidity and reflectivity. This leads to a change in humidity in such a way that precipitation is more likely to occur. We describe and demonstrate the theoretical basis of the method. We then present a case study where targeted covariance inflation leads to a clear improvement of the LETKF and precipitation forecast. All examples are based on the German radar network and the ICON‐D2 model over Central Europe.
    Description: The goal of this work is to develop a targeted covariance inflation (TCI) for the assimilation of 3D‐volume radar data based on the local ensemble transform Kalman filter (LETKF), adding artificial sensitivity and making the LETKF react properly to the radar observations. Perturbations to the simulated observations are constructed such that they exhibit an empirically derived correlation between humidity and reflectivity. This leads to a change in humidity in such a way that precipitation is more likely to occur.
    Description: Deutsche Forschungsgemeinschaft http://dx.doi.org/10.13039/501100001659
    Keywords: ddc:551.5
    Language: English
    Type: doc-type:article
    Location Call Number Limitation Availability
    BibTip Others were also interested in ...
  • 6
    Publication Date: 2021-10-12
    Description: A realistic simulation of the atmospheric boundary layer (ABL) depends on an accurate representation of the land–atmosphere coupling. Land surface temperature (LST) plays an important role in this context and the assimilation of LST can lead to improved estimates of the boundary layer and its processes. We assimilated synthetic satellite LST retrievals derived from a nature run as truth into a fully coupled, state-of-the-art land–atmosphere numeric weather prediction model. As assimilation system a local ensemble transform Kalman filter was used and the control vector was augmented by the soil temperature and humidity. To evaluate the concept of the augmented control vector, two-day case-studies with different control vector settings were conducted for clear-sky periods in March and August 2017. These experiments with hourly LST assimilation were validated against the nature run and overall, the RMSE of atmospheric and soil temperature of the first-guess (and analysis) were reduced. The temperature estimate of the ABL was particularly improved during daytime as was the estimate of the soil temperature during the whole diurnal cycle. The best impact of LST assimilation on the soil and the ABL was achieved with the augmented control vector. Through the coupling between the soil and the atmosphere, the assimilation of LST can have a positive impact on the temperature forecast of the ABL even after 15 hr because of the memory of the soil. These encouraging results motivate further work towards the assimilation of real satellite LST retrievals.
    Keywords: 551.5 ; data assimilation ; land–atmosphere coupling ; land surface temperature ; LETKF
    Language: English
    Type: map
    Location Call Number Limitation Availability
    BibTip Others were also interested in ...
  • 7
    facet.materialart.
    Unknown
    In:  EPIC3Quarterly Journal of the Royal Meteorological Society, 145(723), pp. 2335-2365
    Publication Date: 2019-09-24
    Description: Particle filters contain the promise of fully nonlinear data assimilation. They have been applied in numerous science areas, including the geosciences, but their application to high‐dimensional geoscience systems has been limited due to their inefficiency in high‐dimensional systems in standard settings. However, huge progress has been made, and this limitation is disappearing fast due to recent developments in proposal densities, the use of ideas from (optimal) transportation, the use of localization and intelligent adaptive resampling strategies. Furthermore, powerful hybrids between particle filters and ensemble Kalman filters and variational methods have been developed. We present a state‐of‐the‐art discussion of present efforts of developing particle filters for high‐dimensional nonlinear geoscience state‐estimation problems, with an emphasis on atmospheric and oceanic applications, including many new ideas, derivations and unifications, highlighting hidden connections, including pseudo‐code, and generating a valuable tool and guide for the community. Initial experiments show that particle filters can be competitive with present‐day methods for numerical weather prediction, suggesting that they will become mainstream soon.
    Repository Name: EPIC Alfred Wegener Institut
    Type: Article , isiRev
    Format: application/pdf
    Location Call Number Limitation Availability
    BibTip Others were also interested in ...
Close ⊗
This website uses cookies and the analysis tool Matomo. More information can be found here...