GLORIA

GEOMAR Library Ocean Research Information Access

feed icon rss

Your email was sent successfully. Check your inbox.

An error occurred while sending the email. Please try again.

Proceed reservation?

Export
  • 1
    Keywords: Forschungsbericht
    Type of Medium: Online Resource
    Pages: Online-Ressource (PDF-Datei: 22 S., 661 KB) , graph. Darst.
    Series Statement: Technische Berichte des Instituts für Informatik Bericht Nr. 1101 (Januar 2011)
    Language: English
    Note: Systemvoraussetzungen: Acrobat reader.
    Location Call Number Limitation Availability
    BibTip Others were also interested in ...
  • 2
    Electronic Resource
    Electronic Resource
    Springer
    Archiv der Mathematik 54 (1990), S. 65-72 
    ISSN: 1420-8938
    Source: Springer Online Journal Archives 1860-2000
    Topics: Mathematics
    Type of Medium: Electronic Resource
    Location Call Number Limitation Availability
    BibTip Others were also interested in ...
  • 3
    Publication Date: 2020-02-06
    Description: Background: For single-cell or metagenomic sequencing projects, it is necessary to sequence with a very high mean coverage in order to make sure that all parts of the sample DNA get covered by the reads produced. This leads to huge datasets with lots of redundant data. A filtering of this data prior to assembly is advisable. Brown et al. (2012) presented the algorithm Diginorm for this purpose, which filters reads based on the abundance of their k-mers. Methods: We present Bignorm, a faster and quality-conscious read filtering algorithm. An important new algorithmic feature is the use of phred quality scores together with a detailed analysis of the k-mer counts to decide which reads to keep. Results: We qualify and recommend parameters for our new read filtering algorithm. Guided by these parameters, we remove in terms of median 97.15% of the reads while keeping the mean phred score of the filtered dataset high. Using the SDAdes assembler, we produce assemblies of high quality from these filtered datasets in a fraction of the time needed for an assembly from the datasets filtered with Diginorm. Conclusions: We conclude that read filtering is a practical and efficient method for reducing read data and for speeding up the assembly process. This applies not only for single cell assembly, as shown in this paper, but also to other projects with high mean coverage datasets like metagenomic sequencing projects. Our Bignorm algorithm allows assemblies of competitive quality in comparison to Diginorm, while being much faster. Bignorm is available for download at https://git.informatik.uni-kiel.de/axw/Bignorm.
    Type: Article , PeerReviewed
    Format: text
    Location Call Number Limitation Availability
    BibTip Others were also interested in ...
  • 4
    facet.materialart.
    Unknown
    Copernicus Publications (EGU)
    In:  Geoscientific Model Development, 11 (3). pp. 1181-1198.
    Publication Date: 2021-02-08
    Description: Biogeochemical models, capturing the major feedbacks of the pelagic ecosystem of the world ocean, are today often embedded into Earth system models which are increasingly used for decision making regarding climate policies. These models contain poorly constrained parameters (e.g., maximum phytoplankton growth rate), which are typically adjusted until the model shows reasonable behavior. Systematic approaches determine these parameters by minimizing the misfit between the model and observational data. In most common model approaches, however, the underlying functions mimicking the biogeochemical processes are nonlinear and non-convex. Thus, systematic optimization algorithms are likely to get trapped in local minima and might lead to non-optimal results. To judge the quality of an obtained parameter estimate, we propose determining a preferably large lower bound for the global optimum that is relatively easy to obtain and that will help to assess the quality of an optimum, generated by an optimization algorithm. Due to the unavoidable noise component in all observations, such a lower bound is typically larger than zero. We suggest deriving such lower bounds based on typical properties of biogeochemical models (e.g., a limited number of extremes and a bounded time derivative). We illustrate the applicability of the method with two real-world examples. The first example uses real-world observations of the Baltic Sea in a box model setup. The second example considers a three-dimensional coupled ocean circulation model in combination with satellite chlorophyll a.
    Type: Article , PeerReviewed
    Format: text
    Location Call Number Limitation Availability
    BibTip Others were also interested in ...
  • 5
    facet.materialart.
    Unknown
    Copernicus Publications (EGU)
    In:  Geoscientific Model Development, 10 . pp. 127-154.
    Publication Date: 2020-02-06
    Description: Global biogeochemical ocean models contain a variety of different biogeochemical components and often much simplified representations of complex dynamical interactions, which are described by many (≈10–≈100) parameters. The values of many of these parameters are empirically difficult to constrain, due to the fact that in the models they represent processes for a range of different groups of organisms at the same time, while even for single species parameter values are often difficult to determine in situ. Therefore, these models are subject to a high level of parametric uncertainty. This may be of consequence for their skill with respect to accurately describing the relevant features of the present ocean, as well as their sensitivity to possible environmental changes. We here present a framework for the calibration of global biogeochemical ocean models on short and long time scales. The framework combines an offline approach for transport of biogeochemical tracers with an Estimation of Distribution Algorithm (Covariance Matrix Adaption Evolution Strategy, CMAES). We explore the performance and capability of this framework by five different optimizations of six biogeochemical parameters of a global biogeochemical model. First, a twin experiment explores the feasibility of this approach. Four optimizations against a climatology of observations of annual mean dissolved nutrients and oxygen determine the extent, to which different setups of the optimization influence model's fit and parameter estimates. Because the misfit function applied focuses on the large-scale distribution of inorganic biogeochemical tracers, parameters that act on large spatial and temporal scales are determined earliest, and with the least spread. Parameters more closely tied to surface biology, which act on shorter time scales, are more difficult to determine. In particular the search for optimum zooplankton parameters can benefit from a sound knowledge of maximum and minimum parameter values, leading to a more efficient optimization. It is encouraging that, although the misfit function does not contain any direct information about biogeochemical turnover, the optimized models nevertheless provide a better fit to observed global biogeochemical fluxes.
    Type: Article , PeerReviewed
    Format: text
    Format: archive
    Location Call Number Limitation Availability
    BibTip Others were also interested in ...
  • 6
    facet.materialart.
    Unknown
    Copernicus Publications (EGU)
    In:  Atmospheric Measurement Techniques, 3 (3). pp. 557-567.
    Publication Date: 2019-09-23
    Description: The recently increasing development of whole sky imagers enables temporal and spatial high-resolution sky observations. One application already performed in most cases is the estimation of fractional sky cover. A distinction between different cloud types, however, is still in progress. Here, an automatic cloud classification algorithm is presented, based on a set of mainly statistical features describing the color as well as the texture of an image. The k-nearest-neighbour classifier is used due to its high performance in solving complex issues, simplicity of implementation and low computational complexity. Seven different sky conditions are distinguished: high thin clouds (cirrus and cirrostratus), high patched cumuliform clouds (cirrocumulus and altocumulus), stratocumulus clouds, low cumuliform clouds, thick clouds (cumulonimbus and nimbostratus), stratiform clouds and clear sky. Based on the Leave-One-Out Cross-Validation the algorithm achieves an accuracy of about 97%. In addition, a test run of random images is presented, still outperforming previous algorithms by yielding a success rate of about 75%, or up to 88% if only "serious" errors with respect to radiation impact are considered. Reasons for the decrement in accuracy are discussed, and ideas to further improve the classification results, especially in problematic cases, are investigated.
    Type: Article , PeerReviewed
    Format: text
    Location Call Number Limitation Availability
    BibTip Others were also interested in ...
  • 7
    Publication Date: 2023-11-03
    Description: We present a new quantum-inspired evolutionary algorithm, the attractor population QEA (apQEA). Our benchmark problem is a classical and difficult problem from Combinatorics, namely finding low-discrepancy colorings in the hypergraph of arithmetic progressions on the first n integers, which is a massive hypergraph (e.g., with approx. 3.88 ×1011 hyperedges for n = 250 000). Its optimal low-discrepancy coloring bound is known and it has been a long-standing open problem to give practically and/or theoretically efficient algorithms. We show that apQEA outperforms known QEA approaches and the classical combinatorial algorithm (Sárközy 1974) by a large margin. Regarding practicability, it is also far superior to the SDP-based polynomial-time algorithm of Bansal (2010), the latter being a breakthrough work from a theoretical point of view. Thus we give the first practical algorithm to construct optimal colorings in this hypergraph, up to a constant factor. We hope that our work will spur further applications of Algorithm Engineering to Combinatorics.
    Type: Article , PeerReviewed
    Location Call Number Limitation Availability
    BibTip Others were also interested in ...
  • 8
    Publication Date: 2023-11-03
    Description: We introduce and study the Travelling Salesman Problem with Multiple Time Windows and Hotel Selection (TSP-MTWHS), which generalises the well-known Travelling Salesman Problem with Time Windows and the recently introduced Travelling Salesman Problem with Hotel Selection. The TSP-MTWHS consists in determining a route for a salesman (eg, an employee of a services company) who visits various customers at different locations and different time windows. The salesman may require a several-day tour during which he may need to stay in hotels. The goal is to minimise the tour costs consisting of wage, hotel costs, travelling expenses and penalty fees for possibly omitted customers. We present a mixed integer linear programming (MILP) model for this practical problem and a heuristic combining cheapest insert, 2-OPT and randomised restarting. We show on random instances and on real world instances from industry that the MlLP model can be solved to optimality in reasonable time with a standard MILP solver for several small instances. We also show that the heuristic gives the same solutions for most of the small instances, and is also fast, efficient and practical for large instances.
    Type: Article , PeerReviewed
    Location Call Number Limitation Availability
    BibTip Others were also interested in ...
  • 9
    Publication Date: 2023-11-03
    Description: Methods and results for parameter optimization and uncertainty analysis for a one-dimensional marine biogeochemical model of NPZD type are presented. The model, developed by Schartau and Oschlies, simulates the distribution of nitrogen, phytoplankton, zooplankton and detritus in a water column and is driven by ocean circulation data. Our aim is to identify parameters and fit the model output to given observational data. For this model, it has been shown that a satisfactory fit could not be obtained, and that parameters with comparable fits can vary significantly. Since these results were obtained by evolutionary algorithms (EA), we used a wider range of optimization methods: A special type of EA (called quantum-EA) with coordinate line search and a quasi-Newton SQP method, where exact gradients were generated by Automatic/Algorithmic Differentiation. Both methods are parallelized and can be viewed as instances of a hybrid, mixed evolutionary and deterministic optimization algorithm that we present in detail. This algorithm provides a flexible and robust tool for parameter identification and model validation. We show how the obtained parameters depend on data sparsity and given data error. We present an uncertainty analysis of the optimized parameters w.r.t. Gaussian perturbed data. We show that the model is well suited for parameter identification if the data are attainable. On the other hand, the result that it cannot be fitted to the real observational data without extension or modification, is confirmed. (C) 2010 Elsevier Ltd. All rights reserved.
    Type: Article , PeerReviewed
    Format: text
    Location Call Number Limitation Availability
    BibTip Others were also interested in ...
  • 10
    Publication Date: 2023-11-03
    Description: Frequency-selective, linear FIR filters are considered, as single systems and with analysis-synthesis filter banks. They are usually designed, in the single-channel case, to fulfill tolerances in the Chebychev sense, or in near-perfect-reconstruction filter banks, to minimize a reconstruction-error measure. If hardware is limited, fixed-point coefficient quantization is needed. It causes, in general, tolerance violations or a larger reconstruction error. Discrete re-optimization may help. A recent technique, able to handle also large filter orders, is sucessfully applied and newly extended to filter banks. Even better are randomized strategies, introduced and examined in the mathematical-optimization community over past 15 years; especially, randomized rounding is very effective. Thereby, good results are found for both single-system and filter-bank designs. We further introduce a new random sub-set selection within th above re-optimization. Like randomized rounding, it allows a trade-off between computational effort and solution quality. Clear improvements over deterministic heuristics are obtained by both randomized algorithms.
    Type: Article , PeerReviewed
    Format: text
    Location Call Number Limitation Availability
    BibTip Others were also interested in ...
Close ⊗
This website uses cookies and the analysis tool Matomo. More information can be found here...