GLORIA

GEOMAR Library Ocean Research Information Access

Your email was sent successfully. Check your inbox.

An error occurred while sending the email. Please try again.

Proceed reservation?

Export
  • 1
    Online Resource
    Online Resource
    Forschungszentrum Julich, Zentralbibliothek ; 2019
    In:  Collective Dynamics Vol. 4 ( 2019-09-03)
    In: Collective Dynamics, Forschungszentrum Julich, Zentralbibliothek, Vol. 4 ( 2019-09-03)
    Abstract: Pedestrian dynamics is an interdisciplinary field of research. Psychologists, sociologists, traffic engineers, physicists, mathematicians and computer scientists all strive to understand the dynamics of a moving crowd. In principle, computer simulations offer means to further this understanding. Yet, unlike for many classic dynamical systems in physics, there is no universally accepted locomotion model for crowd dynamics. On the contrary, a multitude of approaches, with very different characteristics, compete. Often only the experts in one special model type are able to assess the consequences these characteristics have on a simulation study. Therefore, scientists from all disciplines who wish to use simulations to analyze pedestrian dynamics need a tool to compare competing approaches. Developers, too, would profit from an easy way to get insight into an alternative modeling ansatz. Vadere meets this interdisciplinary demand by offering an open-source simulation framework that is lightweight in its approach and in its user interface while offering pre-implemented versions of the most widely spread models.
    Type of Medium: Online Resource
    ISSN: 2366-8539
    Language: Unknown
    Publisher: Forschungszentrum Julich, Zentralbibliothek
    Publication Date: 2019
    detail.hit.zdb_id: 2854776-7
    Location Call Number Limitation Availability
    BibTip Others were also interested in ...
  • 2
    Online Resource
    Online Resource
    Forschungszentrum Julich, Zentralbibliothek ; 2020
    In:  Collective Dynamics Vol. 5 ( 2020-03-27)
    In: Collective Dynamics, Forschungszentrum Julich, Zentralbibliothek, Vol. 5 ( 2020-03-27)
    Abstract: In most agent-based simulators, pedestrians navigate from origins to destinations. Consequently, destinations are essential input parameters to the simulation. While many other relevant parameters as positions, speeds and densities can be obtained from sensors, like cameras, destinations cannot be observed directly. Our research question is: Can we obtain this information from video data using machine learning methods? We use density heatmaps, which indicate the pedestrian density within a given camera cutout, as input to predict the destination distributions. For our proof of concept, we train a Random Forest predictor on an exemplary data set generated with the Vadere microscopic simulator. The scenario is a crossroad where pedestrians can head left, straight or right. In addition, we gain first insights on suitable placement of the camera. The results motivate an in-depth analysis of the methodology.
    Type of Medium: Online Resource
    ISSN: 2366-8539
    Language: Unknown
    Publisher: Forschungszentrum Julich, Zentralbibliothek
    Publication Date: 2020
    detail.hit.zdb_id: 2854776-7
    Location Call Number Limitation Availability
    BibTip Others were also interested in ...
  • 3
    Online Resource
    Online Resource
    IOP Publishing ; 2022
    In:  Journal of Statistical Mechanics: Theory and Experiment Vol. 2022, No. 5 ( 2022-05-01), p. 053401-
    In: Journal of Statistical Mechanics: Theory and Experiment, IOP Publishing, Vol. 2022, No. 5 ( 2022-05-01), p. 053401-
    Abstract: Knowing the origins and destinations of pedestrians’ paths is key to the initialization of crowd simulations. Unfortunately, they are difficult to measure in the real world. This is one major challenge for live predictions during events such as festivals, soccer games, protest marches, and many others. Sensor data can be used to feed real-world observations into simulations in real-time. As input data for this study, we use density heatmaps generated from real-world trajectory data obtained from stereo sensors. Density information is compact, of constant size, and in general easier to obtain than e.g., individual trajectories. Therefore, the information limitation improves the applicability to other scenarios. We include the absolute pedestrian trip counts from origins to destinations during a brief time interval in an OD matrix, including unknown destinations due to sensor errors. Our goal is to estimate these OD matrices from a series of density heatmaps for the same interval. For this, we compute the ground truth OD matrices and density heatmaps using real-world trajectory data from a train station. We employ linear regression as a statistical learning method for estimation. We observe that the linear share of the relationship between density and OD matrix is estimated successfully. Nevertheless, a portion of the data remains that cannot be explained. We attempt to overcome this difficulty with random forest as a nonlinear model. The results indicate that both a linear and a nonlinear model can estimate some features of the OD matrices. However, there is no clear winner in terms of the chosen metric, the R 2 score. Overall, our findings are a strong indicator that OD matrices can indeed be estimated from density heatmaps extracted automatically from sensors.
    Type of Medium: Online Resource
    ISSN: 1742-5468
    Language: Unknown
    Publisher: IOP Publishing
    Publication Date: 2022
    detail.hit.zdb_id: 2138944-5
    Location Call Number Limitation Availability
    BibTip Others were also interested in ...
  • 4
    Online Resource
    Online Resource
    Wiley ; 2019
    In:  ChemViews ( 2019)
    In: ChemViews, Wiley, ( 2019)
    Type of Medium: Online Resource
    ISSN: 2190-3735
    Language: Unknown
    Publisher: Wiley
    Publication Date: 2019
    Location Call Number Limitation Availability
    BibTip Others were also interested in ...
Close ⊗
This website uses cookies and the analysis tool Matomo. More information can be found here...