Keywords:
Information theory-Statistical methods.
;
Electronic books.
Description / Table of Contents:
Advances in Info-Metrics expands the study of info-metrics - a framework for modeling, reasoning, and drawing inferences under conditions of insufficient information - across disciplines. This volume explores the mathematical and philosophical foundations of information-theoretic inference and demonstrates how to solve problems using new cross-disciplinary case studies and examples.
Type of Medium:
Online Resource
Pages:
1 online resource (557 pages)
Edition:
1st ed.
ISBN:
9780190636715
URL:
https://ebookcentral.proquest.com/lib/geomar/detail.action?docID=6380730
DDC:
003.54
Language:
English
Note:
Cover -- Advances in Info-Metrics: Information and Information Processing across Disciplines -- Copyright -- Contents -- Preface -- The Info-Metrics Institute -- Acknowledgments -- Contributor List -- PART I: INFORMATION, MEANING, AND VALUE -- 1: Information and Its Value -- 1. Background and Basic Questions -- 2. Information -- 2.1 Definition and Types -- 2.2 Processing, Manipulating, and Converting to Knowledge -- 2.3 Observer and Receiver -- 3. The Value of Information -- 3.1 Utility and Value -- 3.2 Prices and Value -- 3.3 Hedonic Values: The Factors that Determine the Value -- 3.4 Digital Information and Value -- 3.5 Absolute versus Relative Value -- 3.6 Prices, Quantities, Probabilities, and Value -- 3.7 A Comment on Ethics versus Value -- 4. Risk, Decisions, Prices, and Value -- 5. The Value of Disinformation -- 6. Concluding Thoughts -- Acknowledgments -- References -- 2: A Computational Theory of Meaning -- 1. Introduction -- 1.1 Chapter Overview -- 1.2 Learning, Data Compression, and Model Selection -- 1.3 Some Examples -- 2. From Computation to Models -- 2.1 Two-Part Code Optimization -- 2.2 Reasons to Doubt the Validity of Two-Part Code Optimization -- 2.3 There Are Many Different Models That Could Be Selected as Optimal by Two-Part Code Optimization -- 2.4 Two-Part Code Optimization Selects Models That Have no Useful Stochastic Interpretation -- 2.5 Empirical Justification for Two-Part Code Optimization -- 3. Meaning as Computation -- 4. From a Single Computing Agent to Communitiesof Interactive Learning Agents -- 4.1 Semantics for Turing Machines -- 4.2 Descriptions of Turing Machines -- 4.3 Names for Turing Machines -- 4.4 Turing Frames -- 4.5 Variance of Turing Frames -- 4.6 Meaningful Information -- 4.7 Types of Agents -- 5. Discussion -- Acknowledgments -- References -- PART II: INFORMATION THEORY AND BEHAVIOR.
,
3: Inferring the Logic of Collective Information Processors -- 1. Introduction -- 2. First Task: Infer Individual-to-Aggregate Mapping -- 2.1 Inference Challenge 1: An Abundance of Potentially Relevant Detail-Solved by Large-Scale Reverse Engineering -- 2.2 Inference Challenge 2: Structural Uncertainty Due to Limited Data-Solved by Hierarchical Model Selection and Regularization -- 2.2.1 Maximum Entropy Modeling -- 2.2.2 Dynamical Inference -- 2.3 Inference Challenge 3: Parameter Uncertainty Due to Scale Separation and Sloppiness-Solved by Bayes and not Focusingon on Individual Parameters Individual Parameters -- 3. Second Task: Find Abstract System Logic -- 3.0.1 Why Do We Want to Do This? Advantages of Coarse-Grained, Dimensionality-Reduced Description -- 3.0.2 Do We Expect to Be Able to Compress? What Does "Logic" Look Like? -- 3.1 Logic Approach 1: Emergent Grouped Logic Processors: Clustering, Modularity, Sparse Coding, and Motifs -- 3.2 Logic Approach 2: Instability, Bifurcations, and Criticality -- 3.2.1 Fisher Information and Criticality -- 3.2.2 Dynamical Systems and Bifurcations -- 3.3 Logic Approach 3: Explicit Model Reduction -- 4. The Future of the Science of Living Systems -- Acknowledgments -- References -- 4: Information-Theoretic Perspective on Human Ability -- 1. Introduction -- 2. Information-Theoretic Perspective on Human Ability -- 2.1 The Maximum Entropy Principle and Information Capacity -- 2.1.1 Neoclassical Models and Bounded Rationality -- 2.1.2 Information Acquisition -- 2.1.3 Information Processing -- 2.1.4 Discerning Incorrect Information -- 2.2 Rational Inattention Theory and Extensions -- 2.3 Information Capacity and the Big Five Personality Traits -- 3. An Empirical Study on Information Capacity -- 3.1 Data -- 3.2 Information Acquisition and Processing -- 3.3 Accumulation of Wealth and the Big Five Personality Traits.
,
4. Conclusion -- Acknowledgments -- References -- 5: Information Recovery Related to Adaptive Economic Behavior and Choice -- 1. Introduction -- 2. Causal Entropy Maximization -- 3. An Information Recovery Framework -- 3.1 Examples of Two Information-Theoretic Behavioral Models -- 3.2 Convex Entropic Divergences -- 4. Further Examples-Applications -- 4.1 A Stochastic State-Space Framework -- 4.2 Network Behavior Recovery -- 4.3 Unlocking the Dynamic Content of Time Series -- 5. Summary Comments -- References -- PART III: INFO-METRICS AND THEORYCONSTRUCTION -- 6: Maximum Entropy: A Foundation for a Unified Theory of Ecology -- 1. Introduction -- 2. Ecological Theory -- 2.1 The Ecologist's Dilemma -- 2.2 Nonmechanistic Ecological Theory -- 2.3 The Logic of Inference -- 3. The Maximum Entropy Theory of Ecology: Basics and a Simple Model Realization -- 4. Failures of the Static ASNE Model of METE -- 4.1 Energy Equivalence -- 4.2 METE Fails in Rapidly Changing Systems -- 5. Hybrid Vigor in Ecological Theory -- 5.1 DynaMETE: A Natural Extension of the Static Theory -- 6. The Ultimate Goal -- Appendix Some Epistemological Considerations -- References -- 7: Entropic Dynamics: Mechanics without Mechanism -- 1. Introduction -- 2. The Statistical Model -- 2.1 Choosing the Prior -- 2.2 The Constraints -- 3. Entropic Time -- 3.1 Time as an Ordered Sequence of Instants -- 3.2 The Arrow of Entropic Time -- 3.3 Duration: A Convenient Time Scale -- 4. The Information Metric of Configuration Space -- 5. Diffusive Dynamics -- 6. Hamiltonian Dynamics -- 6.1 The Ensemble Hamiltonian -- 6.2 The Action -- 7. Information Geometry and the Quantum Potential -- 8. The Schrödinger Equation -- 9. Some Final Comments -- 9.1 Is ED Equivalent to Quantum Mechanics? -- 9.2 Is ED a Hidden-Variable Model? -- 9.3 On Interpretation -- Acknowledgments -- References.
,
PART IV: INFO-METRICS IN ACTION I: PREDICTION AND FORECASTS -- 8: Toward Deciphering of Cancer Imbalances: Using Information-Theoretic Surprisal Analysis for Understanding of Cancer Systems -- 1. Background -- 2. Information-Theoretic Approaches in Biology -- 3. Theory of Surprisal Analysis -- 4. Using Surprisal Analysis to Understand Intertumor Heterogeneity -- 4.1 A Thermodynamic-Based Interpretation of Protein Expression Heterogeneity in Glioblastoma Multiforme Tumors Identifies Tumor-Specific Unbalanced Processes -- 5. Toward Understanding Intratumor Heterogeneity -- 6. Using Surprisal Analysis to Predict a Direction of Change in Biological Processes -- 7. Summary -- References -- 9: Forecasting Socioeconomic Distributions on Small-Area Spatial Domains for Count Data -- 1. Introduction -- 2. Spatial Perspectives for Areal Data -- 2.1 Spatial Dependence -- 2.2 Spatial Heterogeneity -- 2.3 The Role of the Map -- 3. Spatial Models for Count Data -- 4. Information-Theoretic Methods for Spatial Count Data Models: GCE Area Level Estimators -- 5. Simulation Experiments -- 5.1 Scenario 1: GCE in a Spatial Homogeneous Process -- 5.2 Scenario 2: GCE in a Spatial Heterogeneous Process -- 5.3 Scenario 3: GCE in a Process of Spatial Dependence -- 6. An Empirical Application: Estimating Unemployment Levels of Immigrants at Municipal Scale in Madrid, 2011 -- 7. Conclusions -- References -- 10: Performance and Risk Aversion of Funds with Benchmarks: A Large Deviations Approach -- 1. Introduction -- 2. An Index of Outperformance Probability -- 2.1 Entropic Interpretation -- 2.2 Time-Varying Gaussian Log Returns -- 2.3 Familiar Performance Measures as Approximations -- 3. Nonparametric Estimation of the Performance Measure -- 3.1 Empirical Results -- 3.1.1 Fund Performance -- 4. Outperformance Probability Maximization as a Fund Manager Behavioral Hypothesis.
,
4.1 Scientific Principles for Evaluating Managerial Behavioral Hypotheses -- 5. Conclusions -- Acknowledgments -- References -- 11: Estimating Macroeconomic Uncertainty and Discord: Using Info-Metrics -- 1. Introduction -- 2. The Data -- 3. Aggregate Uncertainty, Aggregate Variance and Their Components -- 3.1 Fitting Continuous Distributions to Histogram Forecasts -- 3.2 Uncertainty Decomposition-Decomposing Aggregate Variance -- 3.3 Estimation of Variance -- 3.4 Correcting for Bin Size for Variance Decomposition -- 3.5 Variance Decomposition Results -- 4. Uncertainty and Information Measures -- 5. Time Series of Uncertainty Measures -- 6. Information Measure and "News" -- 7. Impact of Real Output Uncertainty on Macroeconomic Variables -- 8. Summary and Conclusions -- References -- 12: Reduced Perplexity: A Simplified Perspective on Assessing Probabilistic Forecasts -- 1. Introduction -- 2. Probability, Perplexity, and Entropy -- 3. Relationship between the Generalized Entropy and the Generalized Mean -- 4. Assessing Probabilistic Forecasts Using a Risk Profile -- 5. Discussion and Conclusion -- Appendix: Modeling Risk as a Coupling of Statistical States -- Acknowledgments -- References -- PART V: INFO-METRICS IN ACTION II: STATISTICAL AND ECONOMETRICS INFERENCE -- 13: Info-metric Methods for the Estimation of Models with Group-Specific Moment Conditions -- 1. Introduction -- 2. GMM and IM/GEL -- 3. The Pseudo-Panel Data Approach to Estimation Based on Repeated Cross-Sectional Data -- 4. IM Estimation with Group-Specific Moment Conditions -- 5. Statistical Properties and Inference -- 5.1 Simulation Study -- 5.2 Empirical Example -- 6. Concluding Remarks -- References -- 14: Generalized Empirical Likelihood-Based Kernel Estimation of Spatially Similar Densities -- 1. Introduction -- 2. Weighted Kernel Density Estimation.
,
3. Spatially Smoothed Moment Constraints.
Permalink