دسترسی نامحدود
برای کاربرانی که ثبت نام کرده اند
برای ارتباط با ما می توانید از طریق شماره موبایل زیر از طریق تماس و پیامک با ما در ارتباط باشید
در صورت عدم پاسخ گویی از طریق پیامک با پشتیبان در ارتباط باشید
برای کاربرانی که ثبت نام کرده اند
درصورت عدم همخوانی توضیحات با کتاب
از ساعت 7 صبح تا 10 شب
ویرایش: نویسندگان: Min Chen (editor), J. Michael Dunn (editor), Amos Golan (editor), Aman Ullah (editor) سری: ISBN (شابک) : 0190636688, 9780190636685 ناشر: OUP USA سال نشر: 2021 تعداد صفحات: 552 [557] زبان: English فرمت فایل : PDF (درصورت درخواست کاربر به PDF، EPUB یا AZW3 تبدیل می شود) حجم فایل: 17 Mb
در صورت تبدیل فایل کتاب Advances in Info-Metrics: Information and Information Processing across Disciplines به فرمت های PDF، EPUB، AZW3، MOBI و یا DJVU می توانید به پشتیبان اطلاع دهید تا فایل مورد نظر را تبدیل نمایند.
توجه داشته باشید کتاب پیشرفت در Info-Metrics: اطلاعات و پردازش اطلاعات در سراسر رشته ها نسخه زبان اصلی می باشد و کتاب ترجمه شده به فارسی نمی باشد. وبسایت اینترنشنال لایبرری ارائه دهنده کتاب های زبان اصلی می باشد و هیچ گونه کتاب ترجمه شده یا نوشته شده به فارسی را ارائه نمی دهد.
Info-Metrics چارچوبی برای مدلسازی، استدلال و استنتاج در شرایط پر سر و صدا و اطلاعات ناکافی است. این یک چارچوب بین رشته ای است که در تقاطع نظریه اطلاعات، استنتاج آماری و تصمیم گیری در شرایط عدم قطعیت قرار دارد. مین چن، جی. مایکل دان، آموس گولان و امان اولا در پیشرفتهای متریک اطلاعات، گروهی متشکل از سی متخصص را گرد هم میآورند تا مطالعه اطلاعات سنجهها را در سراسر علوم گسترش دهند و نشان دهند که چگونه مشکلات را با استفاده از این چارچوب بین رشته ای حل کنید. این حجم با تکیه بر پایه های نظری اطلاعات متریک، نور جدیدی بر استنتاج آماری، اطلاعات و حل مسائل کلی می اندازد. این کتاب به بررسی اساس استنتاج نظری اطلاعات و مبانی ریاضی و فلسفی آن می پردازد. بر رابطه متقابل بین اطلاعات و استنتاج تاکید می کند و شامل توضیحاتی در مورد ساخت مدل، ایجاد نظریه، تخمین، پیش بینی و تصمیم گیری است. هر یک از فصل های نوزده گانه ابزارهای لازم را برای استفاده از چارچوب اطلاعات متریک برای حل یک مسئله فراهم می کند. این مجموعه تحولات اخیر در این زمینه و همچنین بسیاری از مطالعات موردی و نمونه های جدید بین رشته ای را پوشش می دهد. این کتاب که به گونه ای طراحی شده است که برای محققان، دانشجویان فارغ التحصیل و شاغلین در سراسر رشته ها قابل دسترسی باشد، تجربه ای واضح و عملی را برای خوانندگانی که علاقه مند به حل مشکلات هستند، در صورت ارائه اطلاعات ناقص و ناقص فراهم می کند.
Info-metrics is a framework for modeling, reasoning, and drawing inferences under conditions of noisy and insufficient information. It is an interdisciplinary framework situated at the intersection of information theory, statistical inference, and decision-making under uncertainty. In Advances in Info-Metrics, Min Chen, J. Michael Dunn, Amos Golan, and Aman Ullah bring together a group of thirty experts to expand the study of info-metrics across the sciences and demonstrate how to solve problems using this interdisciplinary framework. Building on the theoretical underpinnings of info-metrics, the volume sheds new light on statistical inference, information, and general problem solving. The book explores the basis of information-theoretic inference and its mathematical and philosophical foundations. It emphasizes the interrelationship between information and inference and includes explanations of model building, theory creation, estimation, prediction, and decision making. Each of the nineteen chapters provides the necessary tools for using the info-metrics framework to solve a problem. The collection covers recent developments in the field, as well as many new cross-disciplinary case studies and examples. Designed to be accessible for researchers, graduate students, and practitioners across disciplines, this book provides a clear, hands-on experience for readers interested in solving problems when presented with incomplete and imperfect information.
Cover Advances in Info-Metrics: Information and Information Processing across Disciplines Copyright Contents Preface The Info-Metrics Institute Acknowledgments Contributor List PART I: INFORMATION, MEANING, AND VALUE 1: Information and Its Value 1. Background and Basic Questions 2. Information 2.1 Definition and Types 2.2 Processing, Manipulating, and Converting to Knowledge 2.3 Observer and Receiver 3. The Value of Information 3.1 Utility and Value 3.2 Prices and Value 3.3 Hedonic Values: The Factors that Determine the Value 3.4 Digital Information and Value 3.5 Absolute versus Relative Value 3.6 Prices, Quantities, Probabilities, and Value 3.7 A Comment on Ethics versus Value 4. Risk, Decisions, Prices, and Value 5. The Value of Disinformation 6. Concluding Thoughts Acknowledgments References 2: A Computational Theory of Meaning 1. Introduction 1.1 Chapter Overview 1.2 Learning, Data Compression, and Model Selection 1.3 Some Examples 2. From Computation to Models 2.1 Two-Part Code Optimization 2.2 Reasons to Doubt the Validity of Two-Part Code Optimization 2.3 There Are Many Different Models That Could Be Selected as Optimal by Two-Part Code Optimization 2.4 Two-Part Code Optimization Selects Models That Have no Useful Stochastic Interpretation 2.5 Empirical Justification for Two-Part Code Optimization 3. Meaning as Computation 4. From a Single Computing Agent to Communitiesof Interactive Learning Agents 4.1 Semantics for Turing Machines 4.2 Descriptions of Turing Machines 4.3 Names for Turing Machines 4.4 Turing Frames 4.5 Variance of Turing Frames 4.6 Meaningful Information 4.7 Types of Agents 5. Discussion Acknowledgments References PART II: INFORMATION THEORY AND BEHAVIOR 3: Inferring the Logic of Collective Information Processors 1. Introduction 2. First Task: Infer Individual-to-Aggregate Mapping 2.1 Inference Challenge 1: An Abundance of Potentially Relevant Detail—Solved by Large-Scale Reverse Engineering 2.2 Inference Challenge 2: Structural Uncertainty Due to Limited Data—Solved by Hierarchical Model Selection and Regularization 2.2.1 Maximum Entropy Modeling 2.2.2 Dynamical Inference 2.3 Inference Challenge 3: Parameter Uncertainty Due to Scale Separation and Sloppiness—Solved by Bayes and not Focusingon on Individual Parameters Individual Parameters 3. Second Task: Find Abstract System Logic 3.0.1 Why Do We Want to Do This? Advantages of Coarse-Grained, Dimensionality-Reduced Description 3.0.2 Do We Expect to Be Able to Compress? What Does “Logic” Look Like? 3.1 Logic Approach 1: Emergent Grouped Logic Processors: Clustering, Modularity, Sparse Coding, and Motifs 3.2 Logic Approach 2: Instability, Bifurcations, and Criticality 3.2.1 Fisher Information and Criticality 3.2.2 Dynamical Systems and Bifurcations 3.3 Logic Approach 3: Explicit Model Reduction 4. The Future of the Science of Living Systems Acknowledgments References 4: Information-Theoretic Perspective on Human Ability 1. Introduction 2. Information-Theoretic Perspective on Human Ability 2.1 The Maximum Entropy Principle and Information Capacity 2.1.1 Neoclassical Models and Bounded Rationality 2.1.2 Information Acquisition 2.1.3 Information Processing 2.1.4 Discerning Incorrect Information 2.2 Rational Inattention Theory and Extensions 2.3 Information Capacity and the Big Five Personality Traits 3. An Empirical Study on Information Capacity 3.1 Data 3.2 Information Acquisition and Processing 3.3 Accumulation of Wealth and the Big Five Personality Traits 4. Conclusion Acknowledgments References 5: Information Recovery Related to Adaptive Economic Behavior and Choice 1. Introduction 2. Causal Entropy Maximization 3. An Information Recovery Framework 3.1 Examples of Two Information-Theoretic Behavioral Models 3.2 Convex Entropic Divergences 4. Further Examples-Applications 4.1 A Stochastic State-Space Framework 4.2 Network Behavior Recovery 4.3 Unlocking the Dynamic Content of Time Series 5. Summary Comments References PART III: INFO-METRICS AND THEORYCONSTRUCTION 6: Maximum Entropy: A Foundation for a Unified Theory of Ecology 1. Introduction 2. Ecological Theory 2.1 The Ecologist’s Dilemma 2.2 Nonmechanistic Ecological Theory 2.3 The Logic of Inference 3. The Maximum Entropy Theory of Ecology: Basics and a Simple Model Realization 4. Failures of the Static ASNE Model of METE 4.1 Energy Equivalence 4.2 METE Fails in Rapidly Changing Systems 5. Hybrid Vigor in Ecological Theory 5.1 DynaMETE: A Natural Extension of the Static Theory 6. The Ultimate Goal Appendix Some Epistemological Considerations References 7: Entropic Dynamics: Mechanics without Mechanism 1. Introduction 2. The Statistical Model 2.1 Choosing the Prior 2.2 The Constraints 3. Entropic Time 3.1 Time as an Ordered Sequence of Instants 3.2 The Arrow of Entropic Time 3.3 Duration: A Convenient Time Scale 4. The Information Metric of Configuration Space 5. Diffusive Dynamics 6. Hamiltonian Dynamics 6.1 The Ensemble Hamiltonian 6.2 The Action 7. Information Geometry and the Quantum Potential 8. The Schrödinger Equation 9. Some Final Comments 9.1 Is ED Equivalent to Quantum Mechanics? 9.2 Is ED a Hidden-Variable Model? 9.3 On Interpretation Acknowledgments References PART IV: INFO-METRICS IN ACTION I: PREDICTION AND FORECASTS 8: Toward Deciphering of Cancer Imbalances: Using Information-Theoretic Surprisal Analysis for Understanding of Cancer Systems 1. Background 2. Information-Theoretic Approaches in Biology 3. Theory of Surprisal Analysis 4. Using Surprisal Analysis to Understand Intertumor Heterogeneity 4.1 A Thermodynamic-Based Interpretation of Protein Expression Heterogeneity in Glioblastoma Multiforme Tumors Identifies Tumor-Specific Unbalanced Processes 5. Toward Understanding Intratumor Heterogeneity 6. Using Surprisal Analysis to Predict a Direction of Change in Biological Processes 7. Summary References 9: Forecasting Socioeconomic Distributions on Small-Area Spatial Domains for Count Data 1. Introduction 2. Spatial Perspectives for Areal Data 2.1 Spatial Dependence 2.2 Spatial Heterogeneity 2.3 The Role of the Map 3. Spatial Models for Count Data 4. Information-Theoretic Methods for Spatial Count Data Models: GCE Area Level Estimators 5. Simulation Experiments 5.1 Scenario 1: GCE in a Spatial Homogeneous Process 5.2 Scenario 2: GCE in a Spatial Heterogeneous Process 5.3 Scenario 3: GCE in a Process of Spatial Dependence 6. An Empirical Application: Estimating Unemployment Levels of Immigrants at Municipal Scale in Madrid, 2011 7. Conclusions References 10: Performance and Risk Aversion of Funds with Benchmarks: A Large Deviations Approach 1. Introduction 2. An Index of Outperformance Probability 2.1 Entropic Interpretation 2.2 Time-Varying Gaussian Log Returns 2.3 Familiar Performance Measures as Approximations 3. Nonparametric Estimation of the Performance Measure 3.1 Empirical Results 3.1.1 Fund Performance 4. Outperformance Probability Maximization as a Fund Manager Behavioral Hypothesis 4.1 Scientific Principles for Evaluating Managerial Behavioral Hypotheses 5. Conclusions Acknowledgments References 11: Estimating Macroeconomic Uncertainty and Discord: Using Info-Metrics 1. Introduction 2. The Data 3. Aggregate Uncertainty, Aggregate Variance and Their Components 3.1 Fitting Continuous Distributions to Histogram Forecasts 3.2 Uncertainty Decomposition—Decomposing Aggregate Variance 3.3 Estimation of Variance 3.4 Correcting for Bin Size for Variance Decomposition 3.5 Variance Decomposition Results 4. Uncertainty and Information Measures 5. Time Series of Uncertainty Measures 6. Information Measure and “News” 7. Impact of Real Output Uncertainty on Macroeconomic Variables 8. Summary and Conclusions References 12: Reduced Perplexity: A Simplified Perspective on Assessing Probabilistic Forecasts 1. Introduction 2. Probability, Perplexity, and Entropy 3. Relationship between the Generalized Entropy and the Generalized Mean 4. Assessing Probabilistic Forecasts Using a Risk Profile 5. Discussion and Conclusion Appendix: Modeling Risk as a Coupling of Statistical States Acknowledgments References PART V: INFO-METRICS IN ACTION II: STATISTICAL AND ECONOMETRICS INFERENCE 13: Info-metric Methods for the Estimation of Models with Group-Specific Moment Conditions 1. Introduction 2. GMM and IM/GEL 3. The Pseudo-Panel Data Approach to Estimation Based on Repeated Cross-Sectional Data 4. IM Estimation with Group-Specific Moment Conditions 5. Statistical Properties and Inference 5.1 Simulation Study 5.2 Empirical Example 6. Concluding Remarks References 14: Generalized Empirical Likelihood-Based Kernel Estimation of Spatially Similar Densities 1. Introduction 2. Weighted Kernel Density Estimation 3. Spatially Smoothed Moment Constraints 4. Monte Carlo Simulations 5. Empirical Application 6. Concluding Remarks References 15: Rényi Divergence and Monte Carlo Integration 1. Introduction 2. Rényi Divergence 2.1 Definition and Properties 2.2 Power Concentration 3. Acceptance and Importance Sampling 3.1 Acceptance Sampling 3.2 Importance Sampling 3.3 Approximation of Rényi Divergence 3.3.1 Direct or Acceptance Sampling 3.3.2 Importance Sampling 4. Sequential Monte Carlo 4.1 The Algorithm 4.2 Bayesian Inference with Data Tempering 4.3 Bayesian Inference with Power Tempering 4.4 Optimization with Power Tempering 5. Illustrations 5.1 Context 5.1.1 Time Series Model 5.2 Data Tempering 5.3 Power Tempering 6. Appendix: Proofs 6.1 Proof of Proposition 2 6.2 Proof of Proposition 3 6.3 Proof of Proposition 4 References PART VI: INFO-METRICS, DATA INTELLIGENCE, AND VISUAL COMPUTING 16: Cost-Benefit Analysis of Data Intelligence—Its Broader Interpretations 1. Introduction 2. The Cost-Benefit Metric for Data Intelligence 2.1 Processes and Transformations 2.2 Alphabets and Letters 2.3 Cost-Benefit Analysis 3. Relating the Metric to Encryption and Compression 4. Relating the Metric to Model Development 5. Relating the Metric to Perception and Cognition 6. Relating the Metric to Languages and News Media 7. Conclusions References 17: The Role of the Information Channel in Visual Computing 1. Introduction 2. Information Measures and Information Channel 2.1 Basic Information-Theoretic Measures 2.2 Information Channel 2.3 Jensen–Shannon Divergence and Information Bottleneck Method 3. The Viewpoint Selection Channel 3.1 Viewpoint Entropy and Mutual Information 3.2 Results 4. Image Information Channel 5. Scene Visibility Channel 5.1 The Radiosity Method and Form Factors 5.2 Discrete Scene Visibility Channel 5.3 The Refinement Criterion 6. Conclusions References PART VII: INFO-METRICS AND NONPARAMETRIC INFERENCE 18: Entropy-Based Model Averaging Estimation of Nonparametric Models 1. Introduction 2. Entropy-Based Model Averaging 2.1 Averaging for Linear Models 2.2 Averaging for Nonparametric Models 3. Simulation 3.1 Linear Models 3.2 Nonlinear Models 4. An Empirical Example 5. Conclusion Acknowledgments References 19: Information-Theoretic Estimation of Econometric Functions 1. Introduction 2. Estimation of Distribution, Regression, and Response Functions 2.1 Maximum Entropy Distribution Estimation: Bivariate and Marginal 2.2 Regression and Response Functions 2.3 Recursive Integration 3. Simulation and Empirical Examples 3.1 Data-Generating Process 1: Nonlinear Function 3.2 Data-Generating Process 2: Linear Function 3.3 Empirical Study: Canadian High School Graduate Earnings 4. Asymptotic Properties of IT Estimators and Test for Normality 4.1 Asymptotic Normality 4.2 Testing for Normality 5. Conclusions Appendix A: Calculations A.1. Recursive Integration A.2. Finite Integral Range A.3. Empirical Study: An Illustration of Calculations Appendix B: Asymptotic Properties of IT Estimators B.1. Proof of Proposition 1 and (Eq. [19.24]) B.2. Asymptotic Normality of Maximum Entropy Joint Density, Regression Function, and Response Function Acknowledgments References Index