دسترسی نامحدود
برای کاربرانی که ثبت نام کرده اند
برای ارتباط با ما می توانید از طریق شماره موبایل زیر از طریق تماس و پیامک با ما در ارتباط باشید
در صورت عدم پاسخ گویی از طریق پیامک با پشتیبان در ارتباط باشید
برای کاربرانی که ثبت نام کرده اند
درصورت عدم همخوانی توضیحات با کتاب
از ساعت 7 صبح تا 10 شب
ویرایش:
نویسندگان: Arieh Ben-Naim
سری:
ISBN (شابک) : 3031212754, 9783031212758
ناشر: Springer
سال نشر: 2023
تعداد صفحات: 241
[242]
زبان: English
فرمت فایل : PDF (درصورت درخواست کاربر به PDF، EPUB یا AZW3 تبدیل می شود)
حجم فایل: 8 Mb
در صورت تبدیل فایل کتاب Information Theory and Selected Applications به فرمت های PDF، EPUB، AZW3، MOBI و یا DJVU می توانید به پشتیبان اطلاع دهید تا فایل مورد نظر را تبدیل نمایند.
توجه داشته باشید کتاب تئوری اطلاعات و کاربردهای منتخب نسخه زبان اصلی می باشد و کتاب ترجمه شده به فارسی نمی باشد. وبسایت اینترنشنال لایبرری ارائه دهنده کتاب های زبان اصلی می باشد و هیچ گونه کتاب ترجمه شده یا نوشته شده به فارسی را ارائه نمی دهد.
این کتاب بر تجزیه و تحلیل کاربردهای اندازه گیری اطلاعات شانون (SMI) تمرکز دارد. این کتاب مفهوم سرخوردگی را معرفی میکند و مسئله کمیسازی این مفهوم را در نظریه اطلاعات (IT) مورد بحث قرار میدهد، در حالی که بر تفسیر آنتروپی سیستمهای ذرات متقابل از نظر SMI و اطلاعات متقابل نیز تمرکز دارد. نویسنده به بررسی امکان اندازهگیری میزان ناامیدی با استفاده از اطلاعات متقابل میپردازد و برخی از نمونههای کلاسیک فرآیندهای اختلاط و جذب را مورد بحث قرار میدهد که تغییرات آنتروپی بر حسب SMI تفسیر میشوند. توصیفی از چند سیستم الزام آور و تفسیر پدیده های همکاری از نظر اطلاعات متقابل نیز همراه با بحث مفصلی در مورد روش کلی استفاده از حداکثر SMI به منظور یافتن توزیع احتمال "بهترین حدس" ارائه شده است. این کتاب سهم ارزشمندی در زمینه تئوری اطلاعات است و برای هر دانشمندی که به فناوری اطلاعات و کاربردهای بالقوه آن علاقه دارد بسیار مورد توجه خواهد بود.
This book focuses on analysing the applications of the Shannon Measure of Information (SMI). The book introduces the concept of frustration and discusses the question of the quantification of this concept within information theory (IT), while it also focuses on the interpretation of the entropy of systems of interacting particles in terms of the SMI and of mutual information. The author examines the question of the possibility of measuring the extent of frustration using mutual information and discusses some classical examples of processes of mixing and assimilation for which the entropy changes are interpreted in terms of SMI. A description of a few binding systems and the interpretation of cooperativity phenomena in terms of mutual information are also presented, along with a detailed discussion on the general method of using maximum SMI in order to find the “best-guess” probability distribution. This book is a valuable contribution to the field of information theory and will be of great interest to any scientist who is interested in IT and in its potential applications.
Preface Contents Abbreviations 1 Introduction and Caveats 1.1 A Bit of Information About the Bit in Information Theory and the Binary Digit 1.2 Misinterpretation of Probability as SMI and SMI as Probability 1.3 SMI, in General Is not Entropy. Entropy Is a Special Case of SMI 1.4 The “Vennity” of Using Venn Diagrams in Representing Dependence Between Random Variables 1.4.1 The Case of Two Coins with Magnets 1.4.2 The Case of Two Regions on a Board at Which a Dart Is Thrown 1.5 The Frustrating Search for a Measure of Frustration 1.5.1 Three Coins with Magnets 1.5.2 Three Regions on a Board 1.5.3 A Caveat to the Caveat on Frustration 1.6 Levels of Confusion: Information, SMI and Bit 1.7 Information May Be Either Subjective or Objective. The SMI (as Well as Entropy) Is Always an Objective Concept Appendix 1: Venn Diagram for Pair of Events, and for Pair of Random Variables Appendix 2: The Monty Hall and the Equivalent Three Prisoners’ Problem The Three Prisoners’ Problem Two Solutions to the Three Prisoners’ Problem A More General, but Easier to Solve, Problem References 2 Intermolecular Interactions, Correlations, and Mutual Information 2.1 Introduction 2.2 The General Expression for the SMI of Interacting Particles 2.2.1 First Step: The Locational SMI of a Particle in a 1D Box of Length L 2.2.2 Second Step: The Velocity SMI of a Particle in a 1D “Box” of Length L 2.2.3 Third Step: Combining the SMI for the Location and Momentum of a Particle in a 1D System. Addition of Correction Due to Uncertainty 2.2.4 The SMI of One Particle in a Box of Volume V 2.2.5 The Forth Step: The SMI of Locations and Momenta of N Independent Particles in a Box of Volume V. Adding a Correction Due to Indistinguishability of the Particles 2.2.6 The Entropy of a System of Interacting Particles. Correlations Due to Intermolecular Interactions 2.3 The SMI of a System of Interacting Particles in Pairs Only 2.4 Entropy-Change in Phase Transition 2.4.1 Solid–Gas Transition 2.4.2 Liquid–Gas Transition 2.4.3 Solid–Liquid Transition 2.5 Liquid Water 2.6 Aqueous Solutions of Inert Gases 2.7 Entropy and Mutual Information in One Dimensional Fluids 2.7.1 The General Expression for the Entropy of a 1D Fluid 2.7.2 The General Behavior of the Probability Density Pr( r ) 2.7.3 The Entropy-Change Due to Turning-On the Interaction Energy 2.7.4 Conclusion Appendix 1: Solvation Entropy of a Solute as Difference in SMI References 3 Application of Multivariate Mutual Information to Study Spin Systems 3.1 Definition of Multivariate MI Based on Total Correlations 3.2 Definition of Multivariate MI Based on Conditional Information 3.3 Relationship Between the Conditional MI and the Various SMI 3.4 The Formal Connection Between the TI and CI 3.5 Reinterpretation of the CI in Terms of MIs 3.6 Generalization to Any N Random Variables 3.7 Some Properties of the Multivariate MI 3.8 A Three-Spin System 3.8.1 Probabilities 3.8.2 SMI and Conditional SMI 3.8.3 The Various Mutual Information for Three-Spin Systems 3.8.4 The Three-Spin System with External Field 3.8.5 The Three-Spin System with Different Geometries 3.9 Systems with Four Interacting Spins 3.9.1 Four-Spin Systems; Perfect Square 3.9.2 The Parallelogram Arrangement 3.9.3 The Rectangular Configurations 3.10 Five-Spin Systems 3.10.1 Pentagon with One Additional Interaction 3.10.2 Pentagon with Two Additional Interactions 3.10.3 Pentagon with All Pair Interactions 3.11 Six-Spin Systems References 4 Entropy of Mixing and Entropy of Assimilation, an Informational Theoretical Approach 4.1 “Entropy of Mixing” of Two Different Ideal Gas 4.2 Entropy of Assimilation 4.3 Is There a Pure De-assimilation Process? 4.4 Racemization as a Pure De-assimilation Process 4.5 An Example of the Entropy Formulation of the Second Law 4.6 An Example of Gibbs Energy-Formulation of the Second Law 4.7 A Baffling Experiments in Systems of Interacting Particles 4.8 Communal SMI and Communal Entropy References 5 Information Transmission Between Molecules in Binding Systems 5.1 The Method of Generating Probabilities 5.2 Adsorbing on a Single-Site Molecule 5.3 Two-Site Systems 5.3.1 Direct Communication via Ligand-Ligand Interaction 5.3.2 Indirect Communication via Conformation Changes in the Adsorbent Conformation 5.3.3 Indirect Communication Due to Conformational Changes in Each Subunit 5.4 Three-Site Systems 5.4.1 Direct Interaction Between Three Identical Sites 5.4.2 Direct Pair and Triplet Interactions 5.4.3 Direct, But Different Pair Interactions 5.5 Four-Site Systems with Direct Interactions 5.5.1 Multivariate Mutual Information (MI) 5.5.2 A Perfect Square with Equal Nearest Neighbor Interactions 5.5.3 A Perfect Square with Unequal Pair-Interactions 5.5.4 Parallelogram with Five Equal Pair Interactions 5.5.5 Tetrahedral Arrangement; Direct Interaction Between All Pairs of Sites 5.6 Four-Site Systems with Indirect Interactions Only 5.6.1 Binding Isotherms 5.6.2 The Various SMI in the Three Arrangements 5.6.3 Total Mutual Information References 6 Calculations of the “Best-Guess” Probability Distribution Using Shannon’s Measure of Information 6.1 The Probability Distribution of an Unfair Coin 6.2 A Six-Face Die with Three Different Numbers of Dots; 1, 2 and 3 6.3 Probability Distribution of an Unfair Regular Six-Face Die 6.4 Calculation of an Approximate Micelle-Size Distribution Using Experimental Data 6.5 Computation of the Monomer Concentration 6.6 Calculation of the Moments of the Micelle-Size-Distribution 6.7 Computation of the “Best-Guess” MSD 6.8 Conclusion References Index