دسترسی نامحدود
برای کاربرانی که ثبت نام کرده اند
برای ارتباط با ما می توانید از طریق شماره موبایل زیر از طریق تماس و پیامک با ما در ارتباط باشید
در صورت عدم پاسخ گویی از طریق پیامک با پشتیبان در ارتباط باشید
برای کاربرانی که ثبت نام کرده اند
درصورت عدم همخوانی توضیحات با کتاب
از ساعت 7 صبح تا 10 شب
ویرایش: نویسندگان: Akinori Tanaka, Akio Tomiya, Koji Hashimoto سری: Mathematical Physics Studies ISBN (شابک) : 9789813361072, 9789813361089 ناشر: Springer سال نشر: 2021 تعداد صفحات: 211 زبان: English فرمت فایل : PDF (درصورت درخواست کاربر به PDF، EPUB یا AZW3 تبدیل می شود) حجم فایل: 7 مگابایت
در صورت تبدیل فایل کتاب Deep Learning and Physics به فرمت های PDF، EPUB، AZW3، MOBI و یا DJVU می توانید به پشتیبان اطلاع دهید تا فایل مورد نظر را تبدیل نمایند.
توجه داشته باشید کتاب یادگیری عمیق و فیزیک نسخه زبان اصلی می باشد و کتاب ترجمه شده به فارسی نمی باشد. وبسایت اینترنشنال لایبرری ارائه دهنده کتاب های زبان اصلی می باشد و هیچ گونه کتاب ترجمه شده یا نوشته شده به فارسی را ارائه نمی دهد.
Preface Acknowledgments Contents 1 Forewords: Machine Learning and Physics 1.1 Introduction to Information Theory 1.2 Physics and Information Theory 1.3 Machine Learning and Information Theory 1.4 Machine Learning and Physics Part I Physical View of Deep Learning 2 Introduction to Machine Learning 2.1 The Purpose of Machine Learning 2.1.1 Mathematical Formulation of Data 2.2 Machine Learning and Occam's Razor 2.2.1 Generalization 2.3 Stochastic Gradient Descent Method Column: Probability Theory and Information Theory Column: Probability Theory and Information Theory Joint and Conditional Probabilities Relative Entropy 3 Basics of Neural Networks 3.1 Error Function from Statistical Mechanics 3.1.1 From Hamiltonian to Neural Network 3.1.2 Deep Neural Network 3.2 Derivation of Backpropagation Method Using Bracket Notation 3.3 Universal Approximation Theorem of Neural Network Column: Statistical Mechanics and Quantum Mechanics Column: Statistical Mechanics and Quantum Mechanics Canonical Distribution in Statistical Mechanics Bracket Notation in Quantum Mechanics 4 Advanced Neural Networks 4.1 Convolutional Neural Network 4.1.1 Convolution 4.1.2 Transposed Convolution 4.2 Recurrent Neural Network and Backpropagation 4.3 LSTM Column: Edge of Chaos and Emergence of Computability Column: Edge of Chaos and Emergence of Computability Sorting Algorithm Implementation Using Recurrent Neural Network KdV Equation and Box-Ball System Critical State and Turing Completeness of One-Dimensional Cellular Automata 5 Sampling 5.1 Central Limit Theorem and Its Role in Machine Learning 5.2 Various Sampling Methods 5.2.1 Inverse Transform Sampling 5.2.2 Rejection Sampling 5.2.3 Markov Chain 5.2.4 Master Equation and the Principle of Detailed Balance 5.2.5 Expectation Value Calculation Using Markov chains, and Importance Sampling 5.3 Sampling Method with the Detailed Balance 5.3.1 Metropolis Method 5.3.2 Heatbath Method Column: From Ising Model to Hopfield Model Column: From Ising Model to Hopfield Model 6 Unsupervised Deep Learning 6.1 Unsupervised Learning 6.2 Boltzmann Machine 6.2.1 Restricted Boltzmann Machine 6.3 Generative Adversarial Network 6.3.1 Energy-Based GAN 6.3.2 Wasserstein GAN 6.4 Generalization in Generative Models Column: Self-learning Monte Carlo method Column: Self-Learning Monte Carlo Method Part II Applications to Physics 7 Inverse Problems in Physics 7.1 Inverse Problems and Learning 7.2 Regularization in Inverse Problems 7.3 Inverse Problems and Physical Machine Learning Column: Sparse Modeling Column: Sparse Modeling 8 Detection of Phase Transition by Machines 8.1 What is Phase Transition? 8.2 Detecting Phase Transition by a Neural Network 8.3 What the Neural Network Sees 9 Dynamical Systems and Neural Networks 9.1 Differential Equations and Neural Networks 9.2 Representation of Hamiltonian Dynamical System 10 Spinglass and Neural Networks 10.1 Hopfield Model and Spinglass 10.2 Memory and Attractor 10.3 Synchronization and Layering 11 Quantum Manybody Systems, Tensor Networks and Neural Networks 11.1 Neural Network Wave Function 11.2 Tensor Networks and Neural Networks 11.2.1 Tensor Network 11.2.2 Tensor Network Representation of Restricted Boltzmann Machines 12 Application to Superstring Theory 12.1 Inverse Problems in String Theory 12.1.1 Compactification as an Inverse Problem 12.1.2 The Holographic Principle as an Inverse Problem 12.2 Curved Spacetime Is a Neural Network 12.2.1 Neural Network Representation of Field Theory in Curved Spacetime 12.2.2 How to Choose Input/Output Data 12.3 Emergent Spacetime on Neural Networks 12.3.1 Is AdS Black Hole Spacetime Learned? 12.3.2 Emergent Spacetime from Material Data 12.4 Spacetime Emerging from QCD Column: Black Holes and Information Column: Black Holes and Information 13 Epilogue 13.1 Neural Network, Physics and Technological Innovation (Akio Tomiya) 13.2 Why Does Intelligence Exist? (Akinori Tanaka) 13.3 Why do Physical Laws Exist? (Koji Hashimoto) Bibliography Index