دسترسی نامحدود
برای کاربرانی که ثبت نام کرده اند
برای ارتباط با ما می توانید از طریق شماره موبایل زیر از طریق تماس و پیامک با ما در ارتباط باشید
در صورت عدم پاسخ گویی از طریق پیامک با پشتیبان در ارتباط باشید
برای کاربرانی که ثبت نام کرده اند
درصورت عدم همخوانی توضیحات با کتاب
از ساعت 7 صبح تا 10 شب
ویرایش: 1
نویسندگان: Alexandre Bergel
سری:
ISBN (شابک) : 1484253833, 9781484253830
ناشر: Apress
سال نشر: 2020
تعداد صفحات: 394
زبان: English
فرمت فایل : PDF (درصورت درخواست کاربر به PDF، EPUB یا AZW3 تبدیل می شود)
حجم فایل: 8 مگابایت
در صورت تبدیل فایل کتاب Agile Artificial Intelligence in Pharo: Implementing Neural Networks, Genetic Algorithms, and Neuroevolution به فرمت های PDF، EPUB، AZW3، MOBI و یا DJVU می توانید به پشتیبان اطلاع دهید تا فایل مورد نظر را تبدیل نمایند.
توجه داشته باشید کتاب هوش مصنوعی چابک در Pharo: پیاده سازی شبکه های عصبی ، الگوریتم های ژنتیک و تکامل عصبی نسخه زبان اصلی می باشد و کتاب ترجمه شده به فارسی نمی باشد. وبسایت اینترنشنال لایبرری ارائه دهنده کتاب های زبان اصلی می باشد و هیچ گونه کتاب ترجمه شده یا نوشته شده به فارسی را ارائه نمی دهد.
Table of Contents About the Author About the Technical Reviewer Acknowledgments Introduction Part I: Neural Networks Chapter 1: The Perceptron Model 1.1 Perceptron as a Kind of Neuron 1.2 Implementing the Perceptron 1.3 Testing the Code 1.4 Formulating Logical Expressions 1.5 Handling Errors 1.6 Combining Perceptrons 1.7 Training a Perceptron 1.8 Drawing Graphs 1.9 Predicting and 2D Points 1.10 Measuring the Precision 1.11 Historical Perspective 1.12 Exercises 1.13 What Have We Seen in This Chapter? 1.14 Further Reading About Pharo Chapter 2: The Artificial Neuron 2.1 Limit of the Perceptron 2.2 Activation Function 2.3 The Sigmoid Neuron 2.4 Implementing the Activation Functions 2.5 Extending the Neuron with the Activation Functions 2.6 Adapting the Existing Tests 2.7 Testing the Sigmoid Neuron 2.8 Slower to Learn 2.9 What Have We Seen in This Chapter? Chapter 3: Neural Networks 3.1 General Architecture 3.2 Neural Layer 3.3 Modeling a Neural Network 3.4 Backpropagation 3.4.1 Step 1: Forward Feeding 3.4.2 Step 2: Error Backward Propagation 3.4.3 Step 3: Updating Neuron Parameters 3.5 What Have We Seen in This Chapter? Chapter 4: Theory on Learning 4.1 Loss Function 4.2 Gradient Descent 4.3 Parameter Update 4.4 Gradient Descent in Our Implementation 4.5 Stochastic Gradient Descent 4.6 The Derivative of the Sigmoid Function 4.7 What Have We Seen in This Chapter? 4.8 Further Reading Chapter 5: Data Classification 5.1 Training a Network 5.2 Neural Network as a Hashmap 5.3 Visualizing the Error and the Topology 5.4 Contradictory Data 5.5 Classifying Data and One-Hot Encoding 5.6 The Iris Dataset 5.7 Training a Network with the Iris Dataset 5.8 The Effect of the Learning Curve 5.9 Testing and Validation 5.10 Normalization 5.11 Integrating Normalization into the NNetwork Class 5.12 What Have We Seen in This Chapter? Chapter 6: A Matrix Library 6.1 Matrix Operations in C 6.2 The Matrix Class 6.3 Creating the Unit Test 6.4 Accessing and Modifying the Content of a Matrix 6.5 Summing Matrices 6.6 Printing a Matrix 6.7 Expressing Vectors 6.8 Factors 6.9 Dividing a Matrix by a Factor 6.10 Matrix Product 6.11 Matrix Subtraction 6.12 Filling the Matrix with Random Numbers 6.13 Summing the Matrix Values 6.14 Transposing a Matrix 6.15 Example 6.16 What Have We Seen in This Chapter? Chapter 7: Matrix-Based Neural Networks 7.1 Defining a Matrix-Based Layer 7.2 Defining a Matrix-Based Neural Network 7.3 Visualizing the Results 7.4 Iris Flower Dataset 7.5 What Have We Seen in This Chapter? Part II: Genetic Algorithms Chapter 8: Genetic Algorithms 8.1 Algorithms Inspired from Natural Evolution 8.2 Example of a Genetic Algorithm 8.3 Relevant Vocabulary 8.4 Modeling Individuals 8.5 Crossover Genetic Operations 8.6 Mutation Genetic Operations 8.7 Parent Selection 8.8 Evolution Monitoring 8.9 The Genetic Algorithm Engine 8.10 Terminating the Algorithm 8.11 Testing the Algorithm 8.12 Visualizing Population Evolution 8.13 What Have We Seen in This Chapter? Chapter 9: Genetic Algorithms in Action 9.1 Fundamental Theorem of Arithmetic 9.2 The Knapsack Problem 9.2.1 The Unbounded Knapsack Problem Variant 9.2.2 The 0-1 Knapsack Problem Variant 9.2.3 Coding and Encoding 9.3 Meeting Room Scheduling Problem 9.4 Mini Sodoku 9.5 What Have We Seen in This Chapter? Chapter 10: The Traveling Salesman Problem 10.1 Illustration of the Problem 10.2 Relevance of the Traveling Salesman Problem 10.3 Naive Approach 10.4 Adequate Genetic Operations 10.5 The Swap Mutation Operation 10.6 The Ordered Crossover Operation 10.7 Revisiting Our Large Example 10.8 What Have We Seen in This Chapter? Chapter 11: Exiting a Maze 11.1 Encoding the Robot’s Behavior 11.2 Robot Definition 11.3 Map Definition 11.4 Example 11.5 What Have We Seen in This Chapter? Chapter 12: Building Zoomorphic Creatures 12.1 Modeling Join Points 12.2 Modeling Platforms 12.3 Defining Muscles 12.4 Generating Muscles 12.5 Defining the Creature 12.6 Creating Creatures 12.6.1 Serialization and Materialization of a Creature 12.6.2 Accessors and Utility Methods 12.7 Defining the World 12.8 Cold Run 12.9 What Have We Seen in This Chapter? Chapter 13: Evolving Zoomorphic Creatures 13.1 Interrupting a Process 13.2 Monitoring the Execution Time 13.3 The Competing Conventions Problem 13.4 The Constrained Crossover Operation 13.5 Moving Forward 13.6 Serializing the Muscle Attributes 13.7 Passing Obstacles 13.8 Climbing Stairs 13.9 What Have We Seen in This Chapter? Part III: Neuroevolution Chapter 14: Neuroevolution 14.1 Supervised, Unsupervised Learning, and Reinforcement Learning 14.2 Neuroevolution 14.3 Two Neuroevolution Techniques 14.4 The NeuroGenetic Approach 14.5 Extending the Neural Network 14.6 NeuroGenetic by Example 14.7 The Iris Dataset 14.8 Further Reading About NeuroGenetic 14.9 What Have We Seen in This Chapter? Chapter 15: Neuroevolution with NEAT 15.1 Vocabulary 15.2 The Node Class 15.3 Different Kinds of Nodes 15.4 Connections 15.5 The Individual Class 15.6 Species 15.7 Speciation 15.8 The Crossover Operation 15.9 Abstract Definition of Mutation 15.10 Structural Mutation Operations 15.10.1 Adding a Connection 15.10.2 Adding a Node 15.11 Non-Structural Mutation Operation 15.12 Logging 15.13 NEAT 15.14 Visualization 15.15 The XOR Example 15.16 The Iris Example 15.17 What Have We Seen in This Chapter? Chapter 16: The MiniMario Video Game 16.1 Character Definition 16.2 Modeling Mario 16.3 Modeling an Artificial Mario Player 16.4 Modeling Monsters 16.5 Modeling the MiniMario World 16.6 Building the Game’s Visuals 16.7 Running MiniMario 16.8 NEAT and MiniMario 16.9 What Have We Seen in This Chapter? Afterword Last Words Index