دسترسی نامحدود
برای کاربرانی که ثبت نام کرده اند
برای ارتباط با ما می توانید از طریق شماره موبایل زیر از طریق تماس و پیامک با ما در ارتباط باشید
در صورت عدم پاسخ گویی از طریق پیامک با پشتیبان در ارتباط باشید
برای کاربرانی که ثبت نام کرده اند
درصورت عدم همخوانی توضیحات با کتاب
از ساعت 7 صبح تا 10 شب
ویرایش:
نویسندگان: Saad Subair (editor). Christopher Thron (editor)
سری:
ISBN (شابک) : 3030378292, 9783030378295
ناشر: Springer
سال نشر: 2020
تعداد صفحات: 288
زبان: English
فرمت فایل : PDF (درصورت درخواست کاربر به PDF، EPUB یا AZW3 تبدیل می شود)
حجم فایل: 9 مگابایت
در صورت ایرانی بودن نویسنده امکان دانلود وجود ندارد و مبلغ عودت داده خواهد شد
در صورت تبدیل فایل کتاب Implementations and Applications of Machine Learning (Studies in Computational Intelligence, 782) به فرمت های PDF، EPUB، AZW3، MOBI و یا DJVU می توانید به پشتیبان اطلاع دهید تا فایل مورد نظر را تبدیل نمایند.
توجه داشته باشید کتاب پیاده سازی ها و کاربردهای یادگیری ماشینی (مطالعات در هوش محاسباتی، 782) نسخه زبان اصلی می باشد و کتاب ترجمه شده به فارسی نمی باشد. وبسایت اینترنشنال لایبرری ارائه دهنده کتاب های زبان اصلی می باشد و هیچ گونه کتاب ترجمه شده یا نوشته شده به فارسی را ارائه نمی دهد.
این کتاب توضیحات گام به گام پیاده سازی های موفق و
کاربردهای عملی یادگیری ماشین را ارائه می دهد. صفحه GitHub
کتاب حاوی کدهای نرم افزاری است که به خوانندگان کمک می کند تا
مواد و روش ها را برای استفاده خود تطبیق دهند. طیف گسترده ای
از کاربردها، از جمله شبکه مش بی سیم و بهینه سازی سیستم های
قدرت مورد بحث قرار گرفته است. بینایی کامپیوتری؛ تصویر و تشخیص
چهره؛ پیش بینی پروتئین؛ داده کاوی؛ و کشف داده ها بسیاری از
تکنیکهای پیشرفته یادگیری ماشین (با توضیحات مفصل)، از جمله
بهینهسازی با الهام از بیولوژیکی (الگوریتمهای ژنتیک و دیگر
تکاملی، هوش ازدحام) استفاده میشوند. تشخیص چهره ویولا جونز;
مدل سازی مخلوط گاوسی; ماشین های بردار پشتیبانی؛ شبکههای عصبی
کانولوشنال عمیق با تکنیکهای بهبود عملکرد (از جمله طراحی
شبکه، بهینهسازی نرخ یادگیری، افزایش دادهها، یادگیری
انتقال). شبکه های عصبی اسپک و انعطاف پذیری وابسته به زمان.
استخراج مکرر مجموعه اقلام؛ طبقه بندی باینری؛ و برنامه نویسی
پویا این کتاب اطلاعات ارزشمندی را در مورد تکنیکها و
رویکردهای مؤثر و پیشرفته برای دانشآموزان، محققان، پزشکان و
معلمان در زمینه یادگیری ماشین ارائه میکند.
This book provides step-by-step explanations of
successful implementations and practical applications of
machine learning. The book’s GitHub page contains software
codes to assist readers in adapting materials and methods for
their own use. A wide variety of applications are discussed,
including wireless mesh network and power systems
optimization; computer vision; image and facial recognition;
protein prediction; data mining; and data discovery. Numerous
state-of-the-art machine learning techniques are employed
(with detailed explanations), including biologically-inspired
optimization (genetic and other evolutionary algorithms,
swarm intelligence); Viola Jones face detection; Gaussian
mixture modeling; support vector machines; deep convolutional
neural networks with performance enhancement techniques
(including network design, learning rate optimization, data
augmentation, transfer learning); spiking neural networks and
timing dependent plasticity; frequent itemset mining; binary
classification; and dynamic programming. This book provides
valuable information on effective, cutting-edge techniques,
and approaches for students, researchers, practitioners, and
teachers in the field of machine learning.
Preface Reference Contents Parallel 3-Parent Genetic Algorithm with Application to Routing in Wireless Mesh Networks 1 Introduction 2 P3PGA Algorithm 3 Simulated Performance, Results, and Discussion 4 P3PGA for Minimal Cost Route Evaluation 5 Implementation and Performance of the Proposed Approach 5.1 Comparative Performance of 100 Node Client WMNs 5.2 Comparative Performance of 500 Node Client WMNs 5.3 Comparative Performance of 1000 Node Client WMNs 5.4 Comparative Performance of 2000 Node Client WMNs 5.5 Comparative Performance of 2500 Node Client WMNs 5.6 Overall Performance Considering all Networks 6 Conclusions References Application of Evolutionary Algorithms to Power System Stabilizer Design 1 Introduction 1.1 Oscillations in Electrical Power Systems and Power Systems Stabilizers 1.2 Algorithms for Parameter Optimization: Differential Evolution and Population-Based Incremental Learning 2 Problem Statement 2.1 Overview 2.2 State-Space Representation 2.3 Linearization 2.4 Modal Analysis 3 The Differential Evolution Algorithm 3.1 Overview 3.2 Detailed DE Algorithm Description 3.2.1 Population Structure 3.2.2 Initialization 3.2.3 Mutation 3.2.4 Crossover 3.2.5 Selection 3.2.6 Termination 3.3 Self-Adaptive DE Algorithms 4 Population-Based Incremental Learning (PBIL) 4.1 Overview 4.2 Binary Encoding, Probability Vector, and Population 4.3 Mutation 4.4 Learning Process 4.5 Termination 5 Application of DE and PBIL to PSS Design 5.1 Overview 5.2 System Configurations 5.3 Single Machine Infinite Bus System: Results of Optimization 5.4 Two-Area Multimachine System: Results of Optimization 5.5 Sensitivity of Differential Evolution to Algorithm Control Parameters. 5.5.1 Effects of F and CR Parameters on DE Convergence 5.5.2 Effect of Population Size 5.6 Application of Adaptive DE to PSS Design 5.7 Performance Summary 6 Chapter Summary References Automatic Sign Language Manual Parameter Recognition (I): Survey 1 Background and Motivation 2 Skin Detection 2.1 Static Skin Detection 2.2 Parametric Skin Detection 2.3 Non-parametric Skin Detection 3 Hand Tracking 3.1 Approaches to Tracking a Single Hand 3.2 Approaches to Tracking Both Hands 4 Hand Shape and Finger-spelling Recognition 4.1 Rule-Based Approaches 4.2 Machine Learning Approaches 5 Hand Motion/Gesture Recognition 6 Summary and Conclusions References Automatic Sign Language Manual Parameter Recognition (II): Comprehensive System Design 1 Introduction 2 Hand Retrieval 2.1 Input Capture 2.2 Hand Detection 2.3 Skin Detection 2.4 Face Detection 2.5 Face Histogram Computation 2.6 Enhanced Skin Highlighting Principle and Its Application to the Left and Right Hands 2.7 Computation of Enhanced Histograms for the Hands and Integration into the Face Histogram 2.8 Enhanced Skin Highlighting for the Final Skin Image 2.9 Motion Detection 2.10 Combination of Skin and Motion Images 2.11 Hand Tracking 2.11.1 Data Association for Object Tracking 2.11.2 Tracking Initialisation 2.11.3 Tracking Update 3 Manual Parameter Representation and Recognition 3.1 Hold Detection for Motion Representation 3.1.1 Determining When the Hand Starts Moving 3.1.2 Determining Stops or Changes in Direction of the Hand 4 Hand Segmentation 5 Feature Representation 6 Hand Orientation and Shape Recognition 7 SignWriting Lookup and Transcription 8 Summary References Computer Vision Algorithms for Image Segmentation, Motion Detection, and Classification 1 Introduction 2 Image Segmentation 2.1 Adaptive Gaussian Thresholding and Image Inversion 2.2 Cross Correlation Template Matching 2.3 Viola–Jones Face Detection 2.3.1 Haar-Like Wavelet Features and Their Computation 2.3.2 Integral Image Representation for Haar-Like Wavelet Computation 2.3.3 Selection of Features Using AdaBoost and Arrangement into a Rejection Cascade 3 Motion Detection Using Gaussian Mixture Modeling 4 Feature Representation Using the Histogram of Oriented Gradients Feature Descriptor 5 Support Vector Machine Classification 5.1 Support Vector Machine Classification Principle 5.2 Mapping onto Higher-Dimensional Spaces 5.3 Multi-Class SVM Classification Techniques 5.3.1 One-Versus-All 5.3.2 One-Versus-One 5.3.3 Directed Acyclic Graph Support Vector Machine 5.4 n-Fold Cross-Validation 6 Conclusion References Overview of Deep Learning in Facial Recognition 1 Introduction 2 Neural Nets: Basic Structure and Function 2.1 History 2.2 Basic Concepts and Constructs in Deep Learning 2.2.1 Single-Layer Perceptron 2.2.2 The Multilayer Perceptron 2.2.3 Training of MLP\'s 2.3 Underlearning and Overlearning 2.4 Convolutional Neural Networks (CNN) 2.4.1 Convolutional Layers 2.4.2 Guiding Principles of Convolutional Layer Design 2.4.3 CNN Layer Hyperparameters: Window Size, Depth, Stride, and Padding 2.4.4 Pooling 2.4.5 Classifiers on CNN Outputs 3 Neural Net Enhancements and Optimizations 3.1 Producing Probability Outputs with Softmax 3.2 Loss Functions 3.2.1 Cross-Entropy Loss 3.2.2 Contrastive Loss 3.2.3 Center Loss and Contrastive Center Loss 3.2.4 Triplet Loss 3.2.5 Loss Functions Based on Angular Distances 3.3 Optimization of Learning Rate 3.3.1 Adaptive Gradient Descent (AdaGrad) 3.3.2 Delta Adaptive Gradient Descent (AdaDelta) 3.4 Enhanced Training Techniques 3.4.1 Bagging 3.4.2 Boosting 3.4.3 Dropout 4 Facial Recognition 4.1 Convolutional Neural Net Models for Facial Recognition 4.1.1 DeepFace 4.1.2 DeepID (2015) 4.1.3 FaceNet (2015) 4.1.4 VGGFace (2015) 4.1.5 SphereFace (2017) 4.1.6 CosFace (2018) 4.1.7 ArcFace (2018) 4.2 Facial Recognition Without Constraint Using Deep Learning 4.2.1 Data Variability Issues 4.3 Facial Recognition Datasets 4.3.1 Labeled Faces in the Wild (LFW) 4.3.2 CASIA-WebFace 4.3.3 VGGFace and VGGFace2 4.3.4 Similar Looking Labeled Faces in the Wild (SLLFW) 5 Conclusion References Improving Deep Unconstrained Facial Recognition by Data Augmentation 1 Introduction 2 Facial Recognition System Design Elements 2.1 Overview 2.2 Data Augmentation 2.2.1 Data Augmentation Overview 2.2.2 3-D Face Reconstruction 2.2.3 Lighting Variation 2.3 CNN Training for Classification 2.3.1 Overview 2.3.2 Features Extraction 3 Experimental Setup 3.1 Computational Platform 3.2 Description of CNN Model 3.2.1 Inputs 3.2.2 Filters 3.2.3 Subsampling (Pooling) 3.3 Datasets Used 3.3.1 Labeled Faces in the Wild (LFW) 3.3.2 ORL Database 3.3.3 Yale Face Database B 3.4 Experimental Training and Testing Configurations 3.4.1 Experiment 1: LFW Without Data Augmentation 3.4.2 Experiment 2: LFW with Data Augmentation 4 Results and Interpretation 4.1 Evaluation on ORL 4.2 Evaluation on YaleB 5 Conclusion References Improved Plant Species Identification Using Convolutional Neural Networks with Transfer Learning and Test Time Augmentation 1 Introduction 2 Convolutional Neural Networks 3 CNN Architectures 4 Experimental Setup 5 Results and Discussion 6 Summary References Simulation of Biological Learning with Spiking Neural Networks 1 Introduction 2 Mathematical Neuron Models 2.1 Integrate and Fire (IF) Model 2.2 Leaky Integrate and Fire (LIF) Model 2.3 Conductance-Based Neuron Model 3 Spike-time-dependent plasticity learning algorithm 3.1 Description of STDP 3.2 Handwritten digit recognition using STDP 4 SNN Simulation Software 4.1 Overview 4.2 Brian2 Simulator 4.3 NEURON Simulator 4.4 GENESIS Simulator 4.5 NEST Simulator 5 Hardware Implementations 5.1 Overview 5.2 IBM TrueNorth 5.3 Reconfigurable On-Line Learning Spiking (ROLLS) neuromorphic processor 5.4 NeuroGrid 5.5 SpiNNaker 6 Conclusion References An Efficient Algorithm for Mining Frequent Itemsets and Association Rules 1 Introduction 1.1 Problem Decomposition 2 Outline of the Binary-Based ARM Algorithm 2.1 Binary Data Representation 2.2 Masks and Bitwise Operations 2.3 Itemset Pruning via Merging Operation 2.4 Binary-Based Algorithm Description 2.4.1 Top-Level Description 2.4.2 Binary Data Representation 2.4.3 Procedure for Finding Frequent 1-Itemsets 2.4.4 Procedure for Generating Frequent Itemsets with Multiple Items 2.4.5 Phase II: Extracting Association Rules 2.5 Datasets 2.6 Software and Hardware Specifications 2.7 Execution Time Benchmarking 2.8 Memory Usage Benchmarking 2.9 Summary References Receiver Operating Characteristic Curves in Binary Classification of Protein Secondary Structure Data 1 Introduction 2 Classification of Protein Shape 3 Sensitivity and Specificity 4 Receiver Operating Characteristics (ROC) Curves 5 A Practical Example: Assessment of NN-GORV-II Algorithm for Structure Identification 6 Summary References Budget Reconciliation Through Dynamic Programming 1 Introduction 1.1 Discrepancies in Military Accounting 1.2 Dynamic Programming Overview, and a Simple Example from Biochemistry 1.2.1 Budget Reconciliation with Dynamic Programming 2 Methods 2.1 Dynamic Programming Algorithm Step-by-Step Description 2.2 Code Structure 2.2.1 Initialization 2.2.2 Generation of Simulated Commits and Obligations 2.2.3 Loop over N: Dynamic Programming Process 2.2.4 Recovery of Solution and Output of Statistics 3 Results 3.1 Simulation 3.2 Application of Algorithm to Real Budget Data 4 Conclusions References Index