دسترسی نامحدود
برای کاربرانی که ثبت نام کرده اند
برای ارتباط با ما می توانید از طریق شماره موبایل زیر از طریق تماس و پیامک با ما در ارتباط باشید
در صورت عدم پاسخ گویی از طریق پیامک با پشتیبان در ارتباط باشید
برای کاربرانی که ثبت نام کرده اند
درصورت عدم همخوانی توضیحات با کتاب
از ساعت 7 صبح تا 10 شب
ویرایش: 1
نویسندگان: Andre Ye
سری:
ISBN (شابک) : 1484274121, 9781484274125
ناشر: Apress
سال نشر: 2021
تعداد صفحات: 463
زبان: English
فرمت فایل : PDF (درصورت درخواست کاربر به PDF، EPUB یا AZW3 تبدیل می شود)
حجم فایل: 12 مگابایت
در صورت تبدیل فایل کتاب Modern Deep Learning Design and Application Development: Versatile Tools to Solve Deep Learning Problems به فرمت های PDF، EPUB، AZW3، MOBI و یا DJVU می توانید به پشتیبان اطلاع دهید تا فایل مورد نظر را تبدیل نمایند.
توجه داشته باشید کتاب طراحی مدرن یادگیری عمیق و توسعه برنامه: ابزارهای همه کاره برای حل مشکلات یادگیری عمیق نسخه زبان اصلی می باشد و کتاب ترجمه شده به فارسی نمی باشد. وبسایت اینترنشنال لایبرری ارائه دهنده کتاب های زبان اصلی می باشد و هیچ گونه کتاب ترجمه شده یا نوشته شده به فارسی را ارائه نمی دهد.
Table of Contents About the Author About the Technical Reviewer Acknowledgments Introduction Chapter 1: A Deep Dive into Keras Why Keras? Installing and Importing Keras The Simple Keras Workflow Step 1: Define Architecture Step 2: Compile Loss Functions Optimizers Metrics Step 3: Fit and Evaluate Visualizing Model Architectures Functional API Translating a Sequential to a Functional Model Building Nonlinear Topologies Dealing with Data TensorFlow Dataset from Loaded Data TensorFlow Dataset from Image Files Automatic Image Dataset from Directory ImageDataGenerator Key Points Chapter 2: Pretraining Strategies and Transfer Learning Developing Creative Training Structures The Power of Pretraining Transfer Learning Intuition Self-Supervised Learning Intuition Transfer Learning Practical Theory Transfer Learning Models and Model Structure The ImageNet Dataset ResNet InceptionV3 MobileNet EfficientNet Other Models Changing Pretrained Model Architectures Neural Network “Top” Inclusivity Layer Freezing Implementing Transfer Learning General Implementation Structure: A Template No Architecture or Weight Changes Transfer Learning Without Layer Freezing Transfer Learning with Layer Freezing Accessing PyTorch Models Implementing Simple Self-Supervised Learning Case Studies Transfer Learning Case Study: Adversarial Exploitation of Transfer Learning Self-Supervised Learning Case Study: Predicting Rotations Self-Supervised Learning Case Study: Learning Image Context and Designing Nontrivial Pretraining Tasks Key Points Chapter 3: The Versatility of Autoencoders Autoencoder Intuition and Theory The Design of Autoencoder Implementation Autoencoders for Tabular Data Autoencoders for Image Data Image Data Shape Structure and Transformations Convolutional Autoencoder Without Pooling Convolutional Autoencoder Vector Bottleneck Design Convolutional Autoencoder with Pooling and Padding Autoencoders for Other Data Forms Autoencoder Applications Using Autoencoders for Denoising Intuition and Theory Implementation Inducing Noise Using Denoising Autoencoders Using Autoencoders for Pretraining Intuition Implementation Using Autoencoders for Dimensionality Reduction Intuition Implementation Using Autoencoders for Feature Generation Intuition Implementation Using Variational Autoencoders for Data Generation Intuition Implementation Case Studies Autoencoders for Pretraining Case Study: TabNet Denoising Autoencoders Case Study: Chinese Spelling Checker Variational Autoencoders Case Study: Text Generation Key Points Chapter 4: Model Compression for Practical Deployment Introduction to Model Compression Pruning Pruning Theory and Intuition Pruning Implementation Setting Up Data and Benchmark Model Creating Cost Metrics Storage Size Latency Parameter Metrics Pruning an Entire Model Pruning Individual Layers Pruning in Theoretical Deep Learning: The Lottery Ticket Hypothesis Quantization Quantization Theory and Intuition Quantization Implementation Quantizing an Entire Model Quantizing Individual Layers Weight Clustering Weight Clustering Theory and Intuition Weight Clustering Implementation Weight Clustering on an Entire Model Weight Clustering on Individual Layers Collaborative Optimization Sparsity Preserving Quantization Cluster Preserving Quantization Sparsity Preserving Clustering Case Studies Extreme Collaborative Optimization Rethinking Quantization for Deeper Compression Responsible Compression: What Do Compressed Models Forget? Key Points Chapter 5: Automating Model Design with Meta-optimization Introduction to Meta-optimization General Hyperparameter Optimization Bayesian Optimization Intuition and Theory Hyperopt Syntax, Concepts, and Usage Hyperopt Syntax Overview: Finding the Minimum of a Simple Objective Function Using Hyperopt to Optimize Training Procedure Using Hyperopt to Optimize Model Architecture Hyperas Syntax, Concepts, and Usage Using Hyperas to Optimize Training Procedure Using Hyperas to Optimize Model Architecture Neural Architecture Search NAS Intuition and Theory Auto-Keras Auto-Keras System Simple NAS NAS with Custom Search Space NAS with Nonlinear Topology Case Studies NASNet Progressive Neural Architecture Search Efficient Neural Architecture Search Key Points Chapter 6: Successful Neural Network Architecture Design Nonlinear and Parallel Representation Residual Connections Branching and Cardinality Case Study: U-Net Block/Cell Design Sequential Cell Design Nonlinear Cell Design Case Study: InceptionV3 Neural Network Scaling Input Shape Adaptable Design Parametrization of Network Dimensions Case Study: EfficientNet Key Points Chapter 7: Reframing Difficult Deep Learning Problems Data Representation: DeepInsight Corrupted Data: Negative Learning with Noisy Labels Limited Data: Siamese Networks Key Points and Epilogue Index