دسترسی نامحدود
برای کاربرانی که ثبت نام کرده اند
برای ارتباط با ما می توانید از طریق شماره موبایل زیر از طریق تماس و پیامک با ما در ارتباط باشید
در صورت عدم پاسخ گویی از طریق پیامک با پشتیبان در ارتباط باشید
برای کاربرانی که ثبت نام کرده اند
درصورت عدم همخوانی توضیحات با کتاب
از ساعت 7 صبح تا 10 شب
ویرایش: نویسندگان: Sireesha Muppala, Randy DeFauw, Shelbee Eigenbrode سری: ISBN (شابک) : 1801070520, 9781801070522 ناشر: Packt Publishing سال نشر: 2021 تعداد صفحات: 348 زبان: English فرمت فایل : PDF (درصورت درخواست کاربر به PDF، EPUB یا AZW3 تبدیل می شود) حجم فایل: 20 Mb
در صورت تبدیل فایل کتاب Amazon SageMaker Best Practices: Proven tips and tricks to build successful machine learning solutions on Amazon SageMaker به فرمت های PDF، EPUB، AZW3، MOBI و یا DJVU می توانید به پشتیبان اطلاع دهید تا فایل مورد نظر را تبدیل نمایند.
توجه داشته باشید کتاب بهترین روش های Amazon SageMaker: نکات و ترفندهای اثبات شده برای ایجاد راه حل های یادگیری ماشینی موفق در Amazon SageMaker نسخه زبان اصلی می باشد و کتاب ترجمه شده به فارسی نمی باشد. وبسایت اینترنشنال لایبرری ارائه دهنده کتاب های زبان اصلی می باشد و هیچ گونه کتاب ترجمه شده یا نوشته شده به فارسی را ارائه نمی دهد.
بر چالشهای پیشرفته در ساخت راهحلهای ML سرتاسر با استفاده از قابلیتهای Amazon SageMaker برای توسعه و ادغام مدلهای ML در تولید غلبه کنید
Amazon SageMaker یک سرویس AWS کاملاً مدیریت شده است که این توانایی را ارائه میکند. برای ساخت، آموزش، استقرار و نظارت بر مدل های یادگیری ماشینی. این کتاب با یک مرور سطح بالا از قابلیتهای Amazon SageMaker آغاز میشود که به مراحل مختلف فرآیند یادگیری ماشینی نقشه میدهد تا به تنظیم پایه درست کمک کند. شما تاکتیک های کارآمدی را برای مقابله با چالش های علم داده مانند پردازش داده ها در مقیاس، آماده سازی داده ها، اتصال به خطوط لوله داده های بزرگ، شناسایی سوگیری داده ها، اجرای تست های A/B و قابلیت توضیح مدل با استفاده از Amazon SageMaker یاد خواهید گرفت. همانطور که پیشرفت می کنید، متوجه خواهید شد که چگونه می توانید با چالش آموزش در مقیاس مقابله کنید، از جمله نحوه استفاده از مجموعه داده های بزرگ و در عین حال صرفه جویی در هزینه ها، نظارت بر منابع آموزشی برای شناسایی تنگناها، سرعت بخشیدن به کارهای آموزشی طولانی، و ردیابی چندین مدل آموزش دیده برای یک هدف مشترک. با حرکت رو به جلو، خواهید فهمید که چگونه می توانید Amazon SageMaker را با سایر AWS ادغام کنید تا برنامه های یادگیری ماشینی قابل اعتماد، بهینه سازی شده، و خودکار را بسازید. علاوه بر این، شما خطوط لوله ML را با اصول MLOps یکپارچه می سازید و بهترین روش ها را برای ایجاد راه حل های ایمن و کارآمد به کار می گیرید.
در پایان کتاب، با اطمینان می توانید از Amazon SageMaker استفاده کنید. طیف گسترده ای از قابلیت ها برای طیف کامل گردش کار یادگیری ماشین.
این کتاب برای دانشمندان متخصص داده است که مسئول ساختن یادگیری ماشین هستند برنامه های کاربردی با استفاده از Amazon SageMaker. دانش کار آمازون SageMaker، یادگیری ماشین، یادگیری عمیق و تجربه استفاده از نوت بوک های Jupyter و Python مورد انتظار است. دانش اولیه AWS مربوط به داده ها، امنیت و نظارت به شما کمک می کند تا بهترین استفاده را از کتاب ببرید.
Overcome advanced challenges in building end-to-end ML solutions by leveraging the capabilities of Amazon SageMaker for developing and integrating ML models into production
Amazon SageMaker is a fully managed AWS service that provides the ability to build, train, deploy, and monitor machine learning models. The book begins with a high-level overview of Amazon SageMaker capabilities that map to the various phases of the machine learning process to help set the right foundation. You'll learn efficient tactics to address data science challenges such as processing data at scale, data preparation, connecting to big data pipelines, identifying data bias, running A/B tests, and model explainability using Amazon SageMaker. As you advance, you'll understand how you can tackle the challenge of training at scale, including how to use large data sets while saving costs, monitoring training resources to identify bottlenecks, speeding up long training jobs, and tracking multiple models trained for a common goal. Moving ahead, you'll find out how you can integrate Amazon SageMaker with other AWS to build reliable, cost-optimized, and automated machine learning applications. In addition to this, you'll build ML pipelines integrated with MLOps principles and apply best practices to build secure and performant solutions.
By the end of the book, you'll confidently be able to apply Amazon SageMaker's wide range of capabilities to the full spectrum of machine learning workflows.
This book is for expert data scientists responsible for building machine learning applications using Amazon SageMaker. Working knowledge of Amazon SageMaker, machine learning, deep learning, and experience using Jupyter Notebooks and Python is expected. Basic knowledge of AWS related to data, security, and monitoring will help you make the most of the book.
Cover Title Page Copyright and Credits Dedication Contributors Table of Contents Preface Section 1: Processing Data at Scale Chapter 1: Amazon SageMaker Overview Technical requirements Preparing, building, training and tuning, deploying, and managing ML models Discussion of data preparation capabilities SageMaker Ground Truth SageMaker Data Wrangler SageMaker Processing SageMaker Feature Store SageMaker Clarify Feature tour of model-building capabilities SageMaker Studio SageMaker notebook instances SageMaker algorithms BYO algorithms and scripts Feature tour of training and tuning capabilities SageMaker training jobs Autopilot HPO SageMaker Debugger SageMaker Experiments Feature tour of model management and deployment capabilities Model Monitor Model endpoints Edge Manager Summary Chapter 2: Data Science Environments Technical requirements Machine learning use case and dataset Creating data science environments Creating repeatability through IaC/CaC Amazon SageMaker notebook instances Amazon SageMaker Studio Providing and creating data science environments as IT services Creating a portfolio in AWS Service Catalog Amazon SageMaker notebook instances Amazon SageMaker Studio Summary References Chapter 3: Data Labeling with Amazon SageMaker Ground Truth Technical requirements Challenges with labeling data at scale Addressing unique labeling requirements with custom labeling workflows A private labeling workforce Listing the data to label Creating the workflow Improving labeling quality using multiple workers Using active learning to reduce labeling time Security and permissions Summary Chapter 4: Data Preparation at Scale Using Amazon SageMaker Data Wrangler and Processing Technical requirements Visual data preparation with Data Wrangler Bias detection and explainability with Data Wrangler and Clarify Data preparation at scale with SageMaker Processing Loading the dataset Drop columns Converting data types Scaling numeric fields Featurizing the date Simulating labels for air quality Encoding categorical variables Splitting and saving the dataset Summary Chapter 5: Centralized Feature Repository with Amazon SageMaker Feature Store Technical requirements Amazon SageMaker Feature Store essentials Creating feature groups Populating feature groups Retrieving features from feature groups Creating reusable features to reduce feature inconsistencies and inference latency Designing solutions for near real-time ML predictions Summary References Section 2: Model Training Challenges Chapter 6: Training and Tuning at Scale Technical requirements ML training at scale with SageMaker distributed libraries Choosing between data and model parallelism Scaling the compute resources SageMaker distributed libraries Automated model tuning with SageMaker hyperparameter tuning Organizing and tracking training jobs with SageMaker Experiments Summary References Chapter 7: Profile Training Jobs with Amazon SageMaker Debugger Technical requirements Amazon SageMaker Debugger essentials Configuring a training job to use SageMaker Debugger Analyzing the collected tensors and metrics Taking action Real-time monitoring of training jobs using built-in and custom rules Gaining insight into the training infrastructure and training framework Training a PyTorch model for weather prediction Analyzing and visualizing the system and framework metrics generated by the profiler Analyzing the profiler report generated by SageMaker Debugger Analyzing and implementing recommendations from the profiler report Comparing the two training jobs Summary Further reading Section 3: Manage and Monitor Models Chapter 8: Managing Models at Scale Using a Model Registry Technical requirements Using a model registry Choosing a model registry solution Amazon SageMaker model registry Building a custom model registry Utilizing a third-party or OSS model registry Managing models using the Amazon SageMaker model registry Creating a model package group Creating a model package Summary Chapter 9: Updating Production Models Using Amazon SageMaker Endpoint Production Variants Technical requirements Basic concepts of Amazon SageMaker Endpoint Production Variants Deployment strategies for updating ML models with Amazon SageMaker Endpoint Production Variants Standard deployment A/B deployment Blue/Green deployment Canary deployment Shadow deployment Selecting an appropriate deployment strategy Selecting a standard deployment Selecting an A/B deployment Selecting a Blue/Green deployment Selecting a Canary deployment Selecting a Shadow deployment Summary Chapter10: Optimizing Model Hosting and Inference Costs Technical requirements Real-time inference versus batch inference Batch inference Real-time inference Cost comparison Deploying multiple models behind a single inference endpoint Multiple versions of the same model Multiple models Scaling inference endpoints to meet inference traffic demands Setting the minimum and maximum capacity Choosing a scaling metric Setting the scaling policy Setting the cooldown period Using Elastic Inference for deep learning models Optimizing models with SageMaker Neo Summary Chapter 11: Monitoring Production Models with Amazon SageMaker Model Monitor and Clarify End-to-end architectures for monitoring ML models Data drift monitoring Model quality drift monitoring Bias drift monitoring Feature attribution drift monitoring Technical requirements Basic concepts of Amazon SageMaker Model Monitor and Amazon SageMaker Clarify Best practices for monitoring ML models Summary References Section 4: Automate and Operationalize Machine Learning Chapter 12: Machine Learning Automated Workflows Considerations for automating your SageMaker ML workflows Typical ML workflows Considerations and guidance for building SageMaker workflows and CI/CD pipelines AWS-native options for automated workflow and CI/CD pipelines Building ML workflows with Amazon SageMaker Pipelines Building your SageMaker pipeline Data preparation step Model build step Model evaluation step Conditional step Register model step(s) Creating the pipeline Executing the pipeline Pipeline recommended practices Creating CI/CD pipelines using Amazon SageMaker Projects SageMaker projects recommended practices Summary Chapter 13: Well-Architected Machine Learning with Amazon SageMaker Best practices for operationalizing ML workloads Ensuring reproducibility Tracking ML artifacts Automating deployment pipelines Monitoring production models Best practices for securing ML workloads Isolating the ML environment Disabling internet and root access Enforcing authentication and authorization Securing data and model artifacts Logging, monitoring, and auditing Meeting regulatory requirements Best practices for reliable ML workloads Recovering from failure Tracking model origin Automating deployment pipelines Handling unexpected traffic patterns Continuous monitoring of deployed model Updating model with new versions Best practices for building performant ML workloads Rightsizing ML resources Monitoring resource utilization Rightsizing hosting infrastructure Continuous monitoring of deployed model Best practices for cost-optimized ML workloads Optimizing data labeling costs Reducing experimentation costs with models from AWS Marketplace Using AutoML to reduce experimentation time Iterating locally with small datasets Rightsizing training infrastructure Optimizing hyperparameter-tuning costs Saving training costs with Managed Spot Training Using insights and recommendations from Debugger Saving ML infrastructure costs with SavingsPlan Optimizing inference costs Stopping or terminating resources Summary Chapter 14: Managing SageMaker Features across Accounts Examining an overview of the AWS multi-account environment Understanding the benefits of using multiple AWS accounts with Amazon SageMaker Examining multi-account considerations with Amazon SageMaker Considerations for SageMaker features Summary References About PACKT Other Books You May Enjoy Index