ورود به حساب

نام کاربری گذرواژه

گذرواژه را فراموش کردید؟ کلیک کنید

حساب کاربری ندارید؟ ساخت حساب

ساخت حساب کاربری

نام نام کاربری ایمیل شماره موبایل گذرواژه

برای ارتباط با ما می توانید از طریق شماره موبایل زیر از طریق تماس و پیامک با ما در ارتباط باشید


09117307688
09117179751

در صورت عدم پاسخ گویی از طریق پیامک با پشتیبان در ارتباط باشید

دسترسی نامحدود

برای کاربرانی که ثبت نام کرده اند

ضمانت بازگشت وجه

درصورت عدم همخوانی توضیحات با کتاب

پشتیبانی

از ساعت 7 صبح تا 10 شب

دانلود کتاب Amazon SageMaker Developer Guide

دانلود کتاب راهنمای توسعه دهنده Amazon SageMaker

Amazon SageMaker Developer Guide

مشخصات کتاب

Amazon SageMaker Developer Guide

ویرایش:  
نویسندگان:   
سری:  
 
ناشر: Amazon Web Services 
سال نشر: 2021 
تعداد صفحات: [2621] 
زبان: English 
فرمت فایل : PDF (درصورت درخواست کاربر به PDF، EPUB یا AZW3 تبدیل می شود) 
حجم فایل: 66 Mb 

قیمت کتاب (تومان) : 32,000



ثبت امتیاز به این کتاب

میانگین امتیاز به این کتاب :
       تعداد امتیاز دهندگان : 3


در صورت تبدیل فایل کتاب Amazon SageMaker Developer Guide به فرمت های PDF، EPUB، AZW3، MOBI و یا DJVU می توانید به پشتیبان اطلاع دهید تا فایل مورد نظر را تبدیل نمایند.

توجه داشته باشید کتاب راهنمای توسعه دهنده Amazon SageMaker نسخه زبان اصلی می باشد و کتاب ترجمه شده به فارسی نمی باشد. وبسایت اینترنشنال لایبرری ارائه دهنده کتاب های زبان اصلی می باشد و هیچ گونه کتاب ترجمه شده یا نوشته شده به فارسی را ارائه نمی دهد.


توضیحاتی درمورد کتاب به خارجی



فهرست مطالب

Amazon SageMaker
Table of Contents
What Is Amazon SageMaker?
	Amazon SageMaker Features
	Amazon SageMaker Pricing
	Are You a First-time User of Amazon SageMaker?
		How Amazon SageMaker Works
	Machine Learning with Amazon SageMaker
	Explore, Analyze, and Process Data
	What Is Fairness and Model Explainability for Machine Learning Predictions?
		Best Practices for Evaluating Fairness and Explainability in the ML Lifecycle
		Sample Notebooks
		Guide to the SageMaker Clarify Documentation
	Train a Model with Amazon SageMaker
	Deploy a Model in Amazon SageMaker
		Deploy a Model on SageMaker Hosting Services
			Best Practices for Deploying Models on SageMaker Hosting Services
	Get Inferences for an Entire Dataset with Batch Transform
	Validate a Machine Learning Model
	Monitoring a Model in Production
	Use Machine Learning Frameworks, Python, and R with Amazon SageMaker
		Use Apache MXNet with Amazon SageMaker
			What do you want to do?
		Use Apache Spark with Amazon SageMaker
			Download the SageMaker Spark Library
			Integrate Your Apache Spark Application with SageMaker
			Example 1: Use Amazon SageMaker for Training and Inference with Apache Spark
				Use Custom Algorithms for Model Training and Hosting on Amazon SageMaker with Apache Spark
				Use the SageMakerEstimator in a Spark Pipeline
			SDK examples: Use Amazon SageMaker with Apache Spark
		Use Chainer with Amazon SageMaker
			What do you want to do?
		Use Hugging Face with Amazon SageMaker
			Training
				How to run training with the Hugging Face Estimator
			Inference
				How to deploy an inference job using the Hugging Face Deep Learning Containers
			What do you want to do?
		Use PyTorch with Amazon SageMaker
			What do you want to do?
		R User Guide to Amazon SageMaker
			R Kernel in SageMaker
			Get Started with R in SageMaker
			Example Notebooks
		Use Scikit-learn with Amazon SageMaker
			What do you want to do?
		Use SparkML Serving with Amazon SageMaker
		Use TensorFlow with Amazon SageMaker
			Use TensorFlow Version 1.11 and Later
				What do you want to do?
			Use TensorFlow Legacy Mode for Versions 1.11 and Earlier
	Supported Regions and Quotas
		Request a service quota increase for SageMaker resources
Get Started with Amazon SageMaker
	Set Up Amazon SageMaker
		Create an AWS Account
		Create an IAM Administrator User and Group
	Onboard to Amazon SageMaker Studio
		Onboard to Amazon SageMaker Studio Using Quick Start
		Onboard to Amazon SageMaker Studio Using AWS SSO
			Set Up AWS SSO for Use with Amazon SageMaker Studio
		Onboard to Amazon SageMaker Studio Using IAM
		Choose a VPC
		Delete an Amazon SageMaker Studio Domain
			Delete a SageMaker Studio Domain (Studio)
			Delete a SageMaker Studio Domain (CLI)
	SageMaker JumpStart
		Using JumpStart
		Solutions
		Models
			Text Models
			Vision Models
		Deploy a model
		Model Deployment Configuration
		Fine-Tune a Model
		Fine-Tuning Data Source
		Fine-Tuning deployment configuration
		Hyperparameters
		Training Output
		Next Steps
		Amazon SageMaker Studio Tour
	Get Started with Amazon SageMaker Notebook Instances
		Machine Learning with the SageMaker Python SDK
		Tutorial Overview
		Step 1: Create an Amazon SageMaker Notebook Instance
			(Optional) Change SageMaker Notebook Instance Settings
			(Optional) Advanced Settings for SageMaker Notebook Instances
		Step 2: Create a Jupyter Notebook
		Step 3: Download, Explore, and Transform a Dataset
			Load Adult Census Dataset Using SHAP
			Overview the Dataset
			Split the Dataset into Train, Validation, and Test Datasets
			Convert the Train and Validation Datasets to CSV Files
			Upload the Datasets to Amazon S3
		Step 4: Train a Model
			Choose the Training Algorithm
			Create and Run a Training Job
		Step 5: Deploy the Model to Amazon EC2
			Deploy the Model to SageMaker Hosting Services
			(Optional) Use SageMaker Predictor to Reuse the Hosted Endpoint
			(Optional) Make Prediction with Batch Transform
		Step 6: Evaluate the Model
			Evaluate the Model Deployed to SageMaker Hosting Services
		Step 7: Clean Up
Amazon SageMaker Studio
	Amazon SageMaker Studio UI Overview
		Left sidebar
		File and resource browser
		Main work area
		Settings
	Use the Amazon SageMaker Studio Launcher
		Notebooks and compute resources
		Utilities and files
	Studio Entity Status
	Use Amazon SageMaker Studio Notebooks
		How Are Amazon SageMaker Studio Notebooks Different from Notebook Instances?
		Get Started
			Log In from the Amazon SageMaker console
			Next Steps
		Create or Open an Amazon SageMaker Studio Notebook
			Open a Studio notebook
			Create a Notebook from the File Menu
			Create a Notebook from the Launcher
		Use the SageMaker Studio Notebook Toolbar
		Share and Use an Amazon SageMaker Studio Notebook
			Share a Notebook
			Use a Shared Notebook
		Get Notebook and App Metadata
			Get Notebook Metadata
			Get App Metadata
		Get Notebook Differences
			Get the Difference Between the Last Checkpoint
			Get the Difference Between the Last Commit
		Manage Resources
			Change an Instance Type
			Change a Kernel
			Shut Down Resources
				Shut Down an Open Notebook
				Shut Down Resources
		Usage Metering
		Available Resources
			Available SageMaker Studio Instance Types
			Available Amazon SageMaker Images
			Available Amazon SageMaker Kernels
	Bring your own SageMaker image
		Create a custom SageMaker image (Console)
		Attach a custom SageMaker image (Control Panel)
			Attach an existing image version to your domain
			Detach a custom SageMaker image
		Launch a custom SageMaker image in SageMaker Studio
		Bring your own custom SageMaker image tutorial
			Add a Studio-compatible container image to Amazon ECR
			Create a SageMaker image from the ECR container image
			Attach the SageMaker image to a new domain
			Attach the SageMaker image to your current domain
			View the attached image in the Studio control panel
			Clean up resources
		Custom SageMaker image specifications
	Set Up a Connection to an Amazon EMR Cluster
	Perform Common Tasks in Amazon SageMaker Studio
		Upload Files to SageMaker Studio
		Clone a Git Repository in SageMaker Studio
		Stop a Training Job in SageMaker Studio
		Use TensorBoard in Amazon SageMaker Studio
			Prerequisites
			Set Up TensorBoardCallback
			Install TensorBoard
			Launch TensorBoard
		Manage Your EFS Storage Volume in SageMaker Studio
		Provide Feedback on SageMaker Studio
		Update SageMaker Studio and Studio Apps
			Update SageMaker Studio
			Update Studio Apps
	Amazon SageMaker Studio Pricing
	Troubleshooting Amazon SageMaker Studio
Use Amazon SageMaker Notebook Instances
	Amazon Linux 2 vs Amazon Linux notebook instances
		AL1 Maintenance Phase Plan
		Available Kernels
		Migrating to Amazon Linux 2
	Create a Notebook Instance
	Access Notebook Instances
	Update a Notebook Instance
	Customize a Notebook Instance Using a Lifecycle Configuration Script
		Lifecycle Configuration Best Practices
		Install External Libraries and Kernels in Notebook Instances
			Package installation tools
				Conda
				Pip
				Unsupported
		Notebook Instance Software Updates
		Control an Amazon EMR Spark Instance Using a Notebook
	Example Notebooks
		Use or View Example Notebooks in Jupyter Classic
		Use or View Example Notebooks in Jupyterlab
	Set the Notebook Kernel
	Associate Git Repositories with SageMaker Notebook Instances
		Add a Git Repository to Your Amazon SageMaker Account
			Add a Git Repository to Your SageMaker Account (Console)
			Add a Git Repository to Your Amazon SageMaker Account (CLI)
		Create a Notebook Instance with an Associated Git Repository
			Create a Notebook Instance with an Associated Git Repository (Console)
			Create a Notebook Instance with an Associated Git Repository (CLI)
		Associate a CodeCommit Repository in a Different AWS Account with a Notebook Instance
		Use Git Repositories in a Notebook Instance
	Notebook Instance Metadata
	Monitor Jupyter Logs in Amazon CloudWatch Logs
Automate model development with Amazon SageMaker Autopilot
	Get started with Amazon SageMaker Autopilot
		Samples: Explore modeling with Amazon SageMaker Autopilot
		Videos: Use Autopilot to automate and explore the machine learning process
			Start an AutoML job with Amazon SageMaker Autopilot
			Review data exploration and feature engineering automated in Autopilot.
			Tune models to optimize performance
			Choose and deploy the best model
			Amazon SageMaker Autopilot walkthrough
		Tutorials: Get started with Amazon SageMaker Autopilot
	Create an Amazon SageMaker Autopilot experiment
	Amazon SageMaker Autopilot problem types
		Regression
		Binary classification
		Multiclass classification
	Model support and validation
		Autopilot algorithm support
		Autopilot cross-validation
	Amazon SageMaker Autopilot model deployment
	Amazon SageMaker Autopilot explainability
	Models generated by Amazon SageMaker Autopilot
	Amazon SageMaker Autopilot notebooks generated to manage AutoML tasks
		Data exploration notebook
		Candidate definition notebook
	Configure inference output in Autopilot-generated containers
		Inference container definitions for regression and classification problem types
		Select inference response for classification models
	Amazon SageMaker Autopilot quotas
		Quotas that you can increase
		Resource quotas
	API reference guide for Amazon SageMaker Autopilot
		SageMaker API reference
		Amazon SageMaker Python SDK
		AWS Command Line Interface (CLI)
		AWS SDK for Python (Boto)
		AWS SDK for .NET
		AWS SDK for C++
		AWS SDK for Go
		AWS SDK for Java
		AWS SDK for JavaScript
		AWS SDK for PHP V3
		AWS SDK for Ruby V3
Label Data
	Use Amazon SageMaker Ground Truth to Label Data
		Are You a First-time User of Ground Truth?
		Getting started
			Step 1: Before You Begin
				Next
			Step 2: Create a Labeling Job
				Next
			Step 3: Select Workers
				Next
			Step 4: Configure the Bounding Box Tool
				Next
			Step 5: Monitoring Your Labeling Job
		Label Images
			Bounding Box
				Creating a Bounding Box Labeling Job (Console)
				Create a Bounding Box Labeling Job (API)
					Provide a Template for Bounding Box Labeling Jobs
				Bounding Box Output Data
			Image Semantic Segmentation
				Creating a Semantic Segmentation Labeling Job (Console)
				Create a Semantic Segmentation Labeling Job (API)
					Provide a Template for Semantic Segmentation Labeling Jobs
				Semantic Segmentation Output Data
			Auto-Segmentation Tool
				Tool Preview
				Tool Availability
			Image Classification (Single Label)
				Create an Image Classification Labeling Job (Console)
				Create an Image Classification Labeling Job (API)
					Provide a Template for Image Classification Labeling Jobs
				Image Classification Output Data
			Image Classification (Multi-label)
				Create a Multi-Label Image Classification Labeling Job (Console)
				Create a Multi-Label Image Classification Labeling Job (API)
					Provide a Template for Multi-label Image Classification
				Multi-label Image Classification Output Data
			Image Label Verification
		Use Ground Truth to Label Text
			Named Entity Recognition
				Create a Named Entity Recognition Labeling Job (Console)
				Create a Named Entity Recognition Labeling Job (API)
					Provide a Template for Named Entity Recognition Labeling Jobs
				Named Entity Recognition Output Data
			Text Classification (Single Label)
				Create a Text Classification Labeling Job (Console)
				Create a Text Classification Labeling Job (API)
					Provide a Template for Text Classification Labeling Jobs
				Text Classification Output Data
			Text Classification (Multi-label)
				Create a Multi-Label Text Classification Labeling Job (Console)
				Create a Multi-Label Text Classification Labeling Job (API)
					Create a Template for Multi-label Text Classification
				Multi-label Text Classification Output Data
		Label Videos and Video Frames
			Video Classification
				Create a Video Classification Labeling Job (Console)
				Create a Video Classification Labeling Job (API)
					Provide a Template for Video Classification
				Video Classification Output Data
			Label Video Frames
				Video Frame Object Detection
					Preview the Worker UI
					Create a Video Frame Object Detection Labeling Job
						Create a Labeling Job (Console)
						Create a Labeling Job (API)
					Create Video Frame Object Detection Adjustment or Verification Labeling Job
					Output Data Format
				Video Frame Object Tracking
					Preview the Worker UI
					Create a Video Frame Object Tracking Labeling Job
						Create a Labeling Job (Console)
						Create a Labeling Job (API)
					Create a Video Frame Object Tracking Adjustment or Verification Labeling Job
					Output Data Format
				Video Frame Labeling Job Overview
					Input Data
					Job Completion Times
					Task Types
					Workforces
					Worker User Interface (UI)
						Label Category and Frame Attributes
							Label Category Attributes
							Frame level Attributes
						Worker Instructions
						Declining Tasks
					Video Frame Job Permission Requirements
						Add a CORS Permission Policy to S3 Bucket
			Worker Instructions
				Work on Video Frame Object Tracking Tasks
					Your Task
					Navigate the UI
					Bulk Edit Label and Frame Attributes
					Tool Guide
					Icons Guide
					Shortcuts
					Release, Stop and Resume, and Decline Tasks
					Saving Your Work and Submitting
				Work on Video Frame Object Detection Tasks
					Your Task
					Navigate the UI
					Bulk Edit Label and Frame Attributes
					Tool Guide
					UI Icon Guide
					Shortcuts
					Release, Stop and Resume, and Decline Tasks
					Saving Your Work and Submitting
		Use Ground Truth to Label 3D Point Clouds
			3D Point Clouds
				LiDAR
				Sensor Fusion
			Label 3D Point Clouds
				Assistive Labeling Tools for Point Cloud Annotation
			Next Steps
			3D Point Cloud Task types
				3D Point Cloud Object Detection
					View the Worker Task Interface
					Create a 3D Point Cloud Object Detection Labeling Job
						Create a Labeling Job (Console)
						Create a Labeling Job (API)
					Create a 3D Point Cloud Object Detection Adjustment or Verification Labeling Job
					Output Data Format
				3D Point Cloud Object Tracking
					View the Worker Task Interface
						Worker Tools
					Create a 3D Point Cloud Object Tracking Labeling Job
						Create a Labeling Job (API)
						Create a Labeling Job (Console)
					Create a 3D Point Cloud Object Tracking Adjustment or Verification Labeling Job
					Output Data Format
				3D Point Cloud Semantic Segmentation
					View the Worker Task Interface
					Create a 3D Point Cloud Semantic Segmentation Labeling Job
						Create a Labeling Job (Console)
						Create a Labeling Job (API)
					Create a 3D Point Cloud Semantic Segmentation Adjustment or Verification Labeling Job
					Output Data Format
			3D Point Cloud Labeling Jobs Overview
				Job Pre-processing Time
				Job Completion Times
				Workforces
				Worker User Interface (UI)
					Label Category Attributes
						Label Category Attributes
						Frame Attributes
					Worker Instructions
					Declining Tasks
				3D Point Cloud Labeling Job Permission Requirements
					Add a CORS Permission Policy to S3 Bucket
			Worker Instructions
				3D Point Cloud Semantic Segmentation
					Your Task
					Navigate the UI
					Icon Guide
					Shortcuts
					Release, Stop and Resume, and Decline Tasks
					Saving Your Work and Submitting
				3D Point Cloud Object Detection
					Your Task
					Navigate the UI
					Icon Guide
					Shortcuts
					Release, Stop and Resume, and Decline Tasks
					Saving Your Work and Submitting
				3D Point Cloud Object Tracking
					Your Task
					Navigate the UI
						Delete Cuboids
					Bulk Edit Label Category and Frame Attributes
					Icon Guide
					Shortcuts
					Release, Stop and Resume, and Decline Tasks
					Saving Your Work and Submitting
		Verify and Adjust Labels
			Requirements to Create Verification and Adjustment Labeling Jobs
			Create a Label Verification Job (Console)
				Create an Image Label Verification Job (Console)
				Create a Point Cloud or Video Frame Label Verification Job (Console)
			Create a Label Adjustment Job (Console)
				Create an Image Label Adjustment Job (Console)
				Create a Point Cloud or Video Frame Label Adjustment Job (Console)
			Start a Label Verification or Adjustment Job (API)
					Bounding Box and Semantic Segmentation
					3D Point Cloud and Video Frame
			Label Verification and Adjustment Data in the Output Manifest
			Cautions and Considerations
				Color Information Requirements for Semantic Segmentation Jobs
				Filter Your Data Before Starting the Job
		Creating Custom Labeling Workflows
			Step 1: Setting up your workforce
				Next
			Step 2: Creating your custom worker task template
				Starting with a base template
				Developing templates locally
				Using External Assets
				Track your variables
				A simple sample
				Adding automation with Liquid
					Variable filters
						Autoescape and explicit escape
						escape_once
						skip_autoescape
						to_json
						grant_read_access
				End-to-end demos
				Next
			Step 3: Processing with AWS Lambda
				Pre-annotation and Post-annotation Lambda Function Requirements
					Pre-annotation Lambda
						Examples of Pre-annotation Lambda Functions
					Post-annotation Lambda
				Required Permissions To Use AWS Lambda With Ground Truth
					Grant Permission to Create and Select an AWS Lambda Function
					Grant IAM Execution Role Permission to Invoke AWS Lambda Functions
					Grant Post-Annotation Lambda Permissions to Access Annotation
				Create Lambda Functions for a Custom Labeling Workflow
				Test Pre-Annotation and Post-Annotation Lambda Functions
					Prerequisites
					Test the Pre-annotation Lambda Function
					Test the Post-Annotation Lambda Function
			Demo Template: Annotation of Images with crowd-bounding-box
				Starter Bounding Box custom template
				Your own Bounding Box custom template
				Your manifest file
				Your pre-annotation Lambda function
				Your post-annotation Lambda function
				The output of your labeling job
			Demo Template: Labeling Intents with crowd-classifier
				Starter Intent Detection custom template
				Your Intent Detection custom template
					Styling Your Elements
				Your pre-annotation Lambda function
				Your post-annotation Lambda function
				Your labeling job output
			Custom Workflows via the API
		Create a Labeling Job
			Built-in Task Types
			Creating Instruction Pages
				Short Instructions
				Full Instructions
				Add example images to your instructions
			Create a Labeling Job (Console)
				Next Steps
			Create a Labeling Job (API)
				Examples
			Create a Streaming Labeling Job
				Create Amazon SNS Input and Output Topics
					Create an Input Topic
					Create an Output Topic
						Add Encryption to Your Output Topic (Optional)
					Subscribe an Endpoint to Your Amazon SNS Output Topic
				Set up Amazon S3 Bucket Event Notifications
				Create a Manifest File (Optional)
				Example: Use SageMaker API To Create Streaming Labeling Job
				Stop a Streaming Labeling Job
			Create a Labeling Category Configuration File with Label Category and Frame Attributes
				Label Category Configuration File Schema
					Label and label category attribute quotas
				Example: Label Category Configuration Files for 3D Point Cloud Labeling Jobs
				Example: Label Category Configuration Files for Video Frame Labeling Jobs
				Creating Worker Instructions
		Use Input and Output Data
			Input Data
				Use an Input Manifest File
				Automated Data Setup
				Supported Data Formats
				Ground Truth Streaming Labeling Jobs
					How It Works
					Send Data to a Streaming Labeling Job
						Send Data Objects Using Amazon SNS
						Send Data Objects using Amazon S3
					Manage Labeling Requests with an Amazon SQS Queue
					Receive Output Data from a Streaming Labeling Job
					Duplicate Message Handling
						Specify A Deduplication Key and ID in an Amazon SNS Message
						Find Deduplication Key and ID in Your Output Data
				Input Data Quotas
					Input File Size Quota
					Input Image Resolution Quotas
					Label Category Quotas
					3D Point Cloud and Video Frame Labeling Job Quotas
				Filter and Select Data for Labeling
					Use the Full Dataset
					Choose a Random Sample
					Specify a Subset
			3D Point Cloud Input Data
				Accepted Raw 3D Data Formats
					Compact Binary Pack Format
					ASCII Format
					Point Cloud Resolution Limits
				Create an Input Manifest File for a 3D Point Cloud Labeling Job
					Create a Point Cloud Frame Input Manifest File
						Include Vehicle Pose Information in Your Input Manifest
						Include Camera Data in Your Input Manifest
						Point Cloud Frame Limits
					Create a Point Cloud Sequence Input Manifest
						Parameters for Individual Point Cloud Frames
						Include Vehicle Pose Information in Your Input Manifest
						Include Camera Data in Your Input Manifest
						Sequence File and Point Cloud Frame Limits
				Understand Coordinate Systems and Sensor Fusion
					Coordinate System Requirements for Labeling Jobs
					Using Point Cloud Data in a World Coordinate System
						What is a World Coordinate System?
						Convert 3D Point Cloud Data to a WCS
					Sensor Fusion
						Extrinsic Matrix
						Intrinsic Matrix
						Image Distortion
						Ego Vehicle
						Pose
					Compute Orientation Quaternions and Position
					Ground Truth Sensor Fusion Transformations
						LiDAR Extrinsic
						Camera Calibrations: Extrinsic, Intrinsic and Distortion
							Camera Extrinsic
							Intrinsic and Distortion
			Video Frame Input Data
				Choose Video Files or Video Frames for Input Data
					Provide Video Frames
					Provide Video Files
				Input Data Setup
					Automated Video Frame Input Data Setup
						Provide Video Files and Extract Frames
						Provide Video Frames
					Manual Input Data Setup
						Create a Video Frame Input Manifest File
							Create a Video Frame Sequence Input Manifest
							Create a Video Frame Sequence File
			Output Data
				Output Directories
					Active Learning Directory
					Annotations Directory
					Inference Directory
					Manifest Directory
					Training Directory
				Confidence Score
				Worker Metadata
				Output Metadata
				Classification Job Output
				Multi-label Classification Job Output
				Bounding Box Job Output
				Named Entity Recognition
				Label Verification Job Output
				Semantic Segmentation Job Output
				Video Frame Object Detection Output
				Video Frame Object Tracking Output
				3D Point Cloud Semantic Segmentation Output
				3D Point Cloud Object Detection Output
				3D Point Cloud Object Tracking Output
		Enhanced Data Labeling
			Control the Flow of Data Objects Sent to Workers
				Use MaxConcurrentTaskCount to Control the Flow of Data Objects
				Use Amazon SQS to Control the Flow of Data Objects to Streaming Labeling Jobs
			Consolidate Annotations
				Create Your Own Annotation Consolidation Function
					Assess Similarity
					Assess the Most Probable Label
			Automate Data Labeling
				How it Works
					Accuracy of Automated Labels
				Create an Automated Data Labeling Job (Console)
				Create an Automated Data Labeling Job (API)
				Amazon EC2 Instances Required for Automated Data Labeling
				Set up an active learning workflow with your own model
			Chaining Labeling Jobs
				Key Term: Label Attribute Name
				Start a Chained Job (Console)
					Job Overview Panel
				Start a Chained Job (API)
				Use a Partially Labeled Dataset
		Ground Truth Security and Permissions
			CORS Permission Requirement
			Assign IAM Permissions to Use Ground Truth
				Use IAM Managed Policies with Ground Truth
				Grant IAM Permission to Use the Amazon SageMaker Ground Truth Console
					Ground Truth Console Permissions
					Custom Labeling Workflow Permissions
					Private Workforce Permissions
					Vendor Workforce Permissions
				Create a SageMaker Execution Role for a Ground Truth Labeling Job
					Built-In Task Types (Non-streaming) Execution Role Requirements
					Built-In Task Types (Streaming) Execution Role Requirements
					Execution Role Requirements for Custom Task Types
					Automated Data Labeling Permission Requirements
				Encrypt Output Data and Storage Volume with AWS KMS
					Encrypt Output Data using KMS
					Encrypt Automated Data Labeling ML Compute Instance Storage Volume
			Output Data and Storage Volume Encryption
				Use Your KMS Key to Encrypt Output Data
				Use Your KMS Key to Encrypt Automated Data Labeling Storage Volume (API Only)
			Workforce Authentication and Restrictions
				Restrict Access to Workforce Types
		Monitor Labeling Job Status
			Send Events to CloudWatch Events
			Set Up a Target to Process Events
			Labeling Job Expiration
			Declining Tasks
	Create and Manage Workforces
		Using the Amazon Mechanical Turk Workforce
			Use Mechanical Turk with Ground Truth
			Use Mechanical Turk with Amazon A2I
			When is Mechanical Turk Not Supported?
		Managing Vendor Workforces
		Use a Private Workforce
			Create and Manage Amazon Cognito Workforce
				Create a Private Workforce (Amazon Cognito)
					Create a Private Workforce (Amazon SageMaker Console)
						Create an Amazon Cognito Workforce When Creating a Labeling Job
						Create an Amazon Cognito Workforce Using the Labeling Workforces Page
					Create a Private Workforce (Amazon Cognito Console)
				Manage a Private Workforce (Amazon Cognito)
					Manage a Workforce (Amazon SageMaker Console)
						Create a Work Team Using the SageMaker Console
							Subscriptions
						Add or Remove Workers
							Add Workers to the Workforce
							Add a Worker to a Work Team
							Disable and Remove a Worker from the Workforce
					Manage a Private Workforce (Amazon Cognito Console)
						Create Work Teams (Amazon Cognito Console)
							Subscriptions
						Add and Remove Workers (Amazon Cognito Console)
							Add a Worker to a Work Team
							Disable and Remove a Worker From a Work Team
			Create and Manage OIDC IdP Workforce
				Create a Private Workforce (OIDC IdP)
					Send Required and Optional Claims to Ground Truth and Amazon A2I
					Create an OIDC IdP Workforce
						Configure your OIDC IdP
					Validate Your OIC IdP Workforce Authentication Response
					Next Steps
				Manage a Private Workforce (OIDC IdP)
					Prerequisites
					Add work teams
					Add or remove IdP groups from work teams
					Delete a work team
					Manage Individual Workers
					Update, Delete, and Describe Your Workforce
			Manage Private Workforce Using the Amazon SageMaker API
				Find Your Workforce Name
				Restrict Worker Access to Tasks to Allowable IP Addresses
				Update OIDC Identity Provider Workforce Configuration
				Delete a Private Workforce
			Track Worker Performance
				Enable Tracking
				Examine Logs
				Use Log Metrics
			Create and manage Amazon SNS topics for your work teams
				Create the Amazon SNS topic
				Manage worker subscriptions
	Crowd HTML Elements Reference
		SageMaker Crowd HTML Elements
			crowd-alert
				Attributes
					dismissible
					type
				Element Hierarchy
				See Also
			crowd-badge
				Attributes
					for
					icon
					label
				Element Hierarchy
				See Also
			crowd-button
				Attributes
					disabled
					form-action
					href
					icon
					icon-align
					icon-url
					loading
					target
					variant
				Element Hierarchy
				See Also
			crowd-bounding-box
				Attributes
					header
					initial-value
					labels
					name
					src
				Element Hierarchy
				Regions
					full-instructions
					short-instructions
				Output
					boundingBoxes
					inputImageProperties
				See Also
			crowd-card
				Attributes
					heading
					image
				Element Hierarchy
				See Also
			crowd-checkbox
				Attributes
					checked
					disabled
					name
					required
					value
				Element Hierarchy
				Output
				See Also
			crowd-classifier
				Attributes
					categories
					header
					name
				Element Hierarchy
				Regions
					classification-target
					full-instructions
					short-instructions
				Output
				See Also
			crowd-classifier-multi-select
				Attributes
					categories
					header
					name
					exclusion-category
				Element Hierarchy
				Regions
					classification-target
					full-instructions
					short-instructions
				Output
				See Also
			crowd-entity-annotation
				Attributes
					header
					initial-value
					labels
					name
					text
				Element Hierarchy
				Regions
					full-instructions
					short-instructions
				Output
					entities
				See Also
			crowd-fab
				Attributes
					disabled
					icon
					label
					title
				Element Hierarchy
				See Also
			crowd-form
				Element Hierarchy
				Element Events
				See Also
			crowd-icon-button
				Attributes
					disabled
					icon
				Element Hierarchy
				See Also
			crowd-image-classifier
				Attributes
					categories
					header
					name
					overlay
					src
				Element Hierarchy
				Regions
					full-instructions
					short-instructions
					worker-comment
						header
						link-text
						placeholder
				Output
				See Also
			crowd-image-classifier-multi-select
				Attributes
					categories
					header
					name
					src
					exclusion-category
				Element Hierarchy
				Regions
					full-instructions
					short-instructions
				Output
				See Also
			crowd-input
				Attributes
					allowed-pattern
					auto-focus
					auto-validate
					disabled
					error-message
					label
					max-length
					min-length
					name
					placeholder
					required
					type
					value
				Element Hierarchy
				Output
				See Also
			crowd-instance-segmentation
				Attributes
					header
					labels
					name
					src
				initial-value
				Element Hierarchy
				Regions
					full-instructions
					short-instructions
				Output
					labeledImage
					instances
					inputImageProperties
				See Also
			crowd-instructions
				Attributes
					link-text
					link-type
				Element Hierarchy
				Regions
					detailed-instructions
					negative-example
					positive-example
					short-summary
				See Also
			crowd-keypoint
				Attributes
					header
					initial-value
					labels
					name
					src
				Element Hierarchy
				Regions
					full-instructions
					short-instructions
				Output
					inputImageProperties
					keypoints
				See Also
			crowd-line
				Attributes
					header
					initial-value
					labels
					label-colors
					name
					src
				Regions
					full-instructions
					short-instructions
				Element Hierarchy
				Output
					inputImageProperties
					lines
				See Also
			crowd-modal
				Attributes
					link-text
					link-type
				Element Hierarchy
				See Also
			crowd-polygon
				Attributes
					header
					labels
					name
					src
					initial-value
				Element Hierarchy
				Regions
					full-instructions
					short-instructions
				Output
					polygons
					inputImageProperties
				See Also
			crowd-polyline
				Attributes
					header
					initial-value
					labels
					label-colors
					name
					src
				Regions
					full-instructions
					short-instructions
				Element Hierarchy
				Output
					inputImageProperties
					polylines
				See Also
			crowd-radio-button
				Attributes
					checked
					disabled
					name
					value
				Element Hierarchy
				Output
				See Also
			crowd-radio-group
				Attributes
				Element Hierarchy
				Output
				See Also
			crowd-semantic-segmentation
				Attributes
					header
					initial-value
					labels
					name
					src
				Element Hierarchy
				Regions
					full-instructions
					short-instructions
				Output
					labeledImage
					labelMappings
					initialValueModified
					inputImageProperties
				See Also
			crowd-slider
				Attributes
					disabled
					editable
					max
					min
					name
					pin
					required
					secondary-progress
					step
					value
				Element Hierarchy
				See Also
			crowd-tab
				Attributes
					header
				Element Hierarchy
				See Also
			crowd-tabs
				Attributes
				Element Hierarchy
				See Also
			crowd-text-area
				Attributes
					auto-focus
					auto-validate
					char-counter
					disabled
					error-message
					label
					max-length
					max-rows
					name
					placeholder
					rows
					value
				Element Hierarchy
				Output
				See Also
			crowd-toast
				Attributes
					duration
					text
				Element Hierarchy
				See Also
			crowd-toggle-button
				Attributes
					checked
					disabled
					invalid
					name
					required
					value
				Element Hierarchy
				Output
				See Also
		Augmented AI Crowd HTML Elements
			crowd-textract-analyze-document
				Attributes
					header
					src
					initialValue
					blockTypes
					keys
					no-key-edit
					no-geometry-edit
				Element Hierarchy
				Regions
					full-instructions
					short-instructions
				Example of a Worker Template Using the crowd Element
				Output
			crowd-rekognition-detect-moderation-labels
				Attributes
					header
					src
					categories
					exclusion-category
				Element Hierarchy
				AWS Regions
					full-instructions
					short-instructions
				Example Worker Template with the crowd Element
				Output
Prepare and Analyze Datasets
	Detect Pretraining Data Bias
		Amazon SageMaker Clarify Terms for Bias and Fairness
		Sample Notebooks
		Measure Pretraining Bias
			Class Imbalance (CI)
			Difference in Proportions of Labels (DPL)
			Kullback-Leibler Divergence (KL)
			Jensen-Shannon Divergence (JS)
			Lp-norm (LP)
			Total Variation Distance (TVD)
			Kolmogorov-Smirnov (KS)
			Conditional Demographic Disparity (CDD)
		Generate Reports for Bias in Pretraining Data in SageMaker Studio
	Prepare ML Data with Amazon SageMaker Data Wrangler
		Get Started with Data Wrangler
			Prerequisites
			Access Data Wrangler
			Update Data Wrangler
			Demo: Data Wrangler Titanic Dataset Walkthrough
				Upload Dataset to S3 and Import
				Data Flow
					Prepare and Visualize
						Data Exploration
						Drop Unused Columns
						Clean up Missing Values
						Custom Pandas: Encode
					Custom SQL: SELECT Columns
				Export
					Export to Data Wrangler Job Notebook
					Training XGBoost Classifier
					Shut down Data Wrangler
		Import
			Import data from Amazon S3
			Import data from Athena
			Import data from Amazon Redshift
			Import data from Snowflake
				Administrator Guide
					Configure Snowflake with Data Wrangler
					What information needs to be provided to the Data Scientist
				Data Scientist Guide
				Private Connectivity between Data Wrangler and Snowflake via AWS PrivateLink
					Create a VPC
					Set up Snowflake AWS PrivateLink Integration
					Configure DNS for Snowflake Endpoints in your VPC
					Configure Route 53 Resolver Inbound Endpoint for your VPC
					SageMaker VPC Endpoints
			Imported Data Storage
				Amazon Redshift Import Storage
				Amazon Athena Import Storage
		Create and Use a Data Wrangler Flow
			Instances
			The Data Flow UI
			Add a Step to Your Data Flow
			Delete a step from your Data Flow
		Transform Data
			Transform UI
			Join Datasets
			Concatenate Datasets
			Custom Transforms
			Custom Formula
			Encode Categorical
				Ordinal Encode
				One-Hot Encode
			Featurize Text
				Character Statistics
				Vectorize
			Featurize Date/Time
			Format String
			Handle Outliers
				Robust standard deviation numeric outliers
				Standard Deviation Numeric Outliers
				Quantile Numeric Outliers
				Min-Max Numeric Outliers
				Replace Rare
			Handle Missing Values
				Fill Missing
				Impute Missing
				Add Indicator for Missing
				Drop Missing
			Manage Columns
			Manage Rows
			Manage Vectors
			Process Numeric
			Search and Edit
			Parse Value as Type
			Validate String
		Analyze and Visualize
			Histogram
			Scatter Plot
			Table Summary
			Quick Model
			Target Leakage
			Bias Report
			Create Custom Visualizations
		Export
			Export to a Data Wrangler Job
			Export to SageMaker Pipelines
				Use A Jupyter Notebook to Create a Pipeline
			Export to Python Code
			Export to the SageMaker Feature Store
				Use A Jupyter Notebook to Add Features to a Feature Store
			Export to Amazon S3
		Shut Down Data Wrangler
		Update Data Wrangler
		Security and Permissions
			Add a Bucket Policy To Restrict Access to Datasets Imported to Data Wrangler
			Grant an IAM Role Permission to Use Data Wrangler
			Snowflake and Data Wrangler
			Data Encryption with KMS-CMK
				Amazon S3 CMK setup for Data Wrangler imported data storage
		Release Notes
		Troubleshoot
Process Data
	Use Amazon SageMaker Processing Sample Notebooks
	Monitor Amazon SageMaker Processing Jobs with CloudWatch Logs and Metrics
	Data Processing with Apache Spark
		Running a Spark Processing Job
	Data Processing with scikit-learn
	Use Your Own Processing Code
		Run Scripts with Your Own Processing Container
		Build Your Own Processing Container (Advanced Scenario)
			How Amazon SageMaker Processing Runs Your Processing Container Image
			How Amazon SageMaker Processing Configures Input and Output For Your Processing Container
			How Amazon SageMaker Processing Provides Logs and Metrics for Your Processing Container
			How Amazon SageMaker Processing Configures Your Processing Container
			Save and Access Metadata Information About Your Processing Job
			Run Your Processing Container Using the SageMaker Python SDK
Create, Store, and Share Features with Amazon SageMaker Feature Store
	How Feature Store Works
	Create Feature Groups
	Find, Discover, and Share Features
	Real-Time Inference for Features Stored in the Online Store 
	Offline Store for Model Training and Batch Inference
	Feature Data Ingestion
	Get started with Amazon SageMaker Feature Store
		Feature Store Concepts
		Create Feature Groups
			Introduction to Feature Store
				Step 1: Set Up
				Step 2: Inspect your data
				Step 3: Create feature groups
				Step 4: Ingest data into a feature group
				Step 5: Clean up
				Step 6: Next steps
				Step 7: Programmers note
			Fraud Detection with Feature Store
				Step 1: Set Up Feature Store
				Step 2: Load Datasets and Partition Data into Feature Groups
				Step 3: Set Up Feature Groups
				Step 4: Set Up Record Identifier and Event Time Features
				Step 5: Load Feature Definitions
				Step 6: Create a Feature Group
				Step 7: Work with Feature Groups
					Describe a Feature Group
					List Feature Groups
					Put Records in a Feature Group
					Get Records from a Feature Group
					Generate Hive DDL Commands
					Build a Training Dataset
					Write and Execute an Athena Query
					Delete a Feature Group
		Adding required policies to your IAM role
			Step 1: Access AWS Management Console
			Step 2: Choose Roles
			Step 3: Find your role
			Step 4: Attach policy
		Use Amazon SageMaker Feature Store with Amazon SageMaker Studio
			Create a Feature Group in Studio
			View Feature Group Details in Studio
	Data Sources and Ingestion
		Stream Ingestion
		Data Wrangler with Feature Store
	Query Feature Store with Athena and AWS Glue
		Sample Athena Queries
	Cross-Account Offline Store Access
		Step 1: Set Up the Offline Store Access Role in Account A
		Step 2: Set up an Offline Store S3 Bucket in Account B
		Step 3: Set up an Offline Store KMS Encryption Key in Account A
		Step 4: Create a Feature Group in Account A
	Quotas, Naming Rules and Data Types
		Limits and Quotas
		Naming Rules
		Data Types
	Amazon SageMaker Feature Store Offline Store Data Format
	Amazon SageMaker Feature Store Notebook Examples
		Feature Store sample notebooks
Train Models
	Choose an Algorithm
		Choose an algorithm implementation
			Use a built-in algorithm
			Use script mode in a supported framework
			Use a custom Docker image
		Problem types for the basic machine learning paradigms
			Supervised learning
			Unsupervised learning
			Reinforcement learning
		Use Amazon SageMaker Built-in Algorithms
			Supervised Learning
			Unsupervised Learning
			Textual Analysis
			Image Processing
			Common Information About Built-in Algorithms
				Docker Registry Paths and Example Code
					Docker Registry Paths and Example Code for US East (Ohio) (us-east-2)
						BlazingText (algorithm)
						Chainer (DLC)
						Clarify (algorithm)
						Data Wrangler (algorithm)
						Debugger (algorithm)
						DeepAR Forecasting (algorithm)
						Factorization Machines (algorithm)
						Hugging Face (algorithm)
						IP Insights (algorithm)
						Image classification (algorithm)
						Inferentia MXNet (DLC)
						Inferentia PyTorch (DLC)
						K-Means (algorithm)
						KNN (algorithm)
						LDA (algorithm)
						Linear Learner (algorithm)
						MXNet (DLC)
						MXNet Coach (DLC)
						Model Monitor (algorithm)
						NTM (algorithm)
						Neo Image Classification (algorithm)
						Neo MXNet (DLC)
						Neo PyTorch (DLC)
						Neo Tensorflow (DLC)
						Neo XGBoost (algorithm)
						Object Detection (algorithm)
						Object2Vec (algorithm)
						PCA (algorithm)
						PyTorch (DLC)
						Random Cut Forest (algorithm)
						Ray PyTorch (DLC)
						Scikit-learn (algorithm)
						Semantic Segmentation (algorithm)
						Seq2Seq (algorithm)
						Spark (algorithm)
						SparkML Serving (algorithm)
						Tensorflow (DLC)
						Tensorflow Coach (DLC)
						Tensorflow Inferentia (DLC)
						Tensorflow Ray (DLC)
						VW (algorithm)
						XGBoost (algorithm)
					Docker Registry Paths and Example Code for US East (N. Virginia) (us-east-1)
						BlazingText (algorithm)
						Chainer (DLC)
						Clarify (algorithm)
						Data Wrangler (algorithm)
						Debugger (algorithm)
						DeepAR Forecasting (algorithm)
						Factorization Machines (algorithm)
						Hugging Face (algorithm)
						IP Insights (algorithm)
						Image classification (algorithm)
						Inferentia MXNet (DLC)
						Inferentia PyTorch (DLC)
						K-Means (algorithm)
						KNN (algorithm)
						LDA (algorithm)
						Linear Learner (algorithm)
						MXNet (DLC)
						MXNet Coach (DLC)
						Model Monitor (algorithm)
						NTM (algorithm)
						Neo Image Classification (algorithm)
						Neo MXNet (DLC)
						Neo PyTorch (DLC)
						Neo Tensorflow (DLC)
						Neo XGBoost (algorithm)
						Object Detection (algorithm)
						Object2Vec (algorithm)
						PCA (algorithm)
						PyTorch (DLC)
						Random Cut Forest (algorithm)
						Ray PyTorch (DLC)
						Scikit-learn (algorithm)
						Semantic Segmentation (algorithm)
						Seq2Seq (algorithm)
						Spark (algorithm)
						SparkML Serving (algorithm)
						Tensorflow (DLC)
						Tensorflow Coach (DLC)
						Tensorflow Inferentia (DLC)
						Tensorflow Ray (DLC)
						VW (algorithm)
						XGBoost (algorithm)
					Docker Registry Paths and Example Code for US West (N. California) (us-west-1)
						BlazingText (algorithm)
						Chainer (DLC)
						Clarify (algorithm)
						Data Wrangler (algorithm)
						Debugger (algorithm)
						DeepAR Forecasting (algorithm)
						Factorization Machines (algorithm)
						Hugging Face (algorithm)
						IP Insights (algorithm)
						Image classification (algorithm)
						Inferentia MXNet (DLC)
						Inferentia PyTorch (DLC)
						K-Means (algorithm)
						KNN (algorithm)
						LDA (algorithm)
						Linear Learner (algorithm)
						MXNet (DLC)
						MXNet Coach (DLC)
						Model Monitor (algorithm)
						NTM (algorithm)
						Neo Image Classification (algorithm)
						Neo MXNet (DLC)
						Neo PyTorch (DLC)
						Neo Tensorflow (DLC)
						Neo XGBoost (algorithm)
						Object Detection (algorithm)
						Object2Vec (algorithm)
						PCA (algorithm)
						PyTorch (DLC)
						Random Cut Forest (algorithm)
						Ray PyTorch (DLC)
						Scikit-learn (algorithm)
						Semantic Segmentation (algorithm)
						Seq2Seq (algorithm)
						Spark (algorithm)
						SparkML Serving (algorithm)
						Tensorflow (DLC)
						Tensorflow Coach (DLC)
						Tensorflow Inferentia (DLC)
						Tensorflow Ray (DLC)
						VW (algorithm)
						XGBoost (algorithm)
					Docker Registry Paths and Example Code for US West (Oregon) (us-west-2)
						BlazingText (algorithm)
						Chainer (DLC)
						Clarify (algorithm)
						Data Wrangler (algorithm)
						Debugger (algorithm)
						DeepAR Forecasting (algorithm)
						Factorization Machines (algorithm)
						Hugging Face (algorithm)
						IP Insights (algorithm)
						Image classification (algorithm)
						Inferentia MXNet (DLC)
						Inferentia PyTorch (DLC)
						K-Means (algorithm)
						KNN (algorithm)
						LDA (algorithm)
						Linear Learner (algorithm)
						MXNet (DLC)
						MXNet Coach (DLC)
						Model Monitor (algorithm)
						NTM (algorithm)
						Neo Image Classification (algorithm)
						Neo MXNet (DLC)
						Neo PyTorch (DLC)
						Neo Tensorflow (DLC)
						Neo XGBoost (algorithm)
						Object Detection (algorithm)
						Object2Vec (algorithm)
						PCA (algorithm)
						PyTorch (DLC)
						Random Cut Forest (algorithm)
						Ray PyTorch (DLC)
						Scikit-learn (algorithm)
						Semantic Segmentation (algorithm)
						Seq2Seq (algorithm)
						Spark (algorithm)
						SparkML Serving (algorithm)
						Tensorflow (DLC)
						Tensorflow Coach (DLC)
						Tensorflow Inferentia (DLC)
						Tensorflow Ray (DLC)
						VW (algorithm)
						XGBoost (algorithm)
					Docker Registry Paths and Example Code for Africa (Cape Town) (af-south-1)
						BlazingText (algorithm)
						Chainer (DLC)
						Clarify (algorithm)
						Data Wrangler (algorithm)
						Debugger (algorithm)
						DeepAR Forecasting (algorithm)
						Factorization Machines (algorithm)
						Hugging Face (algorithm)
						IP Insights (algorithm)
						Image classification (algorithm)
						Inferentia MXNet (DLC)
						Inferentia PyTorch (DLC)
						K-Means (algorithm)
						KNN (algorithm)
						Linear Learner (algorithm)
						MXNet (DLC)
						MXNet Coach (DLC)
						Model Monitor (algorithm)
						NTM (algorithm)
						Neo Image Classification (algorithm)
						Neo MXNet (DLC)
						Neo PyTorch (DLC)
						Neo Tensorflow (DLC)
						Neo XGBoost (algorithm)
						Object Detection (algorithm)
						Object2Vec (algorithm)
						PCA (algorithm)
						PyTorch (DLC)
						Random Cut Forest (algorithm)
						Scikit-learn (algorithm)
						Semantic Segmentation (algorithm)
						Seq2Seq (algorithm)
						Spark (algorithm)
						SparkML Serving (algorithm)
						Tensorflow (DLC)
						Tensorflow Coach (DLC)
						Tensorflow Inferentia (DLC)
						Tensorflow Ray (DLC)
						XGBoost (algorithm)
					Docker Registry Paths and Example Code for Asia Pacific (Hong Kong) (ap-east-1)
						BlazingText (algorithm)
						Chainer (DLC)
						Clarify (algorithm)
						Data Wrangler (algorithm)
						Debugger (algorithm)
						DeepAR Forecasting (algorithm)
						Factorization Machines (algorithm)
						Hugging Face (algorithm)
						IP Insights (algorithm)
						Image classification (algorithm)
						Inferentia MXNet (DLC)
						Inferentia PyTorch (DLC)
						K-Means (algorithm)
						KNN (algorithm)
						Linear Learner (algorithm)
						MXNet (DLC)
						MXNet Coach (DLC)
						Model Monitor (algorithm)
						NTM (algorithm)
						Neo Image Classification (algorithm)
						Neo MXNet (DLC)
						Neo PyTorch (DLC)
						Neo Tensorflow (DLC)
						Neo XGBoost (algorithm)
						Object Detection (algorithm)
						Object2Vec (algorithm)
						PCA (algorithm)
						PyTorch (DLC)
						Random Cut Forest (algorithm)
						Scikit-learn (algorithm)
						Semantic Segmentation (algorithm)
						Seq2Seq (algorithm)
						Spark (algorithm)
						SparkML Serving (algorithm)
						Tensorflow (DLC)
						Tensorflow Coach (DLC)
						Tensorflow Inferentia (DLC)
						Tensorflow Ray (DLC)
						XGBoost (algorithm)
					Docker Registry Paths and Example Code for Asia Pacific (Mumbai) (ap-south-1)
						BlazingText (algorithm)
						Chainer (DLC)
						Clarify (algorithm)
						Data Wrangler (algorithm)
						Debugger (algorithm)
						DeepAR Forecasting (algorithm)
						Factorization Machines (algorithm)
						Hugging Face (algorithm)
						IP Insights (algorithm)
						Image classification (algorithm)
						Inferentia MXNet (DLC)
						Inferentia PyTorch (DLC)
						K-Means (algorithm)
						KNN (algorithm)
						LDA (algorithm)
						Linear Learner (algorithm)
						MXNet (DLC)
						MXNet Coach (DLC)
						Model Monitor (algorithm)
						NTM (algorithm)
						Neo Image Classification (algorithm)
						Neo MXNet (DLC)
						Neo PyTorch (DLC)
						Neo Tensorflow (DLC)
						Neo XGBoost (algorithm)
						Object Detection (algorithm)
						Object2Vec (algorithm)
						PCA (algorithm)
						PyTorch (DLC)
						Random Cut Forest (algorithm)
						Ray PyTorch (DLC)
						Scikit-learn (algorithm)
						Semantic Segmentation (algorithm)
						Seq2Seq (algorithm)
						Spark (algorithm)
						SparkML Serving (algorithm)
						Tensorflow (DLC)
						Tensorflow Coach (DLC)
						Tensorflow Inferentia (DLC)
						Tensorflow Ray (DLC)
						VW (algorithm)
						XGBoost (algorithm)
					Docker Registry Paths and Example Code for Asia Pacific (Seoul) (ap-northeast-2)
						BlazingText (algorithm)
						Chainer (DLC)
						Clarify (algorithm)
						Data Wrangler (algorithm)
						Debugger (algorithm)
						DeepAR Forecasting (algorithm)
						Factorization Machines (algorithm)
						Hugging Face (algorithm)
						IP Insights (algorithm)
						Image classification (algorithm)
						Inferentia MXNet (DLC)
						Inferentia PyTorch (DLC)
						K-Means (algorithm)
						KNN (algorithm)
						LDA (algorithm)
						Linear Learner (algorithm)
						MXNet (DLC)
						MXNet Coach (DLC)
						Model Monitor (algorithm)
						NTM (algorithm)
						Neo Image Classification (algorithm)
						Neo MXNet (DLC)
						Neo PyTorch (DLC)
						Neo Tensorflow (DLC)
						Neo XGBoost (algorithm)
						Object Detection (algorithm)
						Object2Vec (algorithm)
						PCA (algorithm)
						PyTorch (DLC)
						Random Cut Forest (algorithm)
						Ray PyTorch (DLC)
						Scikit-learn (algorithm)
						Semantic Segmentation (algorithm)
						Seq2Seq (algorithm)
						Spark (algorithm)
						SparkML Serving (algorithm)
						Tensorflow (DLC)
						Tensorflow Coach (DLC)
						Tensorflow Inferentia (DLC)
						Tensorflow Ray (DLC)
						VW (algorithm)
						XGBoost (algorithm)
					Docker Registry Paths and Example Code for Asia Pacific (Singapore) (ap-southeast-1)
						BlazingText (algorithm)
						Chainer (DLC)
						Clarify (algorithm)
						Data Wrangler (algorithm)
						Debugger (algorithm)
						DeepAR Forecasting (algorithm)
						Factorization Machines (algorithm)
						Hugging Face (algorithm)
						IP Insights (algorithm)
						Image classification (algorithm)
						Inferentia MXNet (DLC)
						Inferentia PyTorch (DLC)
						K-Means (algorithm)
						KNN (algorithm)
						LDA (algorithm)
						Linear Learner (algorithm)
						MXNet (DLC)
						MXNet Coach (DLC)
						Model Monitor (algorithm)
						NTM (algorithm)
						Neo Image Classification (algorithm)
						Neo MXNet (DLC)
						Neo PyTorch (DLC)
						Neo Tensorflow (DLC)
						Neo XGBoost (algorithm)
						Object Detection (algorithm)
						Object2Vec (algorithm)
						PCA (algorithm)
						PyTorch (DLC)
						Random Cut Forest (algorithm)
						Ray PyTorch (DLC)
						Scikit-learn (algorithm)
						Semantic Segmentation (algorithm)
						Seq2Seq (algorithm)
						Spark (algorithm)
						SparkML Serving (algorithm)
						Tensorflow (DLC)
						Tensorflow Coach (DLC)
						Tensorflow Inferentia (DLC)
						Tensorflow Ray (DLC)
						VW (algorithm)
						XGBoost (algorithm)
					Docker Registry Paths and Example Code for Asia Pacific (Sydney) (ap-southeast-2)
						BlazingText (algorithm)
						Chainer (DLC)
						Clarify (algorithm)
						Data Wrangler (algorithm)
						Debugger (algorithm)
						DeepAR Forecasting (algorithm)
						Factorization Machines (algorithm)
						Hugging Face (algorithm)
						IP Insights (algorithm)
						Image classification (algorithm)
						Inferentia MXNet (DLC)
						Inferentia PyTorch (DLC)
						K-Means (algorithm)
						KNN (algorithm)
						LDA (algorithm)
						Linear Learner (algorithm)
						MXNet (DLC)
						MXNet Coach (DLC)
						Model Monitor (algorithm)
						NTM (algorithm)
						Neo Image Classification (algorithm)
						Neo MXNet (DLC)
						Neo PyTorch (DLC)
						Neo Tensorflow (DLC)
						Neo XGBoost (algorithm)
						Object Detection (algorithm)
						Object2Vec (algorithm)
						PCA (algorithm)
						PyTorch (DLC)
						Random Cut Forest (algorithm)
						Ray PyTorch (DLC)
						Scikit-learn (algorithm)
						Semantic Segmentation (algorithm)
						Seq2Seq (algorithm)
						Spark (algorithm)
						SparkML Serving (algorithm)
						Tensorflow (DLC)
						Tensorflow Coach (DLC)
						Tensorflow Inferentia (DLC)
						Tensorflow Ray (DLC)
						VW (algorithm)
						XGBoost (algorithm)
					Docker Registry Paths and Example Code for Asia Pacific (Tokyo) (ap-northeast-1)
						BlazingText (algorithm)
						Chainer (DLC)
						Clarify (algorithm)
						Data Wrangler (algorithm)
						Debugger (algorithm)
						DeepAR Forecasting (algorithm)
						Factorization Machines (algorithm)
						Hugging Face (algorithm)
						IP Insights (algorithm)
						Image classification (algorithm)
						Inferentia MXNet (DLC)
						Inferentia PyTorch (DLC)
						K-Means (algorithm)
						KNN (algorithm)
						LDA (algorithm)
						Linear Learner (algorithm)
						MXNet (DLC)
						MXNet Coach (DLC)
						Model Monitor (algorithm)
						NTM (algorithm)
						Neo Image Classification (algorithm)
						Neo MXNet (DLC)
						Neo PyTorch (DLC)
						Neo Tensorflow (DLC)
						Neo XGBoost (algorithm)
						Object Detection (algorithm)
						Object2Vec (algorithm)
						PCA (algorithm)
						PyTorch (DLC)
						Random Cut Forest (algorithm)
						Ray PyTorch (DLC)
						Scikit-learn (algorithm)
						Semantic Segmentation (algorithm)
						Seq2Seq (algorithm)
						Spark (algorithm)
						SparkML Serving (algorithm)
						Tensorflow (DLC)
						Tensorflow Coach (DLC)
						Tensorflow Inferentia (DLC)
						Tensorflow Ray (DLC)
						VW (algorithm)
						XGBoost (algorithm)
					Docker Registry Paths and Example Code for Canada (Central) (ca-central-1)
						BlazingText (algorithm)
						Chainer (DLC)
						Clarify (algorithm)
						Data Wrangler (algorithm)
						Debugger (algorithm)
						DeepAR Forecasting (algorithm)
						Factorization Machines (algorithm)
						Hugging Face (algorithm)
						IP Insights (algorithm)
						Image classification (algorithm)
						Inferentia MXNet (DLC)
						Inferentia PyTorch (DLC)
						K-Means (algorithm)
						KNN (algorithm)
						LDA (algorithm)
						Linear Learner (algorithm)
						MXNet (DLC)
						MXNet Coach (DLC)
						Model Monitor (algorithm)
						NTM (algorithm)
						Neo Image Classification (algorithm)
						Neo MXNet (DLC)
						Neo PyTorch (DLC)
						Neo Tensorflow (DLC)
						Neo XGBoost (algorithm)
						Object Detection (algorithm)
						Object2Vec (algorithm)
						PCA (algorithm)
						PyTorch (DLC)
						Random Cut Forest (algorithm)
						Ray PyTorch (DLC)
						Scikit-learn (algorithm)
						Semantic Segmentation (algorithm)
						Seq2Seq (algorithm)
						Spark (algorithm)
						SparkML Serving (algorithm)
						Tensorflow (DLC)
						Tensorflow Coach (DLC)
						Tensorflow Inferentia (DLC)
						Tensorflow Ray (DLC)
						VW (algorithm)
						XGBoost (algorithm)
					Docker Registry Paths and Example Code for China (Beijing) (cn-north-1)
						BlazingText (algorithm)
						Chainer (DLC)
						Clarify (algorithm)
						Data Wrangler (algorithm)
						Debugger (algorithm)
						DeepAR Forecasting (algorithm)
						Factorization Machines (algorithm)
						Hugging Face (algorithm)
						IP Insights (algorithm)
						Image classification (algorithm)
						Inferentia MXNet (DLC)
						Inferentia PyTorch (DLC)
						K-Means (algorithm)
						KNN (algorithm)
						Linear Learner (algorithm)
						MXNet (DLC)
						MXNet Coach (DLC)
						Model Monitor (algorithm)
						NTM (algorithm)
						Neo Image Classification (algorithm)
						Neo MXNet (DLC)
						Neo PyTorch (DLC)
						Neo Tensorflow (DLC)
						Neo XGBoost (algorithm)
						Object Detection (algorithm)
						Object2Vec (algorithm)
						PCA (algorithm)
						PyTorch (DLC)
						Random Cut Forest (algorithm)
						Scikit-learn (algorithm)
						Semantic Segmentation (algorithm)
						Seq2Seq (algorithm)
						Spark (algorithm)
						SparkML Serving (algorithm)
						Tensorflow (DLC)
						Tensorflow Coach (DLC)
						Tensorflow Inferentia (DLC)
						Tensorflow Ray (DLC)
						XGBoost (algorithm)
					Docker Registry Paths and Example Code for China (Ningxia) (cn-northwest-1)
						BlazingText (algorithm)
						Chainer (DLC)
						Clarify (algorithm)
						Data Wrangler (algorithm)
						Debugger (algorithm)
						DeepAR Forecasting (algorithm)
						Factorization Machines (algorithm)
						Hugging Face (algorithm)
						IP Insights (algorithm)
						Image classification (algorithm)
						Inferentia MXNet (DLC)
						Inferentia PyTorch (DLC)
						K-Means (algorithm)
						KNN (algorithm)
						Linear Learner (algorithm)
						MXNet (DLC)
						MXNet Coach (DLC)
						Model Monitor (algorithm)
						NTM (algorithm)
						Neo Image Classification (algorithm)
						Neo MXNet (DLC)
						Neo PyTorch (DLC)
						Neo Tensorflow (DLC)
						Neo XGBoost (algorithm)
						Object Detection (algorithm)
						Object2Vec (algorithm)
						PCA (algorithm)
						PyTorch (DLC)
						Random Cut Forest (algorithm)
						Scikit-learn (algorithm)
						Semantic Segmentation (algorithm)
						Seq2Seq (algorithm)
						Spark (algorithm)
						SparkML Serving (algorithm)
						Tensorflow (DLC)
						Tensorflow Coach (DLC)
						Tensorflow Inferentia (DLC)
						Tensorflow Ray (DLC)
						XGBoost (algorithm)
					Docker Registry Paths and Example Code for Europe (Frankfurt) (eu-central-1)
						BlazingText (algorithm)
						Chainer (DLC)
						Clarify (algorithm)
						Data Wrangler (algorithm)
						Debugger (algorithm)
						DeepAR Forecasting (algorithm)
						Factorization Machines (algorithm)
						Hugging Face (algorithm)
						IP Insights (algorithm)
						Image classification (algorithm)
						Inferentia MXNet (DLC)
						Inferentia PyTorch (DLC)
						K-Means (algorithm)
						KNN (algorithm)
						LDA (algorithm)
						Linear Learner (algorithm)
						MXNet (DLC)
						MXNet Coach (DLC)
						Model Monitor (algorithm)
						NTM (algorithm)
						Neo Image Classification (algorithm)
						Neo MXNet (DLC)
						Neo PyTorch (DLC)
						Neo Tensorflow (DLC)
						Neo XGBoost (algorithm)
						Object Detection (algorithm)
						Object2Vec (algorithm)
						PCA (algorithm)
						PyTorch (DLC)
						Random Cut Forest (algorithm)
						Ray PyTorch (DLC)
						Scikit-learn (algorithm)
						Semantic Segmentation (algorithm)
						Seq2Seq (algorithm)
						Spark (algorithm)
						SparkML Serving (algorithm)
						Tensorflow (DLC)
						Tensorflow Coach (DLC)
						Tensorflow Inferentia (DLC)
						Tensorflow Ray (DLC)
						VW (algorithm)
						XGBoost (algorithm)
					Docker Registry Paths and Example Code for Europe (Ireland) (eu-west-1)
						BlazingText (algorithm)
						Chainer (DLC)
						Clarify (algorithm)
						Data Wrangler (algorithm)
						Debugger (algorithm)
						DeepAR Forecasting (algorithm)
						Factorization Machines (algorithm)
						Hugging Face (algorithm)
						IP Insights (algorithm)
						Image classification (algorithm)
						Inferentia MXNet (DLC)
						Inferentia PyTorch (DLC)
						K-Means (algorithm)
						KNN (algorithm)
						LDA (algorithm)
						Linear Learner (algorithm)
						MXNet (DLC)
						MXNet Coach (DLC)
						Model Monitor (algorithm)
						NTM (algorithm)
						Neo Image Classification (algorithm)
						Neo MXNet (DLC)
						Neo PyTorch (DLC)
						Neo Tensorflow (DLC)
						Neo XGBoost (algorithm)
						Object Detection (algorithm)
						Object2Vec (algorithm)
						PCA (algorithm)
						PyTorch (DLC)
						Random Cut Forest (algorithm)
						Ray PyTorch (DLC)
						Scikit-learn (algorithm)
						Semantic Segmentation (algorithm)
						Seq2Seq (algorithm)
						Spark (algorithm)
						SparkML Serving (algorithm)
						Tensorflow (DLC)
						Tensorflow Coach (DLC)
						Tensorflow Inferentia (DLC)
						Tensorflow Ray (DLC)
						VW (algorithm)
						XGBoost (algorithm)
					Docker Registry Paths and Example Code for Europe (London) (eu-west-2)
						BlazingText (algorithm)
						Chainer (DLC)
						Clarify (algorithm)
						Data Wrangler (algorithm)
						Debugger (algorithm)
						DeepAR Forecasting (algorithm)
						Factorization Machines (algorithm)
						Hugging Face (algorithm)
						IP Insights (algorithm)
						Image classification (algorithm)
						Inferentia MXNet (DLC)
						Inferentia PyTorch (DLC)
						K-Means (algorithm)
						KNN (algorithm)
						LDA (algorithm)
						Linear Learner (algorithm)
						MXNet (DLC)
						MXNet Coach (DLC)
						Model Monitor (algorithm)
						NTM (algorithm)
						Neo Image Classification (algorithm)
						Neo MXNet (DLC)
						Neo PyTorch (DLC)
						Neo Tensorflow (DLC)
						Neo XGBoost (algorithm)
						Object Detection (algorithm)
						Object2Vec (algorithm)
						PCA (algorithm)
						PyTorch (DLC)
						Random Cut Forest (algorithm)
						Ray PyTorch (DLC)
						Scikit-learn (algorithm)
						Semantic Segmentation (algorithm)
						Seq2Seq (algorithm)
						Spark (algorithm)
						SparkML Serving (algorithm)
						Tensorflow (DLC)
						Tensorflow Coach (DLC)
						Tensorflow Inferentia (DLC)
						Tensorflow Ray (DLC)
						VW (algorithm)
						XGBoost (algorithm)
					Docker Registry Paths and Example Code for Europe (Paris) (eu-west-3)
						BlazingText (algorithm)
						Chainer (DLC)
						Clarify (algorithm)
						Data Wrangler (algorithm)
						Debugger (algorithm)
						DeepAR Forecasting (algorithm)
						Factorization Machines (algorithm)
						Hugging Face (algorithm)
						IP Insights (algorithm)
						Image classification (algorithm)
						Inferentia MXNet (DLC)
						Inferentia PyTorch (DLC)
						K-Means (algorithm)
						KNN (algorithm)
						Linear Learner (algorithm)
						MXNet (DLC)
						MXNet Coach (DLC)
						Model Monitor (algorithm)
						NTM (algorithm)
						Neo Image Classification (algorithm)
						Neo MXNet (DLC)
						Neo PyTorch (DLC)
						Neo Tensorflow (DLC)
						Neo XGBoost (algorithm)
						Object Detection (algorithm)
						Object2Vec (algorithm)
						PCA (algorithm)
						PyTorch (DLC)
						Random Cut Forest (algorithm)
						Scikit-learn (algorithm)
						Semantic Segmentation (algorithm)
						Seq2Seq (algorithm)
						Spark (algorithm)
						SparkML Serving (algorithm)
						Tensorflow (DLC)
						Tensorflow Coach (DLC)
						Tensorflow Inferentia (DLC)
						Tensorflow Ray (DLC)
						XGBoost (algorithm)
					Docker Registry Paths and Example Code for Europe (Stockholm) (eu-north-1)
						BlazingText (algorithm)
						Chainer (DLC)
						Clarify (algorithm)
						Data Wrangler (algorithm)
						Debugger (algorithm)
						DeepAR Forecasting (algorithm)
						Factorization Machines (algorithm)
						Hugging Face (algorithm)
						IP Insights (algorithm)
						Image classification (algorithm)
						Inferentia MXNet (DLC)
						Inferentia PyTorch (DLC)
						K-Means (algorithm)
						KNN (algorithm)
						Linear Learner (algorithm)
						MXNet (DLC)
						MXNet Coach (DLC)
						Model Monitor (algorithm)
						NTM (algorithm)
						Neo Image Classification (algorithm)
						Neo MXNet (DLC)
						Neo PyTorch (DLC)
						Neo Tensorflow (DLC)
						Neo XGBoost (algorithm)
						Object Detection (algorithm)
						Object2Vec (algorithm)
						PCA (algorithm)
						PyTorch (DLC)
						Random Cut Forest (algorithm)
						Scikit-learn (algorithm)
						Semantic Segmentation (algorithm)
						Seq2Seq (algorithm)
						Spark (algorithm)
						SparkML Serving (algorithm)
						Tensorflow (DLC)
						Tensorflow Coach (DLC)
						Tensorflow Inferentia (DLC)
						Tensorflow Ray (DLC)
						XGBoost (algorithm)
					Docker Registry Paths and Example Code for Europe (Milan) (eu-south-1)
						BlazingText (algorithm)
						Chainer (DLC)
						Clarify (algorithm)
						Data Wrangler (algorithm)
						Debugger (algorithm)
						DeepAR Forecasting (algorithm)
						Factorization Machines (algorithm)
						Hugging Face (algorithm)
						IP Insights (algorithm)
						Image classification (algorithm)
						Inferentia MXNet (DLC)
						Inferentia PyTorch (DLC)
						K-Means (algorithm)
						KNN (algorithm)
						Linear Learner (algorithm)
						MXNet (DLC)
						MXNet Coach (DLC)
						Model Monitor (algorithm)
						NTM (algorithm)
						Neo Image Classification (algorithm)
						Neo MXNet (DLC)
						Neo PyTorch (DLC)
						Neo Tensorflow (DLC)
						Neo XGBoost (algorithm)
						Object Detection (algorithm)
						Object2Vec (algorithm)
						PCA (algorithm)
						PyTorch (DLC)
						Random Cut Forest (algorithm)
						Scikit-learn (algorithm)
						Semantic Segmentation (algorithm)
						Seq2Seq (algorithm)
						Spark (algorithm)
						SparkML Serving (algorithm)
						Tensorflow (DLC)
						Tensorflow Coach (DLC)
						Tensorflow Inferentia (DLC)
						Tensorflow Ray (DLC)
						XGBoost (algorithm)
					Docker Registry Paths and Example Code for Middle East (Bahrain) (me-south-1)
						BlazingText (algorithm)
						Chainer (DLC)
						Clarify (algorithm)
						Data Wrangler (algorithm)
						Debugger (algorithm)
						DeepAR Forecasting (algorithm)
						Factorization Machines (algorithm)
						Hugging Face (algorithm)
						IP Insights (algorithm)
						Image classification (algorithm)
						Inferentia MXNet (DLC)
						Inferentia PyTorch (DLC)
						K-Means (algorithm)
						KNN (algorithm)
						Linear Learner (algorithm)
						MXNet (DLC)
						MXNet Coach (DLC)
						Model Monitor (algorithm)
						NTM (algorithm)
						Neo Image Classification (algorithm)
						Neo MXNet (DLC)
						Neo PyTorch (DLC)
						Neo Tensorflow (DLC)
						Neo XGBoost (algorithm)
						Object Detection (algorithm)
						Object2Vec (algorithm)
						PCA (algorithm)
						PyTorch (DLC)
						Random Cut Forest (algorithm)
						Scikit-learn (algorithm)
						Semantic Segmentation (algorithm)
						Seq2Seq (algorithm)
						Spark (algorithm)
						SparkML Serving (algorithm)
						Tensorflow (DLC)
						Tensorflow Coach (DLC)
						Tensorflow Inferentia (DLC)
						Tensorflow Ray (DLC)
						XGBoost (algorithm)
					Docker Registry Paths and Example Code for South America (São Paulo) (sa-east-1)
						BlazingText (algorithm)
						Chainer (DLC)
						Clarify (algorithm)
						Data Wrangler (algorithm)
						Debugger (algorithm)
						DeepAR Forecasting (algorithm)
						Factorization Machines (algorithm)
						Hugging Face (algorithm)
						IP Insights (algorithm)
						Image classification (algorithm)
						Inferentia MXNet (DLC)
						Inferentia PyTorch (DLC)
						K-Means (algorithm)
						KNN (algorithm)
						Linear Learner (algorithm)
						MXNet (DLC)
						MXNet Coach (DLC)
						Model Monitor (algorithm)
						NTM (algorithm)
						Neo Image Classification (algorithm)
						Neo MXNet (DLC)
						Neo PyTorch (DLC)
						Neo Tensorflow (DLC)
						Neo XGBoost (algorithm)
						Object Detection (algorithm)
						Object2Vec (algorithm)
						PCA (algorithm)
						PyTorch (DLC)
						Random Cut Forest (algorithm)
						Scikit-learn (algorithm)
						Semantic Segmentation (algorithm)
						Seq2Seq (algorithm)
						Spark (algorithm)
						SparkML Serving (algorithm)
						Tensorflow (DLC)
						Tensorflow Coach (DLC)
						Tensorflow Inferentia (DLC)
						Tensorflow Ray (DLC)
						XGBoost (algorithm)
					Docker Registry Paths and Example Code for AWS GovCloud (US-West) (us-gov-west-1)
						BlazingText (algorithm)
						Chainer (DLC)
						Debugger (algorithm)
						DeepAR Forecasting (algorithm)
						Factorization Machines (algorithm)
						Hugging Face (algorithm)
						IP Insights (algorithm)
						Image classification (algorithm)
						Inferentia MXNet (DLC)
						Inferentia PyTorch (DLC)
						K-Means (algorithm)
						KNN (algorithm)
						LDA (algorithm)
						Linear Learner (algorithm)
						MXNet (DLC)
						MXNet Coach (DLC)
						NTM (algorithm)
						Neo Image Classification (algorithm)
						Neo MXNet (DLC)
						Neo PyTorch (DLC)
						Neo Tensorflow (DLC)
						Neo XGBoost (algorithm)
						Object Detection (algorithm)
						Object2Vec (algorithm)
						PCA (algorithm)
						PyTorch (DLC)
						Random Cut Forest (algorithm)
						Scikit-learn (algorithm)
						Semantic Segmentation (algorithm)
						Seq2Seq (algorithm)
						Spark (algorithm)
						SparkML Serving (algorithm)
						Tensorflow (DLC)
						Tensorflow Coach (DLC)
						Tensorflow Inferentia (DLC)
						Tensorflow Ray (DLC)
						XGBoost (algorithm)
				Common Data Formats for Built-in Algorithms
					Common Data Formats for Training
						Content Types Supported by Built-In Algorithms
						Using Pipe Mode
						Using CSV Format
						Using RecordIO Format
						Trained Model Deserialization
					Common Data Formats for Inference
						Convert Data for Inference Request Serialization
						Convert Data for Inference Response Deserialization
						Common Request Formats for All Algorithms
							JSON Request Format
							JSONLINES Request Format
							CSV Request Format
							RECORDIO Request Format
						Use Batch Transform with Built-in Algorithms
				Instance Types for Built-in Algorithms
				Logs for Built-in Algorithms
					Common Errors
			BlazingText algorithm
				Input/Output Interface for the BlazingText Algorithm
					Training and Validation Data Format
						Training and Validation Data Format for the Word2Vec Algorithm
						Training and Validation Data Format for the Text Classification Algorithm
							Train with File Mode
							Train with Augmented Manifest Text Format
					Model Artifacts and Inference
						Model Artifacts for the Word2Vec Algorithm
							Sample JSON Request
						Model Artifacts for the Text Classification Algorithm
							Sample JSON Request
				EC2 Instance Recommendation for the BlazingText Algorithm
				BlazingText Sample Notebooks
				BlazingText Hyperparameters
					Word2Vec Hyperparameters
					Text Classification Hyperparameters
				Tune a BlazingText Model
					Metrics Computed by the BlazingText Algorithm
					Tunable BlazingText Hyperparameters
						Tunable Hyperparameters for the Word2Vec Algorithm
						Tunable Hyperparameters for the Text Classification Algorithm
			DeepAR Forecasting Algorithm
				Input/Output Interface for the DeepAR Algorithm
				Best Practices for Using the DeepAR Algorithm
				EC2 Instance Recommendations for the DeepAR Algorithm
				DeepAR Sample Notebooks
				How the DeepAR Algorithm Works
					How Feature Time Series Work in the DeepAR Algorithm
				DeepAR Hyperparameters
				Tune a DeepAR Model
					Metrics Computed by the DeepAR Algorithm
					Tunable Hyperparameters for the DeepAR Algorithm
				DeepAR Inference Formats
					DeepAR JSON Request Formats
					DeepAR JSON Response Formats
					Batch Transform with the DeepAR Algorithm
			Factorization Machines Algorithm
				Input/Output Interface for the Factorization Machines Algorithm
				EC2 Instance Recommendation for the Factorization Machines Algorithm
				Factorization Machines Sample Notebooks
				How Factorization Machines Work
				Factorization Machines Hyperparameters
				Tune a Factorization Machines Model
					Metrics Computed by the Factorization Machines Algorithm
					Tunable Factorization Machines Hyperparameters
				Factorization Machines Response Formats
					JSON Response Format
					JSONLINES Response Format
					RECORDIO Response Format
			Image Classification Algorithm
				Input/Output Interface for the Image Classification Algorithm
					Train with RecordIO Format
					Train with Image Format
					Train with Augmented Manifest Image Format
					Incremental Training
					Inference with the Image Classification Algorithm
				EC2 Instance Recommendation for the Image Classification Algorithm
				Image Classification Sample Notebooks
				How Image Classification Works
				Image Classification Hyperparameters
				Tune an Image Classification Model
					Metrics Computed by the Image Classification Algorithm
					Tunable Image Classification Hyperparameters
			IP Insights
				Input/Output Interface for the IP Insights Algorithm
				EC2 Instance Recommendation for the IP Insights Algorithm
					GPU Instances for the IP Insights Algorithm
					CPU Instances for the IP Insights Algorithm
				IP Insights Sample Notebooks
				How IP Insights Works
				IP Insights Hyperparameters
				Tune an IP Insights Model
					Metrics Computed by the IP Insights Algorithm
					Tunable IP Insights Hyperparameters
				IP Insights Data Formats
					IP Insights Training Data Formats
						IP Insights Training Data Input Formats
							INPUT: CSV
					IP Insights Inference Data Formats
						IP Insights Input Request Formats
							INPUT: CSV Format
							INPUT: JSON Format
							INPUT: JSONLINES Format
						IP Insights Output Response Formats
							OUTPUT: JSON Response Format
							OUTPUT: JSONLINES Response Format
			K-Means Algorithm
				Input/Output Interface for the K-Means Algorithm
				EC2 Instance Recommendation for the K-Means Algorithm
				K-Means Sample Notebooks
				How K-Means Clustering Works
					Step 1: Determine the Initial Cluster Centers
					Step 2: Iterate over the Training Dataset and Calculate Cluster Centers
					Step 3: Reduce the Clusters from K to k
				K-Means Hyperparameters
				Tune a K-Means Model
					Metrics Computed by the K-Means Algorithm
					Tunable K-Means Hyperparameters
				K-Means Response Formats
					JSON Response Format
					JSONLINES Response Format
					RECORDIO Response Format
					CSV Response Format
			K-Nearest Neighbors (k-NN) Algorithm
				Input/Output Interface for the k-NN Algorithm
				k-NN Sample Notebooks
				How the k-NN Algorithm Works
					Step 1: Sample
					Step 2: Perform Dimension Reduction
					Step 3: Build an Index
					Serialize the Model
				EC2 Instance Recommendation for the k-NN Algorithm
					Instance Recommendation for Training with the k-NN Algorithm
					Instance Recommendation for Inference with the k-NN Algorithm
				k-NN Hyperparameters
				Tune a k-NN Model
					Metrics Computed by the k-NN Algorithm
					Tunable k-NN Hyperparameters
				Data Formats for k-NN Training Input
					CSV Data Format
					RECORDIO Data Format
				k-NN Request and Response Formats
					INPUT: CSV Request Format
					INPUT: JSON Request Format
					INPUT: JSONLINES Request Format
					INPUT: RECORDIO Request Format
					OUTPUT: JSON Response Format
					OUTPUT: JSONLINES Response Format
					OUTPUT: VERBOSE JSON Response Format
					OUTPUT: RECORDIO-PROTOBUF Response Format
					OUTPUT: VERBOSE RECORDIO-PROTOBUF Response Format
					SAMPLE OUTPUT for the k-NN Algorithm
			Latent Dirichlet Allocation (LDA) Algorithm
				Choosing between Latent Dirichlet Allocation (LDA) and Neural Topic Model (NTM)
				Input/Output Interface for the LDA Algorithm
				EC2 Instance Recommendation for the LDA Algorithm
				LDA Sample Notebooks
				How LDA Works
				LDA Hyperparameters
				Tune an LDA Model
					Metrics Computed by the LDA Algorithm
					Tunable LDA Hyperparameters
			Linear Learner Algorithm
				Input/Output interface for the linear learner algorithm
				EC2 instance recommendation for the linear learner algorithm
				Linear learner sample notebooks
				How linear learner works
					Step 1: Preprocess
					Step 2: Train
					Step 3: Validate and set the threshold
					Step 4: Deploy a trained linear model
				Linear learner hyperparameters
				Tune a linear learner model
					Metrics computed by the linear learner algorithm
					Tuning linear learner hyperparameters
				Linear learner response formats
					JSON response formats
					JSONLINES response formats
					RECORDIO response formats
			Neural Topic Model (NTM) Algorithm
				Input/Output Interface for the NTM Algorithm
				EC2 Instance Recommendation for the NTM Algorithm
				NTM Sample Notebooks
				NTM Hyperparameters
				Tune an NTM Model
					Metrics Computed by the NTM Algorithm
					Tunable NTM Hyperparameters
				NTM Response Formats
					JSON Response Format
					JSONLINES Response Format
					RECORDIO Response Format
			Object2Vec Algorithm
				I/O Interface for the Object2Vec Algorithm
				EC2 Instance Recommendation for the Object2Vec Algorithm
					Instance Recommendation for Training
					Instance Recommendation for Inference
				Object2Vec Sample Notebooks
				How Object2Vec Works
					Step 1: Process Data
					Step 2: Train a Model
					Step 3: Produce Inferences
				Object2Vec Hyperparameters
				Tune an Object2Vec Model
					Metrics Computed by the Object2Vec Algorithm
						Regressor Metrics Computed by the Object2Vec Algorithm
						Classification Metrics Computed by the Object2Vec Algorithm
					Tunable Object2Vec Hyperparameters
				Data Formats for Object2Vec Training
					Input: JSON Lines Request Format
				Data Formats for Object2Vec Inference
					GPU optimization: Classification or Regression
					Input: Classification or Regression Request Format
					Output: Classification or Regression Response Format
				Encoder Embeddings for Object2Vec
					GPU optimization: Encoder Embeddings
					Input: Encoder Embeddings
					Output: Encoder Embeddings
			Object Detection Algorithm
				Input/Output Interface for the Object Detection Algorithm
					Train with the RecordIO Format
					Train with the Image Format
					Train with Augmented Manifest Image Format
					Incremental Training
				EC2 Instance Recommendation for the Object Detection Algorithm
				Object Detection Sample Notebooks
				How Object Detection Works
				Object Detection Hyperparameters
				Tune an Object Detection Model
					Metrics Computed by the Object Detection Algorithm
					Tunable Object Detection Hyperparameters
				Object Detection Request and Response Formats
					Request Format
					Response Formats
					OUTPUT: JSON Response Format
			Principal Component Analysis (PCA) Algorithm
				Input/Output Interface for the PCA Algorithm
				EC2 Instance Recommendation for the PCA Algorithm
				PCA Sample Notebooks
				How PCA Works
					Mode 1: Regular
					Mode 2: Randomized
				PCA Hyperparameters
				PCA Response Formats
					JSON Response Format
					JSONLINES Response Format
					RECORDIO Response Format
			Random Cut Forest (RCF) Algorithm
				Input/Output Interface for the RCF Algorithm
				Instance Recommendations for the RCF Algorithm
				RCF Sample Notebooks
				How RCF Works
					Sample Data Randomly
					Train a RCF Model and Produce Inferences
					Choose Hyperparameters
					References
				RCF Hyperparameters
				Tune an RCF Model
					Metrics Computed by the RCF Algorithm
					Tunable RCF Hyperparameters
				RCF Response Formats
					JSON Response Format
						JSONLINES Response Format
					RECORDIO Response Format
			Semantic Segmentation Algorithm
				Semantic Segmentation Sample Notebooks
				Input/Output Interface for the Semantic Segmentation Algorithm
					How Training Works
					Training with the Augmented Manifest Format
					Incremental Training
					Produce Inferences
				EC2 Instance Recommendation for the Semantic Segmentation Algorithm
				Semantic Segmentation Hyperparameters
				Tuning a Semantic Segmentation Model
					Metrics Computed by the Semantic Segmentation Algorithm
					Tunable Semantic Segmentation Hyperparameters
			Sequence-to-Sequence Algorithm
				Input/Output Interface for the Sequence-to-Sequence Algorithm
				EC2 Instance Recommendation for the Sequence-to-Sequence Algorithm
				Sequence-to-Sequence Sample Notebooks
				How Sequence-to-Sequence Works
				Sequence-to-Sequence Hyperparameters
				Tune a Sequence-to-Sequence Model
					Metrics Computed by the Sequence-to-Sequence Algorithm
					Tunable Sequence-to-Sequence Hyperparameters
			XGBoost Algorithm
				Supported versions
				How to Use SageMaker XGBoost
				Input/Output Interface for the XGBoost Algorithm
				EC2 Instance Recommendation for the XGBoost Algorithm
				XGBoost Sample Notebooks
				How XGBoost Works
				XGBoost Hyperparameters
				Tune an XGBoost Model
					Evaluation Metrics Computed by the XGBoost Algorithm
					Tunable XGBoost Hyperparameters
				Deprecated Versions of XGBoost and their Upgrades
					Upgrade XGBoost Version 0.90 to Version 1.2
						Upgrade SageMaker Python SDK Version 1.x to Version 2.x
						Change the image tag to 1.2-2
						Change Docker Image for Boto3
						Update Hyperparameters and Learning Objectives
					XGBoost Version 0.72
						Input/Output Interface for the XGBoost Release 0.72
						EC2 Instance Recommendation for the XGBoost Release 0.72
						XGBoost Release 0.72 Sample Notebooks
						XGBoost Release 0.72 Hyperparameters
						Tune an XGBoost Release 0.72 Model
							Metrics Computed by the XGBoost Release 0.72 Algorithm
							Tunable XGBoost Release 0.72 Hyperparameters
		Use Reinforcement Learning with Amazon SageMaker
			What are the differences between reinforcement, supervised, and unsupervised learning paradigms?
			Why is Reinforcement Learning Important?
			Markov Decision Process (MDP)
			Key Features of Amazon SageMaker RL
			Reinforcement Learning Sample Notebooks
			Sample RL Workflow Using Amazon SageMaker RL
			RL Environments in Amazon SageMaker
				Use OpenAI Gym Interface for Environments in SageMaker RL
				Use Open-Source Environments
				Use Commercial Environments
			Distributed Training with Amazon SageMaker RL
			Hyperparameter Tuning with Amazon SageMaker RL
	Manage Machine Learning with Amazon SageMaker Experiments
		SageMaker Experiments Features
			Organize Experiments
			Track Experiments
			Compare and Evaluate Experiments
			Amazon SageMaker Autopilot
		Create an Amazon SageMaker Experiment
		View and Compare Amazon SageMaker Experiments, Trials, and Trial Components
			View Experiments, Trials, and Trial Components
			Compare Experiments, Trials, and Trial Components
		Track and Compare Tutorial
			Open the Notebook in Studio
			Install the Experiments SDK and Import Modules
			Transform and Track the Input Data
			Create and Track an Experiment
			Compare and Analyze Trials
		Search Experiments Using Amazon SageMaker Studio
			Search Experiments, Trials, and Trial Components
			Search the SageMaker Studio Leaderboard
			Search by Tag
		Clean Up Amazon SageMaker Experiment Resources
			Clean Up Using the Experiments SDK
			Clean Up Using the Python SDK (Boto3)
		Search Using the Amazon SageMaker Console and API
			Sample Notebooks for Managing ML Experiments
			Organize, Find, and Evaluate Training Jobs (Console)
				Use Tags to Track Training Jobs (Console)
				Find Training Jobs (Console)
				Evaluate Models (Console)
			Find and Evaluate Training Jobs (API)
				Find Training Jobs (API)
				Evaluate Models (API)
				Get Suggestions for a Search (API)
			Verify the Datasets Used by Your Training Jobs
			Trace Model Lineage
				Trace Model Lineage (Console)
				Trace Model Lineage (API)
	Amazon SageMaker Debugger
		Amazon SageMaker Debugger Features
		Supported Frameworks and Algorithms
			Use Debugger with Custom Training Containers
			Debugger Open-Source GitHub Repositories
		Amazon SageMaker Debugger Architecture
		Get Started with Debugger Tutorials
			Debugger Tutorial Videos
				Analyze, Detect, and Get Alerted on Problems with Training Runs Using Amazon SageMaker Debugger
				Debug Models with Amazon SageMaker Debugger in Studio
				Deep Dive on Amazon SageMaker Debugger and SageMaker Model Monitor
			Debugger Example Notebooks
				Debugger Example Notebooks for Profiling Training Jobs
				Debugger Example Notebooks for Analyzing Model Parameters
			Debugger Advanced Demos and Visualization
				Train and Tune Your Models with Amazon SageMaker Experiments and Debugger
				Using SageMaker Debugger to Monitor a Convolutional Autoencoder Model Training
				Using SageMaker Debugger to Monitor Attentions in BERT Model Training
				Using SageMaker Debugger to Visualize Class Activation Maps in Convolutional Neural Networks (CNNs)
		Configure Debugger Using Amazon SageMaker Python SDK
			Construct a SageMaker Estimator with Debugger
			Configure Debugger Monitoring Hardware System Resource Utilization
			Configure Debugger Framework Profiling
				Start a Training Job with the Default System Monitoring and Framework Profiling
				Start a Training Job with the Default System Monitoring and Customized Framework Profiling for Target Steps or a Target Time Range
				Start a Training Job with the Default System Monitoring and Customized Framework Profiling with Different Profiling Options
			Updating Debugger System Monitoring and Framework Profiling Configuration while a Training Job is Running
			Configure Debugger Hook to Save Tensors
				Configure Debugger Tensor Collections Using the CollectionConfig API Operation
				Configure Debugger Hook to Save Tensors
				Example Notebooks and Code Samples to Configure Debugger Hook
					Tensor Visualization Example Notebooks
					Save Tensors Using Debugger Built-in Collections
					Save Tensors Using Debugger Modified Built-in Collections
					Save Tensors Using Debugger Custom Collections
			Configure Debugger Built-in Rules
				Use Debugger Built-in Rules with the Default Parameter Settings
				Use Debugger Built-in Rules with Custom Parameter Values
				Example Notebooks and Code Samples to Configure Debugger Rules
					Debugger Built-in Rules Example Notebooks
					Debugger Built-in Rules Example Code
					Use Debugger Built-in Rules with Parameter Modifications
			Turn Off Debugger
			Useful SageMaker Estimator Classmethods for Debugger
		Configure Debugger Using Amazon SageMaker API
			JSON (AWS CLI)
				To configure a Debugger rule for debugging model parameters
				To configure a Debugger built-in rule for profiling system and framework metrics
				Update Debugger Profiling Configuration Using the UpdateTrainingJob API Operation
				Add Debugger Custom Rule Configuration to the CreateTrainingJob API Operation
			AWS Boto3
				To configure a Debugger rule for debugging model parameters
				To configure a Debugger built-in rule for profiling system and framework metrics
				Update Debugger Profiling Configuration Using the UpdateTrainingJob API Operation
				Add Debugger Custom Rule Configuration to the CreateTrainingJob API Operation
		List of Debugger Built-in Rules
			Debugger ProfilerRule
			Debugger Rule
			ProfilerReport
			BatchSize
			CPUBottleneck
			GPUMemoryIncrease
			IOBottleneck
			LoadBalancing
			LowGPUUtilization
			OverallSystemUsage
			MaxInitializationTime
			OverallFrameworkMetrics
			StepOutlier
			CreateXgboostReport
			DeadRelu
			ExplodingTensor
			PoorWeightInitialization
			SaturatedActivation
			VanishingGradient
			WeightUpdateRatio
			AllZero
			ClassImbalance
			LossNotDecreasing
			Overfit
			Overtraining
			SimilarAcrossRuns
			StalledTrainingRule
			TensorVariance
			UnchangedTensor
			CheckInputImages
			NLPSequenceRatio
			Confusion
			FeatureImportanceOverweight
			TreeDepth
		Create Debugger Custom Rules for Training Job Analysis
			Prerequisites for Creating Debugger Custom Rules
			Use the Debugger Client Library smdebug to Create a Custom Rule Python Script
			Use the Debugger APIs to Run Your Own Custom Rules
		Use Debugger with Custom Training Containers
			Prepare to Build a Custom Training Container
			Register Debugger Hook to Your Training Script
			Create and Configure a Dockerfile
			Build and Push the Custom Training Container to Amazon ECR
			Run and Debug Training Jobs Using the Custom Training Container
		Action on Amazon SageMaker Debugger Rules
			Debugger Built-in Actions for Rules
				Step 1: Set Up Amazon SNS, Create an SMDebugRules Topic, and Subscribe to the Topic
				Step 2: Set Up Your IAM Role to Attach Required Policies
				Step 3: Configure Debugger Rules with the Built-in Actions
				Considerations for Using the Debugger Built-in Actions
			Create Actions on Rules Using Amazon CloudWatch and AWS Lambda
				CloudWatch Logs for Debugger Rules and Training Jobs
				Set Up Debugger for Automated Training Job Termination Using CloudWatch and Lambda
					Step 1: Create a Lambda Function
					Step 2: Configure the Lambda function
					Step 3: Create a CloudWatch Events Rule and Link to the Lambda Function for Debugger
				Run Example Notebooks to Test Automated Training Job Termination
				Disable the CloudWatch Events Rule to Stop Using the Automated Training Job Termination
		SageMaker Debugger on Studio
			Open Amazon SageMaker Debugger Insights Dashboard
			SageMaker Debugger Insights Dashboard Controller
				SageMaker Debugger Insights Controller UI
				Enable and Configure Debugger Profiling for Detailed Insights
			SageMaker Debugger Insights Dashboard Walkthrough
				Debugger Insights – Overview
					Training job summary
					Resource utilization summary
					Resource intensive operations
					Insights
				Debugger Insights – Nodes
			Shut Down the SageMaker Debugger Insights Instance
			SageMaker Debugger on Studio Experiments
				Visualize Tensors Using SageMaker Debugger and Studio
					Loss Curves While Training Is in Progress
					Analyzing Training Jobs: Comparing Loss Curves Across Multiple Jobs
					Rules Triggering and Logs from Jobs
		SageMaker Debugger Interactive Reports
			SageMaker Debugger Profiling Report
				Download a Debugger Profiling Report
				Debugger Profiling Report Walkthrough
					Training Job Summary
					System Usage Statistics
					Framework metrics summary
						Overview: CPU Operators
						Overview: GPU Operators
					Rules Summary
					Analyzing the Training Loop – Step Durations
					GPU Utilization Analysis
					Batch Size
					CPU Bottlenecks
					I/O Bottlenecks
					LoadBalancing in Multi-GPU Training
					GPU Memory Analysis
			SageMaker Debugger XGBoost Training Report
				Construct a SageMaker XGBoost Estimator with the Debugger XGBoost Report Rule
				Download the Debugger XGBoost Training Report
				Debugger XGBoost Training Report Walkthrough
					Distribution of True Labels of the Dataset
					Loss versus Step Graph
					Feature Importance
					Confusion Matrix
					Evaluation of the Confusion Matrix
					Accuracy Rate of Each Diagonal Element Over Iteration
					Receiver Operating Characteristic Curve
					Distribution of Residuals at the Last Saved Step
					Absolute Validation Error per Label Bin Over Iteration
		Analyze Data Using the SMDebug Client Library
			Access the Monitoring and Profiling Data
			Plot the System Metrics and Framework Metrics Data
			Access the Profiling Data Using the Pandas Data Parsing Tool
			Access the Python Profiling Stats Data
			Merge Timelines of Different Profiling Trace Files
			Profiling Data Loader
		Visualize Amazon SageMaker Debugger Output Tensors in TensorBoard
		Best Practices for Amazon SageMaker Debugger
			Choose a Machine Learning Framework
			Use Studio Debugger Insights Dashboard
			Download Debugger Reports and Gain More Insights
			Capture Data from Your Training Job and Save Data to Amazon S3
			Analyze the Data with a Fleet of Debugger Built-in Rules
			Take Actions Based on the Built-in Rule Status
			Dive Deep into the Data Using the SMDebug Client Library
			Monitoring System Utilization and Detect Bottlenecks
			Profiling Framework Operations
			Debugging Model Parameters
		Amazon SageMaker Debugger Advanced Topics and Reference Documentation
			Amazon SageMaker Debugger API Operations
			Use Debugger Docker Images for Built-in or Custom Rules
				Amazon SageMaker Debugger Registry URLs for Built-in Rule Evaluators
				Amazon SageMaker Debugger Registry URLs for Custom Rule Evaluators
			Amazon SageMaker Debugger Exceptions
			Considerations for Amazon SageMaker Debugger
				Considerations for Distributed Training
				Considerations for Monitoring System Bottlenecks and Profiling Framework Operations
				Considerations for Debugging Model Output Tensors
			Amazon SageMaker Debugger Usage Statistics
				Debugger Profiling Report Usage
					(Recommended) Option 1: Opt Out before Running a Training Job
					Option 2: Opt Out after a Training Job Has Completed
	Perform Automatic Model Tuning with SageMaker
		How Hyperparameter Tuning Works
			Random Search
			Bayesian Search
		Define Metrics
		Define Hyperparameter Ranges
			Hyperparameter Scaling
		Tune Multiple Algorithms with Hyperparameter Optimization to Find the Best Model
			Get Started
			Create a Hyperparameter Optimization Tuning Job for One or More Algorithms (Console)
				Define job settings
				Create Training Job Definitions
					Configure algorithm and parameters
					Define Data Input and Output
					Configure Training Job Resources
					Add or Clone a Training Job
				Configure Tuning Job Resources
				Review and Create HPO Tuning Job
			Manage Hyperparameter Tuning and Training Jobs
		Example: Hyperparameter Tuning Job
			Prerequisites
			Create a Notebook
				Next Step
			Get the Amazon SageMaker Boto 3 Client
				Next Step
			Get the SageMaker Execution Role
				Next Step
			Specify a S3 Bucket to Upload Training Datasets and Store Output Data
				Next Step
			Download, Prepare, and Upload Training Data
				Download and Explore the Training Dataset
				Prepare and Upload Data
				Next Step
			Configure and Launch a Hyperparameter Tuning Job
				Specify the Hyperparameter Tuning Job Settings
				Configure the Training Jobs
				Name and Launch the Hyperparameter Tuning Job
				Next Step
			Monitor the Progress of a Hyperparameter Tuning Job
				View the Status of the Hyperparameter Tuning Job
				View the Status of the Training Jobs
				View the Best Training Job
					Next Step
			Clean up
		Stop Training Jobs Early
			How Early Stopping Works
			Algorithms That Support Early Stopping
		Run a Warm Start Hyperparameter Tuning Job
			Types of Warm Start Tuning Jobs
			Warm Start Tuning Restrictions
			Warm Start Tuning Sample Notebook
			Create a Warm Start Tuning Job
				Create a Warm Start Tuning Job ( Low-level SageMaker API for Python (Boto 3))
				Create a Warm Start Tuning Job (SageMaker Python SDK)
		Resource Limits for Automatic Model Tuning
		Best Practices for Hyperparameter Tuning
			Choosing the Number of Hyperparameters
			Choosing Hyperparameter Ranges
			Using Logarithmic Scales for Hyperparameters
			Choosing the Best Number of Concurrent Training Jobs
			Running Training Jobs on Multiple Instances
	Amazon SageMaker Distributed Training Libraries
		Get Started with Distributed Training
		Basic Distributed Training Concepts
		Advanced Concepts
		Strategies
			Train with Data Parallel and Model Parallel
		Optimize Distributed Training
			Batch Size
			Mini-Batch Size
		Scenarios
			Scaling from a Single GPU to Many GPUs
			Scaling from a Single Instance to Multiple Instances
			Availability Zones and Network Backplane
			Optimized GPU, Network, and Storage
			Custom Training Scripts
		SageMaker Built-In Distributed Training Features
		SageMaker's Distributed Data Parallel Library
			Introduction to SageMaker's Distributed Data Parallel Library
				Why Use SageMaker Distributed Data Parallel Library?
				Training Benchmarks
				Optimal Bandwidth Use with Balanced Fusion Buffer
				Optimal GPU Usage with Efficient AllReduce Overlapping with a Backward Pass
				SageMaker Distributed Data Parallel Architecture
			Modify Your Training Script Using the SageMaker Data Parallel Library
				Script Modification Overview
				Modify a TensorFlow Training Script
				Modify a PyTorch Training Script
			Run a SageMaker Distributed Data Parallel Training Job
				Use SageMaker's Distributed Data Parallel Library
				Use the Data Parallel Library with SageMaker's Python SDK
				TensorFlow Estimator
				PyTorch Estimator
			SageMaker distributed data parallel Configuration Tips and Pitfalls
				Data Preprocessing
				Single vs Multiple Nodes
				Debug Scaling Efficiency with Debugger
				Batch Size
				Custom MPI Options
			Data Parallel Library FAQ
			Data Parallel Troubleshooting
				Considerations for Using SageMaker Distributed Data Parallel with SageMaker Debugger and Checkpoints
				An Unexpected Prefix (model for example) Is Attached to state_dict keys (model parameters) from a PyTorch Distributed Training Job
		SageMaker's Distributed Model Parallel
			Introduction to Model Parallelism
				What is Model Parallelism?
				Important Considerations when Using Model Parallelism
			Core Features of SageMaker Distributed Model Parallel
				Automated Model Splitting
					How It Works
						Automated Model Splitting with PyTorch
						Automated Model Splitting with TensorFlow
						Comparison of Automated Model Splitting Between Frameworks
				Manual Model Splitting
				Pipeline Execution Schedule
					Interleaved Pipeline
					Simple Pipeline
					Pipelining Execution in Specific Frameworks
						Pipeline Execution with TensorFlow
						Pipeline Execution with PyTorch
			Modify Your Training Script Using SageMaker's Distributed Model Parallel Library
				Modify a TensorFlow Training Script
					Unsupported Framework Features
					TensorFlow
					TensorFlow with Horovod
					Manual partitioning with TensorFlow
				Modify a PyTorch Training Script
					Important Considerations
					Unsupported Framework Features
					PyTorch
					Manual Partitioning with PyTorch
			Run a SageMaker Distributed Model Parallel Training Job
				Launch a Training Job with the SageMaker Python SDK
				Extend or Adapt A Docker Container that Contains SageMaker's Distributed Model Parallel Library
			SageMaker distributed model parallel Configuration Tips and Pitfalls
			Model Parallel Troubleshooting
				Considerations for Using SageMaker Debugger with SageMaker Distributed Model Parallel
				Saving Checkpoints
				Convergence Using Model Parallel and TensorFlow
		Amazon SageMaker Distributed Training Notebook Examples
			Blogs and Case Studies
			PyTorch Examples
			TensorFlow Examples
			HuggingFace Examples
			How to Access or Download the SageMaker Distributed Training Notebook Examples
				Option 1: Use a SageMaker notebook instance
				Option 2: Clone the SageMaker example repository to SageMaker Studio or notebook instance
	Detect Posttraining Data and Model Bias
		Sample Notebooks
		Measure Posttraining Data and Model Bias
			Difference in Positive Proportions in Predicted Labels (DPPL)
			Disparate Impact (DI)
			Difference in Conditional Acceptance (DCAcc)
			Difference in Conditional Rejection (DCR)
			Recall Difference (RD)
			Difference in Acceptance Rates (DAR)
			Difference in Rejection Rates (DRR)
			Accuracy Difference (AD)
			Treatment Equality (TE)
			Conditional Demographic Disparity in Predicted Labels (CDDPL)
			Counterfactual Fliptest (FT)
		Configure an Amazon SageMaker Clarify Processing Jobs for Fairness and Explainability
			Prerequisites
			Getting Started with a SageMaker Clarify Container
			How It Works
			Configure a Processing Job Container's Input and Output Parameters
			Configure the Analysis
				Example Analysis Configuration JSON File for a CSV Dataset
				Example Analysis Configuration JSON File for a JSONLines Dataset
		Run SageMaker Clarify Processing Jobs for Bias Analysis and Explainability
			Compute Resources Required for SageMaker Clarify Processing Jobs
			Run the Clarify Processing Job
			Run the Clarify Processing Job with Spark
			Get the Analysis Results
		Troubleshoot SageMaker Clarify Processing Jobs
			Processing job fails to finish
			Processing job finishes without results and you get a CloudWatch warning message
			Error message for invalid analysis configuration
			Bias metric computation fails for several or all metrics
			Mismatch between analysis config and dataset/model input/output
			Model returns 500 Internal Server Error or container falls back to per-record predictions due to model error
			Execution role is invalid
			Failed to download data
			Could not connect to SageMaker
	Model Explainability
		Feature Attributions that Use Shapley Values
		SHAP Baselines for Explainability
		Create Feature Attribute Baselines and Explainability Reports
	Incremental Training in Amazon SageMaker
		Perform Incremental Training (Console)
		Perform Incremental Training (API)
	Managed Spot Training in Amazon SageMaker
		Using Managed Spot Training
		Managed Spot Training Lifecycle
	Use Checkpoints in Amazon SageMaker
		Checkpoints for Frameworks and Algorithms in SageMaker
		Enable Checkpointing
		Browse Checkpoint Files
		Resume Training From a Checkpoint
		Considerations for Checkpointing
	Provide Dataset Metadata to Training Jobs with an Augmented Manifest File
		Augmented Manifest File Format
		Stream Augmented Manifest File Data
		Use an Augmented Manifest File (Console)
		Use an Augmented Manifest File (API)
	Monitor and Analyze Training Jobs Using Metrics
		Training Metrics Sample Notebooks
		Defining Training Metrics
			Defining Regular Expressions for Metrics
			Defining Training Metrics (Low-level SageMaker API)
			Defining Training Metrics (SageMaker Python SDK)
			Define Training Metrics (Console)
		Monitoring Training Job Metrics ( Console)
		Monitoring Training Job Metrics (SageMaker Console)
		Example: Viewing a Training and Validation Curve
Deploy Models for Inference
	Prerequisites
	What do you want to do?
	Manage Model Deployments
	Deploy Your Own Inference Code
	Guide to SageMaker
	Use Amazon SageMaker Elastic Inference (EI)
		How EI Works
		Choose an EI Accelerator Type
		Use EI in a SageMaker Notebook Instance
		Use EI on a Hosted Endpoint
		Frameworks that Support EI
		Use EI with SageMaker Built-in Algorithms
		EI Sample Notebooks
		Set Up to Use EI
			Set Up Required Permissions
			Use a Custom VPC to Connect to EI
				Set up Security Groups to Connect to EI
				Set up a VPC Interface Endpoint to Connect to EI
		Attach EI to a Notebook Instance
			Set Up to Use EI
			Use EI in Local Mode in SageMaker
				Use EI in Local Mode with SageMaker TensorFlow Estimators and Models
				Use EI in Local Mode with SageMaker Apache MXNet Estimators and Models
				Use EI in Local Mode with SageMaker PyTorch Estimators and Models
		Use EI on Amazon SageMaker Hosted Endpoints
			Use EI with a SageMaker TensorFlow Container
				Use an Estimator Object
				Use a Model Object
			Use EI with a SageMaker MXNet Container
				Use an Estimator Object
				Use a Model Object
			Use EI with a SageMaker PyTorch Container
				Use an Estimator Object
				Use a Model Object
			Use EI with Your Own Container
				Import the EI Version of TensorFlow, MXNet, or PyTorch into Your Docker Container
				Create an EI Endpoint with AWS SDK for Python (Boto 3)
					Create an Endpoint Configuration
					Create an Endpoint
	Asynchronous Inference
		How It Works
		How Do I Get Started?
		Create, Invoke, and Update an Asynchronous Endpoint
			Prerequisites
			Create an Asynchronous Inference Endpoint
				Create a Model
				Create an Endpoint Configuration
				Create Endpoint
			Invoke an Asynchronous Endpoint
			Update an Asynchronous Endpoint
			Delete an Asynchronous Endpoint
		Monitor Asynchronous Endpoint
			Monitoring with CloudWatch
				Common Endpoint Metrics
				Asynchronous Inference Endpoint Metrics
			Logs
		Check Prediction Results
			Amazon SNS Topics
			Check Your S3 Bucket
		Autoscale an Asynchronous Endpoint
			Define a Scaling Policy
			Define a Scaling Policy that Scales to 0
	Use Batch Transform
		Use Batch Transform to Get Inferences from Large Datasets
		Speed up a Batch Transform Job
		Use Batch Transform to Test Production Variants
		Batch Transform Errors
		Batch Transform Sample Notebooks
		Associate Prediction Results with Input Records
			Workflow for Associating Inferences with Input Records
			Use Data Processing in Batch Transform Jobs
			Supported JSONPath Operators
			Batch Transform Examples
				Example: Output Only Inferences
				Example: Output Input Data and Inferences
				Example: Output an ID Column with Results and Exclude the ID Column from the Input (CSV)
				Example: Output an ID Attribute with Results and Exclude the ID Attribute from the Input (JSON)
	Host Multiple Models with Multi-Model Endpoints
		Supported Algorithms and Frameworks
		Sample Notebooks for Multi-Model Endpoints
		How Multi-Model Endpoints Work
		Setting SageMaker Multi-Model Endpoint Model Caching Behavior
		Instance Recommendations for Multi-Model Endpoint Deployments
		Create a Multi-Model Endpoint
			Create a Multi-Model Endpoint (Console)
			Create a Multi-Model Endpoint (AWS SDK for Python (Boto))
		Invoke a Multi-Model Endpoint
			Retry Requests on ModelNotReadyException Errors
		Add or Remove Models
		Build Your Own Container with Multi Model Server
			Use the SageMaker Inference Toolkit
			Contract for Custom Containers to Serve Multiple Models
				Load Model API
				List Model API
				Get Model API
				Unload Model API
				Invoke Model API
		Multi-Model Endpoint Security
		CloudWatch Metrics for Multi-Model Endpoint Deployments
	Deploy multi-container endpoints
		Create a multi-container endpoint (Boto 3)
		Update a multi-container endpoint
		Delete a multi-container endpoint
		Use a multi-container endpoint with direct invocation
			Invoke a multi-container endpoint with direct invocation
			Security with multi-container endpoints with direct invocation
			Metrics for multi-container endpoints with direct invocation
			Autoscale multi-container endpoints
			Troubleshoot multi-container endpoints
				Ping Health Check Errors
				Missing accept-bind-to-port=true Docker label
		Deploy an Inference Pipeline
			Sample Notebooks for Inference Pipelines
			Feature Processing with Spark ML and Scikit-learn
				Feature Processing with Spark ML
				Feature Processing with Sci-kit Learn
			Create a Pipeline Model
			Run Real-time Predictions with an Inference Pipeline
				Create and Deploy an Inference Pipeline Endpoint
				Request Real-Time Inference from an Inference Pipeline Endpoint
				Realtime inference pipeline example
			Run Batch Transforms with Inference Pipelines
			Inference Pipeline Logs and Metrics
				Use Metrics to Monitor Multi-container Models
				Use Logs to Monitor an Inference Pipeline
			Troubleshoot Inference Pipelines
				Troubleshoot Amazon ECR Permissions for Inference Pipelines
				Use CloudWatch Logs to Troubleshoot SageMaker Inference Pipelines
				Use Error Messages to Troubleshoot Inference Pipelines
	Automatically Scale Amazon SageMaker Models
		Prerequisites
			Autoscaling policy overview
			Target metric for autoscaling
			Minimum and maximum capacity
			Cooldown period
			Permissions
			Service-linked role
		Configure model autoscaling with the console
		Register a model
			Register a model with the AWS CLI
			Register a model with the Application Auto Scaling API
		Define a scaling policy
			Use a predefined metric
			Use a custom metric
			Add a cooldown period
		Apply a scaling policy
			Apply a scaling policy (AWS CLI)
			Apply a scaling policy (Application Auto Scaling API)
		Edit a scaling policy
			Scale-in
				Disable scale-in activity
			Scale-out
				Disable scale-out activity
			Edit a scaling policy (Console)
			Edit a scaling policy (AWS CLI or Application Auto Scaling API)
		Delete a scaling policy
			Delete a scaling policy (Console)
			Delete a scaling policy (AWS CLI or Application Auto Scaling API)
				Delete a scaling policy (AWS CLI)
				Delete a scaling policy (Application Auto Scaling API)
		Query Endpoint Autoscaling History
			How To Query Endpoint Autoscaling Actions
			How to Identify Blocked AutoScaling Due to Instance Quotas
		Update or delete endpoints that use automatic scaling
			Update endpoints that use automatic scaling
			Delete endpoints configured for automatic scaling
		Load testing your autoscaling configuration
			Determine the performance characteristics
			Calculate the target load
		Use AWS CloudFormation to update autoscaling policies
	Host Instance Storage Volumes
	Test models in production
		Test models by specifying traffic distribution
		Test models by invoking specific variants
		Model A/B test example
			Step 1: Create and deploy models
			Step 2: Invoke the deployed models
			Step 3: Evaluate model performance
			Step 4: Increase traffic to the best model
	Troubleshoot Amazon SageMaker Model Deployments
		Detection Errors in the Active CPU Count
	Deployment Best Practices
		Deploy Multiple Instances Across Availability Zones
	Amazon SageMaker Model Monitor
		How Model Monitor Works
			Model Monitor Sample Notebooks
		Monitor Data Quality
			Create a Baseline
			Schema for Statistics (statistics.json file)
			CloudWatch Metrics
			Schema for Violations (constraint_violations.json file)
		Monitor Model Quality
			Create a Model Quality Baseline
			Schedule Model Quality Monitoring Jobs
			Ingest Ground Truth Labels and Merge Them With Predictions
			Model Quality Metrics
				Regression Metrics
				Binary Classification Metrics
				Multiclass Metrics
			Model Quality CloudWatch Metrics
		Monitor Bias Drift for Models in Production
			Model Monitor Sample Notebook
			Create a Bias Drift Baseline
			Schedule Bias Drift Monitoring Jobs
			Inspect Reports for Data Bias Drift
		Monitor Feature Attribution Drift for Models in Production
			Model Monitor Example Notebook
			Create a SHAP Baseline for Models in Production
			Schedule Feature Attribute Drift Monitoring Jobs
			Inspect Reports for Feature Attribute Drift in Production Models
		Capture Data
		Schedule Monitoring Jobs
			The cron Expression for Monitoring Schedule
		Amazon SageMaker Model Monitor Prebuilt Container
		Interpret Results
			List Executions
			Inspect a Specific Execution
			List Generated Reports
			Violations Report
		Visualize Results in Amazon SageMaker Studio
		Advanced Topics
			Customize Monitoring
				Preprocessing and Postprocessing
					Postprocessing Script
					Preprocessing Script
				Bring Your Own Containers
					Container Contract Inputs
					Container Contract Outputs
						Schema for Statistics (statistics.json file)
						Schema for Constraints (constraints.json file)
					CloudWatch Metrics for Bring Your Own Containers
			Create a Monitoring Schedule with an AWS CloudFormation Custom Resource
				Custom Resource
				Lambda Custom Resource Code
	Register and Deploy Models with Model Registry
		Create a Model Group
			Create a Model Package Group (Boto3)
			Create a Model Package Group (SageMaker Studio)
		Register a Model Version
			Register a Model Version (SageMaker Pipelines)
			Register a Model Version (Boto3)
		View Model Groups and Versions
			View a List of Model Versions in a Group
				View a List of Model Versions in a Group (Boto3)
				View a List of Model Versions in a Group (SageMaker Studio)
		View the Details of a Model Version
			View the Details of a Model Version (Boto3)
			View the Details of a Model Version (SageMaker Studio)
		Update the Approval Status of a Model
			Update the Approval Status of a Model (Boto3)
			Update the Approval Status of a Model (SageMaker Studio)
		Deploy a Model in the Registry
			Deploy a Model in the Registry (SageMaker SDK)
			Deploy a Model in the Registry (Boto3)
		Deploy a Model Version from a Different Account
		View the Deployment History of a Model
	Compile and Deploy Models with Neo
		What is SageMaker Neo?
		How it Works
		Neo Sample Notebooks
		Use Neo to Compile a Model
			Prepare Model for Compilation
				What input data shapes does SageMaker Neo expect?
					Keras
					MXNet/ONNX
					PyTorch
					TensorFlow
					TFLite
					XGBoost
				Saving Models for SageMaker Neo
					Keras
					MXNet
					PyTorch
					TensorFlow
					Built-In Estimators
			Compile a Model (AWS Command Line Interface)
			Compile a Model (Amazon SageMaker Console)
			Compile a Model (Amazon SageMaker SDK)
		Cloud Instances
			Supported Instance Types and Frameworks
				Cloud Instances
				Instance Types
				AWS Inferentia
				Amazon Elastic Inference
			Deploy a Model
				Prerequisites
				Deploy a Compiled Model Using SageMaker SDK
					If you compiled your model using the SageMaker SDK
					If you compiled your model using MXNet or PyTorch
					If you compiled your model using Boto3, SageMaker console, or the CLI for TensorFlow
				Deploy a Compiled Model Using Boto3
					Deploy the Model
				Deploy a Compiled Model Using the AWS CLI
					Deploy the Model
						Create a Model
						Create an Endpoint Configuration
						Create an Endpoint
				Deploy a Compiled Model Using the Console
					Deploy the Model
			Request Inferences from a Deployed Service
				Request Inferences from a Deployed Service (Amazon SageMaker SDK)
					PyTorch and MXNet
					TensorFlow
				Request Inferences from a Deployed Service (Boto3)
				Request Inferences from a Deployed Service (AWS CLI)
			Inference Container Images
				Amazon SageMaker XGBoost
				TensorFlow
				MXNet
				PyTorch
		Edge Devices
			Supported Frameworks, Devices, Systems, and Architectures
				Supported Frameworks
				Supported Devices, Chip Architectures, and Systems
					Devices
					Systems and Chip Architectures
				Tested Models
					DarkNet
					MXNet
					Keras
					ONNX
					PyTorch (FP32)
					TensorFlow
					TensorFlow-Lite
			Deploy Models
				Deploy a Compiled Model (DLR)
				Deploy a Model (AWS IoT Greengrass)
			Getting Started with Neo on Edge Devices
				Prerequisites
				Step 1: Compile the Model
				Step 2: Set Up Your Device
				Step 3: Make Inferences on Your Device
		Troubleshoot Errors
			Error Classification Types
				Client permission error
				Load error
				Compilation error
			Troubleshoot Neo Compilation Errors
				How to Use This Page
				Framework-Related Errors
					TensorFlow
					Keras
					MXNet
				Infrastructure-Related Errors
			Troubleshoot Neo Inference Errors
			Troubleshoot Ambarella Errors
				Setting up the Configuration File
				Calibration Images
				Mean and Scale
	SageMaker Edge Manager
		Why Use Edge Manager?
		How Does it Work?
		How Do I Use SageMaker Edge Manager?
		Getting Started
			Setting Up
			Train, Compile, and Package Your Model
			Create and Register Fleets and Authenticate Devices
			Download and Set Up Edge Manager
			Run Agent
		Set Up Devices and Fleets
			Create a Fleet
				Create a Fleet (Boto3)
				Create a Fleet (Console)
			Register a Device
				Register a Device (Boto3)
				Register a Device (Console)
			Check Status
		Package Model
			Prerequisites
			Package a Model (Amazon SageMaker Console)
			Package a Model (Boto3)
		Edge Manager Agent
			Download and Set Up Edge Manager Agent Manually
				How the Agent Works
				Installing Edge Manager Agent
				Running SageMaker Edge Manager Agent
			Deploy Model Package and Edge Manager Agent with AWS IoT Greengrass
				Prerequisites
				Create AWS IoT Greengrass V2 Components
					Autogenerated Component
					Create a Hello World custom component
				Deploy Components to Your Device
					To deploy your components (console)
					To deploy your components (AWS CLI)
		Manage Model
			Load Model
			Unload Model
			List Models
			Describe Model
			Capture Data
			Get Capture Status
			Predict
Using Docker containers with SageMaker
	Scenarios for Running Scripts, Training Algorithms, or Deploying Models with SageMaker
	Docker Container Basics
	Use Prebuilt SageMaker Docker images
		Prebuilt SageMaker Docker Images for Deep Learning
			Using the SageMaker Python SDK
			Extending Prebuilt SageMaker Docker Images
		Prebuilt Amazon SageMaker Docker Images for Scikit-learn and Spark ML
			Using the SageMaker Python SDK
			Specifying the Prebuilt Images Manually
				Finding Available Images
		Train a Deep Graph Network
			What Is a Deep Graph Network?
			Get Started
			Run a Graph Network Training Example
				Examples
				Use a Deep Learning Container with DGL
				Bring Your Own Container with DGL
		Extend a Prebuilt Container
			Requirements to Extend a Prebuilt Container
			Extend SageMaker Containers to Run a Python Script
				Step 1: Create an SageMaker Notebook Instance
				Step 2: Create and Upload the Dockerfile and Python Training Scripts
				Step 3: Build the Container
				Step 4: Test the Container
				Step 5: Push the Container to Amazon Elastic Container Registry (Amazon ECR)
				Step 6: Clean up Resources
	Adapting Your Own Docker Container to Work with SageMaker
		Individual Framework Libraries
		Using the SageMaker Training and Inference Toolkits
			SageMaker Toolkits Containers Structure
			Single Versus Multiple Containers
		Adapting Your Own Training Container
			Step 1: Create a SageMaker notebook instance
			Step 2: Create and upload the Dockerfile and Python training scripts
			Step 3: Build the container
			Step 4: Test the container
			Step 5: Push the container to Amazon Elastic Container Registry (Amazon ECR)
			Step 6: Clean up resources
		Adapting Your Own Inference Container
			Step 1: Create an Inference Handler
				The model_fn Function
				The input_fn Function
				The predict_fn Function
				The output_fn Function
			Step 2: Implement a Handler Service
			Step 3: Implement an Entrypoint
			Step 4: Write a Dockerfile
			Step 5: Build and Register Your Container
	Create a container with your own algorithms and models
		Use Your Own Training Algorithms
			How Amazon SageMaker Runs Your Training Image
			How Amazon SageMaker Provides Training Information
				Hyperparameters
				Environment Variables
				Input Data Configuration
				Training Data
				Distributed Training Configuration
			Run Training with EFA
				Prerequisites
				Install EFA and required packages
				Considerations when creating your container
				Verify that your EFA device is recognized
				Running a training job with EFA
			How Amazon SageMaker Signals Algorithm Success and Failure
			How Amazon SageMaker Processes Training Output
		Use Your Own Inference Code
			Use Your Own Inference Code with Hosting Services
				How SageMaker Runs Your Inference Image
				How SageMaker Loads Your Model Artifacts
				How Containers Serve Requests
				How Your Container Should Respond to Inference Requests
				How Your Container Should Respond to Health Check (Ping) Requests
				Use a Private Docker Registry for Real-Time Inference Containers
					Store Images in a Private Docker Registry other than Amazon Elastic Container Registry
					Use an Image from a Private Docker Registry for Real-time Inference
					Allow SageMaker to authenticate to a private Docker registry
					Create the Lambda function
					Give your execution role permission to Lambda
					Create an interface VPC endpoint for Lambda
			Use Your Own Inference Code with Batch Transform
				How SageMaker Runs Your Inference Image
				How SageMaker Loads Your Model Artifacts
				How Containers Serve Requests
				How Your Container Should Respond to Inference Requests
				How Your Container Should Respond to Health Check (Ping) Requests
	Example Notebooks: Use Your Own Algorithm or Model
		Setup
		Host Models Trained in Scikit-learn
		Package TensorFlow and Scitkit-learn Models for Use in SageMaker
		Train and Deploy a Neural Network on SageMaker
		Training Using Pipe Mode
		Bring Your Own R Model
		Extend a Prebuilt PyTorch Container Image
		Train and Debug Training Jobs on a Custom Container
	Troubleshooting your Docker containers
SageMaker Workflows
	Amazon SageMaker Model Building Pipelines
		SageMaker Pipelines Overview
			Pipeline Structure
			Access Management
				Pipeline Role Permissions
				Pipeline Step Permissions
				Service Control Policies with Pipelines
			Pipeline Parameters
			Pipeline Steps
				Step Types
					Processing Step
					Training Step
					Tuning Step
					CreateModel Step
					RegisterModel Step
					Transform Step
					Condition Step
					Callback Step
					Lambda Step
				Step Properties
				Data Dependency Between Steps
				Custom Dependency Between Steps
				Use a Custom Image in a Step
			Property Files and JsonGet
			Caching Pipeline Steps
				Enabling Step Caching
			Amazon EventBridge Integration
				Schedule a Pipeline with Amazon EventBridge
					Prerequisites
					Create an EventBridge rule using the EventBridge console
					Create an EventBridge rule using the AWS CLI
			Amazon SageMaker Experiments Integration
				Default Behavior
				Disable Experiments Integration
				Specify a Custom Experiment Name
				Specify a Custom Trial Name
			SageMaker Pipelines Quotas
				Pipeline Quotas
				Executions Quotas
				Step Quotas
				Parameters Quotas
				Condition Step Quotas
				Property Files Quotas
				Metadata Quotas
			Troubleshooting Amazon SageMaker Model Building Pipelines
		Create and Manage SageMaker Pipelines
			Define a Pipeline
				Prerequisites
					Set Up Your Environment
				Create a Pipeline
					Step 1: Download the Dataset
					Step 2: Define Pipeline Parameters
					Step 3: Define a Processing Step for Feature Engineering
					Step 4: Define a Training step
					Step 5: Define a Processing Step for Model Evaluation
					Step 6: Define a CreateModelStep for Batch Transformation
					Step 7: Define a TransformStep to Perform Batch Transformation
					Step 8: Define a RegisterModel Step to Create a Model Package
					Step 9: Define a Condition Step to Verify Model Accuracy
					Step 10: Create a pipeline
			Run a pipeline
				Prerequisites
				Step 1: Start the Pipeline
				Step 2: Examine a Pipeline Execution
				Step 3: Override Default Parameters for a Pipeline Execution
				Step 4: Stop and Delete a Pipeline Execution
			View, Track, and Execute SageMaker Pipelines in SageMaker Studio
				View a Pipeline
				View a Pipeline Execution
				View Experiment Entities Created by SageMaker Pipelines
				Execute a Pipeline
				Track the Lineage of a SageMaker ML Pipeline
	Automate MLOps with SageMaker Projects
		What is a SageMaker Project?
			When Should You Use a SageMaker Project?
			Do I Need to Create a Project to Use SageMaker Pipelines?
		Why Should You Use MLOps?
			Challenges with MLOps
			Benefits of MLOps
		SageMaker Studio Permissions Required to Use Projects
		Create an MLOps Project using Amazon SageMaker Studio
		MLOps Project Templates
			Use SageMaker Provided Project Templates
				MLOps template for model building, training, and deployment
				MLOps template for model building, training, and deployment with third-party Git repositories using CodePipeline
				MLOps template for model building, training, and deployment with third-party Git repositories using Jenkins
				Update SageMaker Projects to Use Third-Party Git Repositories
			Create Custom Project Templates
		View Project Resources
		SageMaker MLOps Project Walkthrough
			Step 1: Create the Project
			Step 2: Clone the Code Repository
			Step 3: Make a Change in the Code
			Step 4: Approve the Model
			(Optional) Step 5: Deploy the Model Version to Production
			Step 6: Cleanup Resources
	Amazon SageMaker ML Lineage Tracking
		Tracking Entities
		Amazon SageMaker Created Tracking Entities
			Tracking Entities for SageMaker Jobs
			Tracking Entities for Model Packages
			Tracking Entities for Endpoints
		Manually Create Tracking Entities
			Manually Create Entities
			Manually Track a Workflow
			Limits
	Kubernetes Orchestration
		SageMaker Operators for Kubernetes
			What is an operator?
				Prerequisites
				Permissions overview
			IAM role-based setup and operator deployment
				Cluster-scoped deployment
					Create an OpenID Connect Provider for Your Cluster
					Get the OIDC ID
					Create an IAM Role
					Attach the AmazonSageMakerFullAccess Policy to the Role
					Deploy the Operator
						Deploy the Operator Using YAML
						Deploy the Operator Using Helm Charts
					Verify the operator deployment
				Namespace-scoped deployment
					Create an OpenID Connect Provider for Your Amazon EKS cluster
					Get your OIDC ID
					Create your IAM Role
					Attach the AmazonSageMakerFullAccess Policy to your Role
					Deploy the Operator to Your Namespace
						Deploy the Operator to Your Namespace Using YAML
						Deploy the Operator to Your Namespace Using Helm Charts
					Verify the operator deployment to your namespace
				Install the SageMaker logs kubectl plugin
			Delete operators
				Delete cluster-based operators
					Operators installed using YAML
					Operators installed using Helm Charts
				Delete namespace-based operators
					Operators installed with YAML
					Operators installed with Helm Charts
			Troubleshooting
				Debugging a Failed Job
				Deleting an Operator CRD
			Images and SMlogs in each Region
			Using Amazon SageMaker Jobs
				TrainingJob operator
					Create a TrainingJob Using a YAML File
					Create a TrainingJob Using a Helm Chart
						Create the TrainingJob
						Verify Your Training Helm Chart
					List TrainingJobs
						TrainingJob Status Values
						Secondary Status Values
					Describe a TrainingJob
					View Logs from TrainingJobs
					Delete TrainingJobs
				HyperParameterTuningJob operator
					Create a HyperparameterTuningJob Using a YAML File
					Create a HyperparameterTuningJob using a Helm Chart
						Create the HyperparameterTuningJob
						Verify Chart Installation
					List HyperparameterTuningJobs
						Hyperparameter Tuning Job Status Values
						Status Counters
						Best TrainingJob
						Spawned TrainingJobs
					Describe a HyperparameterTuningJob
					View Logs from HyperparameterTuningJobs
					Delete a HyperparameterTuningJob
				BatchTransformJob operator
					Create a BatchTransformJob Using a YAML File
					Create a BatchTransformJob Using a Helm Chart
						Get the Helm installer directory
						Configure the Helm Chart
						Create a BatchTransformJob
					List BatchTransformJobs
						Batch Transform Status Values
					Describe a BatchTransformJob
					View Logs from BatchTransformJobs
					Delete a BatchTransformJob
				HostingDeployment operator
					Configure a HostingDeployment Resource
					Create a HostingDeployment
					List HostingDeployments
						HostingDeployment Status Values
					Describe a HostingDeployment
					Invoking the Endpoint
					Update HostingDeployment
					Delete the HostingDeployment
				ProcessingJob operator
					Create a ProcessingJob Using a YAML File
					List ProcessingJobs
					Describe a ProcessingJob
					Delete a ProcessingJob
				HostingAutoscalingPolicy (HAP) Operator
					Create a HostingAutoscalingPolicy Using a YAML File
						Sample 1: Apply a Predefined Metric to a Single Endpoint Variant
						Sample 2: Apply a Custom Metric to a Single Endpoint Variant
						Sample 3: Apply a Scaling Policy to Multiple Endpoints and Variants
						Considerations for HostingAutoscalingPolicies for Multiple Endpoints and Variants
					List HostingAutoscalingPolicies
					Describe a HostingAutoscalingPolicy
					Update a HostingAutoscalingPolicy
					Delete a HostingAutoscalingPolicy
					Update or Delete an Endpoint with a HostingAutoscalingPolicy
		SageMaker Components for Kubeflow Pipelines
			What is Kubeflow Pipelines?
			Kubeflow Pipeline components
				What do SageMaker Components for Kubeflow Pipelines provide?
					Training components
					Inference components
					Ground Truth components
			IAM permissions
			Converting Pipelines to use SageMaker
			Using SageMaker Components
				Setup
					Set up a gateway node
					Set up an Amazon EKS cluster
					Install Kubeflow Pipelines
					Access the KFP UI
						Set up port forwarding to the KFP UI service
						Access the KFP UI service
					Create IAM Users/Roles for KFP pods and the SageMaker service
						Create a KFP execution role
						Create an SageMaker execution role
					Add access to additional IAM users or roles
				Running the Kubeflow Pipeline
					Prepare datasets
					Create a Kubeflow Pipeline using SageMaker Components
						Input Parameters
					Compile and deploy your pipeline
						Install KFP SDK
						Compile your pipeline
						Upload and run the pipeline using the KFP CLI
						Upload and run the pipeline using the KFP UI
					Running predictions
						Configure permissions to run predictions
						Run predictions
					View results and logs
					Cleanup
Using Amazon Augmented AI for Human Review
	Get Started with Amazon Augmented AI
		Core Components of Amazon A2I
			Task Types
			Human Review Workflow (Flow Definition)
			Human Loops
		Prerequisites to Using Augmented AI
		Tutorial: Get Started in the Amazon A2I Console
			Prerequisites
			Step 1: Create a Work Team
			Step 2: Create a Human Review Workflow
			Step 3: Start a Human Loop
			Step 4: View Human Loop Status in Console
			Step 5: Download Output Data
		Tutorial: Get Started Using the Amazon A2I API
			Create a Private Work Team
			Create a Human Review Workflow
				Create a Human Task UI
				Create JSON to specify activation conditions
				Create a human review workflow
			Create a Human Loop
	Use Cases and Examples Using Amazon A2I
		Use SageMaker Notebook Instance with Amazon A2I Jupyter Notebook
		Use Amazon Augmented AI with Amazon Textract
			Get Started: Integrate a Human Review into an Amazon Textract Analyze Document Job
			End-to-End Example Using Amazon Textract and Amazon A2I
			A2I Textract Worker Console Preview
		Use Amazon Augmented AI with Amazon Rekognition
			Get Started: Integrate a Human Review into an Amazon Rekognition Image Moderation Job
			End-to-end Demo Using Amazon Rekognition and Amazon A2I
			A2I Rekognition Worker Console Preview
		Use Amazon Augmented AI with Custom Task Types
			End-to-end Tutorial Using Amazon A2I Custom Task Types
	Create a Human Review Workflow
		Create a Human Review Workflow (Console)
			Next Steps
		Create a Human Review Workflow (API)
			Next Steps
		JSON Schema for Human Loop Activation Conditions in Amazon Augmented AI
			Use Human Loop Activation Conditions JSON Schema with Amazon Textract
				ImportantFormKeyConfidenceCheck Inputs and Results
				MissingImportantFormKey Inputs and Results
				Sampling Inputs and Results
				Examples
			Use Human Loop Activation Conditions JSON Schema with Amazon Rekognition
				ModerationLabelConfidenceCheck Inputs
				Sampling Inputs
				Examples
	Delete a Human Review Workflow
		Delete a Flow Definition Using the Console or the SageMaker API
	Create and Start a Human Loop
		Create and Start a Human Loop for a Built-in Task Type
			Create an Amazon Textract Human Loop
			Create an Amazon Rekognition Human Loop
		Create and Start a Human Loop for a Custom Task Type
		Next Steps:
	Delete a Human Loop
		Human Loop Data Retention and Deletion
		Stop and Delete a Flow Definition Using the Console or the Amazon A2I API
	Create and Manage Worker Task Templates
		Create and Delete Worker Task Templates
			Create a Worker Task Template
			Delete a Worker Task Template
		Create Custom Worker Task Templates
			Develop Templates Locally
			Use External Assets
			Track Your Variables
			Custom Template Example for Amazon Textract
			Custom Template Example for Amazon Rekognition
			Add Automation with Liquid
				Use Variable Filters
					Autoescape and Explicit Escape
					escape_once
					skip_autoescape
					to_json
					grant_read_access
			Preview a Worker Task Template
		Creating Good Worker Instructions
			Create Good Worker Instructions
			Add Example Images to Your Instructions
	Monitor and Manage Your Human Loop
	Amazon A2I Output Data
		Output Data From Built-In Task Types
		Output Data From Custom Task Types
		Track Worker Activity
	Permissions and Security in Amazon Augmented AI
		CORS Permission Requirement
		Add Permissions to the IAM Role Used to Create a Flow Definition
		Create an IAM User That Can Invoke Amazon A2I API Operations
		Create an IAM User With Permissions to Invoke Amazon A2I, Amazon Textract, and Amazon Rekognition API Operations
		Enable Worker Task Template Previews
		Using Amazon A2I with AWS KMS Encrypted Buckets
		Additional Permissions and Security Resources
	Use Amazon CloudWatch Events in Amazon Augmented AI
		Send Events from Your Human Loop to CloudWatch Events
		Set Up a Target to Process Events
		Use Human Review Output
		More Information
	Use APIs in Amazon Augmented AI
		Programmatic Tutorials
Buy and Sell Amazon SageMaker Algorithms and Models in AWS Marketplace
	Topics
	SageMaker Algorithms
	SageMaker Model Packages
	Use your own algorithms and models with the AWS Marketplace
		Create Algorithm and Model Package Resources
			Create an Algorithm Resource
				Create an Algorithm Resource (Console)
				Create an Algorithm Resource (API)
			Create a Model Package Resource
				Create a Model Package Resource (Console)
				Create a Model Package Resource (API)
		Use Algorithm and Model Package Resources
			Use an Algorithm to Run a Training Job
				Use an Algorithm to Run a Training Job (Console)
				Use an Algorithm to Run a Training Job (API)
				Use an Algorithm to Run a Training Job (Amazon SageMaker Python SDK)
			Use an Algorithm to Run a Hyperparameter Tuning Job
				Use an Algorithm to Run a Hyperparameter Tuning Job (Console)
				Use an Algorithm to Run a Hyperparameter Tuning Job (API)
				Use an Algorithm to Run a Hyperparameter Tuning Job (Amazon SageMaker Python SDK)
			Use a Model Package to Create a Model
				Use a Model Package to Create a Model (Console)
				Use a Model Package to Create a Model (API)
				Use a Model Package to Create a Model (Amazon SageMaker Python SDK)
	Sell Amazon SageMaker Algorithms and Model Packages
		Topics
		Develop Algorithms and Models in Amazon SageMaker
			Develop Algorithms in SageMaker
			Develop Models in SageMaker
		List Your Algorithm or Model Package on AWS Marketplace
	Find and Subscribe to Algorithms and Model Packages on AWS Marketplace
		Use Algorithms and Model Packages
Security in Amazon SageMaker
	Access Control
		Access control and SageMaker Studio notebooks
		Control root access to a SageMaker notebook instance
	Data Protection in Amazon SageMaker
		Protect Data at Rest Using Encryption
			Studio notebooks
			Notebook instances and SageMaker jobs
		Protecting Data in Transit with Encryption
			Protect Communications Between ML Compute Instances in a Distributed Training Job
				Enable Inter-Container Traffic Encryption (API)
				Enable Inter-Container Traffic Encryption (Console)
					Enable Inter-container Traffic Encryption in a Training Job
					Enable Inter-container Traffic Encryption in a Hyperparameter Tuning Job
		Key Management
		Internetwork Traffic Privacy
	Identity and Access Management for Amazon SageMaker
		Audience
		Authenticating with Identities
			AWS account root user
			IAM Users and Groups
			IAM Roles
		Managing Access Using Policies
			Identity-Based Policies
			Resource-Based Policies
			Access Control Lists (ACLs)
			Other Policy Types
			Multiple Policy Types
		How Amazon SageMaker Works with IAM
			SageMaker Identity-Based Policies
				Actions
				Resources
				Condition Keys
					Examples
				SageMaker Resource-Based Policies
				Authorization Based on SageMaker Tags
				SageMaker IAM Roles
					Using Temporary Credentials with SageMaker
					Service-Linked Roles
					Service Roles
					Choosing an IAM Role in SageMaker
		Amazon SageMaker Identity-Based Policy Examples
			Policy Best Practices
			Using the SageMaker Console
				Permissions Required to Use the Amazon SageMaker Console
				Permissions Required to Use the Amazon SageMaker Ground Truth Console
				Permissions Required to Use the Amazon Augmented AI (Preview) Console
			Allow Users to View Their Own Permissions
			Control Creation of SageMaker Resources with Condition Keys
				Control Access to SageMaker Resources by Using File System Condition Keys
					Restrict an IAM User to Specific Directories and Access Modes
					Restrict an IAM User to a Specific File System
				Restrict Training to a Specific VPC
				Restrict Access to Workforce Types for Ground Truth Labeling Jobs and Amazon A2I Human Review Workflows
				Enforce Encryption of Input Data
				Enforce Encryption of Notebook Instance Storage Volume
				Enforce Network Isolation for Training Jobs
				Enforce a Specific Instance Type for Training Jobs
				Enforce a Specific EI Accelerator for Training Jobs
				Enforce Disabling Internet Access and Root Access for Creating Notebook Instances
			Control Access to the SageMaker API by Using Identity-based Policies
				Restrict Access to SageMaker API and Runtime to Calls from Within Your VPC
			Limit Access to SageMaker API and Runtime Calls by IP Address
			Limit Access to a Notebook Instance by IP Address
			Control Access to SageMaker Resources by Using Tags
			Require the Presence or Absence of Tags for API Calls
		SageMaker Roles
			Get execution role
				Add Additional Amazon S3 Permissions to an SageMaker Execution Role
			Passing Roles
			CreateAutoMLJob API: Execution Role Permissions
			CreateDomain API: Execution Role Permissions
			CreateImage and UpdateImage APIs: Execution Role Permissions
			CreateNotebookInstance API: Execution Role Permissions
			CreateHyperParameterTuningJob API: Execution Role Permissions
			CreateProcessingJob API: Execution Role Permissions
			CreateTrainingJob API: Execution Role Permissions
			CreateModel API: Execution Role Permissions
		AWS Managed Policies for Amazon SageMaker
			AmazonSageMakerFullAccess
			AmazonSageMakerReadOnly
			SageMaker Updates to AWS Managed Policies
		Amazon SageMaker API Permissions: Actions, Permissions, and Resources Reference
		Troubleshooting Amazon SageMaker Identity and Access
			I Am Not Authorized to Perform an Action in SageMaker
			I Am Not Authorized to Perform iam:PassRole
			I Want to View My Access Keys
			I'm an Administrator and Want to Allow Others to Access SageMaker
			I Want to Allow People Outside of My AWS Account to Access My SageMaker Resources
	Logging and Monitoring
	Compliance Validation for Amazon SageMaker
	Resilience in Amazon SageMaker
	Infrastructure Security in Amazon SageMaker
		SageMaker Scans AWS Marketplace Training and Inference Containers for Security Vulnerabilities
		Connect to Resources in a VPC
			Connect SageMaker Studio Notebooks to Resources in a VPC
				Default communication with the internet
				VPC only communication with the internet
					Requirements to use VPC only mode
			Connect a Notebook Instance to Resources in a VPC
				Default communication with the internet
				VPC communication with the internet
				Security and Shared Notebook Instances
		Run Training and Inference Containers in Internet-Free Mode
			Network Isolation
				Network isolation with a VPC
		Connect to SageMaker Through a VPC Interface Endpoint
			Create a VPC Endpoint Policy for SageMaker
			Create a VPC Endpoint Policy for Amazon SageMaker Feature Store
			Connect to SageMaker Studio Through an Interface VPC Endpoint
				Create a VPC Endpoint Policy for SageMaker Studio
				Allow Access Only from Within Your VPC
			Connect to a Notebook Instance Through a VPC Interface Endpoint
				Connect Your Private Network to Your VPC
				Create a VPC Endpoint Policy for SageMaker Notebook Instances
				Restrict Access to Connections from Within Your VPC
			Connect Your Private Network to Your VPC
		Give SageMaker Access to Resources in your Amazon VPC
			Give SageMaker Processing Jobs Access to Resources in Your Amazon VPC
				Configure a Processing Job for Amazon VPC Access
				Configure Your Private VPC for SageMaker Processing
					Ensure That Subnets Have Enough IP Addresses
					Create an Amazon S3 VPC Endpoint
					Use a Custom Endpoint Policy to Restrict Access to S3
						Restrict Package Installation on the Processing Container
					Configure Route Tables
					Configure the VPC Security Group
					Connect to Resources Outside Your VPC
			Give SageMaker Training Jobs Access to Resources in Your Amazon VPC
				Configure a Training Job for Amazon VPC Access
				Configure Your Private VPC for SageMaker Training
					Ensure That Subnets Have Enough IP Addresses
					Create an Amazon S3 VPC Endpoint
					Use a Custom Endpoint Policy to Restrict Access to S3
						Restrict Package Installation on the Training Container
					Configure Route Tables
					Configure the VPC Security Group
					Connect to Resources Outside Your VPC
			Give SageMaker Hosted Endpoints Access to Resources in Your Amazon VPC
				Configure a Model for Amazon VPC Access
				Configure Your Private VPC for SageMaker Hosting
					Ensure That Subnets Have Enough IP Addresses
					Create an Amazon S3 VPC Endpoint
					Use a Custom Endpoint Policy to Restrict Access to Amazon S3
						Restrict Package Installation on the Model Container with a Custom Endpoint Policy
					Add Permissions for Endpoint Access for Containers Running in a VPC to Custom IAM Policies
					Configure Route Tables
					Connect to Resources Outside Your VPC
			Give Batch Transform Jobs Access to Resources in Your Amazon VPC
				Configure a Batch Transform Job for Amazon VPC Access
				Configure Your Private VPC for SageMaker Batch Transform
					Ensure That Subnets Have Enough IP Addresses
					Create an Amazon S3 VPC Endpoint
					Use a Custom Endpoint Policy to Restrict Access to S3
						Restrict Package Installation on the Model Container
					Configure Route Tables
					Configure the VPC Security Group
					Connect to Resources Outside Your VPC
			Give Amazon SageMaker Clarify Jobs Access to Resources in Your Amazon VPC
				Configure a SageMaker Clarify Job for Amazon VPC Access
					SageMaker Clarify Job Amazon VPC Subnets and Security Groups
					Configure a Model Amazon VPC for Inference
				Configure Your Private Amazon VPC for SageMaker Clarify jobs
					Connect to Resources Outside Your Amazon VPC
					Configure the Amazon VPC Security Group
			Give SageMaker Compilation Jobs Access to Resources in Your Amazon VPC
				Configure a Compilation Job for Amazon VPC Access
				Configure Your Private VPC for SageMaker Compilation
					Ensure That Subnets Have Enough IP Addresses
					Create an Amazon S3 VPC Endpoint
					Use a Custom Endpoint Policy to Restrict Access to S3
						Add Permissions for Compilation Job Running in a Amazon VPC to Custom IAM Policies
					Configure Route Tables
					Configure the VPC Security Group
Monitor Amazon SageMaker
	Monitor Amazon SageMaker with Amazon CloudWatch
		SageMaker Endpoint Invocation Metrics
		SageMaker Multi-Model Endpoint Metrics
		SageMaker Jobs and Endpoint Metrics
		SageMaker Ground Truth Metrics
		SageMaker Feature Store Metrics
		SageMaker Pipelines Metrics
	Log Amazon SageMaker Events with Amazon CloudWatch
	Log Amazon SageMaker API Calls with AWS CloudTrail
		SageMaker Information in CloudTrail
		Operations Performed by Automatic Model Tuning
		Understanding SageMaker Log File Entries
	Automating Amazon SageMaker with Amazon EventBridge
		Training job state change
		Hyperparameter tuning job state change
		Transform job state change
		Endpoint state change
		Feature group state change
		Model package state change
		Pipeline execution state change
		Pipeline step state change
		SageMaker image state change
		SageMaker image version state change
Availability Zones
API Reference Guide for Amazon SageMaker
	Overview
	Programming Model for Amazon SageMaker
Document History for Amazon SageMaker
AWS glossary




نظرات کاربران