ورود به حساب

نام کاربری گذرواژه

گذرواژه را فراموش کردید؟ کلیک کنید

حساب کاربری ندارید؟ ساخت حساب

ساخت حساب کاربری

نام نام کاربری ایمیل شماره موبایل گذرواژه

برای ارتباط با ما می توانید از طریق شماره موبایل زیر از طریق تماس و پیامک با ما در ارتباط باشید


09117307688
09117179751

در صورت عدم پاسخ گویی از طریق پیامک با پشتیبان در ارتباط باشید

دسترسی نامحدود

برای کاربرانی که ثبت نام کرده اند

ضمانت بازگشت وجه

درصورت عدم همخوانی توضیحات با کتاب

پشتیبانی

از ساعت 7 صبح تا 10 شب

دانلود کتاب C++ Template Metaprogramming in Practice: A Deep Learning Framework

دانلود کتاب فرابرنامه‌نویسی الگوی C++ در عمل: چارچوب یادگیری عمیق

C++ Template Metaprogramming in Practice: A Deep Learning Framework

مشخصات کتاب

C++ Template Metaprogramming in Practice: A Deep Learning Framework

ویرایش:  
نویسندگان:   
سری:  
ISBN (شابک) : 2020024624, 9781003102311 
ناشر:  
سال نشر: 2020 
تعداد صفحات: 339 
زبان: English 
فرمت فایل : PDF (درصورت درخواست کاربر به PDF، EPUB یا AZW3 تبدیل می شود) 
حجم فایل: 13 مگابایت 

قیمت کتاب (تومان) : 52,000



ثبت امتیاز به این کتاب

میانگین امتیاز به این کتاب :
       تعداد امتیاز دهندگان : 3


در صورت تبدیل فایل کتاب C++ Template Metaprogramming in Practice: A Deep Learning Framework به فرمت های PDF، EPUB، AZW3، MOBI و یا DJVU می توانید به پشتیبان اطلاع دهید تا فایل مورد نظر را تبدیل نمایند.

توجه داشته باشید کتاب فرابرنامه‌نویسی الگوی C++ در عمل: چارچوب یادگیری عمیق نسخه زبان اصلی می باشد و کتاب ترجمه شده به فارسی نمی باشد. وبسایت اینترنشنال لایبرری ارائه دهنده کتاب های زبان اصلی می باشد و هیچ گونه کتاب ترجمه شده یا نوشته شده به فارسی را ارائه نمی دهد.


توضیحاتی درمورد کتاب به خارجی



فهرست مطالب

Cover
Half Title
Title Page
Copyright Page
Contents
Preface
Acknowledgment
PART I: INTRODUCTION
	1. Basic Tips
		1.1. Metafunction and type_traits
			1.1.1. Introduction to Metafunctions
			1.1.2. Type Metafunction
			1.1.3. Various Metafunctions
			1.1.4. type_traits
			1.1.5. Metafunctions and Macros
			1.1.6. The Nominating Method of Metafunctions in This Book
		1.2. Template Template Parameters and Container Templates
			1.2.1. Templates as the Input of Metafunctions
			1.2.2. Templates as the Output of Metafunctions
			1.2.3. Container Templates
		1.3. Writing of Sequences, Branches, and Looping Codes
			1.3.1. Codes Executed in Sequence Order
			1.3.2. Codes Executed in Branches
				1.3.2.1. Implementing Branches Using std::conditional and std::conditional_t
				1.3.2.2. Implementing Branches with (Partial) Specialization
				1.3.2.3. Implementing Branches Using std::enable_if and std::enable_if_t
				1.3.2.4. Compile-time Branches with Different Return Types
				1.3.2.5. Simplify Codes with if constexpr
			1.3.3. Codes Executed in Loops
			1.3.4. Caution: Instantiation Explosion and Compilation Crash
			1.3.5. Branch Selection and Short Circuit Logic
		1.4. Curiously Recurring Template Pattern (CRTP)
		1.5. Summary
		1.6. Exercises
	2. Heterogeneous Dictionaries and Policy Templates
		2.1. Introduction to Named Arguments
		2.2. Heterogeneous Dictionaries
			2.2.1. How to Use the Module
			2.2.2. The Representation of the Keys
			2.2.3. Implementation of Heterogeneous Dictionaries
				2.2.3.1. External Framework
				2.2.3.2. Implementation of the Function Create
				2.2.3.3. The Main Frame of Values
				2.2.3.4. Logic Analysis of NewTupleType
			2.2.4. A Brief Analysis of VarTypeDicts Performance
			2.2.5. std::tuple as the Cache
		2.3. Policy Templates
			2.3.1. Introduction to Policies
				2.3.1.1. Policy Objects
				2.3.1.2. Policy Object Templates
			2.3.2. Defining Policies and Policy Objects (Templates)
				2.3.2.1. Policy Grouping
				2.3.2.2. Declarations of Macros and Policy Objects (Templates)
			2.3.3. Using Policies
			2.3.4. Background Knowledge: Dominance and Virtual Inheritance
			2.3.5. Policy Objects and Policy Dominance Structures
			2.3.6. Policy Selection Metafunction
				2.3.6.1. Main Frame
				2.3.6.2. The Metafunction MinorCheck_
				2.3.6.3. Construct the Final Return Type
			2.3.7. Simplifying Declarations of Policy Objects with Macros
		2.4. Summary
		2.5. Exercises
PART II: THE DEEP LEARNING FRAMEWORK
	3. A Brief Introduction to Deep Learning
		3.1. Introduction to Deep Learning
			3.1.1. From Machine Learning to Deep Learning
			3.1.2. A Wide Variety of Artificial Neural Networks
				3.1.2.1. Artificial Neural Networks and Matrix Operations
				3.1.2.2. Deep Neural Network
				3.1.2.3. Recurrent Neural Networks
				3.1.2.4. Convolutional Neural Networks
				3.1.2.5. Components of Neural Networks
			3.1.3. Organization and Training of Deep Learning Systems
				3.1.3.1. Network Structure and Loss Function
				3.1.3.2. Model Training
				3.1.3.3. Predictions with Models
		3.2. The Framework Achieved in This BookMetaNN
			3.2.1. From Computing Tools of Matrices to Deep Learning Frameworks
			3.2.2. Introduction to MetaNN
			3.2.3. What We Will Discuss
				3.2.3.1. Data Representation
				3.2.3.2. Matrix Operations
				3.2.3.3. Layers and Automatic Derivation
				3.2.3.4. Evaluation and Performance Optimization
			3.2.4. Topics Not Covered in This Book
		3.3. Summary
	4. Type System and Basic Data Types
		4.1. The Type System
			4.1.1. Introduction to the Type System
			4.1.2. Classification Systems of Iterators
			4.1.3. Use Tags as Template Parameters
			4.1.4. The Type System of MetaNN
			4.1.5. Metafunctions Related to the Type System
				4.1.5.1. Metafunction IsXXX
				4.1.5.2. Metafunction DataCategory
		4.2. Design Concepts
			4.2.1. Support for Different Computing Devices and Computing Units
			4.2.2. Allocation and Maintenance of Storage Space
				4.2.2.1. Class Template Allocator
				4.2.2.2. Class Template ContinuousMemory
			4.2.3. Shallow Copy and Detection of Write Operations
				4.2.3.1. Data Types without Requirements of Support for Element level Reading and Writing
				4.2.3.2. Element-level Writing and Shallow Copy
			4.2.4. Expansion of Underlying Interfaces
			4.2.5. Type Conversion and Evaluation
			4.2.6. Data Interface Specifications
		4.3. Scalars
			4.3.1. Declaration of Class Templates
			4.3.2. A Specialized Version Based on CPU
				4.3.2.1. Type Definitions and Data Members
				4.3.2.2. Construction, Assignment, and Movement
				4.3.2.3. Reading and Writing Elements
				4.3.2.4. Evaluating Related Interfaces
			4.3.3. The Principal Type of Scalars
		4.4. Matrix
			4.4.1. Class Template Matrix
				4.4.1.1. Declarations and Interfaces
				4.4.1.2. Dimensional Information and Element-level Reading and Writing
				4.4.1.3. Submatrix
				4.4.1.4. Underlying Access Interfaces of Matrix
			4.4.2. Special Matrices: Trivial Matrix, Zero Matrix, and One-hot Vector
				4.4.2.1. Trivial Matrix
				4.4.2.2. Zero Matrix
				4.4.2.3. One-hot Vector
			4.4.3. Introducing a New Matrix Class
		4.5. List
			4.5.1. Template Batch
			4.5.2. Template Array
				4.5.2.1. Introduction of the Template Array
				4.5.2.2. Class Template ArrayImp
				4.5.2.3. Metafunction IsIterator
				4.5.2.4. Construction of Array Objects
			4.5.3. Duplication and Template Duplicate
				4.5.3.1. Introduction of the Template Duplicate
				4.5.3.2. Class Template DuplicateImp
				4.5.3.3. Construction of the Duplicate Object
		4.6. Summary
		4.7. Exercises
	5. Operations and Expression Templates
		5.1. Introduction to Expression Templates
		5.2. The Design Principles of Operation Templates in MetaNN
			5.2.1. Problems of the Operation Template Add
			5.2.2. Behavior Analysis of Operation Templates
				5.2.2.1. Validation and Derivation of Types
				5.2.2.2. Division of Object Interfaces
				5.2.2.3. Auxiliary Class Templates
		5.3. Classification of Operations
		5.4. Auxiliary Templates
			5.4.1. The Auxiliary Class Template OperElementType_/OperDeviceType_
			5.4.2. The Auxiliary Class Template OperXXX_
			5.4.3. The Auxiliary Class Template OperCateCal
			5.4.4. The Auxiliary Class Template OperOrganizer
				5.4.4.1. Specialization for Scalars
				5.4.4.2. Specialization for Lists of Scalars
				5.4.4.3. Other Specialized Versions
			5.4.5. The Auxiliary Class Template OperSeq
		5.5. Framework for Operation Templates
			5.5.1. Category Tags for Operation Templates
			5.5.2. Definition of UnaryOp
		5.6. Examples of Operation Implementations
			5.6.1. Sigmoid Operation
				5.6.1.1. Function Interface
				5.6.1.2. Template OperSigmoid_
				5.6.1.3. User Calls
			5.6.2. Operation Add
				5.6.2.1. Function Interface
				5.6.2.2. The Implementation Framework of OperAdd_
				5.6.2.3. The Implementation of OperAdd_::Eval
			5.6.3. Operation Transpose
			5.6.4. Operation Collapse
		5.7. The List of Operations Supported by MetaNN
			5.7.1. Unary Operations
			5.7.2. Binary Operations
			5.7.3. Ternary Operations
		5.8. The Trade-off and Limitations of Operations
			5.8.1. The Trade-off of Operations
			5.8.2. Limitations of Operations
		5.9. Summary
		5.10. Exercises
	6. Basic Layers
		6.1. Design Principles of Layers
			6.1.1. Introduction to Layers
			6.1.2. Construction of Layer Objects
				6.1.2.1. Information Delivered through Constructors
				6.1.2.2. Information Specified through Template Parameters
			6.1.3. Initialization and Loading of Parameter Matrices
			6.1.4. Forward Propagation
			6.1.5. Preservation of Intermediate Results
			6.1.6. Backward Propagation
			6.1.7. Update of Parameter Matrices
			6.1.8. Acquisition of Parameter Matrices
			6.1.9. Neutral Detection of Layers
		6.2. Auxiliary Logic for Layers
			6.2.1. Initializing the Module
				6.2.1.1. Using the Initialization Module
				6.2.1.2. MakeInitializer
				6.2.1.3. Class Template ParamInitializer
				6.2.1.4. Class Template Initializer
			6.2.2. Class Template DynamicData
				6.2.2.1. Base Class Template DynamicCategory
				6.2.2.2. Derived Class Template DynamicWrapper
				6.2.2.3. Encapsulating Behaviors of Pointers with DynamicData
				6.2.2.4. Category Tags
				6.2.2.5. Auxiliary Functions and Auxiliary Meta Functions
				6.2.2.6. DynamicData and Dynamic Type System
			6.2.3. Common Policy Objects for Layers
				6.2.3.1. Parameters Relevant to Update and Backward Propagation
				6.2.3.2. Parameters Relevant to Input
				6.2.3.3. Parameters Relevant to Operations
			6.2.4. Metafunction InjectPolicy
			6.2.5. Universal I/O Structure
			6.2.6. Universal Operation Functions
		6.3. Specific Implementations of Layers
			6.3.1. AddLayer
			6.3.2. ElementMulLayer
				6.3.2.1. Recording Intermediate Results
				6.3.2.2. Forward and Backward Propagation
				6.3.2.3. Neutral Detection
			6.3.3. BiasLayer
				6.3.3.1. Basic Framework
				6.3.3.2. Initialization and Loading of Parameters
				6.3.3.3. Obtaining Parameters
				6.3.3.4. Forward and Backward Propagation
				6.3.3.5. Collecting Parameter Gradients
				6.3.3.6. Neutral Detection
		6.4. Basic Layers achieved in MetaNN
		6.5. Summary
		6.6. Exercises
	7. Composite and Recurrent Layers
		7.1. Interfaces and Design Principles of Composite Layers
			7.1.1. Basic Structure
			7.1.2. Syntax for Structural Description
			7.1.3. The Inheritance Relationship of Policies
			7.1.4. Correction of Policies
			7.1.5. Constructors of Composite Layers
			7.1.6. An Example of Complete Construction of a Composite Layer
		7.2. Implementation of Policy Inheritance and Correction
			7.2.1. Implementation of Policy Inheritance
				7.2.1.1. The Container SubPolicyContainer and the Functions Related
				7.2.1.2. The Implementation of PlainPolicy
				7.2.1.3. The Metafunction SubPolicyPicker
			7.2.2. Implementation of Policy Correction
		7.3. The Implementation of ComposeTopology
			7.3.1. Features
			7.3.2. Introduction to the Topological Sorting Algorithm
			7.3.3. Main Steps Included in ComposeTopology
			7.3.4. Clauses for Structural Description and Their Classification
			7.3.5. Examination of Structural Validity
			7.3.6. Implementation of Topological Sorting
				7.3.6.1. Pre-processing of Topological Sorting
				7.3.6.2. Main Logic
				7.3.6.3. Post-processing of Topological Sorting
			7.3.7. The Metafunction to Instantiate Sublayers
				7.3.7.1. Calculation of Policies for Each Sublayer
				7.3.7.2. Examination of the Output Gradient
				7.3.7.3. Policy Correction
				7.3.7.4. Instantiations of Sublayers
		7.4. The Implementation of ComposeKernel
			7.4.1. Declarations of Class Templates
			7.4.2. Management of Sublayer Objects
			7.4.3. Parameter Acquisition, Gradient Collection, and Neutrality Detection
			7.4.4. Initialization and Loading of Parameters
			7.4.5. Forward Propagation
				7.4.5.1. The Interface ComposeKernel::FeedForward
				7.4.5.2. Saving the Calculation Results of Sublayers
				7.4.5.3. FeedForwardFun
				7.4.5.4. Constructing Input Containers of Sublayers
				7.4.5.5. The Implementation Logic of InputFromInConnect
				7.4.5.6. The Implementation Logic of InputFromInternalConnect
				7.4.5.7. Forward Propagation and Filling in Results of Output
			7.4.6. Backward Propagation
		7.5. Examples of Composite Layer Implementations
		7.6. Recurrent Layers
			7.6.1. GruStep
			7.6.2. Building a RecurrentLayer Class Template
				7.6.2.1. Main Definitions of RecurrentLayer
				7.6.2.2. How to Use RecurrentLayer
				7.6.2.3. Implementations of Functions Such as SaveWeights
				7.6.2.4. Forward Propagation
				7.6.2.5. Backward Propagation
				7.6.2.6. The Function FeedStepBackward
			7.6.3. The Use of RecurrentLayer
		7.7. Summary
		7.8. Exercises
	8. Evaluation and Its Optimization
		8.1. Evaluation Models of MetaNN
			8.1.1. Hierarchy of Operations
			8.1.2. Module Division of Evaluation Subsystems
				8.1.2.1. Overview of an Evaluation Process
				8.1.2.2. EvalPlan
				8.1.2.3. EvalPool
				8.1.2.4. EvalUnit
				8.1.2.5. EvalGroup
				8.1.2.6. EvalHandle
				8.1.2.7. EvalBuffer
				8.1.2.8. The Auxiliary Function Evaluate
		8.2. Basic Evaluation Logic
			8.2.1. Evaluation Interface of Principal Types
			8.2.2. Evaluation of Non-principal Basic Data Types
			8.2.3. Evaluation of Operation Templates
			8.2.4. DynamicData and Evaluation
		8.3. Optimization of Evaluation Processes
			8.3.1. Avoiding Repetitive Computations
			8.3.2. Merging Similar Computations
			8.3.3. Multi-operation Co-optimization
				8.3.3.1. Background
				8.3.3.2. MetaNN Solutions
				8.3.3.3. Matching Evaluation Structures at Compile Time
				8.3.3.4. Equality Judgment of Objects in MetaNN
				8.3.3.5. Auto-trigger Optimization
		8.4. Summary
		8.5. Exercises
Postscript
Index




نظرات کاربران