ورود به حساب

نام کاربری گذرواژه

گذرواژه را فراموش کردید؟ کلیک کنید

حساب کاربری ندارید؟ ساخت حساب

ساخت حساب کاربری

نام نام کاربری ایمیل شماره موبایل گذرواژه

برای ارتباط با ما می توانید از طریق شماره موبایل زیر از طریق تماس و پیامک با ما در ارتباط باشید


09117307688
09117179751

در صورت عدم پاسخ گویی از طریق پیامک با پشتیبان در ارتباط باشید

دسترسی نامحدود

برای کاربرانی که ثبت نام کرده اند

ضمانت بازگشت وجه

درصورت عدم همخوانی توضیحات با کتاب

پشتیبانی

از ساعت 7 صبح تا 10 شب

دانلود کتاب Machine Reading Comprehension

دانلود کتاب درک مطلب ماشینی

Machine Reading Comprehension

مشخصات کتاب

Machine Reading Comprehension

ویرایش:  
 
سری:  
ISBN (شابک) : 9780323901185 
ناشر:  
سال نشر: 2021 
تعداد صفحات: 272 
زبان: English 
فرمت فایل : PDF (درصورت درخواست کاربر به PDF، EPUB یا AZW3 تبدیل می شود) 
حجم فایل: 3 مگابایت 

قیمت کتاب (تومان) : 46,000



ثبت امتیاز به این کتاب

میانگین امتیاز به این کتاب :
       تعداد امتیاز دهندگان : 13


در صورت تبدیل فایل کتاب Machine Reading Comprehension به فرمت های PDF، EPUB، AZW3، MOBI و یا DJVU می توانید به پشتیبان اطلاع دهید تا فایل مورد نظر را تبدیل نمایند.

توجه داشته باشید کتاب درک مطلب ماشینی نسخه زبان اصلی می باشد و کتاب ترجمه شده به فارسی نمی باشد. وبسایت اینترنشنال لایبرری ارائه دهنده کتاب های زبان اصلی می باشد و هیچ گونه کتاب ترجمه شده یا نوشته شده به فارسی را ارائه نمی دهد.


توضیحاتی درمورد کتاب به خارجی



فهرست مطالب

Front Cover
Machine Reading Comprehension
Copyright Page
Contents
About the author
Foreword by Xuedong Huang
Foreword by Zide Du
Preface
	Acknowledgment
Recommendation
I. Foundation
	1 Introduction to machine reading comprehension
		1.1 The machine reading comprehension task
			1.1.1 History of machine reading comprehension
			1.1.2 Application of machine reading comprehension
		1.2 Natural language processing
			1.2.1 The status quo of natural language processing
			1.2.2 Existing issues
				1.2.2.1 The ambiguity of language
				1.2.2.2 Common sense and reasoning skills
		1.3 Deep learning
			1.3.1 Features of deep learning
			1.3.2 Achievements of deep learning
		1.4 Evaluation of machine reading comprehension
			1.4.1 Answer forms
			1.4.2 Recall-oriented understudy for gisting evaluation: metric for evaluating freestyle answers
		1.5 Machine reading comprehension datasets
			1.5.1 Single-paragraph datasets
				1.5.1.1 RACE
				1.5.1.2 NewsQA
				1.5.1.3 CNN/DailyMail
				1.5.1.4 SQuAD
				1.5.1.5 CoQA
			1.5.2 Multiparagraph datasets
				1.5.2.1 MS MARCO
				1.5.2.2 DuReader
				1.5.2.3 QAngaroo
				1.5.2.4 HotpotQA
			1.5.3 Corpus-based datasets
				1.5.3.1 AI2 reasoning challenge
		1.6 How to make an machine reading comprehension dataset
			1.6.1 Generation of articles and questions
				1.6.1.1 Generating questions from articles
				1.6.1.2 Generate articles from questions
			1.6.2 Generation of correct answers
			1.6.3 How to build a high-quality machine reading comprehension dataset
				1.6.3.1 Distinguishing comprehension-based and matching-based models
				1.6.3.2 Evaluate the reasoning capability
				1.6.3.3 Assess common sense
				1.6.3.4 Other comprehension skills
					1.6.3.4.1 List/enumeration
					1.6.3.4.2 Mathematical operations
					1.6.3.4.3 Coreference resolution
					1.6.3.4.4 Logical reasoning
					1.6.3.4.5 Analogy
					1.6.3.4.6 Spatial–temporal relations
					1.6.3.4.7 Causal relations
					1.6.3.4.8 Common sense reasoning
					1.6.3.4.9 Schematic/rhetorical clause relations
					1.6.3.4.10 Special sentence structure
		1.7 Summary
		References
	2 The basics of natural language processing
		2.1 Tokenization
			2.1.1 Byte pair encoding
		2.2 The cornerstone of natural language processing: word vectors
			2.2.1 Word vectorization
				2.2.1.1 One-hot embedding
				2.2.1.2 Distributed representation
			2.2.2 Word2vec
				2.2.2.1 Skip-gram
				2.2.2.2 Implementation details of word2vec
		2.3 Linguistic tagging
			2.3.1 Named entity recognition
				2.3.1.1 Rule-based named entity recognition
				2.3.1.2 Feature-based named entity recognition
				2.3.1.3 Named entity recognition based on deep learning
			2.3.2 Part-of-speech tagging
				2.3.2.1 Estimate probabilities in hidden Markov model
				2.3.2.2 Maximize probabilities in hidden Markov model
				2.3.2.3 Named entity recognition and part-of-speech tagging in Python
		2.4 Language model
			2.4.1 N-gram model
			2.4.2 Evaluation of language models
		2.5 Summary
		Reference
	3 Deep learning in natural language processing
		3.1 From word vector to text vector
			3.1.1 Using the final state of recurrent neural network
			3.1.2 Convolutional neural network and pooling
			3.1.3 Parametrized weighted sum
		3.2 Answer multiple-choice questions: natural language understanding
			3.2.1 Network structure
			3.2.2 Implementing text classification
		3.3 Write an article: natural language generation
			3.3.1 Network architecture
			3.3.2 Implementing text generation
			3.3.3 Beam search
		3.4 Keep focused: attention mechanism
			3.4.1 Attention mechanism
			3.4.2 Implementing attention function
			3.4.3 Sequence-to-sequence model
		3.5 Summary
II. Architecture
	4 Architecture of machine reading comprehension models
		4.1 General architecture of machine reading comprehension models
		4.2 Encoding layer
			4.2.1 Establishing the dictionary
			4.2.2 Character embeddings
			4.2.3 Contextual embeddings
		4.3 Interaction layer
			4.3.1 Cross-attention
			4.3.2 Self-attention
			4.3.3 Contextual embeddings
		4.4 Output layer
			4.4.1 Construct the question vector
			4.4.2 Generate multiple-choice answers
			4.4.3 Generate extractive answers
			4.4.4 Generate freestyle answers
				4.4.4.1 Application of attention mechanism
				4.4.4.2 Copy-generate mechanism
		4.5 Summary
		References
	5 Common machine reading comprehension models
		5.1 Bidirectional attention flow model
			5.1.1 Encoding layer
			5.1.2 Interaction layer
				5.1.2.1 Attend from article to question
				5.1.2.2 Attend from question to article
			5.1.3 Output layer
		5.2 R-NET
			5.2.1 Gated attention-based recurrent network
			5.2.2 Encoding layer
			5.2.3 Interaction layer
			5.2.4 Output layer
		5.3 FusionNet
			5.3.1 History of word
			5.3.2 Fully-aware attention
			5.3.3 Encoding layer
			5.3.4 Interaction layer
				5.3.4.1 Word-level attention layer
				5.3.4.2 Reading layer
				5.3.4.3 Question understanding layer
				5.3.4.4 Fully-aware multilevel fusion layer
				5.3.4.5 Fully-aware self-boosted fusion layer
			5.3.5 Output layer
		5.4 Essential-term-aware retriever–reader
			5.4.1 Retriever
			5.4.2 Reader
				5.4.2.1 Relation embedding
				5.4.2.2 Feature embedding
				5.4.2.3 Attention layer
				5.4.2.4 Sequence modeling layer
				5.4.2.5 Fusion layer
				5.4.2.6 Choice interaction layer
				5.4.2.7 Output layer
		5.5 Summary
		References
	6 Pretrained language models
		6.1 Pretrained models and transfer learning
		6.2 Translation-based pretrained language model: CoVe
			6.2.1 Machine translation model
			6.2.2 Contextual embeddings
		6.3 Pretrained language model ELMo
			6.3.1 Bidirectional language model
			6.3.2 How to use ELMo
		6.4 The generative pretraining language model: generative pre-training (GPT)
			6.4.1 Transformer
				6.4.1.1 Multihead attention
				6.4.1.2 Positional encoding
				6.4.1.3 Layer normalization
				6.4.1.4 Feed-forward network
			6.4.2 GPT
			6.4.3 Apply GPT
		6.5 The phenomenal pretrained language model: BERT
			6.5.1 Masked language model
			6.5.2 Next sentence prediction
			6.5.3 Configurations of BERT pretraining
			6.5.4 Fine-tuning BERT
				6.5.4.1 Text classification tasks
				6.5.4.2 Sequence labeling tasks
			6.5.5 Improving BERT
				6.5.5.1 Better pretraining tasks
				6.5.5.2 Multitask fine-tuning
				6.5.5.3 Multiphase pretraining
				6.5.5.4 Use BERT as an encoding layer
			6.5.6 Implementing BERT fine-tuning in MRC
		6.6 Summary
		References
III. Application
	7 Code analysis of the SDNet model
		7.1 Multiturn conversational machine reading comprehension model: SDNet
			7.1.1 Encoding layer
			7.1.2 Interaction layer and output layer
		7.2 Introduction to code
			7.2.1 Code structure
			7.2.2 How to run the code
				7.2.2.1 Configuring the docker
				7.2.2.2 Download data
				7.2.2.3 Execute the code
			7.2.3 Configuration file
		7.3 Preprocessing
			7.3.1 Initialization
			7.3.2 Preprocessing
				7.3.2.1 Tokenization
				7.3.2.2 Build the vocabulary
				7.3.2.3 Get word ids
				7.3.2.4 Save the dictionary and data of new format
				7.3.2.5 Load the dictionary and embeddings
		7.4 Training
			7.4.1 Base class
			7.4.2 Subclass
				7.4.2.1 Training function
				7.4.2.2 Forward function
				7.4.2.3 Evaluation
		7.5 Batch generator
			7.5.1 Padding
			7.5.2 Preparing data for Bidirectional Encoder Representations from Transformers
		7.6 SDNet model
			7.6.1 Network class
			7.6.2 Network layers
				7.6.2.1 Attention layer
				7.6.2.2 Fully-aware attention layer
				7.6.2.3 Question vector layer
				7.6.2.4 Output layer
			7.6.3 Generate Bidirectional Encoder Representations from Transformers embeddings
		7.7 Summary
		Reference
	8 Applications and future of machine reading comprehension
		8.1 Intelligent customer service
			8.1.1 Building product knowledge base
			8.1.2 Intent understanding
				8.1.2.1 Modeling
				8.1.2.2 Training
			8.1.3 Answer generation
			8.1.4 Other modules
				8.1.4.1 Application programming interface (API) integration
				8.1.4.2 Contextual understanding
				8.1.4.3 Chit-chat
		8.2 Search engine
			8.2.1 Search engine technology
				8.2.1.1 Crawling
				8.2.1.2 Indexing
				8.2.1.3 Ranking
			8.2.2 Machine reading comprehension in search engine
			8.2.3 Challenges and future of machine reading comprehension in search engine
		8.3 Health care
		8.4 Laws
			8.4.1 Automatic judgement
			8.4.2 Crime classification
		8.5 Finance
			8.5.1 Predicting stock prices
			8.5.2 News summarization
		8.6 Education
		8.7 The future of machine reading comprehension
			8.7.1 Challenges
				8.7.1.1 Knowledge and reasoning
				8.7.1.2 Interpretability
				8.7.1.3 Low-resource machine reading comprehension
				8.7.1.4 Multimodality
					8.7.1.4.1 Question and answering on structured data
					8.7.1.4.2 Visual question answering
			8.7.2 Commercialization
				8.7.2.1 Assisting or partially replacing humans
				8.7.2.2 Completely replacing humans
		8.8 Summary
		References
Appendix A: Machine learning basics
	A.1 Types of machine learning
	A.2 Model and parameters
	A.3 Generalization and overfitting
Appendix B: Deep learning basics
	B.1 Neural network
		B.1.1 Definition
			B.1.1.1 Neuron
			B.1.1.2 Layer and network
		B.1.2 Loss function
			B.1.2.1 Mean squared error
			B.1.2.2 Cross entropy
		B.1.3 Optimization
	B.2 Common types of neural network in deep learning
		B.2.1 Convolutional neural network
		B.2.2 Recurrent neural network
			B.2.2.1 Gated recurrent unit
			B.2.2.2 Long short-term memory
		B.2.3 Dropout
	B.3 The deep learning framework PyTorch
		B.3.1 Installing PyTorch
		B.3.2 Tensor
		B.3.3 Gradient computation
		B.3.4 Network layer
			B.3.4.1 Fully connected layer
			B.3.4.2 Dropout
			B.3.4.3 Convolutional neural network
			B.3.4.4 Recurrent neural network
		B.3.5 Custom network
			B.3.5.1 Implement a custom network
			B.3.5.2 Optimize a custom network
	References
Index
Back Cover




نظرات کاربران