دسترسی نامحدود
برای کاربرانی که ثبت نام کرده اند
برای ارتباط با ما می توانید از طریق شماره موبایل زیر از طریق تماس و پیامک با ما در ارتباط باشید
در صورت عدم پاسخ گویی از طریق پیامک با پشتیبان در ارتباط باشید
برای کاربرانی که ثبت نام کرده اند
درصورت عدم همخوانی توضیحات با کتاب
از ساعت 7 صبح تا 10 شب
ویرایش: 2
نویسندگان: Thushan Ganegedara
سری:
ISBN (شابک) : 1838641351, 9781838641351
ناشر: Packt Publishing
سال نشر: 2022
تعداد صفحات: 515
زبان: English
فرمت فایل : PDF (درصورت درخواست کاربر به PDF، EPUB یا AZW3 تبدیل می شود)
حجم فایل: 17 مگابایت
در صورت تبدیل فایل کتاب Natural Language Processing with TensorFlow: The definitive NLP book to implement the most sought-after machine learning models and tasks, 2nd Edition به فرمت های PDF، EPUB، AZW3، MOBI و یا DJVU می توانید به پشتیبان اطلاع دهید تا فایل مورد نظر را تبدیل نمایند.
توجه داشته باشید کتاب پردازش زبان طبیعی با TensorFlow: کتاب NLP قطعی برای پیادهسازی پرطرفدارترین مدلها و وظایف یادگیری ماشین، ویرایش دوم نسخه زبان اصلی می باشد و کتاب ترجمه شده به فارسی نمی باشد. وبسایت اینترنشنال لایبرری ارائه دهنده کتاب های زبان اصلی می باشد و هیچ گونه کتاب ترجمه شده یا نوشته شده به فارسی را ارائه نمی دهد.
از وظایف NLP مقدماتی تا مدلهای ترانسفورماتور، این نسخه جدید به شما میآموزد که از APIهای قدرتمند TensorFlow برای پیادهسازی راهحلهای NLP سرتاسری که توسط مدلهای ML (یادگیری ماشین) عملکردی هدایت میشوند، استفاده کنید
آموزش نحوه حل مشکلات پردازش زبان طبیعی (NLP) به دلیل رشد انفجاری داده ها همراه با تقاضا برای راه حل های یادگیری ماشین در تولید، مهارت مهمی برای تسلط است. پردازش زبان طبیعی با TensorFlow، نسخه دوم، به شما میآموزد که چگونه مشکلات رایج NLP در دنیای واقعی را با انواع معماریهای مدل یادگیری عمیق حل کنید.
این کتاب با جذب خوانندگان شروع میشود. آشنایی با NLP و اصول TensorFlow. سپس، به تدریج جنبه های مختلف TensorFlow 2.x را به شما آموزش می دهد. در فصلهای بعدی، نحوه ایجاد بردارهای کلمه قدرتمند، طبقهبندی متن، تولید متن جدید، و ایجاد زیرنویسهای تصویر، در میان سایر موارد استفاده هیجانانگیز از NLP در دنیای واقعی را میآموزید.
TensorFlow به اکوسیستمی تبدیل شده است که از جریان کار یادگیری ماشینی از طریق جذب و تبدیل دادهها، ساخت مدلها، نظارت و تولید پشتیبانی میکند. سپس متن را مستقیماً از فایلها میخوانیم و تبدیلهای مورد نیاز را از طریق خط لوله داده TensorFlow انجام میدهیم. همچنین خواهیم دید که چگونه از یک ابزار تجسم همه کاره معروف به TensorBoard برای تجسم مدل های خود استفاده کنیم.
در پایان این کتاب NLP، با استفاده از TensorFlow برای ساخت راحت خواهید بود. مدلهای یادگیری عمیق با بسیاری از معماریهای مختلف، و بهطور کارآمد دادهها را با استفاده از TensorFlow دریافت میکنند، علاوه بر این، میتوانید با اطمینان از TensorFlow در سراسر گردش کار یادگیری ماشین خود استفاده کنید.
این کتاب برای توسعه دهندگان و برنامه نویسان پایتون است که علاقه زیادی به یادگیری عمیق دارند و می خواهند یاد بگیرند که چگونه از TensorFlow برای ساده کردن وظایف NLP استفاده کنند.</ p>
مهارت های اساسی پایتون و همچنین دانش اولیه یادگیری ماشین و حساب دیفرانسیل و انتگرال در سطح کارشناسی و جبر خطی فرض شده است. بدون نیاز به تجربه قبلی پردازش زبان طبیعی.
From introductory NLP tasks to Transformer models, this new edition teaches you to utilize powerful TensorFlow APIs to implement end-to-end NLP solutions driven by performant ML (Machine Learning) models
Learning how to solve natural language processing (NLP) problems is an important skill to master due to the explosive growth of data combined with the demand for machine learning solutions in production. Natural Language Processing with TensorFlow, Second Edition, will teach you how to solve common real-world NLP problems with a variety of deep learning model architectures.
The book starts by getting readers familiar with NLP and the basics of TensorFlow. Then, it gradually teaches you different facets of TensorFlow 2.x. In the following chapters, you then learn how to generate powerful word vectors, classify text, generate new text, and generate image captions, among other exciting use-cases of real-world NLP.
TensorFlow has evolved to be an ecosystem that supports a machine learning workflow through ingesting and transforming data, building models, monitoring, and productionization. We will then read text directly from files and perform the required transformations through a TensorFlow data pipeline. We will also see how to use a versatile visualization tool known as TensorBoard to visualize our models.
By the end of this NLP book, you will be comfortable with using TensorFlow to build deep learning models with many different architectures, and efficiently ingest data using TensorFlow Additionally, you'll be able to confidently use TensorFlow throughout your machine learning workflow.
This book is for Python developers and programmers with a strong interest in deep learning, who want to learn how to leverage TensorFlow to simplify NLP tasks.
Fundamental Python skills are assumed, as well as basic knowledge of machine learning and undergraduate-level calculus and linear algebra. No previous natural language processing experience required.
Cover Copyright Contributors Table of Contents Preface Chapter 1: Introduction to Natural Language Processing What is Natural Language Processing? Tasks of Natural Language Processing The traditional approach to Natural Language Processing Understanding the traditional approach Example – generating football game summaries Drawbacks of the traditional approach The deep learning approach to Natural Language Processing History of deep learning The current state of deep learning and NLP Understanding a simple deep model – a fully connected neural network Introduction to the technical tools Description of the tools Installing Anaconda and Python Creating a Conda environment TensorFlow (GPU) software requirements Accessing Jupyter Notebook Verifying the TensorFlow installation Summary Chapter 2 : Understanding TensorFlow 2 What is TensorFlow? Getting started with TensorFlow 2 TensorFlow 2 architecture – What happens during graph build? TensorFlow architecture – what happens when you execute the graph? Café Le TensorFlow 2 – understanding TensorFlow 2 with an analogy Flashback: TensorFlow 1 Inputs, variables, outputs, and operations Defining inputs in TensorFlow Feeding data as NumPy arrays Feeding data as tensors Building a data pipeline using the tf.data API Defining variables in TensorFlow Defining outputs in TensorFlow Defining operations in TensorFlow Comparison operations Mathematical operations Updating (scattering) values in tensors Collecting (gathering) values from a tensor Neural network-related operations Nonlinear activations used by neural networks The convolution operation The pooling operation Defining loss Keras: The model building API of TensorFlow Sequential API Functional API Sub-classing API Implementing our first neural network Preparing the data Implementing the neural network with Keras Training the model Testing the model Summary Chapter 3: Word2vec – Learning Word Embeddings What is a word representation or meaning? Classical approaches to learning word representation One-hot encoded representation The TF-IDF method Co-occurrence matrix An intuitive understanding of Word2vec – an approach to learning word representation Exercise: does queen = king – he + she? The skip-gram algorithm From raw text to semi-structured text Understanding the skip-gram algorithm Implementing and running the skip-gram algorithm with TensorFlow Implementing the data generators with TensorFlow Implementing the skip-gram architecture with TensorFlow Training and evaluating the model The Continuous Bag-of-Words algorithm Generating data for the CBOW algorithm Implementing CBOW in TensorFlow Training and evaluating the model Summary Chapter 4: Advanced Word Vector Algorithms GloVe – Global Vectors representation Understanding GloVe Implementing GloVe Generating data for GloVe Training and evaluating GloVe ELMo – Taking ambiguities out of word vectors Downloading ELMo from TensorFlow Hub Preparing inputs for ELMo Generating embeddings with ELMo Document classification with ELMo Dataset Generating document embeddings Classifying documents with document embeddings Summary Chapter 5: Sentence Classification with Convolutional Neural Networks Introducing CNNs CNN fundamentals The power of CNNs Understanding CNNs Convolution operation Standard convolution operation Convolving with stride Convolving with padding Transposed convolution Pooling operation Max pooling Max pooling with stride Average pooling Fully connected layers Putting everything together Exercise – image classification on Fashion-MNIST with CNN About the data Downloading and exploring the data Implementing the CNN Analyzing the predictions produced with a CNN Using CNNs for sentence classification How data is transformed for sentence classification Implementation – downloading and preparing data Implementation – building a tokenizer The sentence classification CNN model The convolution operation Pooling over time Implementation – sentence classification with CNNs Training the model Summary Chapter 6: Recurrent Neural Networks Understanding RNNs The problem with feed-forward neural networks Modeling with RNNs Technical description of an RNN Backpropagation Through Time How backpropagation works Why we cannot use BP directly for RNNs Backpropagation Through Time – training RNNs Truncated BPTT – training RNNs efficiently Limitations of BPTT – vanishing and exploding gradients Applications of RNNs One-to-one RNNs One-to-many RNNs Many-to-one RNNs Many-to-many RNNs Named Entity Recognition with RNNs Understanding the data Processing data Defining hyperparameters Defining the model Introduction to the TextVectorization layer Defining the rest of the model Evaluation metrics and the loss function Training and evaluating RNN on NER task Visually analyzing outputs NER with character and token embeddings Using convolution to generate token embeddings Implementing the new NER model Defining hyperparameters Defining the input layer Defining the token-based TextVectorization layer Defining the character-based TextVectorization layer Processing the inputs for the char_vectorize_layer Performing convolution on the character embeddings Model training and evaluation Other improvements you can make Summary Chapter 7: Understanding Long Short-Term Memory Networks Understanding Long Short-Term Memory Networks What is an LSTM? LSTMs in more detail How LSTMs differ from standard RNNs How LSTMs solve the vanishing gradient problem Improving LSTMs Greedy sampling Beam search Using word vectors Bidirectional LSTMs (BiLSTMs) Other variants of LSTMs Peephole connections Gated Recurrent Units Summary Chapter 8: Applications of LSTM – Generating Text Our data About the dataset Generating training, validation, and test sets Analyzing the vocabulary size Defining the tf.data pipeline Implementing the language model Defining the TextVectorization layer Defining the LSTM model Defining metrics and compiling the model Training the model Defining the inference model Generating new text with the model Comparing LSTMs to LSTMs with peephole connections and GRUs Standard LSTM Review Gated Recurrent Units (GRUs) Review The model LSTMs with peepholes Review The code Training and validation perplexities over time Improving sequential models – beam search Implementing beam search Generating text with beam search Improving LSTMs – generating text with words instead of n-grams The curse of dimensionality Word2vec to the rescue Generating text with Word2vec Summary Chapter 9: Sequence-to-Sequence Learning – Neural Machine Translation Machine translation A brief historical tour of machine translation Rule-based translation Statistical Machine Translation (SMT) Neural Machine Translation (NMT) Understanding neural machine translation Intuition behind NMT systems NMT architecture The embedding layer The encoder The context vector The decoder Preparing data for the NMT system The dataset Adding special tokens Splitting training, validation, and testing datasets Defining sequence lengths for the two languages Padding the sentences Defining the model Converting tokens to IDs Defining the encoder Defining the decoder Attention: Analyzing the encoder states Computing Attention Implementing Attention Defining the final model Training the NMT The BLEU score – evaluating the machine translation systems Modified precision Brevity penalty The final BLEU score Visualizing Attention patterns Inference with NMT Other applications of Seq2Seq models – chatbots Training a chatbot Evaluating chatbots – the Turing test Summary Chapter 10: Transformers Transformer architecture The encoder and the decoder Computing the output of the self-attention layer Embedding layers in the Transformer Residuals and normalization Understanding BERT Input processing for BERT Tasks solved by BERT How BERT is pre-trained Masked Language Modeling (MLM) Next Sentence Prediction (NSP) Use case: Using BERT to answer questions Introduction to the Hugging Face transformers library Exploring the data Implementing BERT Implementing and using the Tokenizer Defining a TensorFlow dataset BERT for answering questions Defining the config and the model Training and evaluating the model Answering questions with Bert Summary Chapter 11: Image Captioning with Transformers Getting to know the data ILSVRC ImageNet dataset The MS-COCO dataset Downloading the data Processing and tokenizing data Preprocessing data Tokenizing data Defining a tf.data.Dataset The machine learning pipeline for image caption generation Vision Transformer (ViT) Text-based decoder Transformer Putting everything together Implementing the model with TensorFlow Implementing the ViT model Implementing the text-based decoder Defining the self-attention layer Defining the Transformer layer Defining the full decoder Training the model Evaluating the results quantitatively BLEU ROUGE METEOR CIDEr Evaluating the model Captions generated for test images Summary Appendix A: Mathematical Foundations and Advanced TensorFlow Basic data structures Scalar Vectors Matrices Indexing of a matrix Special types of matrices Identity matrix Square diagonal matrix Tensors Tensor/matrix operations Transpose Matrix multiplication Element-wise multiplication Inverse Finding the matrix inverse – Singular Value Decomposition (SVD) Norms Determinant Probability Random variables Discrete random variables Continuous random variables The probability mass/density function Conditional probability Joint probability Marginal probability Bayes’ rule Visualizing word embeddings with TensorBoard Starting TensorBoard Saving word embeddings and visualizing via TensorBoard Summary Other Books You May Enjoy Packt Page Other Books You May Enjoy Index