دسترسی نامحدود
برای کاربرانی که ثبت نام کرده اند
برای ارتباط با ما می توانید از طریق شماره موبایل زیر از طریق تماس و پیامک با ما در ارتباط باشید
در صورت عدم پاسخ گویی از طریق پیامک با پشتیبان در ارتباط باشید
برای کاربرانی که ثبت نام کرده اند
درصورت عدم همخوانی توضیحات با کتاب
از ساعت 7 صبح تا 10 شب
ویرایش: نویسندگان: Chua Shi, Xiao Wang, Philip S. Yu سری: Artificial Intelligence: Foundations, Theory, and Algorithms ISBN (شابک) : 9811661650, 9789811661655 ناشر: Springer سال نشر: 2022 تعداد صفحات: 338 [329] زبان: English فرمت فایل : PDF (درصورت درخواست کاربر به PDF، EPUB یا AZW3 تبدیل می شود) حجم فایل: 10 Mb
در صورت تبدیل فایل کتاب Heterogeneous Graph Representation Learning and Applications به فرمت های PDF، EPUB، AZW3، MOBI و یا DJVU می توانید به پشتیبان اطلاع دهید تا فایل مورد نظر را تبدیل نمایند.
توجه داشته باشید کتاب یادگیری و کاربردهای نمایش نمودار ناهمگن نسخه زبان اصلی می باشد و کتاب ترجمه شده به فارسی نمی باشد. وبسایت اینترنشنال لایبرری ارائه دهنده کتاب های زبان اصلی می باشد و هیچ گونه کتاب ترجمه شده یا نوشته شده به فارسی را ارائه نمی دهد.
Foreword Preface About the Book Contents About the Authors 1 Introduction 1.1 Basic Concepts and Definitions 1.2 Graph Representation Learning 1.3 Heterogeneous Graph Representation Learning and Challenges 1.4 Organization of the Book References 2 The State-of-the-Art of Heterogeneous Graph Representation 2.1 Method Taxonomy 2.1.1 Structure-Preserved Representation 2.1.2 Attribute-Assisted Representation 2.1.3 Dynamic Representation 2.1.4 Application-Oriented Representation 2.2 Technique Summary 2.2.1 Shallow Model 2.2.2 Deep Model 2.3 Open Sources 2.3.1 Benchmark Datasets 2.3.2 Open-Source Code 2.3.3 Available Tools References Part I Techniques 3 Structure-Preserved Heterogeneous Graph Representation 3.1 Introduction 3.2 Meta-Path Based Random Walk 3.2.1 Overview 3.2.2 The HERec Model 3.2.2.1 Model Framework 3.2.2.2 Heterogeneous Graph Embedding 3.2.2.3 Integrating Matrix Factorization with Fused HG Embedding for Recommendation 3.2.3 Experiments 3.2.3.1 Experimental Settings 3.2.3.2 Effectiveness Experiments 3.2.3.3 Cold-Start Prediction 3.3 Meta-Path Based Decomposition 3.3.1 Overview 3.3.2 The NeuACF Model 3.3.2.1 Model Framework 3.3.2.2 Aspect-Level Similarity Matrix Extraction 3.3.2.3 Learning Aspect-Level Representations 3.3.2.4 Attention-Based Aspect-Level Representations Fusion 3.3.2.5 NeuACF++: Self-Attention-Based Aspect-Level Representations Fusion 3.3.2.6 Objective Function 3.3.3 Experiments 3.3.3.1 Experimental Settings 3.3.3.2 Performance Analysis 3.4 Relation Structure Awareness 3.4.1 Overview 3.4.2 Preliminary 3.4.3 The RHINE Model 3.4.3.1 Basic Idea 3.4.3.2 Different Models for ARs and IRs 3.4.3.3 A Unified Model for HG Embedding 3.4.4 Experiments 3.4.4.1 Experimental Settings 3.4.4.2 Node Clustering 3.4.4.3 Link Prediction 3.4.4.4 Multi-Class Classification 3.5 Network Schema Preservation 3.5.1 Overview 3.5.2 The NSHE Model 3.5.2.1 Model Framework 3.5.2.2 Preserving Pairwise Proximity 3.5.2.3 Preserving Network Schema Proximity 3.5.2.4 Optimization Objective 3.5.3 Experiments 3.5.3.1 Experimental Settings 3.5.3.2 Node Classification 3.5.3.3 Node Clustering 3.6 Conclusions References 4 Attribute-Assisted Heterogeneous Graph Representation 4.1 Introduction 4.2 Heterogeneous Graph Attention Network 4.2.1 Overview 4.2.2 The HAN Model 4.2.2.1 Node-Level Attention 4.2.2.2 Semantic-Level Attention 4.2.3 Experiments 4.2.3.1 Experimental Settings 4.2.3.2 Classification 4.2.3.3 Analysis of Hierarchical Attention Mechanism 4.3 Heterogeneous Graph Propagation Network 4.3.1 Overview 4.3.2 Semantic Confusion Analysis 4.3.2.1 Heterogeneous Graph Neural Network 4.3.2.2 Relationship Between HGNNs and Multiple Meta-Paths Based Random Walk 4.3.3 The HPN Model 4.3.3.1 Semantic Propagation Mechanism 4.3.3.2 Semantic Fusion Mechanism 4.3.4 Experiments 4.3.4.1 Experimental Settings 4.3.4.2 Clustering 4.3.4.3 Robustness to Model Depth 4.4 Heterogeneous Graph Structure Learning 4.4.1 Overview 4.4.2 The HGSL Model 4.4.2.1 Model Framework 4.4.2.2 Feature Graph Generator 4.4.2.3 Semantic Graph Generator 4.4.2.4 Optimization 4.4.3 Experiments 4.4.3.1 Experimental Settings 4.4.3.2 Node Classification 4.4.3.3 Importance Analysis of Candidate Graphs 4.5 Conclusions References 5 Dynamic Heterogeneous Graph Representation 5.1 Introduction 5.2 Incremental Learning 5.2.1 Overview 5.2.2 The DyHNE Model 5.2.2.1 Static Modeling 5.2.2.2 Dynamic Modeling 5.2.2.3 Acceleration 5.2.3 Experiments 5.2.3.1 Experimental Settings 5.2.3.2 Effectiveness of StHNE 5.2.3.3 Effectiveness of DyHNE 5.3 Sequence Information 5.3.1 Overview 5.3.2 The SHCF Model 5.3.2.1 Embedding Layer 5.3.2.2 Sequence-Aware Heterogeneous Message Passing 5.3.2.3 Optimization Objective 5.3.3 Experiments 5.3.3.1 Experimental Settings 5.3.3.2 Performance Comparison 5.4 Temporal Interaction 5.4.1 Overview 5.4.2 The THIGE Model 5.4.2.1 Embedding Layer with Temporal Information 5.4.2.2 Short-Term Preference Modeling 5.4.2.3 Long-Term Preference Modeling 5.4.2.4 Preference Modeling of Items 5.4.2.5 Optimization Objective 5.4.3 Experiments 5.4.3.1 Experimental Settings 5.4.3.2 Performance Comparison 5.5 Conclusion References 6 Emerging Topics of Heterogeneous Graph Representation 6.1 Introduction 6.2 Adversarial Learning 6.2.1 Overview 6.2.2 The HeGAN Model 6.2.2.1 Model Framework 6.2.2.2 Relation-Aware Discriminator 6.2.2.3 Relation-Aware Generalized Generator 6.2.2.4 Model Training 6.2.3 Experiments 6.2.3.1 Experimental Settings 6.2.3.2 Node Classification 6.2.3.3 Link Prediction 6.3 Importance Sampling 6.3.1 Overview 6.3.2 The HeteSamp Model 6.3.2.1 The General HIGE 6.3.2.2 Batch-Wise Heterogeneous Sampling 6.3.2.3 Type-Dependent Sampling Strategy 6.3.2.4 Type-Fusion Sampling Strategy 6.3.2.5 Heterogeneous Self-Normalized and Adaptive Estimators 6.3.2.6 Optimization Framework 6.3.3 Experiments 6.3.3.1 Experimental Settings 6.3.3.2 Empirical Validation 6.3.3.3 Effectiveness 6.3.3.4 Efficiency 6.4 Hyperbolic Representation 6.4.1 Overview 6.4.2 The HHNE Model 6.4.2.1 Model Framework 6.4.2.2 Hyperbolic HG Embedding 6.4.2.3 Optimization 6.4.3 Experiments 6.4.3.1 Experimental Setup 6.4.3.2 Network Reconstruction 6.4.3.3 Link Prediction 6.5 Conclusion References Part II Applications 7 Heterogeneous Graph Representation for Recommendation 7.1 Introduction 7.2 Top-N Recommendation 7.2.1 Overview 7.2.2 The MCRec Model 7.2.2.1 Model Framework 7.2.2.2 Characterizing Meta-path Based Context for Interaction 7.2.2.3 Improving Embeddings for Interaction Via Co-Attention Mechanism 7.2.2.4 Overall Architecture 7.2.3 Experiments 7.2.3.1 Experimental Settings 7.2.3.2 Comparisons and Analysis 7.3 Cold-Start Recommendation 7.3.1 Overview 7.3.2 The MetaHIN Model 7.3.2.1 Model Framework 7.3.2.2 Semantic-Enhanced Task Constructor 7.3.2.3 Co-Adaptation Meta-Learner 7.3.3 Experiments 7.3.3.1 Experimental Settings 7.3.3.2 Comparisons and Analysis 7.4 Author Set Recommendation 7.4.1 Overview 7.4.2 The ASI Model 7.4.2.1 Model Framework 7.4.2.2 Weighted Paper-Ego-Network Construction 7.4.2.3 Optimal Quasi-Clique with Constraint Extraction in Weighted Paper-Ego-Network 7.4.3 Experiments 7.4.3.1 Experimental Settings 7.4.3.2 Comparisons and Analysis 7.5 Conclusions References 8 Heterogeneous Graph Representation for Text Mining 8.1 Introduction 8.2 Short Text Classification 8.2.1 Overview 8.2.2 HG Modeling for Short Texts 8.2.3 The HGAT Model 8.2.4 Experiments 8.2.4.1 Experimental Settings 8.2.4.2 Main Results 8.2.4.3 Comparison of Variants of HGAT 8.2.4.4 Case Study 8.3 News Recommendation with Long/Short-Term Interest Modeling 8.3.1 Overview 8.3.2 Problem Formulation 8.3.3 The GNewsRec Model 8.3.3.1 Text Information Extractor 8.3.3.2 Long-Term User Interest Modeling and News Modeling 8.3.3.3 Short-Term User Interest Modeling 8.3.3.4 Prediction and Training 8.3.4 Experiments 8.3.4.1 Experimental Settings 8.3.4.2 Comparisons of Different Models 8.3.4.3 Comparisons of GNewsRec Variants 8.4 News Recommendation with Preference Disentanglement 8.4.1 Overview 8.4.2 The GNUD Model 8.4.2.1 News Content Information Extractor 8.4.2.2 Heterogeneous Graph Encoder 8.4.2.3 Model Training 8.4.3 Experiments 8.4.3.1 Experimental Settings 8.4.3.2 Comparison of Different Methods 8.4.3.3 Comparison of GNUD Variants 8.4.3.4 Case Study 8.5 Conclusion References 9 Heterogeneous Graph Representation for Industry Application 9.1 Introduction 9.2 Cash-Out User Detection 9.2.1 Overview 9.2.2 Preliminaries 9.2.3 The HACUD Model 9.2.3.1 Model Framework 9.2.3.2 Meta-Path Based Neighbors Aggregation 9.2.3.3 Feature Fusion 9.2.3.4 Hierarchical Attention 9.2.3.5 Model Learning 9.2.4 Experiments 9.2.4.1 Experimental Settings 9.2.4.2 Performance Comparison 9.2.4.3 Effects of Hierarchical Attention 9.2.4.4 Impact of Different Meta-Paths 9.3 Intent Recommendation 9.3.1 Overview 9.3.2 Problem Formulation 9.3.3 The MEIRec Model 9.3.3.1 Model Framework 9.3.3.2 Uniform Term Embedding 9.3.3.3 Meta-Path Guided Heterogeneous Graph Neural Network 9.3.3.4 User Modeling 9.3.3.5 Query Modeling 9.3.3.6 Optimization Objective 9.3.4 Experiments 9.3.4.1 Experimental Settings 9.3.4.2 Offline Performance Evaluation 9.3.4.3 Online Experiments 9.4 Share Recommendation 9.4.1 Overview 9.4.2 Problem Formulation 9.4.3 The HGSRec Model 9.4.3.1 Model Framework 9.4.3.2 Initialization with Feature Embedding 9.4.3.3 Tripartite Heterogeneous Graph Neural Networks 9.4.3.4 Dual Co-attention Mechanism 9.4.3.5 Transitive Triplet Representation 9.4.4 Experiments 9.4.4.1 Experimental Settings 9.4.4.2 Offline Performance Evaluation 9.4.4.3 Attention Analysis 9.4.4.4 Online Experiments 9.5 Friend-Enhanced Recommendation 9.5.1 Overview 9.5.2 Preliminaries 9.5.3 The SIAN Model 9.5.3.1 Model Framework 9.5.3.2 Attentive Feature Aggregator 9.5.3.3 Social Influence Coupler 9.5.3.4 Behavior Prediction and Model Learning 9.5.4 Experiments 9.5.4.1 Experimental Settings 9.5.4.2 Experimental Results 9.5.4.3 Analysis on Social Influence in FER 9.6 Conclusions References 10 Platforms and Practice of Heterogeneous Graph Representation Learning 10.1 Introduction 10.2 Foundation Platforms 10.2.1 Deep Learning Platforms 10.2.1.1 TensorFlow 10.2.1.2 PyTorch 10.2.1.3 MXNet 10.2.1.4 PaddlePaddle 10.2.2 Platforms of Graph Machine Learning 10.2.2.1 DGL 10.2.2.2 PyG 10.2.3 Platforms of Heterogeneous Graph Representation Learning 10.3 Practice of Heterogeneous Graph Representation Learning 10.3.1 Build a New Dataset 10.3.2 Build a New Model 10.3.2.1 Model 10.3.2.2 Trainerflow 10.3.2.3 Task 10.3.2.4 Register in OpenHGNN 10.3.3 Practice of HAN 10.3.3.1 HAN Basemodel 10.3.3.2 HANlayer 10.3.4 Practice of RGCN 10.3.5 Practice of HERec 10.3.5.1 HERec Random Walk 10.3.5.2 Meta-Path Random Walk 10.4 Conclusion References 11 Future Research Directions 11.1 Introduction 11.2 Preserving HG Structures 11.3 Capturing HG Properties 11.4 Deep Graph Learning on HG Data 11.5 Making HG Representation Reliable 11.6 Technique Deployment in Real-World Applications 11.7 Others References