ورود به حساب

نام کاربری گذرواژه

گذرواژه را فراموش کردید؟ کلیک کنید

حساب کاربری ندارید؟ ساخت حساب

ساخت حساب کاربری

نام نام کاربری ایمیل شماره موبایل گذرواژه

برای ارتباط با ما می توانید از طریق شماره موبایل زیر از طریق تماس و پیامک با ما در ارتباط باشید


09117307688
09117179751

در صورت عدم پاسخ گویی از طریق پیامک با پشتیبان در ارتباط باشید

دسترسی نامحدود

برای کاربرانی که ثبت نام کرده اند

ضمانت بازگشت وجه

درصورت عدم همخوانی توضیحات با کتاب

پشتیبانی

از ساعت 7 صبح تا 10 شب

دانلود کتاب An introduction to IoT Analytics

دانلود کتاب مقدمه ای بر تحلیل اینترنت اشیا

An introduction to IoT Analytics

مشخصات کتاب

An introduction to IoT Analytics

ویرایش:  
نویسندگان:   
سری: Chapman & Hall/CRC Data Science Series 
ISBN (شابک) : 2020038957, 9780367686314 
ناشر: CRC Press 
سال نشر: 2021 
تعداد صفحات: 373 
زبان: English 
فرمت فایل : PDF (درصورت درخواست کاربر به PDF، EPUB یا AZW3 تبدیل می شود) 
حجم فایل: 12 مگابایت 

قیمت کتاب (تومان) : 43,000



ثبت امتیاز به این کتاب

میانگین امتیاز به این کتاب :
       تعداد امتیاز دهندگان : 16


در صورت تبدیل فایل کتاب An introduction to IoT Analytics به فرمت های PDF، EPUB، AZW3، MOBI و یا DJVU می توانید به پشتیبان اطلاع دهید تا فایل مورد نظر را تبدیل نمایند.

توجه داشته باشید کتاب مقدمه ای بر تحلیل اینترنت اشیا نسخه زبان اصلی می باشد و کتاب ترجمه شده به فارسی نمی باشد. وبسایت اینترنشنال لایبرری ارائه دهنده کتاب های زبان اصلی می باشد و هیچ گونه کتاب ترجمه شده یا نوشته شده به فارسی را ارائه نمی دهد.


توضیحاتی درمورد کتاب به خارجی



فهرست مطالب

Cover
Half Title
Series Page
Title Page
Copyright Page
Dedication
Table of Contents
Preface
Author
Chapter 1: Introduction
	1.1 The Internet of Things (IoT)
	1.2 IoT Application Domains
	1.3 IoT Reference Model
	1.4 Performance Evaluation and Modeling of IoT Systems
	1.5 Machine Learning and Statistical Techniques for IoT
	1.6 Overview of the Book
	Exercises
	References
Chapter 2: Review of Probability Theory
	2.1 Random Variables
	2.2 Discrete Random Variables
		2.2.1 The Binomial Random Variable
		2.2.2 The Geometric Random Variable
		2.2.3 The Poisson Random Variable
		2.2.4 The Cumulative Distribution
	2.3 Continuous Random Variables
		2.3.1 The Uniform Random Variable
		2.3.2 The Exponential Random Variable
		2.3.3 Mixtures of Exponential Random Variables
		2.3.4 The Normal Random Variable
	2.4 The Joint Probability Distribution
		2.4.1 The Marginal Probability Distribution
		2.4.2 The Conditional Probability
	2.5 Expectation and Variance
		2.5.1 The Expectation and Variance of Some Random Variables
	Exercises
	References
Chapter 3: Simulation Techniques
	3.1 Introduction
	3.2 The Discrete-event Simulation Technique
		3.2.1 Recertification of IoT Devices: A Simple Model
		3.2.2 Recertification of IoT Devices: A More Complex Model
	3.3 Generating Random Numbers
		3.3.1 Generating Pseudo-Random Numbers
		3.3.2 Generating Random Variates
	3.4 Simulation Designs
		3.4.1 The Event List
		3.4.2 Selecting the Unit Time
	3.5 Estimation Techniques
		3.5.1 Collecting Endogenously Created Data
		3.5.2 Transient-State versus Steady-State Simulation
		3.5.3 Estimation of the Confidence Interval of the Mean
		3.5.4 Estimation of the Confidence Interval of a Percentile
		3.5.5 Estimation of the Confidence Interval of a Probability
		3.5.6 Achieving a Required Accuracy
	3.6 Validation of a Simulation Model
	3.7 Simulation Languages
	Exercises
	Simulation Project
	References
Chapter 4: Hypothesis Testing
	4.1 Statistical Hypothesis Testing for a Mean
		4.1.1 The p -Value
		4.1.2 Hypothesis Testing for the Difference between Two Population Means
		4.1.3 Hypothesis Testing for a Proportion
		4.1.4 Type I and Type II Errors
	4.2 Analysis of Variance (ANOVA)
		4.2.1 Degrees of Freedom
	Exercises
	References
Chapter 5: Multivariable Linear Regression
	5.1 Simple Linear Regression
	5.2 Multivariable Linear Regression
		5.2.1 Significance of the Regression Coefficients
		5.2.2 Residual Analysis
		5.2.3 R -Squared
		5.2.4 Multicollinearity
		5.2.5 Data Transformations
	5.3 An Example
	5.4 Polynomial Regression
	5.5 Confidence and Prediction Intervals
	5.6 Ridge, Lasso, and Elastic Net Regression
		5.6.1 Ridge Regression
		5.6.2 Lasso Regression
		5.6.3 Elastic Net Regression
	Exercises
	Regression Project
		Data Set Generation
	References
Chapter 6: Time Series Forecasting
	6.1 A Stationary Time Series
		6.1.1 How to Recognize Seasonality
		6.1.2 Techniques for Removing Non-Stationary Features
	6.2 Moving Average or Smoothing Models
		6.2.1 The Simple Average Model
		6.2.2 The Exponential Moving Average Model
		6.2.3 The Average Age of a Model
		6.2.4 Selecting the Best Value for k and a
	6.3 The Moving Average MA( q) Model
		6.3.1 Derivation of the Mean and Variance of X t
		6.3.2 Derivation of the Autocorrelation Function of the MA(1)
		6.3.3 Invertibility of MA( q)
	6.4 The Autoregressive Model
		6.4.1 The AR(1) Model
		6.4.2 Stationarity Condition of AR( p)
		6.4.3 Derivation of the Coefficients a i, i = 1, 2, …, p
		6.4.4 Determination of the Order of AR( p)
	6.5 The Non-Seasonal ARIMA ( p,d,q) Model
		6.5.1 Determination of the ARIMA Parameters
	6.6 Decomposition Models
		6.6.1 Basic Steps for the Decomposition Model
	6.7 Forecast Accuracy
	6.8 Prediction Intervals
	6.9 Vector Autoregression
		6.9.1 Fitting a VAR( p)
	Exercises
	Forecasting Project
		Data Set
	References
Chapter 7: Dimensionality Reduction
	7.1 A Review of Eigenvalues and Eigenvectors
	7.2 Principal Component Analysis (PCA)
		7.2.1 The PCA Algorithm
	7.3 Linear and Multiple Discriminant Analysis
		7.3.1 Linear Discriminant Analysis (LDA)
		7.3.2 Multiple Discriminant Analysis (MDA)
	Exercises
	References
Chapter 8: Clustering Techniques
	8.1 Distance Metrics
	8.2 Hierarchical Clustering
		8.2.1 The Hierarchical Clustering Algorithm
		8.2.2 Linkage Criteria
	8.3 The k -Means Algorithm
		8.3.1 The Algorithm
		8.3.2 Determining the Number k of Clusters
			a. Silhouette Scores
			b. Akaike’s Information Criterion (AIC)
	8.4 The Fuzzy c -Means Algorithm
	8.5 The Gaussian Mixture Decomposition
	8.6 The DBSCAN Algorithm
		8.6.1 Determining MinPts and ε
		8.6.2 Advantages and Disadvantages of DBSCAN
	Exercises
	Clustering Project
		Data Set Generation
	References
Chapter 9: Classification Techniques
	9.1 The k -Nearest Neighbor ( k -NN) Method
		9.1.1 Selection of k
		9.1.2 Using Kernels with the k -NN Method
		9.1.3 Curse of Dimensionality
		9.1.4 Voronoi Diagrams
		9.1.5 Advantages and Disadvantages of the k -NN Method
	9.2 The Naive Bayes Classifier
		9.2.1 The Simple Bayes Classifier
		9.2.2 The Naive Bayes Classifier
		9.2.3 The Gaussian Naive Bayes Classifier
		9.2.4 Advantages and Disadvantages
		9.2.5 The k -NN Method Using Bayes’ Theorem
	9.3 Decision Trees
		9.3.1 Regression Trees
		9.3.2 Classification Trees
		9.3.3 Pre-Pruning and Post-Pruning
		9.3.4 Advantages and Disadvantages of Decision Trees
		9.3.5 Decision Trees Ensemble Methods
	9.4 Logistic Regression
		9.4.1 The Binary Logistic Regression
		9.4.2 Multinomial Logistics Regression
		9.4.3 Ordinal Logistic Regression
	Exercises
	Classification Project
	References
Chapter 10: Artificial Neural Networks
	10.1 The Feedforward Artificial Neural Network
	10.2 Other Artificial Neural Networks
	10.3 Activation Functions
	10.4 Calculation of the Output Value
	10.5 Selecting the Number of Layers and Nodes
	10.6 The Backpropagation Algorithm
		10.6.1 The Gradient Descent Algorithm
		10.6.2 Calculation of the Gradients
	10.7 Stochastic, Batch, Mini-Batch Gradient Descent Methods
	10.8 Feature Normalization
	10.9 Overfitting
		10.9.1 The Early Stopping Method
		10.9.2 Regularization
		10.9.3 The Dropout Method
	10.10 Selecting the Hyper-Parameters
		10.10.1 Selecting the Learning Rate γ
		10.10.2 Selecting the Regularization Parameter λ
	Exercises
	Neural Network Project
		Data Set Generation
	References
Chapter 11: Support Vector Machines
	11.1 Some Basic Concepts
	11.2 The SVM Algorithm: Linearly Separable Data
	11.3 Soft-Margin SVM ( C- SVM)
	11.4 The SVM Algorithm: Non-Linearly Separable Data
	11.5 Other SVM methods
	11.6 Multiple Classes
	11.7 Selecting the Best Values for C and γ
	11.8 ε -Support Vector Regression ( ε -SVR)
	Exercises
		SVM Project
			Data Set Generation
	References
Chapter 12: Hidden Markov Models
	12.1 Markov Chains
	12.2 Hidden Markov Models – An Example
	12.3 The Three Basic HMM Problems
		12.3.1 Problem 1 – The Evaluation Problem
		12.3.2 Problem 2 – The Decoding Problem
		12.3.3 Problem 3 – The Learning Problem
	12.4 Mathematical Notation
	12.5 Solution to Problem 1
		12.5.1 A Brute Force Solution
		12.5.2 The Forward–Backward Algorithm
	12.6 Solution to Problem 2
		12.6.1 The Heuristic Solution
		12.6.2 The Viterbi Algorithm
	12.7 Solution to Problem 3
	12.8 Selection of the Number of States N
	12.9 Forecasting O T+t
	12.10 Continuous Observation Probability Distributions
	12.11 Autoregressive HMMs
	Exercises
	HMM Project
		Data Set Generation
	References
Appendix A: Some Basic Concepts of Queueing Theory
Appendix B: Maximum Likelihood Estimation (MLE)
	B.1 The MLE Method
	B.2 Relation of MLE to Bayesian Inference
	B.3 MLE and the Least Squares Method
	B.4 MLE of the Gaussian MA(1)
	B.5 MLE of the Gaussian AR(1)
Index
	A
	B
	C
	D
	E
	F
	G
	H
	I
	J
	K
	L
	M
	N
	O
	P
	Q
	R
	S
	T
	U
	V




نظرات کاربران