دسترسی نامحدود
برای کاربرانی که ثبت نام کرده اند
برای ارتباط با ما می توانید از طریق شماره موبایل زیر از طریق تماس و پیامک با ما در ارتباط باشید
در صورت عدم پاسخ گویی از طریق پیامک با پشتیبان در ارتباط باشید
برای کاربرانی که ثبت نام کرده اند
درصورت عدم همخوانی توضیحات با کتاب
از ساعت 7 صبح تا 10 شب
دسته بندی: مکانیک: پویایی و هرج و مرج غیرخطی ویرایش: نویسندگان: Andreas Galka سری: Advanced Series in Nonlinear Dynamics: Volume 14 ISBN (شابک) : 9789810241483 ناشر: World Scientific Pub Co Inc سال نشر: 2000 تعداد صفحات: 360 زبان: English فرمت فایل : PDF (درصورت درخواست کاربر به PDF، EPUB یا AZW3 تبدیل می شود) حجم فایل: 27 مگابایت
کلمات کلیدی مربوط به کتاب موضوعات در تجزیه و تحلیل سری های زمانی غیرخطی: با مفاهیمی برای تجزیه و تحلیل EEG: ریاضیات، دینامیک غیرخطی
در صورت تبدیل فایل کتاب Topics in Nonlinear Time Series Analysis: With Implications for EEG Analysis به فرمت های PDF، EPUB، AZW3، MOBI و یا DJVU می توانید به پشتیبان اطلاع دهید تا فایل مورد نظر را تبدیل نمایند.
توجه داشته باشید کتاب موضوعات در تجزیه و تحلیل سری های زمانی غیرخطی: با مفاهیمی برای تجزیه و تحلیل EEG نسخه زبان اصلی می باشد و کتاب ترجمه شده به فارسی نمی باشد. وبسایت اینترنشنال لایبرری ارائه دهنده کتاب های زبان اصلی می باشد و هیچ گونه کتاب ترجمه شده یا نوشته شده به فارسی را ارائه نمی دهد.
این کتاب یک بررسی کامل از یک کلاس از الگوریتمهای قدرتمند برای تحلیل عددی دادههای سری زمانی پیچیده که از سیستمهای دینامیکی بهدست آمدهاند، ارائه میکند. این الگوریتمها مبتنی بر مفهوم نمایش فضای حالت دینامیک زیربنایی هستند که توسط دینامیک غیرخطی معرفی شدهاند. به طور خاص، الگوریتمهای فعلی برای بازسازی فضای حالت، تخمین ابعاد همبستگی، آزمایش جبرگرایی و آزمایش دادههای جایگزین ارائه شدهاند - الگوریتمهایی که از سال 1980 نقش اصلی را در بررسی آشوب قطعی و پدیدههای مرتبط ایفا کردهاند. تاکید ویژهای به این موضوع بسیار مورد بحث است که آیا این الگوریتمها میتوانند با موفقیت برای تجزیه و تحلیل الکتروانسفالوگرام انسانی مورد استفاده قرار گیرند یا خیر.
This book provides a thorough review of a class of powerful algorithms for the numerical analysis of complex time series data which were obtained from dynamical systems. These algorithms are based on the concept of state space representations of the underlying dynamics, as introduced by nonlinear dynamics. In particular, current algorithms for state space reconstruction, correlation dimension estimation, testing for determinism and surrogate data testing are presented — algorithms which have been playing a central role in the investigation of deterministic chaos and related phenomena since 1980. Special emphasis is given to the much-disputed issue whether these algorithms can be successfully employed for the analysis of the human electroencephalogram.
Contents 10 Preface 8 Chapter 1 Introduction 18 1.1 Linearity and the beginning of time series analysis 18 1.2 Irregular time series and determinism 20 1.3 The objective of nonlinear time series analysis 21 1.4 Outline of the organisation of the present study 22 Chapter 2 Dynamical systems, time series and attractors 26 2.1 Overview 26 2.2 Dynamical systems and state spaces 26 2.3 Measurements and time series 27 2.4 Deterministic dynamical systems 29 2.4.1 Attractors 29 2.4.2 Linear systems 32 2.4.3 Invariant measures 32 2.4.4 Sensitive dependence on initial conditions 35 2.4.5 Maps and discretised flows 36 2.4.6 Some important maps 38 2.4.7 Some important flows 41 2.5 Stochastic dynamical systems 45 2.5.1 Pure noise time series 46 2.5.2 Noise in dynamical systems 47 2.5.3 Linear stochastic systems 48 2.6 Nonstationarity 50 2.7 Experimental and observational time series 52 2.7.1 Electroencephalograms 53 Chapter 3 Linear methods 56 3.1 Overview 56 3.2 Linear autocorrelation 56 3.3 Fourier spectrum estimation 57 3.3.1 Discrete Fourier transform and power spectrum 57 3.3.2 Practical application of Fourier spectrum estimation 59 3.4 Linear prediction and linear filtering 63 Chapter 4 State Space Reconstruction: Theoretical foundations 66 4.1 Overview 66 4.2 The reconstruction problem 66 4.3 Definition of an embedding 68 4.4 Measures of the distortion due to embedding 69 4.5 The embedding theorem of Whitney and its generalisation 70 4.6 Time-delay embedding 72 4.7 The embedding theorem of Takens and its generalisation 74 4.8 Some historical remarks 76 4.9 Filtered time-delay embedding 76 4.9.1 Derivatives and Legendre coordinates 77 4.9.2 Principal components: definition and properties 81 4.9.3 Principal components: applications 84 4.10 Other reconstruction methods 86 4.11 Interspike intervals 87 Chapter 5 State space reconstruction: Practical application 90 5.1 Overview 90 5.2 The effect of noise on state space reconstruction 90 5.3 The choice of the time delay 92 5.4 In search of optimal embedding parameters 95 5.4.1 The Fillfactor algorithm 96 5.4.2 Comparing different reconstructions by PCA 99 5.4.3 The Integral Local Deformation (ILD) algorithm 101 5.4.4 Other algorithms for the estimation of optimal embedding parameters 104 Chapter 6 Dimensions: Basic definitions 110 6.1 Overview 110 6.2 Why estimate dimensions? 111 6.3 Topological dimension 112 6.4 Hausdorff dimension 112 6.5 Capacity dimension 114 6.6 Generalisation of the Hausdorff dimension 115 6.7 Generalisation of capacity dimension 117 6.8 Information dimension 119 6.9 Continuous definition of generalised dimensions 120 6.10 Pointwise dimension 120 6.11 Invariance of dimension under reconstruction 121 6.12 Invariance of dimension under filtering 123 6.13 Methods for the calculation of dimensions 124 6.13.1 Box-counting algorithm 124 6.13.2 Pairwise-distance algorithm 126 Chapter 7 Lyapunov exponents and entropies 130 7.1 Overview 130 7.2 Lyapunov exponents 130 7.3 Estimation of Lyapunov exponents from time series 132 7.4 Kaplan-Yorke dimension 133 7.5 Generalised entropies 134 7.6 Correlation entropy for time-delay embeddings 136 7.7 Pesin's theorem and partial dimensions 137 Chapter 8 Numerical estimation of the correlation dimension 140 8.1 Overview 140 8.2 Correlation dimension as a tail parameter 140 8.3 Estimation of the correlation integral 141 8.4 Efficient implementations 143 8.5 The choice of metric 144 8.6 Typical behaviour of C(r) 145 8.7 Dynamical range of C(r) 148 8.8 Dimension estimation in the case of unknown embedding dimension 150 8.9 Global least squares approach 151 8.10 Chord estimator 153 8.11 Local-slopes approach 154 8.11.1 Implementation of the local-slopes approach 155 8.11.2 Typical behaviour of the local-slopes approach 155 8.12 Maximum-likelihood estimators 158 8.12.1 The Takens estimator 158 8.12.2 Extensions to the Takens estimator 160 8.12.3 The binomial estimator 161 8.12.4 The algorithm of Judd 162 8.13 Intrinsic dimension and nearest-neighbour algorithms 164 Chapter 9 Sources of error and data set size requirements 166 9.1 Overview 166 9.2 Classification of errors 166 9.3 Edge effects and singularities 168 9.3.1 Hypercubes with uniform measure 168 9.3.2 Underestimation due to edge effect 169 9.3.3 Data set size requirements for avoiding edge effects 170 9.3.4 Distributions with singularities 172 9.4 Lacunarity 173 9.5 Additive measurement noise 175 9.6 Finite-resolution error 176 9.7 Autocorrelation error 177 9.7.1 Periodic-sampling error 178 9.7.2 Circles 181 9.7.3 Trajectory bias and temporal autocorrelation 183 9.7.4 Space-time separation plots 186 9.7.5 Quasiperiodic signals 186 9.7.6 Topological structure of Nt-tori 189 9.7.7 Autocorrelations in Nt-tori 190 9.7.8 Noise with power-law spectrum 192 9.7.9 Unrepresentativity error 195 9.8 Statistical error 195 9.9 Other estimates of data set size requirements 197 Chapter 10 Monte Carlo analysis of dimension estimation 200 10.1 Overview 200 10.2 Calibration systems 201 10.2.1 Mackey-Glass system 201 10.2.2 Gaussian white noise 203 10.2.3 Filtered noise 205 10.3 Ns-spheres 205 10.3.1 Analytical estimation of statistical error 206 10.3.2 Minimum data set size for Ns-spheres 209 10.3.3 Monte Carlo analysis of statistical error 211 10.3.4 Limited number of reference points 214 10.3.5 Comparison between GPA and JA 215 10.3.6 Results for maximum metric 217 10.4 Multiple Lorenz systems: True state space 219 10.4.1 Monte Carlo analysis of statistical error 220 10.4.2 Comparison between GPA and JA 223 10.4.3 Results for maximum metric 225 10.5 Multiple Lorenz systems: Reconstructed state space 226 10.5.1 Exact derivative coordinates 227 10.5.2 Time-delay coordinates 229 10.5.3 Hybrid coordinates 235 Chapter 11 Surrogate data tests 238 11.1 Overview 238 11.2 Null hypotheses for surrogate data testing 239 11.3 Creation of surrogate data sets 241 11.3.1 Typical-realisation surrogates 241 11.3.2 Constrained-realisation surrogates 243 11.3.3 Surrogates with non-gaussian distribution 247 11.4 Refinements of constrained-realisation surrogate data set creation procedures 249 11.4.1 Improved AAPR surrogates 249 11.4.2 The wraparound artifact 251 11.4.3 Noisy sine waves 252 11.4.4 Limited phase randomisation 255 11.4.5 Remedies against the wraparound artifact 257 11.5 Evaluating the results of surrogate data tests 259 11.6 Interpretation of the results of surrogate data tests 261 11.7 Choice of the test statistic for surrogate data tests 262 11.8 Application of surrogate data testing to correlation dimension estimation 263 Chapter 12 Dimension analysis of the human EEG 266 12.1 Overview 266 12.2 The beginning of dimension analysis of the EEG 267 12.3 Application of dimension analysis to cerebral diseases and psychiatric disorders 268 12.3.1 EEG recordings from epileptic patients 269 12.3.2 EEG recordings from human sleep 269 12.4 Scepticism against finite dimension estimates from EEG recordings 271 12.4.1 Application of GPA to an EEG time series from sleep stage IV 272 12.4.2 Interpretation of the finite estimates found in the literature 275 12.5 Dimension analysis using moving windows 278 12.5.1 Application to nonstationary time series 279 12.5.2 Application to stationary time series 282 12.5.3 Application to a nonstationary EEG time series 284 12.6 Dimension analysis of EEG time series: Valuable or impractical? 288 Chapter 13 Testing for determinism in time series 290 13.1 Overview 290 13.2 The BDS-statistic 291 13.3 The dependence parameters δm by Savit & Green 294 13.3.1 Generalisations of the δm 297 13.3.2 Predictability parameters and the relationship between the δm and entropies 298 13.4 Testing for determinism and minimum embedding dimension 300 13.5 Continuous versus discrete data sets 303 13.6 Reduction of EEG time series to discrete phase information 304 13.7 Savit-Green analysis of ISI series from multiple Lorenz systems 308 13.7.1 Distribution of the dependence parameters δm (r) 308 13.7.2 Surrogate data testing applied to the predictability parameters Šm(r) 310 13.8 Savit-Green analysis of ISI series from nonstationary time series 313 13.9 Savit-Green analysis of ISI series from EEG time series 315 13.9.1 Analysis of an EEG time series from sleep stage IV 316 13.9.2 Analysis of a nonstationary EEG time series 318 13.10 Surrogate data testing of differenced time series 321 Chapter 14 Conclusion 326 Table of notation 332 Bibliography 338 Index 354