ورود به حساب

نام کاربری گذرواژه

گذرواژه را فراموش کردید؟ کلیک کنید

حساب کاربری ندارید؟ ساخت حساب

ساخت حساب کاربری

نام نام کاربری ایمیل شماره موبایل گذرواژه

برای ارتباط با ما می توانید از طریق شماره موبایل زیر از طریق تماس و پیامک با ما در ارتباط باشید


09117307688
09117179751

در صورت عدم پاسخ گویی از طریق پیامک با پشتیبان در ارتباط باشید

دسترسی نامحدود

برای کاربرانی که ثبت نام کرده اند

ضمانت بازگشت وجه

درصورت عدم همخوانی توضیحات با کتاب

پشتیبانی

از ساعت 7 صبح تا 10 شب

دانلود کتاب Geometry and Statistics

دانلود کتاب هندسه و آمار

Geometry and Statistics

مشخصات کتاب

Geometry and Statistics

ویرایش:  
نویسندگان: , ,   
سری: Handbook of Statistics, 46 
ISBN (شابک) : 0323913458, 9780323913454 
ناشر: Academic Press 
سال نشر: 2022 
تعداد صفحات: 488
[490] 
زبان: English 
فرمت فایل : PDF (درصورت درخواست کاربر به PDF، EPUB یا AZW3 تبدیل می شود) 
حجم فایل: 15 Mb 

قیمت کتاب (تومان) : 32,000



ثبت امتیاز به این کتاب

میانگین امتیاز به این کتاب :
       تعداد امتیاز دهندگان : 2


در صورت تبدیل فایل کتاب Geometry and Statistics به فرمت های PDF، EPUB، AZW3، MOBI و یا DJVU می توانید به پشتیبان اطلاع دهید تا فایل مورد نظر را تبدیل نمایند.

توجه داشته باشید کتاب هندسه و آمار نسخه زبان اصلی می باشد و کتاب ترجمه شده به فارسی نمی باشد. وبسایت اینترنشنال لایبرری ارائه دهنده کتاب های زبان اصلی می باشد و هیچ گونه کتاب ترجمه شده یا نوشته شده به فارسی را ارائه نمی دهد.


توضیحاتی در مورد کتاب هندسه و آمار

هندسه و آمار، جلد 46 در سری راهنمای آمار، پیشرفت‌های جدید در این زمینه را برجسته می‌کند و این جلد جدید جالب توجه است. فصل های نوشته شده توسط هیئت بین المللی نویسندگان.


توضیحاتی درمورد کتاب به خارجی

Geometry and Statistics, Volume 46 in the Handbook of Statistics series, highlights new advances in the field, with this new volume presenting interesting chapters written by an international board of authors.



فهرست مطالب

Front Cover
Geometry and Statistics
Copyright
Contents
Contributors
Preface
Section I: Foundations in classical geometry and analysis
	Chapter 1: Geometry, information, and complex bundles
		1. Introduction
		2. Complex planes
			2.1. Important implications of Liouville's theorem
		3. Geometric analysis and Jordan curves
		4. Summary
		References
	Chapter 2: Geometric methods for sampling, optimization, inference, and adaptive agents
		1. Introduction
		2. Accelerated optimization
			2.1. Principle of geometric integration
			2.2. Conservative flows and symplectic integrators
			2.3. Rate-matching integrators for smooth optimization
			2.4. Manifold and constrained optimization
			2.5. Gradient flow as a high friction limit
			2.6. Optimization on the space of probability measures
		3. Hamiltonian-based accelerated sampling
			3.1. Optimizing diffusion processes for sampling
			3.2. Hamiltonian Monte Carlo
		4. Statistical inference with kernel-based discrepancies
			4.1. Topological methods for MMDs
			4.2. Smooth measures and KSDs
				4.2.1. The canonical Stein operator and Poincaré duality
				4.2.2. Kernel Stein discrepancies and score matching
			4.3. Information geometry of MMDs and natural gradient descent
				4.3.1. Minimum Stein discrepancy estimators
				4.3.2. Likelihood-free inference with generative models
		5. Adaptive agents through active inference
			5.1. Modeling adaptive decision-making
				5.1.1. Behavior, agents, and environments
				5.1.2. Decision-making in precise agents
				5.1.3. The information geometry of decision-making
			5.2. Realizing adaptive agents
				5.2.1. The basic active inference algorithm
				5.2.2. Sequential decision-making under uncertainty
				5.2.3. World model learning as inference
				5.2.4. Scaling active inference
		Acknowledgments
		References
	Chapter 3: Equivalence relations and inference for sparse Markov models
		1. Introduction
			1.1. Improved modeling capabilities of sparse Markov models (SMMs)
		2. Fitting SMMs and example applications
			2.1. Model fitting based on a collapsed Gibbs sampler
				2.1.1. Modeling wind speeds
				2.1.2. Modeling a DNA sequence
			2.2. Fitting SMM through regularization
				2.2.1. Application to classifying viruses
		3. Equivalence relations and the computation of distributions of pattern statistics for SMMs
			3.1. Notation
			3.2. Computing distributions in higher-order Markovian sequences
			3.3. Specializing the computation to SMM
			3.4. Application to spaced seed coverage
		4. Summary
		Acknowledgments
		References
Section II: Information geometry
	Chapter 4: Symplectic theory of heat and information geometry
		1. Preamble
		2. Life and seminal work of Souriau on lie groups thermodynamics
		3. From information geometry to lie groups thermodynamics
		4. Symplectic structure of fisher metric and entropy as Casimir function in coadjoint representation
			4.1. Symplectic Fisher Metric structures given by Souriau model
			4.2. Entropy characterization as generalized Casimir invariant function in coadjoint representation and Poisson Cohomology
			4.3. Koszul Poisson Cohomology and entropy characterization
		5. Covariant maximum entropy density by Souriau model
			5.1. Gauss density on Poincaré unit disk covariant with respect to SU(1,1) Lie group
			5.2. Gauss density on Siegel unit disk covariant with respect to SU(N,N) Lie group
			5.3. Gauss density on Siegel upper half plane
		6. Conclusion
		References
		Further reading
	Chapter 5: A unifying framework for some directed distances in statistics
		1. Divergences, statistical motivations, and connections to geometry
			1.1. Basic requirements on divergences (directed distances)
			1.2. Some statistical motivations
			1.3. Incorporating density function zeros
			1.4. Some motivations from probability theory
			1.5. Divergences and geometry
			1.6. Some incentives for extensions
				1.6.1. phi-Divergences between other statistical objects
				1.6.2. Some non-phi-divergences between probability distributions
				1.6.3. Some non-phi-divergences between other statistical objects
		2. The framework
			2.1. Statistical functionals S and their dissimilarity
			2.2. The divergences (directed distances) D
			2.3. The reference measure λ
			2.4. The divergence generator phi
			2.5. The scaling and the aggregation functions m1, m2, and m3
				2.5.1. m1(x) = m2(x) =: m(x), m3(x) = r(x)m(x)  [0, ] for some (measurable) function r:XR satisfying r(x)]-,0[]0,[ for λ- ...
					2.5.1.1. m1(x) = m2(x):= 1, m3(x) = r(x) for some (measurable) function r:X[0,] satisfying r(x)  ]0, [ for λ-a.a. xX
					2.5.1.2. m1(x) = m2(x):= Sx(Q), m3(x) = r(x)Sx(Q)  [0, ] for some (measurable) function r:XR satisfying r(x)]-,0[]0,[ for ...
					2.5.1.3. m1(x) = m2(x):= w(Sx(P), Sx(Q)), m3(x) = r(x)w(Sx(P), Sx(Q))  [0, [ for some (measurable) functions w:R(S(P))xR( ...
				2.5.2. m1(x)=Sx(P) and m2(x)=Sx(Q) with statistical functional SS, m3(x)  0
			2.6. Auto-divergences
			2.7. Connections with optimal transport and coupling
		3. Aggregated/integrated divergences
		4. Dependence expressing divergences
		5. Bayesian contexts
		6. Variational representations
		7. Some further variants
		Acknowledgments
		References
	Chapter 6: The analytic dually flat space of the mixture family of two prescribed distinct Cauchy distributions
		1. Introduction and motivation
		2. Differential-geometric structures induced by smooth convex functions
			2.1. Hessian manifolds and Bregman manifolds
			2.2. Bregman manifolds: Dually flat spaces
		3. Some illustrating examples
			3.1. Exponential family manifolds
				3.1.1. Natural exponential family
				3.1.2. Fisher–Rao manifold of the categorical distributions
			3.2. Regular cone manifolds
			3.3. Mixture family manifolds
				3.3.1. Definition
				3.3.2. The categorical distributions: A discrete mixture family
		4. Information geometry of the mixture family of two distinct Cauchy distributions
			4.1. Cauchy mixture family of order 1
			4.2. An analytic example with closed-form dual potentials
		5. Conclusion
		Appendix. Symbolic computing notebook in MAXIMA
		References
	Chapter 7: Local measurements of nonlinear embeddings with information geometry
		1. Introduction
		2. α-Divergence and autonormalizing
		3. α-Discrepancy of an embedding
		4. Empirical α-discrepancy
		5. Connections to existing methods
			5.1. Neighborhood embeddings
			5.2. Autoencoders
		6. Conclusion and extensions
		6. Conclusion and extensions
		Appendices
		Appendix A. Proof of Lemma 1
		Appendix B. Proof of Proposition 1
		Appendix C. Proof of Proposition 2
		Appendix D. Proof of Theorem 1
		References
Section III: Advanced geometrical intuition
	Chapter 8: Parallel transport, a central tool in geometric statistics for computational anatomy: Application to cardiac m ...
		1. Introduction
			1.1. Diffeomorphometry
			1.2. Longitudinal models
			1.3. Parallel transport for intersubject normalization
			1.4. Chapter organization
		2. Parallel transport with ladder methods
			2.1. Numerical accuracy of Schild's and pole ladders
				2.1.1. Elementary construction of Schild's ladder
				2.1.2. Taylor expansion
				2.1.3. Numerical scheme and convergence
				2.1.4. Pole ladder
				2.1.5. Infinitesimal schemes
			2.2. A short overview of the LDDMM framework
			2.3. Ladder methods with LDDMM
				2.3.1. Validation
		3. Application to cardiac motion modeling
			3.1. The right ventricle and its diseases
			3.2. Motion normalization with parallel transport
				3.2.1. Interaction between shape and deformations: A scale problem
			3.3. An intuitive rescaling of LDDMM parallel transport
				3.3.1. Hypothesis
				3.3.2. Criterion and estimation of λ
				3.3.3. Results
					3.3.3.1. Relationship between λ and VolED
			3.4. Changing the metric to preserve relative volume changes
				3.4.1. Model
				3.4.2. Implementation
				3.4.3. Geodesics
				3.4.4. Results
			3.5. Analysis of the normalized deformations
				3.5.1. Geodesic and spline regression
					3.5.1.1. Results
				3.5.2. Hotelling tests on velocities
		4. Conclusion
		Acknowledgments
		Abbreviations
		References
	Chapter 9: Geometry and mixture models
		1. Introduction
			1.1. Fundamentals of modeling with mixtures
			1.2. Mixtures and the fundamentals of geometry
			1.3. Structure of article
		2. Identification, singularities, and boundaries
			2.1. Mixtures of finite distributions
		3. Likelihood geometry
		4. General geometric structures
		5. Singular learning theory
			5.1. Bayesian methods
			5.2. Singularities and algebraic geometry
			5.3. Singular learning and model selection
		6. Nonstandard testing problems
		7. Discussion
		References
	Chapter 10: Gaussian distributions on Riemannian symmetric spaces of nonpositive curvature
		1. Introduction
		2. Gaussian distributions and RMT
			2.1. From Gauss to Shannon
			2.2. The ``right´´ Gaussian
			2.3. The normalizing factor Z(σ)
			2.4. MLE and maximum entropy
			2.5. Barycenter and covariance
			2.6. Z(σ) from RMT
			2.7. The asymptotic distribution
			2.8. Duality: The Θ distributions
		3. Gaussian distributions and Bayesian inference
			3.1. MAP versus MMS
			3.2. Bounding the distance
			3.3. Computing the MMS
				3.3.1. Metropolis-Hastings algorithm
				3.3.2. The empirical barycenter
			3.4. Proof of Proposition 13
		Appendix A. Riemannian symmetric spaces
			A.1. The noncompact case
			A.2. The compact case
			A.3. Example of Propositions A.1 and A.2
		Appendix B. Convex optimization
			B.1. Convex sets and functions
			B.2. Second-order Taylor formula
			B.3. Taylor with retractions
			B.4. Riemannian gradient descent
				B.4.1. Strictly convex case
				B.4.2. Strongly convex case
		Appendix C. Proofs for Section B
		References
	Chapter 11: Multilevel contours on bundles of complex planes*
		1. Introduction
		2. Infinitely many bundles of complex planes
		3. Multilevel contours in a random environment
			3.1. Behavior of X (zl(t), l) at (l  0)
			3.2. Loss of spaces in bundle B()
		4. Islands and holes in B()
			4.1. Consequences of B()\l on multilevel contours
			4.2. PDEs for the dynamics of lost space
		5. Concluding remarks
		Acknowledgments
		References
Index
Back Cover




نظرات کاربران